- #1
dpatnd
- 64
- 20
I'm a physics major (Junior) who would like to learn a programming language. Many suggest Python as a good starting language, so let's suppose that is the language I want to first learn. This isn't for employability or general usefulness; rather, I'm just curious.
How exactly does one go about "learning" a programming language and how does one determine what has actually been "learned"? When can you say you "know" a given language? The issue is that I have no idea what "learning" looks like when applied to coding. I can very easily determine what I have learned and how well I have learned it when it comes to things like physics and mathematics, but not for coding.
I have taken two coding classes (technically one, since the second focused on a hardware description language). That one class was on C/C++. I got an A, and it seemed like my professor thought that an A translates to "knowing" how to code in C; at least, we should be able to put it on resumes as a skill without being deceitful. Yet, I would not dare put C/C++ on any resume. As far as I can tell, I learned nothing. So, if after a semester ending in an A I am no closer to "knowing" C, what exactly does "learning" look like in the first place? This bothered me a lot, since if I end a semester with an A in, say, a math class, I can confidently demonstrate that I "learned" most of the course material at a later date.
I guess I'm just asking for a more detailed explanation of how one "learns" coding. All the answers I've encountered thus far have been vague and unsatisfying. I hope to take a computational physics class in Python next semester, which will hopefully leave a more long-lasting impression on my brain than that C class.
How exactly does one go about "learning" a programming language and how does one determine what has actually been "learned"? When can you say you "know" a given language? The issue is that I have no idea what "learning" looks like when applied to coding. I can very easily determine what I have learned and how well I have learned it when it comes to things like physics and mathematics, but not for coding.
I have taken two coding classes (technically one, since the second focused on a hardware description language). That one class was on C/C++. I got an A, and it seemed like my professor thought that an A translates to "knowing" how to code in C; at least, we should be able to put it on resumes as a skill without being deceitful. Yet, I would not dare put C/C++ on any resume. As far as I can tell, I learned nothing. So, if after a semester ending in an A I am no closer to "knowing" C, what exactly does "learning" look like in the first place? This bothered me a lot, since if I end a semester with an A in, say, a math class, I can confidently demonstrate that I "learned" most of the course material at a later date.
I guess I'm just asking for a more detailed explanation of how one "learns" coding. All the answers I've encountered thus far have been vague and unsatisfying. I hope to take a computational physics class in Python next semester, which will hopefully leave a more long-lasting impression on my brain than that C class.