For those who ask: "What programming language should I learn?"

  • Thread starter pbuk
  • Start date
In summary, when deciding which programming language to learn, it is important to consider your goals and the specific applications you are interested in. For hobbyists, Python may be a good choice due to its ease of use and versatility. However, for job opportunities, C or C++ may be more in demand. It is also worth considering the speed and efficiency of each language in different contexts, such as web development, scientific research, or game development. Ultimately, it is important to choose a language that aligns with your goals, rather than relying solely on generalizations or assumptions about a language's speed or popularity.
  • #36
That's fair, but I might focus on that.

I do think C++ has gotten a bit of an unfair rap (but some of its problems are ignored, so maybe it averages out) because it exposes what people don't understand. Scoping certainly. Strong typing. Pointers. Maybe you remember confused by call-by-value. Toss in STL and now you expose a lack of understanding of data structures - do I want a list or a vector or a dequeue or...
 
Last edited:
Technology news on Phys.org
  • #37
One thing that’s not been brought up here is the fragile base class issue of OOP languages like Java or C++.

https://en.wikipedia.org/wiki/Fragile_base_class#:~:text=The fragile base class problem,the derived classes to malfunction.

C++ allows for multiple inheritance and as a result can introduce some funky errors in code. As an example, say you have a class C that inherits from classes PA and PB and they in turn inherit a common class G. G has a variable x that a method in class C changes.

The question is how should one access x from methods in C? Via PA to G or via PB to G. It turns out that in C++ there are two instances of the G class and hence two x variables. The solution was to always design with getter and setter functions and not allow direct access to the variables of another class.

These subtleties can really mess with a beginning student and must be approached carefully as they get comfortable with whatever OOP language they choose.

in the past, I’ve championed the Processing approach to learning to program because of it immediate feedback and relatively simple editor and runtime. It’s based on Java but introduces students to the setup and draw model allowing them to get immediate feedback when something it run. Java is a good language on several fronts, single inheritance, platform agnostic, rich library of features, good vendor support and great IDE tools.

The student has a clear path from Processing to more serious projects and an easier transition from hobbyist level to professional developer while maintaining what they’ve learned.

Processing has a similar path for Python folks when the Jython mode is selected although it’s not as easy to transition to developer level since Jython is at v2.7 and most modern Python development is done with Python 3.

Another popular Java student IDE is BlueJ. I’ve played with it but feel that Processing has more to offer.

Anyway, this is an endless flame war topic with no clear winner only many options for folks to choose from and lots of people with varying opinions and positions on what works best.
 
  • Informative
Likes jack action
  • #38
jedishrfu said:
One thing that’s not been brought up here is the fragile base class issue of OOP languages like Java or C++.
...
These subtleties can really mess with a beginning student
I don't think multiple inheritance is something a beginning student should be learning anyway.

jedishrfu said:
The student has a clear path from Processing to more serious projects and an easier transition from hobbyist level to professional developer while maintaining what they’ve learned.
I accept what you say, however I believe it is better to be able to move to more serious projects without having to change language.

jedishrfu said:
Processing has a similar path for Python folks when the Jython mode is selected although it’s not as easy to transition to developer level since Jython is at v2.7 and most modern Python development is done with Python 3.
No, Python 2.7 reached end of life three years ago and should not be used by anyone for anything, let alone be taught to beginners.

jedishrfu said:
Anyway, this is an endless flame war topic with no clear winner only many options for folks to choose from and lots of people with varying opinions and positions on what works best.
I have two thoughts on this:
  1. Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particuarly in STEM, that Python is the best first language (some indicative evidence below). I think PF would benefit from a sticky post joining this consensus.
  2. Alternatively, there does at least appear to be a consensus on here for "don't learn C++ just because someone has told you it is fast", and I think a post along those lines that we could link in answers would be useful.
@Greg Bernhardt what do you think?

University of Cambridge Department of Computer Science and Technology said:
FAQ: Can you recommend any books or activities to do that would help my application and/or studies?
A: If you choose to learn a new language, it may be a good idea to learn one that is not explicitly taught in the Tripos. Doing so obviously helps to avoid repetition, but also gives you a wider perspective on languages that can be useful later in the degree and in employment. A popular choice is Python, for which there are many tutorials available.
https://www.cst.cam.ac.uk/admissions/undergraduate/faqs

Since 2016 Python has been the first language taught to University of Cambride Engineering students
https://github.com/CambridgeEngineering/PartIA-Computing-Michaelmas

MIT Open Courseware Introductory Programming said:
This page will help you begin to learn programming and computer science, with some suggested introductory courses on OCW.
These courses introduce principles of computer science and begin to develop programming skills, specifically in the Python language. Learn more about these courses’ learning goals, history and student experience in this MIT news article.
Introduction to Computer Science and Programming in Python
Introduction to Computational Thinking and Data Science
Programming for the Puzzled
https://ocw.mit.edu/course-lists/general-introductions-to-programming/
 
  • #39
pbuk said:
Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particuarly in STEM, that Python is the best first language (some indicative evidence below). I think PF would benefit from a sticky post joining this consensus.
Among the academic community, particularly in STEM, it is prototyping that is mostly needed, not serious programming. For that purpose Python is a good choice. But the academic community is an extremely small sample of all programmers.
To the general question: "what programming language should I learn", I'd say the consensus in the industry is that it should at least be an object-oriented language with static typing.

Even people using Python would do well to learn a statically typed language with separation of concerns first. That would greatly improve their programming style.
 
  • Like
Likes Wrichik Basu
  • #40
Rene Dekker said:
Among the academic community, particularly in STEM, it is prototyping that is mostly needed, not serious programming.
I think you have misunderstood my point, which is that if institutions such as the University of Cambridge and MIT are recommending and using Python as the first language to learn for computing and STEM students then why would it not be good for PF to do that too?

Rene Dekker said:
Even people using Python would do well to learn a statically typed language with separation of concerns first.
I don't understand your reference to separation of concerns as a language feature, surely it's a (highly desireable) programming style that can (and should) be used in any language?
 
  • #41
pbuk said:
Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particuarly in STEM, that Python is the best first language (some indicative evidence below).
I beg to differ on this. Python is a good language for scientific computing, no questions regarding that. But that's primarily due to the availability of libraries like numpy, scipy and matplotlib - the foundations. If these libraries were written for Java, the scenario would have been different.

An absolute beginner needs to learn programming first, before going into scientific or numerical programming. Without the notion of data types, for instance, I am not convinced how programming logic can be created. A solid base in programming would help you master any programming language in the future.

I am not saying stay with Java. Switch to whatever language you want, based on your needs. But start with Java.
 
  • #42
Wrichik Basu said:
I beg to differ on this.
Then you are differing with the recommendations I quoted from the University of Cambridge and MIT which are not specific for scientists.

Wrichik Basu said:
An absolute beginner needs to learn programming first, before going into scientific or numerical programming. Without the notion of data types, for instance, I am not convinced how programming logic can be created. A solid base in programming would help you master any programming language in the future.
I agree completely. Data types are encountered early in learning Python with
Python:
# Adding numbers.
print(2 + 2) # 4

# Adding strings (concatenation).
print("2" + "2") # 22

# You can't mix types.
print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'
 
  • #43
On Python:

There are research groups out there beimg strangled (get it?) by too much Python. They depend on thousands of lines of unmaintainable Python code written by a parade of students, and never really designed. "Prototypes" are sent straight into production.

This was done with American automobiles in the 1980s and it worked about as well. I suppose it worked out very well..if you are Toyota.

On C++:

Just because you can doesn't mean you should.

We have a discussion upstream about the "diamond of death" inheritance pattern. As a friend said "this should onl;y be used in compiler test suites, and not any actual running code." I dislike it too, not because of the probelms with it so much as it is almost never what you want.

There is a tendency of beginning C++ programs to overuse inheritance. This is aggravated by textbooks using geometric shapes to showcase it ("look, it inherits the perimeter clas") which is a bad choice - it's contrived, saves little coding, and turns the classes into monstrsities of exception handling.

The design philosophy is good - if you write code to do one thing, you write it one time and put it in one place. If you are using a feature of the language in some other way, there will be harder times ahead.
 
  • Like
Likes Wrichik Basu, jack action and jedishrfu
  • #44
Vanadium 50 said:
There are research groups out there beimg strangled (get it?) by too much Python. They depend on thousands of lines of unmaintainable Python code written by a parade of students, and never really designed. "Prototypes" are sent straight into production.
Yes, you can write bad code in any language. In the 1990s the same groups struggled with FORTRAN77 legacy code, in the 2000s it was Fortran 90, the 2010s C++ and Java. Perhaps if we teach people how to write good Python code from the get-go this will be better in future.

Vanadium 50 said:
We have a discussion upstream about the "diamond of death" inheritance pattern. As a friend said "this should onl;y be used in compiler test suites, and not any actual running code." I dislike it too, not because of the probelms with it so much as it is almost never what you want.
Absolutely.

Vanadium 50 said:
There is a tendency of beginning C++ programs to overuse inheritance. This is aggravated by textbooks using geometric shapes to showcase it ("look, it inherits the perimeter clas") which is a bad choice - it's contrived, saves little coding, and turns the classes into monstrsities of exception handling.
I don't know what idiot it was that confused the perfectly sensible "Shape" example of polymormorphism and the equally sensible "calculatePerimeter" example of function overloading to come up with a Perimiter class. I hope they were sacked, but somehow their legacy lingers on. Personally I prefer the "Animal" example.
 
  • #45
pbuk said:
you can write bad code in any language
Go too far down that path and you will reach the conclusion "nothing wring with just writing that code in C++".

If the story is "There's a lot of Python out there. so it;s a good thing to learn" it's hard to argue with. If the story is "Python encourages good programming habits" I think you can find a lot of counter examples.

But I think this dances around the real issue - people want to be programmers without learning any programming (and worse, they think they are good at it). Not too long ago, we had a thread where someone wanted to learn a particular database query language - but wasn't willing to put in more than two days on it.

To steal a phrase, if you want to build a doghouse, you can run down to Home Depot, pick up some wood and nails and go. If you want to build a house, you probably at least want blueprints. If you want to build a skyscraper, you want engineering. CS gets in trouble when people treat skyscrapers as great big doghouses.
 
  • Like
Likes Klystron, Wrichik Basu and jack action
  • #46
pbuk said:
I don't think multiple inheritance is something a beginning student should be learning anyway.I accept what you say, however I believe it is better to be able to move to more serious projects without having to change language.No, Python 2.7 reached end of life three years ago and should not be used by anyone for anything, let alone be taught to beginners.I have two thoughts on this:
  1. Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particuarly in STEM, that Python is the best first language (some indicative evidence below). I think PF would benefit from a sticky post joining this consensus.
  2. Alternatively, there does at least appear to be a consensus on here for "don't learn C++ just because someone has told you it is fast", and I think a post along those lines that we could link in answers would be useful.
@Greg Bernhardt what do you think?

Since 2016 Python has been the first language taught to University of Cambride Engineering students
https://github.com/CambridgeEngineering/PartIA-Computing-Michaelmas
Would you like to take a try at writing this post?
 
  • Haha
Likes jedishrfu
  • #47
Python's indentation is a bit hard to realize. In C++, I can use codeblocks to format on save. And it auto formats code for me. I'm pretty sure there's not one in python because that's the whole thing it is built in.
 
  • Like
Likes Wrichik Basu
  • #50
jedishrfu said:
One thing that’s not been brought up here is the fragile base class issue of OOP languages like Java or C++.

https://en.wikipedia.org/wiki/Fragile_base_class#:~:text=The fragile base class problem,the derived classes to malfunction.
Very interesting: I have never run into this problem.
I converted the example provided in that wiki article to C++ and included it here:
Wiki example to C++:
class base {
public:
   // int incA() { return counter++; }
   int incA() { return incB(); }
   virtual int incB() { return counter++; }
private:
   static int counter;
};
int base::counter = 0;

class derived: public base {
public:
   int incB() { return incA(); }
};
First off, I think this is an excellent example of how mentorship is really important to a beginning programmer. Without mentorship, a novice will fiddle with this code until it works and leave mystery and frustration in its wake.

Although the semantics of the language are very important, in my mind, this "fragility" is more profound than just semantics. In contrast to addressing semantics, I think it is important to track this issue all the way through the design process - because that is where the problem lies.

When coded exactly as it is written above, it will drop into infinite recursion and fail.
The way to correct this would depend on the actual purpose of base::incA(). Of course, in this case it is only to provide an example. But in "real life", you would need to determine if it was correct for it to call this-> incB() or base::incB().
If base::incB() is needed, then "int incA() { return incB(); }" needs to be corrected to "int incA() { return base::incB(); }". And the bug is fixed.

On the other hand, it could be that base::incB() really should be relying on the derived version of incA().
If that is the case, the problem is with derived::incA(). That function should not rely on base::incB() and the fact that it was coded that way indicates that the coder did not understand how the classes 'base' and 'derived' are being used to address the "real life" problem. I such a case, I would expect that the derived::incA() has been completely mis-coded and that fixing it will involve more than just replacing one or two lines of code. In other words, the problem isn't just inadvertent recursion, it is a misunderstanding of what incA() was supposed to do.
jedishrfu said:
C++ allows for multiple inheritance and as a result can introduce some funky errors in code. As an example, say you have a class C that inherits from classes PA and PB and they in turn inherit a common class G. G has a variable x that a method in class C changes.

The question is how should one access x from methods in C? Via PA to G or via PB to G. It turns out that in C++ there are two instances of the G class and hence two x variables. The solution was to always design with getter and setter functions and not allow direct access to the variables of another class.
So here is my rendition of the example you describe in C++. If this is not what you have in mind, let me know and I will take another shot at it.
Wiki example to C++:
class G { public: int x; };
class PA: public G {};
class PB: public G {};
class C: public PA, PB
{
   void SetX(int n) { x=n; }
   int GetX() { return x; }
};

That code will not compile. The C++ compiler complains that the x's in SetX() and GetX are ambiguous.
To correct the code, you need to replace each 'x' with either 'PA::x' or 'PB::x'.
 
  • #51
pbuk said:
Although there are many opinions stated among posters here and elsewhere, I believe that there is now a consensus among the academic community, particularly in STEM, that Python is the best first language (some indicative evidence below). I think PF would benefit from a sticky post joining this consensus.
If such a sticky note is included, it should specify its advantages. In your "spoiler" section, only the availability of tutorials is mentioned as an advantage. If that is the only advantage, Python is not worth a sticky note. If there are more advantages, then they would need to be enumerated.
pbuk said:
Alternatively, there does at least appear to be a consensus on here for "don't learn C++ just because someone has told you it is fast", and I think a post along those lines that we could link in answers would be useful.
And don't learn assembler just because it is fast - or either C/C++ or assembler because they create binaries that are easy on the processors other resources.

What might interest someone to learn C/C++/C#:
1) it has application in industry;
2) since most operating system are written in C/C++/C# and assembler, you have more confidence that the restrictions to the resources in your operating environment will be at a minimum.
3) Once you gain some mastery in it, it is quick to code and check.

Previous academic "darlings" have included Basic, Pascal, ADA, Lisp, Forth, C, and C++.
The difference between C, C++, and later C# and those others is that C/C++/C# have led to high productivity in the workplace. Coders got good at it and and ran with it. It was a solid replacement to all of those Fortran-like languages that preceded it and it took the lead as the favorite language for "heavy lifting".

I don't know where Python lands. I find it a handy "supplemental" language. In contrast, I would describe Pascal as 'lame'. I don't thing Python falls into that category. Python has found a place in industry.
 
  • #52
Greg Bernhardt said:
Would you like to take a try at writing this post?
Yes, I started this thread as a first prototype :smile:
 
  • #53
shivajikobardan said:
I'm pretty sure there's not [an equivalent to Code::Blocks for] Python.
You must be joking - there are many IDEs for Python, including in no particular order:
  • IDLE which is the 'default' Python IDE, shipped with Python since forever;
  • Thonny, which is easy to install along with Python and good for beginners;
  • PyCharm, as for Thonny but also part of a robust professional family offering from JetBrains;
  • Visual Studio Code, which is not really an IDE but has plugins that mean that it has pretty much replaced IDEs for a large part of the development community in most languages.
  • The online IDE from Programiz which is probably the easiest and quickest way to start programming and can run on almost anything e.g. a Chromebook.
 
Last edited:
  • #54
pbuk said:
unsupported operand type(s) for +: 'int' and 'str'
I admit I never tried that in Python, but I find that astonishing and not very Pythonic. I would have expected Python to promote teh 2 to a "2" and make this a "22".
 
  • #55
Vanadium 50 said:
I would have expected Python to promote teh 2 to a "2" and make this a "22".
That would be guessing that the intent was string concatenation rather than integer addition. But why should that be the default? One could just as easily argue that it's more likely that the programmer meant to do integer addition so the "2" should be interpreted as an integer and added to the 2 to make 4.

Python would rather make the programmer be explicit and correct about stating what they intend, instead of guessing about incorrect constructs that could be interpreted multiple ways.
 
  • Like
Likes pbuk
  • #56
PeterDonis said:
Python would rather make the programmer be explicit and correct about stating what they intend, instead of guessing about incorrect constructs that could be interpreted multiple ways.
That was a good decision. Shame that whoever made it was out of office when the decisions about the relationships between True, False and 1, 0 were made (and why were they not fixed in v3?). Anyway this is OT.
Python:
# The power of truth.
print(10 ** True) # 10

# The power of double truth.
print(10 ** (True + True)) = 100

# Wait for it...
print(True == 1) # True
print((True + False) == 1) # True
one = 1
print((True + False) is one) # True
print(True is one) # False - gotcha!
 
  • #57
pbuk said:
the relationships between True, False and 1, 0
I agree that treating bool as a subtype of int doesn't make a lot of sense. Historically bool was tacked on as a separate type during the evolution of Python 2 but I don't think the consequences were ever fully considered.

pbuk said:
why were they not fixed in v3?
My ranting answer: they were too busy making bad decisions about how to "fix" the handling of Unicode to make good decisions about other stuff. (AFAIK making bool its own independent type has never been on the radar of the Python core developers anyway.)
 
  • #58
pbuk said:
I agree completely. Data types are encountered early in learning Python with
Python:
# Adding numbers.
print(2 + 2) # 4

# Adding strings (concatenation).
print("2" + "2") # 22

# You can't mix types.
print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'
I don't want to be harsh on you, but this is exactly the kind of thing that we are trying to avoid. If "data types" are mentioned, and you can only think of numbers and strings, then you may have learned a programming language, but have not learned software design.

The most striking disadvantages of Python wrt. for example Java:
- limited focus on separated data structures with defined, encapsulated functionality.
- no static type checking, meaning that type faults come up at run-time rather than compile time.
- limited support for separation of concerns.

If you have already learned software design based on a more structured language, then it is easy to put something together in Python, and you can even write decent programs in it. But the lack of static type checking would still be blocking for large scale deployment.

So as a first language to learn software design, I would definitely say that there is a consensus NOT to use Python, or any other dynamically typed language.
There is a consensus to use a more structured language like Java.

I also agree with the others that C++ is not suited as first language. It is overly complex, with loads of features and peculiarities that distract from the program solving focus. Multiple inheritance, as discussed in this thread, is almost never required or used.
 
  • #59
Rene Dekker said:
limited support for separation of concerns
What exactly is the difference between Python and Java in this respect?
 
  • #60
PeterDonis said:
But why
I ask myself that every time I sue Python., Wht? Why of why?

With strong typing, the language never needs to guess. With weak typing, it does, although often the choice is obvious. Python is kind of slushy. But the guesswork goes like this: 2 + "a" can onlt be "2a" so 2 + "2" should be 22.

I'm not complaining about what it's doing, hjust astonished. Like running into an exotic deancer in church on Sunday. You can be glad she's there but still be surprised to see her.
 
  • #61
.Scott said:
2) since most operating system are written in C/C++/C# and assembler,
I wouldn't treat C# in one breath with C and C++. C# is a higher structured language based on and inspired by Java. Just like Java, it is compiled to byte-code which then runs on a virtual machine (which nowadays uses advanced JIT compiler techniques to speed up the execution). It is not in the same category as C and C++
 
  • #62
pbuk said:
TL;DR Summary: If you are new to programming, learn Python - it is relatively easy to learn and you can do a lot with it.
This sounds way too much like you are promoting a policy instead of offering direction.
How about, "If you are new to programming and looking for advice, start with Python - it ..." ?
pbuk said:
LearnIf you want to write code for

This "Learn X, if you want to develop Y" block has a context problem. By including it as "advice to novice" , you are advising someone who is "new to programming" to "Learn C if you want to code device drivers" - though you are reversing this in your first "tip".

So there are two issues: the second is phrasing the heading to match the audience; the first relates to qualifying the audience.

So the first: I would like to say that telling a novice to "Learn C if you want to code device drivers" is categorically wrong. But I know of several EE types that have done exactly that with serial drivers in Unix systems. That happens to be a relatively cookie-cutter type procedure. Is someone who has already read through the data sheet for a Intel processor and wired up a new device to his computer "new to programming"? I think the "qualifier" is what I mentioned before: are they "looking for advice".

Now the phrasing. How about "You'll eventually need", instead of "Learn"?

All novice coders that I have ever worked with are 1-language creatures. So, except in special situations, we don't want to tell them to tackle C until they say "how about C".

pbuk said:
  • Do not learn C++ (or any other dialect of C) because you believe that it is "fast" in some general sense - the contexts in which this may be relevant are limited to those mentioned above (games, embedded systems etc.)
I think your purpose with this tip is to discourage someone who is over-eager with C++ by souring the grapes. I wouldn't do that. All computer languages have friends (even RPG). C++ is great, so is Python, so it JavaScript, so is ... is ... Cobol (had a hard time getting that one out). Besides, you're describing C++ as a language of last resort - but for many large projects there are so many inherently complexities that the conciseness and power of C++ can be essential.

So I would say something like this:
C and C++ are powerful and concise languages often well suited for large projects, OS code, and code that needs to make the most of limited computer processor resources, it is also cryptic, subtle, and vast.
Picking this as your first language could end your career before it ever gets started.
pbuk said:
  • Do not learn C++ because you believe that all interviewers expect you to use it to answer coding questions - in 2023 this is no longer true in many cases, and it is easier and quicker to solve coding problems correctly in e.g. Python. Of course if you are going for a job writing device drivers at NVIDIA this is not going to apply, but if you think you can get a job writing device drivers at NVIDIA you don't need advice on learning to code anyway.
By "coding questions", you mean explaining a software concept or algorithm. The phrase "coding question" certainly includes questions on semantics - which would, of course, require language-specific knowledge.

I would replace this with:
In a job interview or other situation where you want to described software concepts or algorithms,
don't default to C/C++ unless both you and the interviewer are very familiar with it. Expect to find better success with Python, an improvised pseudo-code, a narrative procedure, a flow chart, or some combination.

pbuk said:
  • Do not learn how to implement algorithms and data structures like linked lists, hash tables, trees, indexing, sorting etc - these algorithms have already been implemented optimally in the languages that you will learn (although the comment about NVIDIA etc. above applies).
In a practical sense, what the difference between "learning to implement" and "learning how it has been implemented"?
Anyone who sticks with programming long enough will pick up almost all of that stuff - and it is all valuable. If your database is slow, it might be because some automatic decisions have been made by you software tools on your behalf. Understanding the mechanisms under the hood helps in directing you to the solution.
I would drop this item completely.
 
  • Like
Likes DrClaude and Vanadium 50
  • #63
Vanadium 50 said:
With strong typing, the language never needs to guess.
Python is strongly typed. What differentiates it from, for example, Java, is that it is dynamically typed.

I suppose you could say that Python never "needs" to guess, but I would put it that it refuses to guess.

The other key difference between Python and most other languages is the meaning of a variable in Python. In, say, Java, declaring a variable means setting aside a piece of memory to store a particular kind of data. The "type" of the variable is what kind of data will be stored in that piece of memory.

In Python, however, declaring a variable means declaring a namespace binding. In other words, a "variable" is an assignment of a name in the current namespace to an object. The namespace is just a Python dict, so the name is a dictionary key and the object is the dictionary value corresponding to that key. The object itself is a pointer to a data structure, and the "variable type" is the type of the object. Python type annotations are declarations that a particular name in a namespace will only be bound to variables of a particular type; but the interpreter currently doesn't enforce them (i.e., it doesn't check actual variable assignments to see if they are consistent with the type annotations).
 
  • #65
.Scott said:
But I know of several EE types that have done exactly that with serial drivers in Unix systems.
One of the very first pieces of code I wrote in physics, and I think it might have been the first that other people would use after I moved on was similar.

The task was to read out our data, arranged by crate, card and channel and decode it. e.g. Byte 3 in Card 7 in Crate 2 is temperature. I've said that people were preoccupied by speed, but the code I was replacing limited how fast we can take data.

I used a union of structs,

So not just C, but esoteric C. One side of the union was card/crate/channel and the other was what those variables meant,

It was blazingly fast. It was also very fragile, and the comments exceeded the length of the code, and expressed the idea "if you touch this code, it will break."

My first foray into 8086 assembler was a device driver. This was shorty after the IBM PC was released. Got a deak on a hard disk with one bad head, so I wrote a driver that ignored that head. I was not an expert in 8086 assemnler (I'm still not) but it was easy enough to take an existing driver and adjust it. (Finding which 4's had to be turned into 3's was a bit of work, to be sure)
 
  • Like
Likes .Scott and PeterDonis
  • #66
All this talk about addition, concatenation and data types handled so badly in all of these languages when you got PHP that handles it so well:

PHP:
$numberOfApples = 3;
$numberOfOranges = 2;

$numberOfApples  .= " apples";
$numberOfOranges .= " oranges";

echo "$numberOfApples + $numberOfOranges = ". ($numberOfApples + $numberOfOranges) . " fruits";

# Prints "3 apples + 2 oranges = 5 fruits"
# ... with 2 nice notices (not errors, not warnings)
# about "non well form numeric values encountered" on line 7.

Declaring variables ... Pfft! ... and data type ... Pfft! ... statically ... Pfft!
Using the same syntax for addition & string concatenation ... :doh:

It is not that difficult to understand what the user means when done correctly.

And if the user wants a specific type - when he needs it - he just has to specify it then:

PHP:
$numberOfApples = "3 apples";

echo $numberOfApples; # Prints "3 apples"
echo (int) $numberOfApples; # Prints "3"

I don't remember ever being surprised by the assumptions made by PHP. It works as expected, logically, all the time.

PHP takes the best out of every other language and incorporates it into its own. Once you understand PHP, every other language seems familiar somehow.

Do you want procedural or object-oriented? You have both. Do you need a list or an array? In PHP, there are no lists: a list is simply an array. On the plus side, because it is an array, you can even manipulate the indices of your "list"!

Because it was primarily designed for the web, you can easily use HTML for any user interface you need, and all that is needed is a server (yours or someone else's) to use your program. No need to compile it for different architectures or use some unique-use virtual machine. Perfect when you are just "learning" by doing fun stuff.

If that wasn't enough, you have the best documentation on the web - filled with examples to explain the different functions - and most likely the largest community to help solve the more complex problems. I think I never had a question that wasn't already asked & answered on the web.

If you need a better-suited language for some specialized task later on, already knowing PHP will probably be helpful for learning that new language (which is most likely more complicated/strict somehow).
 
  • #67
pbuk said:
# You can't mix types. print(2 + "2") # TypeError: unsupported operand type(s) for +: 'int' and 'str'

Vanadium 50 said:
I would have expected Python to promote teh 2 to a "2" and make this a "22".

PeterDonis said:
One could just as easily argue that it's more likely that the programmer meant to do integer addition so the "2" should be interpreted as an integer and added to the 2 to make 4.
With a slight change in the code ("2" replaced by '2') and language, I wouldn't expect either "22" or 4. Case in point:
C:
printf("%d", 2 + '2');
The above prints 52, using well-documented conversion rules.
 
  • #68
Mark44 said:
The above prints 52, using well-documented conversion rules.
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
 
  • #69
I have used both languages. Python is OK for small hacks but might be a nightmare for big jobs, the stuff that takes person-years.

I also could never figure what advantage Python had over MacLISP or Scheme or some other interpretive language that already existed.
 
  • #70
PeterDonis said:
Yes, and note that in this case, the string '2' is what is being "converted", not the integer 2 (though the "conversion" of the string is using its ASCII code, not its value if interpreted as a decimal digit).
Technically, '2' is a character, very different from the string "2". Internally, '2' is stored, as you note, as its ASCII code. In the expression, '2' is promoted to an int value (50) and then added to 2.
 

Similar threads

Replies
8
Views
1K
Replies
7
Views
1K
Replies
15
Views
2K
Replies
17
Views
4K
Replies
8
Views
2K
Back
Top