Learn C++ with Bjarne Stroustrup's "Programming Principles and Practice" Book

  • C/C++
  • Thread starter Hacker Jack
  • Start date
In summary, "Programming Principles and Practice" by Bjarne Stroustrup is a comprehensive guide to learning the C++ programming language. It covers a wide range of topics, from basic concepts to advanced techniques, and provides clear explanations and examples. With Stroustrup's expertise and experience, readers can gain a strong foundation in C++ and use it to create efficient and reliable programs.
  • #36
@yungman many programmers never come to understand race conditions. I remember arguing, years ago at Bell Labs, with a designer who wanted to leave in a design that could result in a race condition. "But it will never happen in a million years", was his reply. I considered the guy a total idiot. Do you want to fly in an airplane with software that could even ever remotely have a race condition? Remember all those (was it Toyota?) cars that got stuck in full acceleration mode even though people were pumping the brakes? They never exactly revealed what caused it, but multiple people died and many accidents occurred because some designer pushed out a bad design in code somewhere.

Anyway, that designer in MY case was a close friend of the department head, and he was thus untouchable, and his views always won out in arguments. I moved to get away from him, and years later, they finally had to fire him--not because he harassed all the women and Jews, not because he was an idiot, oh no--but finally he got fired because in his arrogance, he got drunk at a company picnic and struck a supervisor in the face with his fist. Not only a supervisor too--he struck the only African American supervisor, and at that point, there was nothing even his well-placed manager friend could do to keep him from getting fired. I was one of about two dozen people who went out for a beer to celebrate afterwards.

That guy getting fired was the first time in my life I started to think about karma being, maybe, a real thing.
 
Technology news on Phys.org
  • #37
@Hacker Jack I apologize that we have hi-jacked your interesting thread to vent about corporate management issues. I too am always fascinated to find out why anyone bothers to spend a lot of lifetime with C++, as I personally found it so--(no words to describe but life is too short).
 
  • #38
harborsparrow said:
what are streamers?

cout << "Object contents: " << object;
 
  • Love
Likes harborsparrow
  • #39
jbunniii said:
If you're looking for up to date C++ in book form, Josuttis's C++17: The Complete Guide is good and has no competition. But it assumes you already know pre-17 C++ and just want to learn the new stuff. And it doesn't touch C++20 at all.
Lippman/Lajoie/Moo's C++ Primer (6th ed.) is apparently due Real Soon Now. At least it's available for pre-order at Amazon. Maybe this will fill the bill for an up-to-date "complete modern C++" book.
 
  • Like
  • Informative
Likes jbunniii and harborsparrow
  • #40
harborsparrow said:
And certain other features such as operater overloading might as well, in my experience, be academic toys. Because in complex real-world code, I have never seen a good reason to do anything like that

Here's a (very simple) example of when you might choose to overload a comparison operator. You might think "well, I can always take an object, and from the contents of that object derive an integer value, and then sort or select on that. For example red is 3, green is 2 and blue is 1, so red > green and green > blue." Here is a case where that does not work:

rock > scissors
scissors > paper
paper > rock
 
  • Informative
Likes harborsparrow
  • #41
Vanadium 50 said:
Here's a (very simple) example of when you might choose to overload a comparison operator. You might think "well, I can always take an object, and from the contents of that object derive an integer value, and then sort or select on that. For example red is 3, green is 2 and blue is 1, so red > green and green > blue." Here is a case where that does not work:

rock > scissors
scissors > paper
paper > rock

Interesting example. I think there are other ways that I personally would choose to deal with it though. Because the comparison going on is not really "greater than" in the arithmetic sense, I would probably just write a function to implement that logic and call it, rather than overload an operator. But then, I am a big proponent of making things extremely obvious to readers of my code, since I had the privilege of maintaining a pile of legacy code written by others, and that sharpened up my cussing vocabulary quite a bit.
 
  • #42
To be honest, I probably wouldn't code this up with an overloaded ">" either. But it's an example where the standard trick of "oh, just assign an integer to it somehow" doesn't do it.
 
  • Love
Likes harborsparrow
  • #43
harborsparrow said:
@yungman is correct. Most CS majors tend to be offered numerical programming as an option rather than a requirement, and I agree that it (as well as some assembler level programming) would be useful as a requirement instead.

But CS sees, accurately I think, that it will only be needed mainly in scientific fields. Programmers can work for decades in industries such as telecom or pharma and never need to do more than very trivial numerical work. It will be, for those types, lots of database, web services, or UI work--very different from what physicists need.

@yungman is also correct that CS departments do not teach enough "close to the metal" real-time programming, which is a dark art requiring intimate knowledge of the hardware. However, EE departments tend to offer those courses, and they are usually available as options for CS majors.

This was true at the two CS departments I had contact with, anyway (Univ of Tenn and Univ of Penn).

P S - I learned a little of the "dark art" doing microprocessor programming in telecom many decades ago. Not only did it have to be assembler level, but RAM was so small that the code had to jump to take advantage of every free byte of RAM in order to fit it into the tiny chip. These days, memory is cheap and people think they can afford the overhead of OO and compiled languages for robotics etc. That is debatable.
The reason I keep mention about cross learning is because the reliability and speed of ALL the new products with computer are going DOWN. I am talking about everything I bought in the last 2 years. I know people here think this is off the topic in this forum and got deleted before. I am talking about a 2 years old new car, new 82" Samsung expensive tv, numerous printers, new Direct tv receivers. They are SLOW, unreliable, confused. You have to restart, unplug them all the time. Things I bought up to 5 years ago are NOTHING like that.

Obviously there is a big disconnect between hardware and software. This is going to blow up in the industry as people will get fed up with this. as expected, I just bought an Epson printer that is over $300, it occasionally print the old file in cash when asked to print a new file. Print half way and stop and got confused. This is the 6th printers I got in the last two or 3 years, Canon, HP. Now I just bought a Brother waiting to arrive. Don't even get me started on cars and tv or else this post is going to be deleted. there's a HUGE difference between the latest and greatest to ones that are still consider quite new only afew years old....and difference NOT in a good way.

Then I started learning C++ and see all the fancy "elegant" ways of doing things, then I started to realize software and maybe hardware are getting a live of their own forgetting they are just TOOLS to get a job done. They are no more or no less than say the Word program...to make people's lives easier, then GET OUT OF THE WAY!
 
  • Like
Likes harborsparrow
  • #44
On the questions of "why colleges (and high schools) switched to Java", it has to do with the relative "safety" for the average programmer in using Java vs. using C++. It is difficult if not impossible to cause a segmentation fault using Java or to crash an entire machine, or screw up the DLL's so much that it is necessary to reinstall the operation system. All this IS possible with C++.

Here's an analogous situation. By the early 2000's, there were approximately 40 major and minor versions of Windows in public use. Each version of Windows (counting not only, say, Windows 2000 but Windows 2000 Professional, and then versions 1 vs 2 etc) had a slightly different set and configuration of DLL's (dynamic link libraries), and the ecosystem of Windows operating systems had become utterly unmanageable even to expert C++ programmers with years of experience. This situation was known as "DLL hell" as most of you may remember.

To solve it, someone at Microsoft took a look what James Gosling at Sun did when he create the Java programming language. Java does not run on the machine's hardware directly; instead, it runs on a Java "virtual machine", and this "virtual machine" looks the same no matter which operating system it happens to be running on. Microsoft used this idea of a virtual machine to bootstrap itself out of DLL hell by creating something called .NET. , which was a Microsoft virtual machine that would be implemented for every Windows operating system (and some Linuxes at least), and all so-called application programming would be done to the .NET virtual machine instead of to the operating system directly. This saved Microsoft's bacon. Further, Microsoft went to ECMA and got both their .NET virtual machine and it's premier language, C#, standardized. Languages which never create an international standard for tools builders to work against usually have failed (think: Ruby, just for one example).

So--Java became the new programming language for kids to learn because it would run on Mac OS X and it would run on Windows and it was a decent programming language that was fully object oriented. Both it and C# have now been extended to include everything but the kitchen sink, but you still cannot create a segmentation fault using them to save your life. Furthermore, over the years, the .NET VM and Windows compilers have become so highly smart and optimized that performance for numerical number crunching programs is very high indeed and it is probably well worth consider these newer and nicer behaved "sweeter" languages over C++ whenever it is feasible. The .NET VM is so good, in fact, that dozens and dozens of programming languages now target it to run on, including modern versions of COBOL, the ancient language of the business world that still exists due to legacy code written back in the 1960s.

As far as I can tell, physicists, astrophysicists, the space program, and people writing actual operating systems are pretty much the only holdouts who still actively use C++ with all its problems, and that is in large part because there is a lot of legacy number crunching programs lying around written in C++, FORTRAN and the like, not because more modern languages would not be up to the task. At least, this is my impression based on a long career in software development that spanned literally dozens of languages and reached from short stints in physics and astrophysics projects, to biology, to telecommunications, to big pharma, to web development, and eventually to teaching.

C++ on the other hand has all the portability issues that C always had. If you change hardware from 32-bit to 64-bit, the C and C++ compilers will have to be doctored. But using a virtual machine underneath, not so much. Hence, Java. Java solved a million more problems than just automatic garbage collection, and BTW, automatic garbage collection also introduced some new problems for really high performance applications or large-scale websites, just for example. You don't get nothing for free in computers; there is always a trade off.

By the way, since Microsoft tended to name everything differently than the UNIX/Linux world, it took everyone about a decade to figure out what the hell .NET actually was, and that it was a good thing. Where they came up with the name .NET I don't know, but the way the name (mis-name) things doesn't do them any favors.
 
Last edited:
  • #45
@yungman you are talking about the safety and reliability of the so-called "internet of things" (a dumb name), referring to all the devices now in common use that have microprocessors inside them, especially in smarthome and automative products. It is true that these have started out low quality and have big security issues. But, I think it is a well recognized issue (at least in computer science circles) and that help is on the way. A lot of attention is now being put into finding out and exposing the hacking weaknesses of these products at the Black Hat conferences, and companies such as Google have entered the marketplace with a higher standard of quality than was at first being applied.

I've been watching (and using) smarthome products from the get-go. You can see just by reading reviews on Amazon (which can be pretty entertaining by the way) how these products are slowly getting their act together and are coming out with more stable and secure versions. Most of them were horribly rushed into the market with little forethought because they were desperately trying to beat out competitors and also they were innovating and doing things never done before--and no one was giving them lots of money to do it, they had to do it on shoe-string budgets.

The marketplace has not been kind to any kind of culture that would develop reliable software since the crash of the telecom industry thirty years ago in the US. Amazon, Google, Microsoft and those big corporations are the only players which had the resources to fund long-term development, but those were not the innovators in IoT or smarthome products--the big corporations followed along later and helped clean up the mess.

I am actually an admirer of Jeff Bezos, founder of Amazon, because he refused to promise his investors that Amazon would make a profit for, like, 8 years at first. And it took him almost the entire 8 years to make it profitable, and he did it by being smart. He's no angel but he has been extremely innovative in software and in business and gets less credit than he should.
 
  • #46
Hi Harborsparrow

I am not talking about smart phone and windows stuffs, for whatever reason, I don't have issue with laptops. We average buy one laptop every two years max between me and my wife, we always get in price range of about $700 which gets us into flash drive, latest gen I5 or I7. We don't play video game, we save on the graphics. Laptops are one that always quite reliable. I always get HP, few Lenovo did not work out very well. I don't put anything important on my smart phone, I only put my play email address in it, not bank info or anything. They can take my phone and I won't worry about anything. So those are not my issue.

My issue is all the appliance( necessities). I have a 3 yr ol Samsung 65"TV, it works perfectly, remote is fast and consistence. Then I bought an 82" Samsung early this year, man that is bad. The remote is so so slow, it has to "think" before doing anything as simple as navigate to the menu and through the menu. It is no more sophisticate than the 3 years old one, just slow. Then it got confused sometimes and had to start all over again.

Then the car. I have a 2014 Mercedes ML, other than changing oil and stuffs, only been to the shop ones about a warning signal that turned out to be tire pressure that got fixed by inflating the tires. It has voice, navigation and all the fancy stuffs. We had so much luck with the ML and an old 2003 E class so when it's time to replace the E, we went and bought a 2018 E. MAN! Is it a bomb. Spent at least a month in the shop the first half a year, ALL computer problems. You can drive and all of a sudden you lost control of the radio and everything in the center console. I had to stop the car, put in park, then it reset. The radio change station by itself, all the stupid mouse pads and joy sticks. They never got it fixed, they just say wait for the next software update. They never got the garage door opener to work. They even sent a specialist to our home and still couldn't make it work. but he can get his C class he drove to work with my garage door! Then about the printers, I am still playing musical printers. Then of all things, my 2 years old washer got confused and started blinking! Had to turn it off to reset it. Since when you see a cloth washer get confused? Again, it's a high model of Maytag.

Anything that is over 4 years old never have issue like this. We buy a lot of stuffs. I don't care they break down, but they are not working right from the get go. Yes, I blame on the software/firmware. Learning C++ make me start to understand why. Too fancy, too elegant. What gets me is the computers are getting faster and faster, the stuffs are getting slower and slower. Must be all the calling class to class, aggregations and collaborations that I just learned. Keep pushing on stack( I mean stack pointer when you make a call and store the existing data). All the push pop, general purpose language that works on all platform causing a lot of overhead. Imagine you try to make a 3 point U turn in a busy street, you switch the transmission, the transmission has to thing. " are you sure you want to do this?" before it switch. AND YES, I am talking about my 2018E! make my blood boils just talking about all these. Yes, I blame the software. they told me as such.

I am not say hardware never at fault. But hardware is really dumb, only a few control bits, say a DAC and ADC. Those either works or if it fail, it's not intermittent, it fail very noticeable and you cannot turn off the power to reset that often. Hardware is dumb like a rock.
 
Last edited:
  • Love
Likes harborsparrow
  • #47
The software in automobiles now is just plain scary sometimes. And, self driving cars are coming soon. Think about that.!
 
  • Like
Likes yungman
  • #48
harborsparrow said:
The software in automobiles now is just plain scary sometimes. And, self driving cars are coming soon. Think about that.!
Do you know I had to order the car to get the lowest level equipment already. I don't have intelligent braking, no driving assist. Still it's like that. Funny thing is they advertising the new ML can "hoop" out of a pot hole. Imagine when it goes south, you could be driving down the road hooping like a low rider?! The 2014 is not any inferior in high tech, it's just more reliable.

No, I will NEVER buy a self drive car. If I get too old, I'll take Uber. Just look at the Boeing 737 Max.

I am seriously thinking about Tesla next, I don't drive that much particular during this virus, stupid gas car die if you don't drive them or drive short distance. I have to make up reason to take a longer drive every other week to keep them alive. An electric car don't have that problem...BUT then, the Tesla??! Another gadgetgy one I don't need.
 
  • Haha
Likes harborsparrow
  • #49
Don't mistaken me anti technology. I have a lot of patience with computers, it's understandable. I just don't have patience with "appliances", car, printers, tv etc. to me are appliance to make my life easier, not more challenging. I PAID for them to make my life easier.
 
  • Love
Likes harborsparrow
  • #50
Well--@yungman--you should have read the report in the 1980's when several of the Bell Labs engineers had to rush to testify before the U S Congress to try and get them NOT to build lazers that could shoot things from space. Bell Labs had officially lobbied for the project because the company stood to get a lot of money out of it, but the engineers rebelled and essentially told Congress: "Are you effing out of your MINDS?" And cited all the rockets going off course and blowing up in the early days of space travel due to programming errors, and all the ongoing failures in software, and the absolute dumbness that it would take to trust software not to shoot at the wrong thing etc. But funding finally did that project in, not wisdom. Thank goodness it never happened.
 
  • Like
Likes yungman
  • #51
And in fact, anti-missile software did mistakenly shoot an Iranian passenger airliner down once, and the government tried to deny it at first but had to admit it had happened. Tragic. And it could easily have started a war.
 
  • #52
harborsparrow said:
Well--@yungman--you should have read the report in the 1980's when several of the Bell Labs engineers had to rush to testify before the U S Congress to try and get them NOT to build lazers that could shoot things from space. Bell Labs had officially lobbied for the project because the company stood to get a lot of money out of it, but the engineers rebelled and essentially told Congress: "Are you effing out of your MINDS?" And cited all the rockets going off course and blowing up in the early days of space travel due to programming errors, and all the ongoing failures in software, and the absolute dumbness that it would take to trust software not to shoot at the wrong thing etc. But funding finally did that project in, not wisdom. Thank goodness it never happened.
I actually don't mind that, like I said, I don't mind problems with computers and windows as they have to change and keep up with times, avoid virus hacking and all that. That to me, it's understandable. But I don't want my appliance to do that, appliance is not for adventure for me particularly it's not getting better since 4 years ago, why they can do it up to 4 years ago and just go to hell. Something just is not right. My 65" Samsung is a smart tv, not that it's primitive. My 2014SUV have all the smarts as the 2018, just less mouse pads to screw things up accidentally. Far as I concern, the newer printers don't do anything more than the 4 year old ones, they all do duplex printing, scanning and copying. So where's the improvement? Getting slower to train your patience? That's what bugs me.

You always have bugs when things first come out, takes time to perfect the product. Problem is the life cycle is so short you don't have time to fix anything, instead hoping the next model will fix the old problem...then introduce the new set of problems.
 
  • Like
Likes harborsparrow
  • #53
harborsparrow said:
And certain other features such as operater overloading might as well, in my experience, be academic toys. Because in complex real-world code, I have never seen a good reason to do anything like that, and I would in fact advise against using things like operator overloading (without a strong compelling reason, which I did not encounter--one may exist though) because--think of someone coming along ten years later, trying to fix a big hunk of unknown code, and not knowing immediately that the operator has been overloaded.
Some standard containers, such as std::map and std::set, require that the type stored in the container be comparable using "<". If you want to use these containers with your own types, that's one obvious situation where operator overloading is needed. It's certainly not just an academic toy; it's commonly used in every large C++ code base that I've worked on, most of which are commercially shipped products running in hundreds of millions of phones.
 
  • Informative
Likes harborsparrow
  • #54
My background is odd, unlikely. Maybe everyone's is. I learned C (and started learning C++) because I got a job in 1984 at Bell Labs and they told me to. Before that, I had done a little Fortran, Snobol, Pascal, and some assembler. I had designed a little hardware and a bit-slice microprocessor. I was interested in technology and had repaired telephone switchboards for a couple of years (my first exposure to a "computer" actually), I had studied radio engineering, and I had worked on the software and hardware for the very first Positron Emission Technology (PET) scanner ever built, but mostly I had been living in poverty. I was lucky enough to claw through public college with a combination of scholarships, work study, and assistantships, working side jobs in factories and restaurants, borrowing textbooks, and not owning a car. I was motivated by fear of being 40 years old if I did not find a profession that would earn a living wage.

I could easily have ended up homeless, and I still feel enormous compassion for people who do.

At Bell Labs, I knew I had been hired to fill a quota (being female), not because they thought I would be the best choice. But I found enough generous mentoring to become as good as I needed to be. I would say that there, I soon became slightly above average in performance, not a star, and it was definitely the school of hard knocks. You don't even want to know some of the gender discrimination that I encountered, and it didn't matter all that much because I had been raised among boys and I could take it. I did take it and survived, if not thrived.

I typically spent at least half my waking hours learning new stuff, and I didn't always do exactly as expected in small things, but I usually got the needed results on time, if in some unorthodox way. When I did fail at something, I agonized, because I often could not easily tell whether I had truly failed through stupidity or whether the deck had been stacked against me. Eventually, I got mad enough about that to accept that it was impossible to know for sure.

For years, I have kept a daily log of what I got done. I used to keep it in a notebook, and now I keep it in a wiki. I look at this log when I am discouraged, to give myself the courage to go on. An old telephone repairman gave me the advice to keep this log. He said it would be useful when management tried to blame me for "dang", as he called it. It has served me well, and I'll pass that along to any young people who ever read this. The log consists of one single line describing something I completed. Okay, maybe, occasionally 2 lines.

I lasted a dozen years at Bell Labs as it slowly downsized. Two years of microprocessor programming for consumer products (some 4-bit chip). Two years of C programming for SONET systems on Motorola 68000 controllers. I learned to make a PC do polling instead of interrupts, so that I could guarantee nothing would ever block. I learned weird realtime tricks like software debouncing for a shared memory item. They tried to teach me C++ but I hated it. After that, I got shunted into writing requirements for transmission and network operations systems. As @yungman noted, the "software" profession requires one to constantly learn the latest new thing, and it was always one huge crap shoot to see if I had chosen the right new language to learn to keep myself employed. I embraced the WWW. I liked Microsoft and bought and studied PC's although Bell Labs officially hated it. I made enemies there by solving problems quickly on Windows that were going to take two years to solve using UNIX. I left Bell Labs after a dozen years and became an independent consultant. I negotiated my own contracts, worked short term jobs for high pay, and kept moving here and there. I worked in big pharma until 9-11 shook the world up. Jobs in software got very lean in the early 2000's due in part to outsourcing, and I became a college professor (non tenured) and that lasted for about 10 years. I taught computer architecture to grad students, Java programming (had to learn it myself first, and quickly), and software engineering. I learned Ruby and various other programming languages so that I could teach them. After that, I worked for biologists where I was the only programmer, sysadmin, web developer, chief cook and bottle washer. Became good at database design and programming (mentored by biologists who are fearless about databases and statistics). The pay was low but the work was interesting and my boss sometimes actually said thank you. Around 2010, I refused to commute any more and started working remotely from home, which I loved. My boss snuck around and allowed it because he didn't want to lose me. So the pandemic hasn't bothered me as much as it has some people; my home office was already in good shape.

My formal training was to program close to the metal--and I liked it and was good at it @yungman--but I left it behind because the level of gender harassment in that field was just not worth putting up with. Over the years, I kept clawing my way back to actually writing code. I have worked extensively with both Linux and Windows whereas most programmers ever only learn one . This became an advantage. I would string things together to make something happen that could only be done using four different technologies at once, or I would get something done by some unorthodox approach that no one else thought would work.

I stayed employed, but I had to take what was available to me, not particularly what I wanted. Many good technologists left the field during those bad economic downturns. I only know maybe two women my age who survived in the tech trenches throughout their entire lives. I am proud to have survived. I grew to like the work and be good at it, but unfortunately, most of the people who are in my day to day life have no idea what I have accomplished. They look down on me because I am not a good quilter or a good whatever that women are expected to be good at, and they don't see or appreciate my accomplishments. That frustrates me sometimes.

So that's my war story.

I've enjoyed reading your war stories. Keep them coming.
 
Last edited:
  • Like
  • Love
Likes yungman, Mark44 and Jarvis323
  • #55
jbunniii said:
Some standard containers, such as std::map and std::set, require that the type stored in the container be comparable using "<". If you want to use these containers with your own types, that's one obvious situation where operator overloading is needed. It's certainly not just an academic toy; it's commonly used in every large C++ code base that I've worked on, most of which are commercially shipped products running in hundreds of millions of phones.

Good to know! Since I never properly learned C++, I didn't learn this before.
 
  • #56
harborsparrow said:
My background is odd, unlikely. Maybe everyone's is. I learned C (and started learning C++) because I got a job in 1984 at Bell Labs and they told me to. Before that, I had done a little Fortran, Snobol, Pascal, and some assembler. I had designed a little hardware and a bit-slice microprocessor. I was interested in technology and had repaired telephone switchboards for a couple of years (my first exposure to a "computer" actually), I had studied radio engineering, and I had worked on the software and hardware for the very first Positron Emission Technology (PET) scanner ever built, but mostly I had been living in poverty. I was lucky enough to claw through public college with a combination of scholarships, work study, and assistantships, working side jobs in factories and restaurants, borrowing textbooks, and not owning a car. I was motivated by fear of being 40 years old if I did not find a profession that would earn a living wage.

I could easily have ended up homeless, and I still feel enormous compassion for people who do.

At Bell Labs, I knew I had been hired to fill a quota (being female), not because they thought I would be the best choice. But I found enough generous mentoring to become as good as I needed to be. I would say that there, I soon became slightly above average in performance, not a star, and it was definitely the school of hard knocks. You don't even want to know some of the gender discrimination that I encountered, and it didn't matter all that much because I had been raised among boys and I could take it. I did take it and survived, if not thrived.

I typically spent at least half my waking hours learning new stuff, and I didn't always do exactly as expected in small things, but I usually got the needed results on time, if in some unorthodox way. When I did fail at something, I agonized, because I often could not easily tell whether I had truly failed through stupidity or whether the deck had been stacked against me. Eventually, I got mad enough about that to accept that it was impossible to know for sure.

For years, I have kept a daily log of what I got done. I used to keep it in a notebook, and now I keep it in a wiki. I look at this log when I am discouraged, to give myself the courage to go on. An old telephone repairman gave me the advice to keep this log. He said it would be useful when management tried to blame me for "dang", as he called it. It has served me well, and I'll pass that along to any young people who ever read this. The log consists of one single line describing something I completed. Okay, maybe, occasionally 2 lines.

I lasted a dozen years at Bell Labs as it slowly downsized. Two years of microprocessor programming for consumer products (some 4-bit chip). Two years of C programming for SONET systems on Motorola 68000 controllers. I learned to make a PC do polling instead of interrupts, so that I could guarantee nothing would ever block. I learned weird realtime tricks like software debouncing for a shared memory item. They tried to teach me C++ but I hated it. After that, I got shunted into writing requirements for transmission and network operations systems. As @yungman noted, the "software" profession requires one to constantly learn the latest new thing, and it was always one huge crap shoot to see if I had chosen the right new language to learn to keep myself employed. I embraced the WWW. I liked Microsoft and bought and studied PC's although Bell Labs officially hated it. I made enemies there by solving problems quickly on Windows that were going to take two years to solve using UNIX. I left Bell Labs after a dozen years and became an independent consultant. I negotiated my own contracts, worked short term jobs for high pay, and kept moving here and there. I worked in big pharma until 9-11 shook the world up. Jobs in software got very lean in the early 2000's due in part to outsourcing, and I became a college professor (non tenured) and that lasted for about 10 years. I taught computer architecture to grad students, Java programming (had to learn it myself first, and quickly), and software engineering. I learned Ruby and various other programming languages so that I could teach them. After that, I worked for biologists where I was the only programmer, sysadmin, web developer, chief cook and bottle washer. Became good at database design and programming (mentored by biologists who are fearless about databases and statistics). The pay was low but the work was interesting and my boss sometimes actually said thank you. Around 2010, I refused to commute any more and started working remotely from home, which I loved. My boss snuck around and allowed it because he didn't want to lose me. So the pandemic hasn't bothered me as much as it has some people; my home office was already in good shape.

My formal training was to program close to the metal--and I liked it and was good at it @yungman--but I left it behind because the level of gender harassment in that field was just not worth putting up with. Over the years, I kept clawing my way back to actually writing code. I have worked extensively with both Linux and Windows whereas most programmers ever only learn one . This became an advantage. I would string things together to make something happen that could only be done using four different technologies at once, or I would get something done by some unorthodox approach that no one else thought would work.

I stayed employed, but I had to take what was available to me, not particularly what I wanted. Many good technologists left the field during those bad economic downturns. I only know maybe two women my age who survived in the tech trenches throughout their entire lives. I am proud to have survived. I grew to like the work and be good at it, but unfortunately, most of the people who are in my day to day life have no idea what I have accomplished. They look down on me because I am not a good quilter or a good whatever that women are expected to be good at, and they don't see or appreciate my accomplishments. That frustrates me sometimes.

So that's my war story.

I've enjoyed reading your war stories. Keep them coming.
You should write a book, or at least a medium article or something.
 
  • Like
  • Wow
Likes yungman and harborsparrow
  • #57
Old timer here. Started programming Fortran IV and assembly and some APL back in 1968 (in high school). Got my first job in 1973, mostly assembly on a multi-computer | multi-tasking database server (using six HP 2100 mini-computers), with some Fortran used for offline processing. In 1975 and later, it was still mostly assembly for work, while using Fortran and APL on an IBM 370 at a local community college where I was helping students. I didn't learn C until 1985 when I got an Atari ST. My first experience with C++ started with Programming Windows 3.1 by Charles Petzold. An interesting thing is how little has changed in the basic Windows API since then. My career mostly involved operating systems for mini-computers and later embedded systems, mostly C with some assembly, so my experience with C++ was for personal use or to help others. I'm still using versions of Visual Studio (2005 to 2019) for C, C++, and C# with Windows (XP, 7, 10).

- - - other languages - - -

I use NetBeans / Java to help people asking questions in a few forums. I don't like Java's lack of functionality for native types or worse yet, it's native linked list, native linked list iterators, and no way to move nodes within or between lists (no equivalent to C++ std::list::splice()). I also use Python to help at some forums. My main complaint is how slow it is. Unless 80% or so of a Python program uses the libraries built with compiled languages, I don't see much of a practical use for it.

Some of my hardware type co-workers use Visual Basic, since it has a drag and drop interface for creating user interfaces, such as real time graphs, bar charts, general information displays, used with code they added for monitoring hardware during development. Other co-workers used Matlab for math problems, which would be converted to C if it was going to be used in an embedded device.
 
Last edited:
  • Like
Likes harborsparrow and Mark44
  • #58
harborsparrow said:
My background is odd, unlikely. Maybe everyone's is. I learned C (and started learning C++) because I got a job in 1984 at Bell Labs and they told me to. Before that, I had done a little Fortran, Snobol, Pascal, and some assembler. I had designed a little hardware and a bit-slice microprocessor. I was interested in technology and had repaired telephone switchboards for a couple of years (my first exposure to a "computer" actually), I had studied radio engineering, and I had worked on the software and hardware for the very first Positron Emission Technology (PET) scanner ever built, but mostly I had been living in poverty. I was lucky enough to claw through public college with a combination of scholarships, work study, and assistantships, working side jobs in factories and restaurants, borrowing textbooks, and not owning a car. I was motivated by fear of being 40 years old if I did not find a profession that would earn a living wage.

I could easily have ended up homeless, and I still feel enormous compassion for people who do.

At Bell Labs, I knew I had been hired to fill a quota (being female), not because they thought I would be the best choice. But I found enough generous mentoring to become as good as I needed to be. I would say that there, I soon became slightly above average in performance, not a star, and it was definitely the school of hard knocks. You don't even want to know some of the gender discrimination that I encountered, and it didn't matter all that much because I had been raised among boys and I could take it. I did take it and survived, if not thrived.

I typically spent at least half my waking hours learning new stuff, and I didn't always do exactly as expected in small things, but I usually got the needed results on time, if in some unorthodox way. When I did fail at something, I agonized, because I often could not easily tell whether I had truly failed through stupidity or whether the deck had been stacked against me. Eventually, I got mad enough about that to accept that it was impossible to know for sure.

For years, I have kept a daily log of what I got done. I used to keep it in a notebook, and now I keep it in a wiki. I look at this log when I am discouraged, to give myself the courage to go on. An old telephone repairman gave me the advice to keep this log. He said it would be useful when management tried to blame me for "dang", as he called it. It has served me well, and I'll pass that along to any young people who ever read this. The log consists of one single line describing something I completed. Okay, maybe, occasionally 2 lines.

I lasted a dozen years at Bell Labs as it slowly downsized. Two years of microprocessor programming for consumer products (some 4-bit chip). Two years of C programming for SONET systems on Motorola 68000 controllers. I learned to make a PC do polling instead of interrupts, so that I could guarantee nothing would ever block. I learned weird realtime tricks like software debouncing for a shared memory item. They tried to teach me C++ but I hated it. After that, I got shunted into writing requirements for transmission and network operations systems. As @yungman noted, the "software" profession requires one to constantly learn the latest new thing, and it was always one huge crap shoot to see if I had chosen the right new language to learn to keep myself employed. I embraced the WWW. I liked Microsoft and bought and studied PC's although Bell Labs officially hated it. I made enemies there by solving problems quickly on Windows that were going to take two years to solve using UNIX. I left Bell Labs after a dozen years and became an independent consultant. I negotiated my own contracts, worked short term jobs for high pay, and kept moving here and there. I worked in big pharma until 9-11 shook the world up. Jobs in software got very lean in the early 2000's due in part to outsourcing, and I became a college professor (non tenured) and that lasted for about 10 years. I taught computer architecture to grad students, Java programming (had to learn it myself first, and quickly), and software engineering. I learned Ruby and various other programming languages so that I could teach them. After that, I worked for biologists where I was the only programmer, sysadmin, web developer, chief cook and bottle washer. Became good at database design and programming (mentored by biologists who are fearless about databases and statistics). The pay was low but the work was interesting and my boss sometimes actually said thank you. Around 2010, I refused to commute any more and started working remotely from home, which I loved. My boss snuck around and allowed it because he didn't want to lose me. So the pandemic hasn't bothered me as much as it has some people; my home office was already in good shape.

My formal training was to program close to the metal--and I liked it and was good at it @yungman--but I left it behind because the level of gender harassment in that field was just not worth putting up with. Over the years, I kept clawing my way back to actually writing code. I have worked extensively with both Linux and Windows whereas most programmers ever only learn one . This became an advantage. I would string things together to make something happen that could only be done using four different technologies at once, or I would get something done by some unorthodox approach that no one else thought would work.

I stayed employed, but I had to take what was available to me, not particularly what I wanted. Many good technologists left the field during those bad economic downturns. I only know maybe two women my age who survived in the tech trenches throughout their entire lives. I am proud to have survived. I grew to like the work and be good at it, but unfortunately, most of the people who are in my day to day life have no idea what I have accomplished. They look down on me because I am not a good quilter or a good whatever that women are expected to be good at, and they don't see or appreciate my accomplishments. That frustrates me sometimes.

So that's my war story.

I've enjoyed reading your war stories. Keep them coming.
I don't even want to pretend I can understand what you went through because I’m a male, I went through a little bit similar as I am a Chinese, back in the days in the 70's and 80's, I did feel some discrimination. But my burning love for electronics just couldn't let that even hinder me. Whatever they told me to do, I did more. Like I was a test tech, I was supposed to only test the boards, I started writing some tiny assembly programs to test the hardware. They recognize that and gave me more. That's how I keep moving up. In 91, I want to become the manager of EE. I felt there was some discrimination, I did a good job in designing, but never been a manager. The discussion got quite bad and we had to go to personnel office to resolve the issue. I ask them what do I have to do to become the manage? They told me what they want to see me doing. I did it. I got the promotion. One thing in life I learn, you don't get what you deserve from others for different reasons, not all are discrimination, it can be laziness on their part and don't want to rock the boat. I always work hard, but I DEMAND recognition... in form or raise and promotion. I don't sit there waiting for them to reward me. Other than being promoted to EE in 1980, I asked for every raise and every promotion.

I hope you don’t get offended, my point is do a good job, then speak up and demand what you deserve, don’t wait for them to recognize you and don’t spend any time thinking about gender discrimination. My wife is the same way like me, she worked for the federal government in AFDC and Childcare, she only had HS education, she moved her way up to grade 12 through hard working in AFDC. She saw an opening in Childcare that was a grade 13. That’s around the time I was fighting to get the manager of EE position. We talked and I encourage her to go for it and talk to the regional administrator. She did, it’s funny I cheered her on and she cheered me on. She got some resistance also, but she really spoke up, finally they put her in a temporary position for 3 months saying that she was going to go back to AFDC, it’s only temporary. She worked her heart out. She got to stay and become permanent and promoted to grade 13. That’s the highest grade without being a manager usually people have master degrees. Point is, ASK, all they can say is no! I am surprised though, we have women programmers, nobody make anything out of it, we just work together get the job done. It's engineering that has fewer women, I blame it on the old generation kept giving dolls to the girls and teaching them how to cook and sew! Read the next paragraph.

When come to gender, I keep encourage my little girl (grand daughter) to learn science, math and CS. I want her to grow up with all these instead of cooking, sewing and all that. My wife is old school, little girl’s parents are not that educated. They all keep concentrating on English, writing and all that. I am the only one that kept talking about science, engineering, CS. I even bought an electronic kit for kids and played with her like 2 years back, taught her about AND, NAND, OR, NOR truth table. I am willing to learn Python if she have interest in it now so we can actually play together. I believe women can do everything a man do. It's just the old generation type cast them, automatically give girls a doll, learn cooking and sewing. The hell with that. I want my little girl to be strong, knowledgeable. I taught her how to use tools like screw driver, pliers, socket wrenches and all when she was like 8 or 9! Just a little exposure at a time.

Hell, I even want her to be strong physically. I play sparing(no contact punching and kicking) on and off, make her get used to block, duck and punch. Stupid mother made her wasted so much time learning ballet and she just didn’t like it. Finally my little girl gather the courage ( through my moral support) to tell mommy she didn’t want to take it anymore! I tried to encourage her to take martial arts. I don’t want any man to kick her around. Hell, nobody kick my little girl around! That, I am still working on it. I have been “brain washing” her by kept talking how important to study math, have to take at least two semester of calculus even she end up in business school. BUT I told her without any doubt, don’t listen to what I said, choose the career she want, I just want her to have all the tools so she can be ready for anything. She is getting straight A’s and awarded student of the year twice the last two years. I am so proud of her. She’s coming over today to stay with us for two weeks. I am so stoked. She stay with us on average like 6 weeks a year.

As for me, I always joking ( serious also) that my wife is the big boss, I only work and get some allowance! It is quite true. We have a rental business and she is running it. She is in charge of our finance, tax, checking and all. I am lucky, I’d be in big trouble without her. I am all for women to be strong. Hey, I am benefited from her being strong, the hell with chauvinistic.( My English is so bad, I had to ask her how to spell chauvinistic!)

funny you mentioned about bit-slice. I so wanted to learn that at the time, didn't get the chance. But they were out pretty fast.
 
Last edited:
  • #59
yungman said:
I hope you don’t get offended, my point is do a good job, then speak up and demand what you deserve,

Ha ha @jungman. I had a good boss for a span of 4 years, and I asked him one year to be promoted, and he told me (and he always told me the absolute truth as far as I could tell): "Sorry, I can't get you promoted, but it has nothing to do with you. The last time the managers had a promotion meeting, I vetoed the promotion of (so-and-so), and his supervisor was angry at me and told me he would never support any of my candidates for promotion again." And I knew the so-and-so who my boss had refused to promote, and I understood that it was important that he had vetoed the promotion of that particular person. And so I just laughed and said, "Okay". And that was the only time I tried to get promoted within that corporation. The rest of the time, I knew there was no hope for various reasons. I really didn't want to be a manager there anyway, because they were mostly snakes.

Further, because I never had formal power, I developed a number of sideways ways of exercising power, influencing people, delegating work to others when I was overloaded, and working around people who were roadblocks. Over time, I think I became reasonably facile at all that, and those skills were more important to me personally than being the boss.

The ONE thing that gender really, truly cost me was that I was never paid equally. Bell Labs forbid people to tell their salaries, but I would always get some of the men to tell me anyway on the sneak, and I was often paid about 20 to 30% less than my peers even though, in several cases, I'm pretty sure I was far more productive and helpful than they were.

When I was a consultant, I asked for what I was worth. It was only a few years, but I proved to myself that it was possible to be paid what I was worth. It required a lot of hard nosed, cold negotiation at the time, and it required walking away from some opportunities that looked like I would really enjoy them, but if they wouldn't pay, I decided I would just go to the beach. Best decision I ever made. Maybe I'm delusional but I made sure that I was always worth what I was paid. That way, I could sleep well at night.

The next best decision I ever made was to work for less money near the end of my career--for the biologist--because it was peaceful and less cut-throat and the work was interesting as all hell and I had a good boss. Money is not everything.

I also once turned down a very high paying job because it was making missiles. They didn't call them missiles, they called them something odd like "autonomous flying vehicles", but it was before drones, and they showed a film and it was ballistic missiles, and I didn't want to be part of that. I knew other people were going to do it, but I just didn't want that on my conscience to work on anything that I knew would be used to kill.
 
Last edited:
  • #60
Vanadium 50 said:
Seriously, multiple inheritance is one of those ideas that looks great on paper, but not so good in real life. The advantages tend to be small or speculative, [...]
I'm surprised to find a point on which I totally disagree with you (since this is quite rare).

In a prehistoric age, I was working on a project when only single inheritance was available and I felt quite restricted and frustrated. As soon as I got a beta copy of the AT&T C++ compiler with MI implemented, I felt extremely liberated.

but the issues - including the Diamond of Death (just like on ski slopes, the diamond should serve as a warning) - tend to be very real.
The C++ design principle that "ambiguities are illegal" solves (imho) the so-called Diamond of Death problem. If there is an ambiguity as to which parent class method should be called, the user must qualify explicitly. I never had any problem with that.
 
  • Informative
Likes harborsparrow
  • #61
harborsparrow said:
I remember arguing, years ago at Bell Labs, with a designer who wanted to leave in a design that could result in a race condition. "But it will never happen in a million years", was his reply. I considered the guy a total idiot.
Ha! I've had that argument too. Fortunately, I was senior enough that pointing out that the code would run in the real world millions of times more often than in tests was enough to make sense prevail. That, and coming up with an alternate design that avoided the race.

Btw, this reminds me of the recently-learned Brandolini's law. :headbang:
 
  • Love
Likes harborsparrow
  • #62
@strangerep thanks for pointing me to Brandolini's law!
 
  • #63
yungman said:
...I don't drive that much particular during this virus, stupid gas car die if you don't drive them or drive short distance. I have to make up reason to take a longer drive every other week to keep them alive.

@yungman we are having exactly the same problem with a 2015 Honda Fit. It dies if not run maybe an hour every week. but the 16-year-old Toyota Camry could sit in the driveway unstarted for 2 months and the battery doesn't run down. I am convinced that it is crappy firmware design, somewhere in the car, draining the battery while it's just sitting there. dumb. bad. it ought to be able to go longer than a week!
 
  • #64
harborsparrow said:
@yungman we are having exactly the same problem with a 2015 Honda Fit. It dies if not run maybe an hour every week. but the 16-year-old Toyota Camry could sit in the driveway unstarted for 2 months and the battery doesn't run down. I am convinced that it is crappy firmware design, somewhere in the car, draining the battery while it's just sitting there. dumb. bad. it ought to be able to go longer than a week!
Possibly this is due to the parasitic current draw of security sensors, which might not be present in your Toyota Camry. It would be interesting to see how large a current draw there is for a car just sitting idle. There might also be sensors that unlock the doors when you come close or press a button on a fob.

I have five motorcycles that I don't ride much this time of year. I have trickle chargers on four of them to keep the batteries charged up. I just bought the fifth one, and it's mostly a pile of parts that I have to put together -- it doesn't even have a battery yet. I also have a trickle charger on a generator, to make sure it will start when (not if) the power goes out.
 
  • Informative
Likes harborsparrow
  • #65
harborsparrow said:
@yungman we are having exactly the same problem with a 2015 Honda Fit. It dies if not run maybe an hour every week. but the 16-year-old Toyota Camry could sit in the driveway unstarted for 2 months and the battery doesn't run down. I am convinced that it is crappy firmware design, somewhere in the car, draining the battery while it's just sitting there. dumb. bad. it ought to be able to go longer than a week!
Check the battery. How old is the battery? I assume 5 years old originally came with the car. I've seen this symptom before that you have to start and drive the car to charge up the battery like this.

Like Mark said, check for leakage, if that's ok, chances are it's the battery. Happened to me twice before already. 5 years is a little short, but I would say 7 to 8 years is about time to change already.

When I said about gas car needed to be driven, I am more referring to the engine needs to be run and oil needs to be circulating, the fuel injector needs to be cycling gas to keep it clean and remove the water condensation etc.
 
  • #66
harborsparrow said:
@yungman we are having exactly the same problem with a 2015 Honda Fit. It dies if not run maybe an hour every week. but the 16-year-old Toyota Camry could sit in the driveway unstarted for 2 months and the battery doesn't run down. I am convinced that it is crappy firmware design, somewhere in the car, draining the battery while it's just sitting there. dumb. bad. it ought to be able to go longer than a week!
I forgot to mention, you said you commute long distance to work, is that on the Fit? Those days I commuted 80 miles a day on my Ford Fiesta, I put 100K miles in like 3 years, the battery died with the exact symptom in less than 4 years. I did not know that, One time I went back to Hong Kong for 2 weeks, I had to have my friend start the car and let it idle every other day. That was stupid. I used to work on cars a lot since, but I don't anymore. I wonder do they have battery tester or something. Ask the gas station. I absolutely don't touch cars for years now, hell, one time I needed to open the hood, I didn't even know where was the release!
 
  • #67
@yungman the battery was replaced in May of this year, and the car has been driven very little since then.
 
  • #68
harborsparrow said:
@yungman the battery was replaced in May of this year, and the car has been driven very little since then.
Then you must have a leak somewhere. What was the reason you replace the battery on the first place?
 
  • #69
harborsparrow said:
@yungman the battery was replaced in May of this year, and the car has been driven very little since then.
That's likely the reason the battery has gone dead. My house is about 5 miles from town, and I have three cars at the moment. I usually take each of them out for at least a trip to town about once a week.
 
  • #70
I don't know of any reason the battery goes dead if not driving that often, that's more for the gas engine. That's the reason I seriously consider buying an all electric car for my next one because I don't have to drive them periodically to keep it alive. We have low mileage on our cars since retirement 15 years ago, our batteries never die as we don't drive that much. My neighbor across the street have a few expensive cars in the garage, he seldom drives them, I never heard he has to charge the batteries. They can sit there for months if it is not drained.

I think in her case, it's either some abnormal leakage or a defective new battery. Since she bought the battery in may, take it back to the place that change the battery and have them test the battery. I am sure they have battery tester. That will eliminate half the possibility.
 
Back
Top