Computer languages tend to be transient

In summary: Java (tried at work but didn't stick around)- Swift (heard good things)- Scala (heard good things)In summary, languages come and go, but COBOL is still around and will likely continue to be for some time.
  • #36
Unoptimized compiled code dutifully replicates transient variable usage designated by the programmer and common optimization practices tend to introduce further usage :cry:
 
Computer science news on Phys.org
  • #37
Like nearly everything, computer languages evolve each having it's own ecological niche.
 
  • #38
Keith_McClary said:
Perhaps filling out these was "coding"?
View attachment 269464

Ha, used to fill these out at High School in Computer Studies, send them off to the local Polytechnic to get punched and run, wait a week for the errors to come back and repeat the process.
 
  • #39
Hardware and computer languages can change so fast, even hardware like the cpu and peripheral chips keep on changing. They are not hard to learn to design and program, but they keep changing and changing. You can be expert one day, and the next day you are obsoleted. Sure, it's not hard to learn new ones, but the idea that you have to keep learning and learning. It's ok when you are young, you are excited and brain is working great. How about when you get to 50+? Your brain slow down, you have family to think about, you get tired...and those young kids come and kick your butt!

I started out in 1979 doing test programming on Z80, then got into designing cpu controller stuffs. By 1983, I started to see the trend. I was aggressive, hard working, I could learn and be on top. But I was thinking what if one day I am not as excited about learning when I get old, I don't want to keep learning and learning and keep having a horse race with the young and enthusiastic people. I change to analog RF design. Analog and RF are very hard to learn, a lot of overhead. BUT once you learn, it really doesn't get obsoleted. So I switched starting in 1984 and never look back.
 
Last edited:
  • Like
Likes jedishrfu and Keith_McClary
  • #40
It would be foolish to become the world's expert on Just One Chip, but I can't help but point out the Z80 is still around. They run toasters and not personal computers these days, but they are easy enough to come by. I could have one on my desk tomorrow. They start at around $5.
 
  • Like
Likes jedishrfu
  • #41
Vanadium 50 said:
It would be foolish to become the world's expert on Just One Chip, but I can't help but point out the Z80 is still around. They run toasters and not personal computers these days, but they are easy enough to come by. I could have one on my desk tomorrow. They start at around $5.
God, I LOVED the Z80 when it came out. I thought it was an Intel killer but that was not to be. Still, it was in really widespread use for many years and as you point out, it's still around.
 
  • Like
Likes jedishrfu
  • #42
I was lucky, as in 1968, my high school had an IBM 1130 computer, and on Saturdays, we could go to the IBM data center to run programs on an IBM 360 model 30, mostly Fortran, some assembly, and a little bit of Cobol. In 1969 / 1970, I also spent some time with a CDC 3150. In college, most of my programming was done on an IBM 370, and also a Wang 720 with printer. My first job in 1973 was with a multi-tasking, multi-system online database running on HP 2100 mini computers (with ten 11 platter 80 MB hard drives, each one bigger than a HP 2100 mini), where we used paper tape (via teletype) to enter or batch edit source code kept on 5 MB removable hard disks (think 14 inch diameter hard case floppy), mostly assembly, with some Fortran used for offline processing. I also did some APL programming, starting in high school, but mostly during the 1970's at a college computer center where I was tutoring. Most of my early early work was on mini computers, but did a few jobs using Z80 and also Z8 (256 registers). I didn't start C programming until 1985 when I got an Atari ST. I also did some 6502 programming for the Atari 8 bit series of "computers". Most of my PC programs are C / C++ or assembly. My last couple of jobs before I retired involved ARM V4. I'm not a fan of Java (issues with the language), or Pythons very slow speed, but I help out others at programming forums with Java and/or Python questions.
 
  • #43
hmmm27 said:
My bad ; my brain tends to drop bits these days.

My point was that you actually didn't have to be able to type in order to be a programmer.
And you never had to (IBM 402 Accounting Machine - 1940's):
IBM402plugboard.Shrigley.wireside.jpg
 
  • #44
.Scott said:
And you never had to (IBM 402 Accounting Machine - 1940's):
View attachment 271881
1940's ? I was using something like that in the 1980's (I think). Walked in for a temp job (I don't recall what, pink collar something), saw a couple of IBM card sorters, and just had to play with them. Repatched a few of the boards to work with the control dials up top (so the operator didn't have to remove and repatch a board for each level of sorting, 5 or 6 times per run), and demonstrated the new and improved method.

And, they said "oh, that's nice"... and immediately went right back to manually repatching everything.

Government, of course.
 
  • #45
hmmm27 said:
1940's ? I was using something like that in the 1980's (I think). Walked in for a temp job (I don't recall what, pink collar something), saw a couple of IBM card sorters, and just had to play with them. Repatched a few of the boards to work with the control dials up top (so the operator didn't have to remove and repatch a board for each level of sorting, 5 or 6 times per run), and demonstrated the new and improved method.
I was working on this equipment in 1969. The high speed card sorters were fun, but the 402 was more "programmable". You could examine card columns and use them as criteria for accumulating values that appeared in other columns or determining how to print and where to print it.
I used to it to prepare the 3-columns per page absentee list for Lowell High School.
 
Back
Top