Up to Date on The Second Coming

  • Thread starter rudinreader
  • Start date
In summary, the conversation discusses the evolution of computing and the frustration of not being able to understand the inner workings of modern computers. The participants mention articles on prophesies and the future of computing, as well as their personal opinions on the subject. They also touch on the challenge of understanding assembly language and the importance of computer science. Some mention the difficulty of understanding modern technology while others argue that it is possible with time and effort.
  • #1
rudinreader
167
0
Up to Date on "The Second Coming"

Just reading some old (1990,2000) articles concerning prophesies or the future of computing and the internet, etc..

The Second Coming Manifesto:
http://www.flatrock.org.nz/topics/info_and_tech/second_coming.htm

The Computer for the 21st Century:
http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

It's actually interesting to look at Weiser's home page, because the picture reads: "I have no idea what the future will be". I gather this was to admit being pretty wrong with his ideas.

My personal take is pretty much the opposite of both of these guys, but still I could be dead wrong. I personally am pretty annoyed (or nastalgic) at the early days of the IBM PC, up until Windows 3.1. In those days it was possible to understand how the entire operating system and hardware worked (if they would only give you enough time to study it without changing the technology in the next release!), and you could also understand (via C/C++ etc) exactly how the program you wrote would be executed.

Certainly there were lots of problems and inefficiencies back then. If your program crashed (especially in DOS), then you would have to restart your computer. Another big inefficiency was the lack of rapid application development. Unfortunately, when RAD came to be, we simultaneously lost the ability to know what the programs looked like at an assembly level, etc.

All in all, it's clear to me that the original computer actually had something in it that would be of interest to computer science. The modern computer only simulates computer science, in that there is no way in hell you can understand how the code is processed in the machine.

To make a long story short, I hope that technologies like this one will play a bigger role: http://www.grifo.com/SOFT/uk_bas52.htm. Or I would also be happy if linux would play a role in making interesting computer science more accessible to nonspecialists. I have "glanced" through the "Linux Device Drivers" book, and the fact that that is a starting point seems to me to be kind of a problem. The "embedded linux" seems to me to have more hope than the intel cpu linux. Ideally, in the future you would not have to be a specialist to study linux on cpu's easy enough to understand, etc..

Anyone know of other recent prophecies on the future of computing, or have any ideas of your own? Please post.
 
Last edited:
Computer science news on Phys.org
  • #2
It's called the "law of leaky abstractions" - in Java/C#/ruby you don't have to know what the bytes are doing - but if you do something wrong you have no hope of figuring out what is really happening.

One prediction that is almost becoming true is
"In ten years, computers will just be bumps in cables. (Gordon Bell 1990 )"

We don't think of a phone/Ipod/etc as a computer but we don't go to a big beige box on a desk to play music or send a message.
 
  • #3
Computer science is primarily the study of data structures and algorithms, not assembly language code. In fact, most of what you describe (deep knowledge of the OS and hardware) is actually computer engineering, not computer science.

In fact, I could just as well argue that very high-level languages like Python permit users to work with "pure" CS concepts in a way never before possible in syntactically challenging, feature-poor languages like C++.

- Warren
 
  • #4
It's actually very possible to understand everything that a modern computer and OS does. It just takes some time and commitment, and less than you would think. But it won't come without some effort, just like anything worth while.
 
  • #5
I totally agree that C++ sucks..

What you claim about computer science, that it's just algorithms and data structures, is certainly a simplification. Operating Systems, Compilers, are computer science subjects (moreso than computer engineering) that require you to deeply understand the assembly language, yet one of the hypotheses is that you assume that the assembly language instructions do what they are supposed to do. But in it's nature, that requires an understanding of computer architecture. Nonetheless, it's still computer science.

On the other hand, the claim I made above that modern computers "simulate real computer science" was also a simplification. But part of the reason for my annoyance with modern PC's is that if you are interested in the hardware-software bridge then you are best off working with Lego's Minstorms or the basic-interpreter, as I mentioned above, no modern PC is sufficient for that.
 
  • #6
eieio said:
It's actually very possible to understand everything that a modern computer and OS does. It just takes some time and commitment, and less than you would think. But it won't come without some effort, just like anything worth while.

Sure it's possible, if you work for microsoft, or do what I'm probably going to have to do - read linux device drivers and write a better book (OK, that's bad, it's actually a good book. But still, what I'm saying is that more work needs to be done on linux to make it accessible). But I totally disagree with the sentiment. (And of course, I am referring to actually a very small subset of what computers do, because on a whole I don't think anyone is interested in everything computers do).
 
  • #7
Do you also find it frustrating that your doctor knows more about medicine than you do after reading Wikipedia articles? Does it bother you that Boeing consistently builds better airplanes than you can in your backyard?

- Warren
 
  • #8
chroot said:
Do you also find it frustrating that your doctor knows more about medicine than you do after reading Wikipedia articles? Does it bother you that Boeing consistently builds better airplanes than you can in your backyard?

Ha ha, funny.
 
  • #9
It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?

- Warren
 
  • #10
chroot said:
It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?

- Warren

Actually, it's a real peeve for me. Not so much the complexity but the idea of so many things working sequentially and not failing. That all goes away when I see how it's all made - huge numbers of people who've more or less dedicated their lives to it and still spend years on it at a time. But I don't want to know what one of their CAD (Boeing or otherwise) models is going to look like, that's for sure. I took one look at that IBM Cell cpu layout and choked.
 
  • #11
chroot, I don't disagree with your point of view: that Computer Science is data structures and algorithms, as long as you don't force everyone else to share your point of view - in particular me.

The complexity bothers me because it is unneccessary. Cheap computers with simple architecture and simple operating systems should be made available for $50, instead of trying to add features, when I'm happy with much less than what I already have.

I began this article nuetral, but in reality I must say I am most interested in linux. It is not clear to me what the goals of that project are at the moment, but I am hoping that they don't try to compete with microsoft via matching every feature. Instead, I would like to apply linux to a simple cheap cpu, which I can add all kind's of scientific measurement equiptment on for a cheap price (in the future).

Chroot, you are basically saying that microsoft, apple, and linux are like automobile manufacturers, and you shouldn't try to build one in your back yard. What I am saying, is that I think there's a market for machines that can be understood inside and out, but at the same time more advanced than lego's minstorms.

Does that answer your question?
 
  • #12
There's nothing stopping you from using old hardware and old software until the day you die. Embedded linux is already extremely popular. You can buy 486-based single-board PCs for far less than $50. If you like that route, go for it.

The infrastructure you don't like is actually quite necessary if you want good-looking graphical programs with lots of interoperability and features to be designed in less than a human lifetime. New languages, APIs, RAD, etc. are all "power tools" which software engineers create to allow them to do more in less time. In the same way, Boeing has long since abandoned trying to build airplanes using nothing more than hammers and screwdrivers.

I'm not saying you can't or shouldn't try to build your own computer or software in your backyard -- in fact, I think you should. I just don't think you should lament the fact that the field has developed to the point where the hobbyists can't compete with the professionals. After all, every other major industry has been this way for tens or hundreds of years, and I like that my doctor knows more than Wikipedia.

- Warren
 
  • #13
Showing my age here, but I started out with CP/M and migrated to DOS, writing application programs that would run under Ashton-Tate dBase. I could carry an entire point-of-sale/inventory-control/bookkeeping suite around on a few floppies. Things were wild and wooly in the early DOS years when the open architecture allowed software to directly address hardware, and "cowboy" developers were not above stealing system resources.
 
  • #14
Again I started this topic kind of nuetral, but really my prediction is towards linux.. I really hope that in the future, that if you study basics of computer science in "compilers-architecture-operating systems-networking", that you would be able to understand the actual implementation linux from a deep point of view with a minimum of effort...

This would be geared more towards "serious hobbyists", as a general hobbyist could perhaps be satisfied playing with mindstorms, etc..

I hope the brains behind linux will realize it's more of an important goal to market towards scientists rather than the general desktop market.. and then perhaps old CPU's like the 486 will find larger demand,, but hopefully better, simpler CPU's than the 486 in my opinion.

But that's just my theory in regards to the second coming: linux will improve in this regard, and will thus not go away, regardless of whatever microsoft and others are up to.
 
  • #15
There aren't enough scientists to comprise an entire market for an operating system. Furthermore, there is no incentive for anyone who is not a hobbyist to make anything hobbyist-friendly. Computer hardware and software are some of the largest and fastest emerging technologies the world has ever seen. Untold billions of dollars change hands over them.

Your little fantasyland of the world being rebuilt with hobbyist-friendly 286's and toy operating systems is just that: a fantasyland. Futhermore, I can guarantee you that Linux will not evolve in any such way. I'm sorry to say, but you're basically delusional.

- Warren
 
  • #16
chroot said:
Futhermore, I can guarantee you that Linux will not evolve in any such way. I'm sorry to say, but you're basically delusional.

Just underlines the point of why less people study computer science. I guess I should get back to Rudin..
 
  • #17
What you're saying is basically the equivalent of lamenting how complex and indecipherable EKGs and MRIs are, and how you sure wish we could just go back to the days of drilling holes in each others' heads and sticking leeches on wounds. You know, things the average Joe can understand, because that's what is important.

- Warren
 
  • #18
That's not at all what I'm saying, are you a lawyer or something?? For instance, I'm not saying the general theory of relatively is wrong because I (and the average Joe) don't understand it.

What I'm saying is that this "fastest emerging technology" you mention is really not changing that much as far as I can see. It changed a lot in the 80's and 90's, and in the meantime we wen't through hundreds (exagerration) of hardware upgrades. But since windows 2000, the desktop environment hasn't changed much.

The fact that it has actually stood still is a singularity in comparison to the 80's and 90's. My prediction is actually simple and seems very plausible to me (yet delusional to you). So we have PC's now, and we have the internet now. As long as that stay's fixed, and it probably will, then it's just a matter of time before simplified operating systems on simplified cpu's will show up. I'll just leave it at that for now, I guess.
 
  • #19
The desktop isn't constant at all. Considering Microsoft is deciding what everyone sees lately, you might want to take a look as to what they have up their sleeves.
 
  • #20
dst said:
The desktop isn't constant at all.

No comment.
 
  • #21
chroot I reread this discussion and also some of your quotes:

It's not meant to be funny -- it's an observation. Why does the complexity of modern computer software bother you less than the complexity of modern airplanes?
I'm sorry to say, but you're basically delusional.


After a second read, it seems to me you are kind of coming out of left field with your replies (although you are also implying that I am coming out of left field with my theories..). Although it's true that I began this discussion about how I'm "nastalgic" at the good ole days of DOS, that wasn't really the point. And I totally don't take it personal that you called me delusional, because that is an understatement for Weiser if you read his article below (and he was actually an "influential computer scientist").

The Computer for the 21st Century:
http://www.ubiq.com/hypertext/weiser/SciAmDraft3.html

He actually claims "disposable pieces of paper computers" will perpetuate in society, and people at the time take him seriously.

In contrast, what I say is pretty down to earth. The kind of prediction I am making is not really a prediction at all. It's also just an "observation". The modern PC behemoths, both Windows and Linux, were made in haste in terms of very rapid development of humongous technologies to support what amounts what a tiny CPU can accomplish: communicate text and multimedia over a phone line, display a graphical user interface with mouse, and perform i/o on various devices of varying (sometimes enormous) complexity.

Now don't get me wrong, I am not proposing a design for such a system that has a single 8-bit data register in a single CPU. I'm not an engineer, and really don't know a thing about that. But to rebute the claim you made about "I gaurantee linux won't go in that direction", it's not actually up to the brains behind linux, it seems like it's more going to be a fact of the manufacturers. The software is free, it runs on hundreds of CPU's, and a sufficient subset of linux gets you to the internet, and gives you a GUI.

As long as the current desktop computing situation doesn't change much, and it wont, (actually that is the central point of what I'm saying), then more and more people will buy a $50 linux machine when their toshiba laptop burns out after the warantee expires.. so they can save to buy a new laptop, and ultimately they will put off buying the laptop altogether for economic reasons, since the machine they have gives them all they need..

And the end result, is that some of the said systems will come with complete schematics of how the motherboard is laid out, with instruction manuals about how to do i/o using linux on the parallel port or whatnot, etc.. and a lot of simple things. Unlike DOS, these computers won't disappear. More and more high school students will stumble upon this as a hobby, and as such more interest will grow in computer science, or as you say computer engineering. Well, I would stop in this train of thought here because this aspect - the ability to understand the machine itself, is a personal interest but is not the point of "the prediction".

I personally don't understand why the "ubiquotous" computing people could imagine serious computer users will ever depart from a workstation. Are these conclusions delusional? Perhaps, but still seem much more down to Earth than what some well respected computer scientists have come up with.
 
  • #22
You don't actually need any hardware/sofware for studying Computer Science. All of today's machines are equivalent to a simplistic theoretic model called a "Turing Machine". The hardware implementation on today's computers isn't that interesting if you just want to play around or study Computer Science, because their increased complexity is with the purpose of optimization, not adding new capabilities. Same goes for modern Operating Systems.

In my opinion, if anything today's technology enables students to do much more than they were able to do before. There are such high level languages and libraries that you just need to get acquainted with the basic programming principles to be able to do almost anything. And you are able to understand how the hardware and OS components your program runs on function, but when you do you'll see that such implementation-level details aren't that relevant.

You are also able to get software emulators for older hardware, so that you don't have to go back to the stone age just to study something that's more understandable.
 
  • #23
rudinreader said:
I totally agree that C++ sucks..

What you claim about computer science, that it's just algorithms and data structures, is certainly a simplification. Operating Systems, Compilers, are computer science subjects (moreso than computer engineering) that require you to deeply understand the assembly language, yet one of the hypotheses is that you assume that the assembly language instructions do what they are supposed to do. But in it's nature, that requires an understanding of computer architecture. Nonetheless, it's still computer science.

From a Computer Science perspective, Operating Systems and Compilers are only interesting in that they implement algorithms and data-structures. Actual implementation details, such what assembly language the Compiler generates, or which scheduling algorithm Windows implements are only relevant in Software/Hardware engineering.

Computer Science studies what can be done, in how much space and how much time - the implementation details of that theory is relevant mostly for engineers.
 
Last edited:
  • #24
Just to let you all know my relative background in regards to computer science. I'm not an expert in any computer science subfield (..yet), but not a complete novice either, computer science was my undergraduate minor. (I'm planning to start Phd math program next year.) I took (Hard, in C) Data Structures, Assembly Language, Computer Architecture, Operating Systems, Theoretical Computer Science (so I know what a Turing Machine is..), and I think that's it. I started to take compilers but the teacher was so hard it was beyond description (there was certainly no parser generators allowed in his course!), so I dropped.

I get the impression (understatement) there must have been a different atmosphere, for whatever reason, that a law student at harvard (hint, hint..) would be interested in basic compilers, assembly code, and operating systems.. But nowadays if you are a Phd track math student (like myself) you get advised to avoid digital logic much in the same light that an engineer should be advized not to study calculus: it's hard to comprehend, don't bother.

Is that a contradiction of all I have said up to now? You decide; I don't think so. The point is that part of my prediction is that the embedded systems are going to make it possible for digital logic hobbysts to turn something into a profession if they get good at it, and perhaps a very lucrative one, if you can figure out how to sell linux on $40 lcd computers that can connect to dsl.

About your quote "In my opinion, if anything today's technology enables students to do much more than they were able to do before." Of course it is true. But one cannot help to sense that "modern computing" must be about as immature as thermodynamics before Clausius, not to mention Boltzmann, etc.

In particular, "students can do more than ever before", but the problem is that you have to be a bit more self motivated compared to any other scientific discipline. For instance, it's very straightforward to get a sophisticated mathematics education: Read Rudin. The closest equivalent book in Computer Science is Knuth's Art of Computer Programming (which I'm planning to read through eventually), and yet still there's so more more to read...

Anyways, just thoughts..
 
  • #25
rudinreader said:
But one cannot help to sense that "modern computing" must be about as immature as thermodynamics before Clausius, not to mention Boltzmann, etc.

Actually if there is anywhere that I am wrong or delusional, I think it would be in my above quote. I just realized that the main thing all the crazy computer science prophets had in common was the shared belief that modern computers suck. One realizes that it is entirely possible that it may never actually get any better than it is today (relatively speaking).

**Editing the last thing I said here because it was truly a bit over the top:: that "high school students would be able to read computer schematics, etc. in the future". I was lacking in sleep when I initially wrote that. But perhaps what can be said is anyone who studies embedded systems will be able to do the above said task, for relatively simple logic setups.
 
Last edited:
  • #26
I cannot even tell what your point is anymore, but I have to quibble with one thing: engineers are required to take something like four or five semesters of calculus, as it is necessary for any understanding of electronics (or physics in general). The curriculum doesn't get into real analysis, though.

- Warren
 
  • #27
chroot said:
I cannot even tell what your point is anymore

I guess to summarize... I think Linux is immature in that the associated books are not that informative and still require a great deal of self motivation (I once had the displeasure of having to read through a GTK book and I don't wish that on anyone). Conversely, I don't expect Microsoft to change much in their emphasis on neurocomputing desktop advances - nueropsychological software development in my opinion is good for the development of educational children's games, but Mathematicians for one thing aren't really concerned with finding better way's to manage their "friend's list". I personally can do without future microsoft releases, in the same manner that I will probably never buy a cell phone.

On the issue of "hobbyists building airplanes in their backyards", I should clarify that the manner of computing I envision** is probably inaccessible to high school to undergraduate level, but should be accessible to Master's - Phd level in all Science and Engineering disciplines, in particular not restricted to Computer Engineering. In the same light, there is nothing stopping electrical engineering students from reading Masters to Phd level Math.

** I don't want to suggest it's necessary to understand all 50,000+ lines of operating system code in any case. But there are basic things, like I/O to a simple pressure sensor, that scientifically speaking you should not have the "implementation details" hidden. So to "write a device driver" is something relavent to anyone interested in scientific measurement (including Mathematicians, since after all it offers a very inexpensive laboratory), and preferably you would not need to go through the USB forest.
 
  • #28
The number of mathematicians who need to know how to read low-level data from a digital pressure sensor is vanishingly small. The number of computer scientists who need to know how to read low-level data from a digital pressure sensor is also vanishingly small. In fact, the only group of people who would really be interested in the subject, or expected to know anything about it, would be engineers. You seem to have some grave misunderstandings about where one field ends and another field begins.

Your initial argument seemed to be that we should make computers simpler so that everyone can understand them deeply -- without any rationale for why that would be beneficial. You've now narrowed that argument to apply only to I/O drivers, but you've again failed to explain with any substance the rationale for why that would be beneficial.

In fact, it pretty much just sounds like you're a disgruntled mathematician who ran into trouble while writing a device driver. That's rather like me running into trouble building a turbine in my backyard -- it shouldn't surprise me.

- Warren
 
  • #29
Sure nobody really needs to know anything outside their relative field of specialization but still,,,

I'm not disgruntled about failing at a device driver, in that I know better than to expect to complete a successful project that involves the Windows DDK, for instance. I'm more disgruntled by the fact I must submit there is nothing substantial about all the fuss I have made here... So in terms of developing a coherent philosophical approach to computing, I have run into trouble building the turbine...

But nonetheless, philosophy is something that is accumulated over time and not always in a linear way, much like the technologies that are currently under development that I am actually very optimistic about. I have a sense that no matter what your computing taste is, you are going to be happy in the coming years. So you won't have to go through all the trouble of this guy: http://www.frontiernet.net/~fys/index.htm , to at least in concept achieve whatever it is that gets you going.
 
Last edited by a moderator:
  • #30
I find that you have a very narrow view of what people use computers for, and what they need to be able to do that.

Text messaging, surfing the internet, and reading basic inputs from a sensor are NOT the only things computers are used for! What you're describing is a modern cell phone these days. Computers are made to process huge amounts of data very quickly- video, audio, pictures. There's no way a 486 processor would be able to keep up with a 1080p video stream, or lossless audio compression. Heck, they could barely keep up with decoding a single mp3 in real time! Just because a simpler processor can in theory do the same thing, doesn't mean it can do it nearly as fast as a modern dual or quad-core processor.

What you're saying is basically computers have gotten as good as they can, now it's time to de-evolve again. Personally, I find this to be a very naive view of technology.
 
  • #31
rudinreader said:
it's hard to comprehend, don't bother.

Not everyone feels this way about calculus. Not everyone feels this way about all of the intricate workings of a modern PC.

There are plenty of people, many of them contributors to this forum, who find all manner of challenging subjects both interesting and enjoyable. I suspect you are the sort of person who has a limited taste for challenge. The people driving all of this "complex" technology most surely do not have such a limitation; and it's not likely to ever get easier, from your perspective. But it's likely to get more fun from mine.
 
  • #32
rudinreader said:
In the same light, there is nothing stopping electrical engineering students from reading Masters to Phd level Math.

Except all the terms, theorems and notation you would need to become familiar with. There's a lot of math in CS/E but personally i wouldn't expect to be able to read and fully understand most advanced math papers without first having to go through a lot of research to get acquainted with some of the principles.
 
  • #33
I'm not saying that we should de-evolve.. Instead, I'm simply questioning the prevailing view of the term "obsolete". I'm not saying "we should do (fill in the blank)". And yes, I have a naive view of operating systems and technology, the number of people who can comprehend technology on the whole is probably around the number of people who can comprehend mathematics on a whole: "vanishilngly small".

But take for example that you can buy a dell PC with "free DOS":http://www.dell.com/content/topics/segtopic.aspx/e510_nseries?c=us&cs=19&l=en&s=dhs

And go with it. I'm not saying you should buy it. But if simple technologies (once considered obsolete) exist, then it's not hard or expensive for manufacturers to add it to their product line... And I think that there is going to be more of this in the future, and the conjecture is that when text-surfing and the high-school chemistry example of reading input sensors is all someone wants, then they will be able to buy it and it won't have to be a cell phone (again I don't have a cell phone and probably never will).. but instead who knows, could be like one of those $10 atari joysticks that plug into your tv..

But don't get me wrong, I'm not trying to "impose my view", admittidely I regret that I actually know so little about computers - and clearly everyone who has responded knows more about them than me. Anyways...
 
Last edited by a moderator:
  • #34
rudinreader said:
I'm not saying that we should de-evolve.. Instead, I'm simply questioning the prevailing view of the term "obsolete".

What exactly is the "prevailing view of obsolete"? Just because a new product comes out every 6 months doesn't mean all previous products are rendered obsolete by it...

rudinreader said:
But take for example that you can buy a dell PC with "free DOS":http://www.dell.com/content/topics/segtopic.aspx/e510_nseries?c=us&cs=19&l=en&s=dhs

And go with it. I'm not saying you should buy it. But if simple technologies (once considered obsolete) exist, then it's not hard or expensive for manufacturers to add it to their product line...

So are you saying it's too hard to buy older technology? Check on Ebay, I bought my fiance a laptop for $150 that surfs the internet, plays DVD's, and does everything she wants to do.

rudinreader said:
And I think that there is going to be more of this in the future, and the conjecture is that when text-surfing and the high-school chemistry example of reading input sensors is all someone wants, then they will be able to buy it and it won't have to be a cell phone (again I don't have a cell phone and probably never will).. but instead who knows, could be like one of those $10 atari joysticks that plug into your tv..

Text-surfing would be a terribly inefficient way to surf the internet IMO. The internet is all about MULTImedia! Pictures, videos, sound files, all seamlessly integrated together in webpages. The internet would be a terribly boring place if it was just a big text-based encyclopedia.

The goals you want to accomplish with basic sensors could be done with a TI calculator and some programming; or a BASIC chipset and some soldering; or on an oscilloscope; there's nothing new or inherently exciting about it. Simple tasks like you are describing can be accomplished by even the most rudimentary "computers" in our society, and the best computers in our society can accomplish things that are lightyears ahead.

So let's follow your idea to its logical conclusion; we go back to basics like you are suggesting. You would like to have DOS on a computer; a simple operating system that makes you type in commands. Of course, if you were REALLY starting at the basics you have to write this DOS program as well, but we'll assume we aren't COMPLETELY reinventing the wheel here.

We'll want some kind of file structure to store stuff, and we need to be able to find where we stored things. So we'll make a directory structure on the computer's memory.

To suport the hardware of the computer, you'll need to load the drivers for each component; there could be a couple dozen of them, so the logical thing to do would be to load drivers from a text file that lists all of the drivers you need for static hardware in your system... and we'll go ahead and do the same thing with drivers themselves too.

Next? Well, we want an interface that allows us to easily browse through lots of files quickly and efficiently (dir/w just won't cut it for thousands of files...) so we make up a mouse-keyboard based GUI. It will of course need a driver for the display, and some basic stuff for placing representations of files (icons) and other things like file parameters and such. Since we already have a simple system for sorting files and browsing them (DOS) let's lay our GUI over the top of it.

We'll want to be able to run programs (what good is a computer without at least that?) so we'll make a standard programming language that the GUI/DOS will run off of. this way we can run programs that can control the computer from the quick comfort of our GUI. So we do this, write a few programs for things like surfing the internet, writing notes, performing tasks on the computer, etc.

So, what have we just described? Could it be we have "invented" something that perhaps already exists? Hmm?

...how about an operating system like Windows or Linux! That's really all they are, a program made to increase convenience and efficiency! We can perform tasks using the computer's hardware without us having to type the code in; instead we click a button and a pre-determined set of code is excecuted! Granted Wondws is incredibly bloated these days, but it's due to the HUGE amount of third-party hardware and software it has to support; not some inherent malfunction in the way modern computers operate. If the software the computer ran was specificaly tailored to the computer's hardware (can be done in an open-source OS like Linux) it would be a very streamlined and smooth way to accomplish complicated tasks on the computer.

Woah, I got way out on tangents there...
 
Last edited by a moderator:
  • #35
If you want to get back to basics in computing, why not check this out:

http://www.parallax.com/

Description of a BASIC stamp chip

Parallax.com said:
A BASIC Stamp microcontroller is a single-board computer that runs the Parallax PBASIC language interpreter in its microcontroller. The developer's code is stored in an EEPROM, which can also be used for data storage. The PBASIC language has easy-to-use commands for basic I/O, like turning devices on or off, interfacing with sensors, etc. More advanced commands let the BASIC Stamp module interface with other integrated circuits, communicate with each other, and operate in networks. The BASIC Stamp microcontroller has prospered in hobby, lower-volume engineering projects and education due to ease of use and a wide support base of free application resources.
 
Last edited:

Similar threads

Back
Top