Understanding Computer Languages: A Layman's Guide

  • Thread starter jackson6612
  • Start date
  • Tags
    Computer
In summary: A simple analog would be a food recipe - it provides instructions for making a dish. In this case the cook is the computer "executing" the provided steps. Pushing the analogy a little you could say that you transform the ingredients (data) using tools (software libraries and functions) and the result should be predictably the same (output) every time if you just follow those instructions carefully.Thank you for your help!
  • #1
jackson6612
334
1
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.
 
Technology news on Phys.org
  • #2
All computer languages I can think of need boolean algebra to function, I wouldn't say it forms a basis. It is a way to write instructions to a computer - to perform a calculation; draw something; tell a device connected to it do something (output), possibly dependent on another device connected to it (input).

A simple analog would be a food recipe - it provides instructions for making a dish. In this case the cook is the computer "executing" the provided steps. Pushing the analogy a little you could say that you transform the ingredients (data) using tools (software libraries and functions) and the result should be predictably the same (output) every time if you just follow those instructions carefully.

Hope that helps :-)

-S
 
  • #3
jackson6612 said:
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.

The wikipedia into article is pretty good:

http://en.wikipedia.org/wiki/Computer_languages

.
 
  • #4
A computer doesn't understand the symbols. It just reacts to them.
Take a look at this very simple machine

It can add binary numbers but that doesn't mean that it understands what numbers are.
 
Last edited by a moderator:
  • #5
jackson6612 said:
I'm not a computer science or science student - quite a layman with some basic education in these areas. Therefore, please keep your replies simple, so that your help is appreciated fully. Thanks.

I have quite a few questions about computer languages. I would ask them in steps.

Is this true Boolean algebra is basis for all the languages?

How does a computer language function in general? A human learns language from the society and has organs to articulate them. A computer language, I suppose, is a set of some symbols as are words of natural language for humans. But computer is built up from electronic devices, how does it understand those random symbols? Some analogy?

Thank you for your help and time.

Hey there Jackson.

Boolean algebra is the basis behind computation. When you have transistors in certain configurations you can build logical devices that do boolean operations. You can then use these to do all your arithmetic operations and other computational operations.

Typically what happens is that you have what is called a platform. An example is for example x86 which is the platform used in the majority of PC's.

The platform has a specific architecture and a specific instruction set. The architecture basically recognizes binary code and executes the code in the binary form.

Now the first step above binary code is assembler code. Instead of having to write the code using ones and zeros, you can write code that has symbols for each instruction, labels for positions in code, and you can write constant numbers or strings (text) and the assembler will turn your code into binary that the computer understands.

A higher level language like say C/C++ or BASIC let's you write code that's easier to read and understand. In assembler for every machine instruction you write an assembler instruction. With languages like C/C++ and BASIC you can write more intuitive instructions and the compiler turns them into the machine code that the CPU can recognise.

I don't know if this answers you're question, but at the lowest level on the CPU is done using logical gates because logical gates are used to do everything from arithmetic to decision making.
 
  • #6
jackson6612 said:
Is this true Boolean algebra is basis for all the languages?
Boolean logic is the base logic used to create the lowest level components of a computer, such as the arithmetic component, which performs math operations like add, subtract, multiply, divide. Languages for computers don't directly deal with this boolean logic, but instead with the instruction set and operands (data) that a particular computer uses.

How does a computer language function in general?
A computer language is translated into the native instructions and data (operands) used by a particular computer. It allows a person to specify a sequence of logical steps and define the data, inputs, and outputs to be used, without having to get into the lowest level of computer specific details.

The oldest computers had to be manually wired to perform a specific set of operations. On some old computers, a person could enter a program by toggling switches that represent 0's and 1's to enter commands and data into the computer's memory using a "front panel". Some types of computers used "plug boards" for programming, where the plug board was programmed by plugging in a large number of wires:

http://en.wikipedia.org/wiki/Plugboard

For machine level programming, a very low level assembler could read some form of human input, via paper tape or punched cards, allowing an operation code to be specified as a few letters, and data operands to be written in hex or octal. The person was burdoned with having to determine the locations in memory for instructions and data, and had to rewrite the program if instructions or data were moved around.

The next step up in assembler languages allows a group of letters to be used as names for data or instruction locations (labels), and the assembler can assign the actual addresses or offsets to be used when it converts the assembly language program into machine language. This allowed instructions and data to be moved without having to rewrite programs.

More advanced assemblers include macros, where a single line of macro code could be expanded into multiple machine language instructions. For IBM 360 based mainframes, the simple assembler was called BAL (Basic Assembly Language), and the macro version called ALC (assembly language compiler). Generally there was a library of common macros provided by the computer maker so that programmers wouldn't have to recreate their own set for common operations.

Compiled languages were the next step. Here a single line of code may consist of a mathematical expression that results in many machine level instructions.

The next step up from this are code generators that allow a person to draw a user interface using the tools provided by the code generator, typically "drag and drop". For example a person selects a "dialog box" and drags into the desired location on what will become the user interface for the program. Once this is done, the code genreator creates the code to be used by a compiler to produce the user interface part of the program code. The person then adds the non-user interface code between the generated code to create the completed program. The code generator can tell the difference between code it generated and code added by a programmer so that updating the user interface and/or non-interface code is not too difficult.
 
Last edited:
  • #7
computers are really fast deterministic (0 and 1, False and True) calculators

CS is designing ways to do things with the speed
 

Related to Understanding Computer Languages: A Layman's Guide

1. What is a computer language?

A computer language is a set of rules and symbols used to write programs that computers can understand and execute. It is the medium through which humans communicate with computers to perform specific tasks.

2. How many computer languages are there?

There are thousands of computer languages, but the most commonly used ones are Java, Python, C++, and HTML.

3. Do I need to learn multiple computer languages?

It depends on your goals and what you want to achieve. If you're interested in a specific field or job, you may have to learn the language commonly used in that field. Otherwise, learning one or two languages can be enough to get started.

4. Is learning a computer language difficult?

Learning a computer language can be challenging, but it's not impossible. It requires dedication, practice, and patience. With the right resources and approach, anyone can learn a computer language.

5. What are the benefits of understanding computer languages?

Understanding computer languages allows you to write programs, develop applications, and automate tasks. It also enhances problem-solving skills, logical thinking, and opens up career opportunities in the tech industry.

Similar threads

  • Programming and Computer Science
Replies
11
Views
2K
  • Programming and Computer Science
Replies
14
Views
1K
Replies
2
Views
328
  • Programming and Computer Science
Replies
29
Views
3K
  • Programming and Computer Science
Replies
10
Views
2K
  • Programming and Computer Science
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
40
Views
7K
  • Computing and Technology
2
Replies
44
Views
4K
  • Programming and Computer Science
Replies
9
Views
2K
Back
Top