How programming languages are created?

  • Thread starter pairofstrings
  • Start date
  • Tags
    Programming
In summary: All a compiler does is check the code for syntax. If there are errors, it should alert you about the errors. If syntax is correct, it does not necessarily make the program run the way it should. In other words, building a better compiler doesn't mean the programming language is better.All a compiler does is check the code for syntax. If there are errors, it should alert you about the errors. If syntax is correct, it does not necessarily make the program run the way it should. In other words, building a better compiler doesn't mean the programming language is better.
  • #1
pairofstrings
411
7
If we write better compiler does it mean that programming language is getting better?
 
Technology news on Phys.org
  • #2
All a compiler does is check the code for syntax. If there are errors, it should alert you about the errors. If syntax is correct, it does not necessarily make the program run the way it should. In other words, building a better compiler doesn't mean the programming language is better.
 
  • #3
Ivan92 said:
All a compiler does is check the code for syntax.


No, a compiler translates from a source language to a target language. The latter is often machine code or an intermediate language such as Java bytecode.

pairofstrings, programming languages are often documented first, and then a compiler is implemented. If the compiler is better because it is closer to what is specified by the document(s), then we might not say the language has changed, but rather the compiler is more compliant to the language specification. The compiler might also be better because it produces code that runs faster or is more compact.

We would probably only claim the language has been revised if there were new specifing documents.
 
Last edited:
  • #4
"better" is too ambiguous to mean anything. A compiler can be "better" in many ways: faster, more memory efficient, faster object code, size-optimized object code, etc. A programming language can be more expressive, have more features, etc.
 
  • #5
To basically just restate what's already been said (we ALL think our own way of saying things is superior ... sorry about that), a compiler is just a tool to implement a language and does NOTHING to make the language better or worse (it might make how the language WORKS better or worse, but that is in the implementation, not in the language.

To make a language work better can be a VERY good thing, so good compilers are important, but it's even more important to have languages that are helpful in allowing humans to construct algorithms that get done whatever it is they need to have done.

The progression of languages from machine language to assemblers and then interpreters, and compilers has been an evolution of languages, not of implementation (which necessarily followed along to implement the improved languages).

So, based on what I just said, I'd say no, better compilers don't have anything to do with making better languages, but they can have a lot to do with better implementations of languages, which I see as a different thing.
 
  • #6
pairofstrings said:
If we write better compiler does it mean that programming language is getting better?

While new compilers do have a habit of adding new "language" features (of which some are non-standard features specific to the compiler, like say compiler directives), many updates are often optimizations that are used to overcome existing compiler limits, or to better optimize the compiled results in terms of generating better output.

Also you should be aware that different languages are created for different specific purposes. Languages like C/BASIC/Java/FORTRAN are created for specific purposes. BASIC is good to run toy programs or models where speed is not an issue. C on the other hand has a very close resemblence to assembler although it is a lot easier to read (and you can even embed assembler code and use it from your C program under some environments).

Consequently there are languages that solve optimization problems, list processing languages (like LISP), and many others.

Again the different languages are built for (a) specific purpose(s) and when you understand those purposes it becomes a lot easier to understand the language and appreciate it. If someone has a purpose and needs a new language (or a substantial extension to an existing one), then one is usually created.
 
  • #7
Your title asks one question, then your post asks a different question.

pairofstrings said:
How programming languages are created?
As mentioned above, one or more people create some type of document describing the language. A document describing a programming language and a compiler or interpreter for that programming language could be developed somewhat concurrently, but I don't know if this actually occurred for any specific programming language.

pairofstrings said:
If we write better compiler does it mean that programming language is getting better?
As mentioned above, it depends on what you mean by "better". The language isn't changed by the compiler, but part of the choice of which programming language is best for a particular computer and program may depend on the quality of the compilers and the corresponding size or speed of the code generated by the compilers that are available for that particular computer.
 
  • #8
Question 1:

In language 1 if the instruction goes like this : add x y
In language 2 if the instruction goes like this : x add y

My understanding is that we need to have two different compilers which can understands the instructions of language 1 and 2. Am I right? I mean we cannot have same compiler compiling instructions of both Java and C,or instructions of different operating systems. Right?

Question 2:

If we have 'C' language code like this:

# include <stdio.h>
main(){
c = a + b;
printf(c);
}
When above code is compiled it might look like this(not familiar with machine language):
11111000 stdio.h
110011()
10101010 01 10
001100(01)
Is it an object code or machine code? I think of it as an object code because machine code has to be only 0's and 1's. There shouldn't be any stdio.h in the code. True?

Now.
Question 3:

If you see above main() statement when it is compiled it got converted into 1111100. Similarly, the other statements also got converted into 0's and 1's.
And every time I compile the same statements in our above 'C' program we will get the exact object code( in other languages exact intermediate code). Right?
Ok now, that means compiler has to know that the statements should be represented by a particular binary numbers only. In this case, the main() statement is represented as 11111000. That means the compiler has to remember which statement has to be converted to what binary numbers.
That means if we are able to make/program a compiler then we are able to create a new programming language. Or make changes(adding new features, as Chiro said) to existing ones. Is it right? That is my understanding. Help please.

Question 4:
How to program/make a compiler? I know it has evolved as phinds said, but can you tell me if we are using any language or something else in it's creation?
Thanks.
 
Last edited:
  • #9
pairofstrings said:
Question 1:
My understanding is that we need to have two different compilers which can understands the instructions of language 1 and 2. Am I right? I mean we cannot have same compiler compiling instructions of both Java and C,or instructions of different operating systems.
A single program may be able to compile two similar programming languages, but this would be unusual. Some compilers are able to produce more than one type of machine code, for example, Visual Studio can produce 32 bit or 64 bit code. Going back to the 1960's, Cobol has an "environment division" which includes a source computer (the computer the program is compiled on) and an object computer (the computer the program is to be run on), but I'm not sure how many actual implementations supported multiple computers.


pairofstrings said:
Question 2, 3:
If we have 'C' language code like this:

# include <stdio.h>
main(){
c = a + b;
printf(c);
}
The issue here is that the C compliler will include machine code to call main(), and also all of the code required for printf(). It will end up being much larger than the minimal machine code required to read two numbers from memory, add them, and and store the result in memory.

pairofstrings said:
Question 3, 4:
How to program/make a compiler?
This can get tricky. An initial version of a compiler will have to be written in some other language, perhaps machine level language like assembly, or perhaps some other high level programming language, or a working compiler for one machine is modififed to produce code for another machine, or an emulator on the other machine is created and used to emulate the machine the compiler current works on. Once the initial version of a compiler is working, then the compiler may be re-written and updated in it's own language.
 
Last edited:
  • #10
pairofstrings said:
Question 2:

If we have 'C' language code like this:

# include <stdio.h>
main(){
c = a + b;
printf(c);
}
When above code is compiled it might look like this(not familiar with machine language):
11111000 stdio.h
110011()
10101010 01 10
001100(01)
Is it an object code or machine code? I think of it as an object code because machine code has to be only 0's and 1's. There shouldn't be any stdio.h in the code. True?

You do not understand what the #include directive does. Essentially, #include <stdio.h> it results in the code of the file stdio.h being pasted in place of the #include line, before compilation. As such, we would not expect to see "stdio.h" in the object file.

The object files will typically be the output of the compiler. These object files may contain machine code, as well as other things such as object file headers and data sections.


pairofstrings said:
Question 3:

If you see above main() statement when it is compiled it got converted into 1111100. Similarly, the other statements also got converted into 0's and 1's.
And every time I compile the same statements in our above 'C' program we will get the exact object code( in other languages exact intermediate code). Right?
Ok now, that means compiler has to know that the statements should be represented by a particular binary numbers only. In this case, the main() statement is represented as 11111000. That means the compiler has to remember which statement has to be converted to what binary numbers.
That means if we are able to make/program a compiler then we are able to create a new programming language. Or make changes(adding new features, as Chiro said) to existing ones. Is it right? That is my understanding. Help please.

It's not as if statements translate into binary numbers with typical implementations of a high level language such as C. It's considerably more complicated than that.
 
  • #11
MisterX said:
It's not as if statements translate into binary numbers with typical implementations of a high level language such as C. It's considerably more complicated than that.

I agree that translation could be more complicated. But by your first sentence do you mean that the compilers are made using 'C' language?
 
  • #12
rcgldr said:
The issue here is that the C compliler will include machine code to call main(), and also all of the code required for printf().

This is implementation specific, but in many cases, the entire code for printf is not included by the compiler. Instead, printf is part of a linked library. For example, with Linux this may be "libc" and with Windows this may be one of the "C runtime libraries."
 
  • #13
pairofstrings said:
I agree that translation could be more complicated. But by your first sentence do you mean that the compilers are made using 'C' language?

No, that't not what I meant. But, I'm sure compilers have been made using the C language, such as gcc which was "written primarily in C".
 
  • #14
rcgldr said:
The issue here is that the C compliler will include machine code to call main(), and also all of the code required for printf().

MisterX said:
This is implementation specific, but in many cases, the entire code for printf is not included by the compiler.
It wasn't clear to me if the original poster was asking about object modules which include external links to be resolved by the linker, or was asking about executables, which would include library code (or overlay handlers). The compiler would at least need to generate the code required to call printf().

MisterX said:
No, that't not what I meant. But, I'm sure compilers have been made using the C language, such as gcc which was "written primarily in C".
How was the initial gcc compiler created? You'd need an existing C compiler in order to compile C code. I mentioned this above, that this is either done by cross compiling from another machine, or by creating the initial version of a compiler in a language already supported by the target machine.

As an early example, Altair Basic's roots go back to an 8008/8080 emulator that ran on a PDP-10. The paper tape loader mentioned in the wiki article had to be toggled into memory using the Altair's front panel:

http://en.wikipedia.org/wiki/Altair_BASIC
 
Last edited:
  • #15
rcgldr said:
How was the initial gcc compiler created? You'd need an existing C compiler in order to compile C code. I mentioned this above, that this is either done by cross compiling from another machine, or by creating the initial version of a compiler in a language already supported by the target machine.

Can you give me little idea about the difference between cross compilers and bootstrapping?
I am new to computer science.
 
  • #16
pairofstrings said:
cross compilers
As a modern example of a cross compiler, note that ARM processors are often embedded into the chips that go into consumer devices and computer peripherals:

http://en.wikipedia.org/wiki/ARM_architecture

Programmers can get an ARM toolset that runs on Wintel systems (Windows running on Intel processors). This includes a compiler, linker, emulator (includes it's own debugger), and debugger interface for the actual ARM processor using its "jtag" interface. Since the compiler and linker run on an Intel processor but produce code for the ARM processor, that would be an example of a cross compiler.

pairofstrings said:
bootstrapping
Boot - for a PC, this is done in the BIOS which is stored in some type of prom. For some early mini and micro computers, the boot strap program had to be manually entered via toggle switches on the front panel. One clever idea used in the ancient Monrobot mini computer was like a music box, except the pins on the drum were used to toggle switches to enter the bootstrap program (in this case to read and then run a program from paper tape). Wiki article:

http://en.wikipedia.org/wiki/Booting

There's also a more general usage of the term bootstrapping used in the computer industry of developing a more complex environment from a simpler one. The wiki article mentions this here:

http://en.wikipedia.org/wiki/Bootstrapping#Software_bootstrapping
 
Last edited:
  • #17
Mentor comment:
The questions raised in this thread fall into the category of questions that are beyond the scope of an internet forum.pairofstrings: While you do not realize this, what you are doing is asking us to write multiple books just for you, teach multiple classes in computer science just to you.

Langauge theory and compiler design are upper level undergraduate / lower level graduate classes in college. It takes a long time, many classes, many books, to get from the "I am new to computer science" stage to the stage where a fair answer can be given to your questions on language theory and compiler design.

What you'll get by reading wikipedia is an apparent hodgepodge of nearly incomprehensible stuff. Note: I am not disparaging wikipedia here. It is an encyclopedia; this is a generic problem with encyclopedias. Multiple books are needed to answer these questions fairly and comprehensively. An encyclopedia cannot do full justice to such questions. Nor can an internet forum.To those helping pairofstrings: Your work so far is commendable. Continue helping if you wish. However, don't be afraid to provide the short and sweet "Don't ask me to write a book" answer if it looks like you aren't helping or if answering the question at hand would indeed require you to write a book.
 
  • #18
Can anyone suggest me some books which could help me grab this subject? I have a book on compilers. I have this book- Compilers: Principles, Techniques, and Tools.
I want a book which could explain me how compilers after converting source code to machine code interacts with the hardware in a computer, like a microprocessor for example.
 
  • #19
pairofstrings said:
Can anyone suggest me some books which could help me grab this subject? I have a book on compilers. I have this book- Compilers: Principles, Techniques, and Tools.
I want a book which could explain me how compilers after converting source code to machine code interacts with the hardware in a computer, like a microprocessor for example.

Complilers do not interact with the hardware in any way at all. Once they create the object code they are out of the picture and things fall to a loader to load the object code which then operates the machine.

What you want, I guess, is a book on how machine code words, which is to say, really, a book on computer hardware architecture.

I've been out of the technical loop too long to be able to recommend a good book.
 
Last edited:
  • #20
pairofstrings said:
Can anyone suggest me some books which could help me grab this subject? I have a book on compilers. I have this book- Compilers: Principles, Techniques, and Tools.
I want a book which could explain me how compilers after converting source code to machine code interacts with the hardware in a computer, like a microprocessor for example.

In terms of knowing how code interacts with the hardware, this is both operating system and platform dependent.

What happens is the operating system installs a lot of its software that directly talks to hardware, and if you want to do something that accesses the hardware in any way, you call a routine which eventually either calls an interrupt, gets access to a hardware port, or reads some area of memory that the hardware uses.

This issue is very complicated.

If you are using windows, you should get the driver development kit and look at the documentation and examples. The thing is you can't get access to hardware directly, and for the most part, this has been the case since Windows 95.

When DOS was around it was very easy to get access to hardware: you could call software interrupts that DOS installed, you could directly read and write to hardware ports, and you could read and write to memory in a free manner.

So yeah in terms of suggestions, look at the DDK for windows, or download the linux kernel source code, or get a library that has software and hardware interrupts, hardware ports, and memory associated with hardware.

Again I want to point out, this is not a trivial thing and can take a little while to get used to depending on the software (windows, DOS, linux etc) and hardware involved.
 
  • #21
pairofstrings said:
I want a book which could explain me how compilers after converting source code to machine code interacts with the hardware in a computer, like a microprocessor for example.

I studied a Microprocessor course at uni that covered all that but I don't have immediate access to the textbook 'Microprocessors'.

As part of the assessment we had to write a program to input and then multiply two two digit numbers in assembly language and output the result, all in 256 bytes of memory. The second part involved keying in the hex machine code into a Motorolla processor card and running the program.

http://en.wikipedia.org/wiki/Assembly_language
 
  • #22
Hi.
  1. Suppose that I have created a software and I am assuming that it will run only on one particular platform. What do I have to do if I want to make it run on different platforms as well?
    Do I have to change the hardware or the compiler or the programming language or should I make changes to the existing software code?
  • Also when I have some restrictions like I should use only the given hardware and make the program work for different platforms. What should I do then?
  • I am restricted to use only so-and-so compiler and not the other one and should make the software work everywhere.
  • I am restricted to use only this particular language and not other one and should make the software work everywhere.
2. Suppose I have created a software and I know that it is not working as good as it is working on other platforms. What changes should I make? Should I change the hardware or the compiler or the programming language or should I make changes to the existing software code?
  • Also when I have some restrictions like I should use only the given hardware and make the program work with same performance. What should I do then?
  • I am restricted to use only so-and-so compiler and not the other one and should make the software work with same performance.
  • I am restricted to use only this particular language and not other one and should make the software work with same performance.
 
  • #23
pairofstrings said:
Hi.
  1. Suppose that I have created a software and I am assuming that it will run only on one particular platform. What do I have to do if I want to make it run on different platforms as well?
    Do I have to change the hardware or the compiler or the programming language or should I make changes to the existing software code?

The difference between platforms will be the byte code outputted. If the platform is different, chances are the output is going to be different. Remember the code is represented as a sequence of bits in memory and depending on the platform, the data will be treated differently for each platform.

If your platform changes, you have to change the output to a format that the new platform will be able to read and use.

If the architecture is significantly different in how it runs code, then that is also another issue. In this case you will probably have to implement major changes to the compiler, or you could use bootstrapping to create a compiler that converts one format to another.

Have a look at machine code by doing a google search.

Bootstrapping is probably a good idea to investigate.

  • Also when I have some restrictions like I should use only the given hardware and make the program work for different platforms. What should I do then?
  • I am restricted to use only so-and-so compiler and not the other one and should make the software work everywhere.
  • I am restricted to use only this particular language and not other one and should make the software work everywhere.
2. Suppose I have created a software and I know that it is not working as good as it is working on other platforms. What changes should I make? Should I change the hardware or the compiler or the programming language or should I make changes to the existing software code?
  • Also when I have some restrictions like I should use only the given hardware and make the program work with same performance. What should I do then?
  • I am restricted to use only so-and-so compiler and not the other one and should make the software work with same performance.
  • I am restricted to use only this particular language and not other one and should make the software work with same performance.

Usually what happens is someone creates a language (like C/C++) and then if its useful it becomes standardized. The standardization creates rules that implementations must follow and compilers are created for the different platforms adhering to those rules.

The idea is that a compiler is created for each platform, but it works exactly the same way for each platform. Each implementation is adapted for that particular architecture and the output is generated for that particular architecture.

So in terms of language, it helps if it becomes standardized first where the intended behaviour of the program is thoroughly examined and specific rules are drafted that help establish all of the specifics that are required to create a decent standard.

It might help you if you look at some of the standards (ISO standards) for things like C++, and documents relating to .NET and Java.
 
  • #24
chiro said:
The difference between platforms will be the byte code outputted. If the platform is different, chances are the output is going to be different. Remember the code is represented as a sequence of bits in memory and depending on the platform, the data will be treated differently for each platform.

If your platform changes, you have to change the output to a format that the new platform will be able to read and use.

If the architecture is significantly different in how it runs code, then that is also another issue. In this case you will probably have to implement major changes to the compiler, or you could use bootstrapping to create a compiler that converts one format to another.

Have a look at machine code by doing a google search.

Bootstrapping is probably a good idea to investigate.
Usually what happens is someone creates a language (like C/C++) and then if its useful it becomes standardized. The standardization creates rules that implementations must follow and compilers are created for the different platforms adhering to those rules.

The idea is that a compiler is created for each platform, but it works exactly the same way for each platform. Each implementation is adapted for that particular architecture and the output is generated for that particular architecture.

So in terms of language, it helps if it becomes standardized first where the intended behaviour of the program is thoroughly examined and specific rules are drafted that help establish all of the specifics that are required to create a decent standard.

It might help you if you look at some of the standards (ISO standards) for things like C++, and documents relating to .NET and Java.

I think I am missing something here. I thought when I compile a C/C++ program using a C/C++ compiler, I get a '.obj' file which contains the 'object-code' and when this object code is added with header files, consequently,(I think 'linker' will be used here to generate an executable file, which links 'object-code' and header-files) an executable file i.e., a '.exe' file is created which can be run on the other computer system which have the same platform as that of the system on which the file(.exe) was created.The '.exe' file is given to the microprocessor.

And also I learned that when I compile a java program using a java compiler I get a '.class' file which contains the 'byte-code' this code will be given to the JVM and the JVM does the job of storing all the objects and methods in the respective memory areas in the JVM architecture and the Execution Engine(Interpreter and JIT compiler) in the JVM architecture will convert the 'byte-code' into machine code and then it is given to the microprocessor.

In both the cases, object-code and byte-code will be converted into 0's and 1's in the later stages that is to be processed by the processor. Right?

My question is:
When I compile C/C++ programs using C/C++ compilers I am getting a 'object-code'.
When I compile Java programs using java compilers I am getting a 'byte-code'.
Will I get object-code or byte-code when I compile, say a Lisp program or any program in any other language?
 
Last edited:
  • #25
pairofstrings said:
I thought when I compile a C/C++ program using a C/C++ compiler, I get a '.obj' file which contains the 'object-code' and when this object code is added with header files, consequently,(I think 'linker' will be used here to generate an executable file, which links 'object-code' and header-files)

The linker doesn't deal with header ('.h') files, the compiler does. More precisely, the preprocessor does. Header files are simply chunks of source code (text) that the preprocessor effectively "pastes" into your code at the location of the #include directives. The preprocessor then passes the combined source code to the compiler which compiles it.

The linker links your compiled code ('.obj' or '.o' or whatever) with other object code files. Typically an object-code library file is associated with a header file that you have to #include in the source code for your program. The header file contains function prototypes etc. that the compiler uses to ensure that your program is calling those functions correctly. The linker combines the (pre-compiled) object code for those functions with your own object code.

When I compile C/C++ programs using C/C++ compilers I am getting a 'object-code'.
When I compile Java programs using java compilers I am getting a 'byte-code'.
Will I get object-code or byte-code when I compile, say a Lisp program?

My understanding is that Lisp is usually interpreted, not compiled.
 
  • #26
You should also look at Instruction Set Architecture.

http://en.wikipedia.org/wiki/Instruction_set
An instruction set, or instruction set architecture (ISA), is the part of the computer architecture related to programming, including the native data types, instructions, registers, addressing modes, memory architecture, interrupt and exception handling, and external I/O. An ISA includes a specification of the set of opcodes (machine language), and the native commands implemented by a particular processor.

Instruction set architecture is distinguished from the microarchitecture, which is the set of processor design techniques used to implement the instruction set. Computers with different microarchitectures can share a common instruction set. For example, the Intel Pentium and the AMD Athlon implement nearly identical versions of the x86 instruction set, but have radically different internal designs.

Some virtual machines that support bytecode for Smalltalk, the Java virtual machine, and Microsoft's Common Langauge Runtime virtual machine as their ISA implement it by translating the bytecode for commonly used code paths into native machine code, and executing less-frequently-used code paths by interpretation; Transmeta implemented the x86 instruction set atop VLIW processors in the same fashion.
 
  • #27
jtbell said:
My understanding is that Lisp is usually interpreted, not compiled.

It's not an "either/or" situation. Efficient list-processing implementations often do compilation "on the fly" - i.e each "statement" in the code is compiled the first time you try to execute it, rather than compling everything up front. That preserves the ability to for the program to operate on itself (in simple terms, Lisp programs can rewrite their own source code when they are running) but the compiler can optimise the code when it is compiled. For example list-processing compilers often transform recursion into iteration - recursion is often the most efficient way to describe an algorithm in the source code, but iteration is the most efficient way to run it on a conventional computer.
 
  • #28
AlephZero said:
It's not an "either/or" situation. Efficient list-processing implementations often do compilation "on the fly" - i.e each "statement" in the code is compiled the first time you try to execute it, rather than compling everything up front.
This is a hallmark of interpreted languages - translate a section of code, execute, and repeat.
AlephZero said:
That preserves the ability to for the program to operate on itself (in simple terms, Lisp programs can rewrite their own source code when they are running) but the compiler can optimise the code when it is compiled. For example list-processing compilers often transform recursion into iteration - recursion is often the most efficient way to describe an algorithm in the source code, but iteration is the most efficient way to run it on a conventional computer.
 

FAQ: How programming languages are created?

How are programming languages designed and created?

Programming languages are designed and created by a team of computer scientists, software engineers, and linguists. They first identify the purpose and target audience of the language, then create a set of rules and syntax to define how the language will function. This is followed by writing a compiler or interpreter, which translates the code into machine-readable instructions.

What is the process of developing a programming language?

The process of developing a programming language involves several stages, including conceptualization, design, implementation, and testing. During the conceptualization stage, the purpose and features of the language are defined. In the design stage, the syntax and rules of the language are created. Implementation involves writing the compiler or interpreter, and testing ensures that the language functions as intended.

How long does it take to create a programming language?

The time it takes to create a programming language can vary greatly depending on the complexity and goals of the language. It can take anywhere from several months to several years to fully develop a programming language. Additionally, the process of refining and updating the language may continue even after it is released to the public.

What are the key factors that influence the design of a programming language?

The key factors that influence the design of a programming language include the purpose of the language, its target audience, the existing programming languages and their features, and the desired level of complexity. Additionally, the availability of resources and the potential for future updates also play a role in the design process.

Can anyone create a programming language?

Technically, anyone can create a programming language. However, it requires a deep understanding of computer science and programming concepts, as well as strong skills in software development. It also takes a significant amount of time, effort, and resources to create a functional and useful programming language.

Back
Top