The first programming languages \u200b\u200bwere created during. The first high-level programming language is Planckalkühl. The era of a new formation

Casper Beyer, A Brief Totally Accurate History Of Programming Languages.

1800

Joseph Marie Jacquard teaches the loom to read punched cards and thus creates the first multi-threaded data processor. His invention was met with hostility by the silk weavers who foresaw the birth of Skynet.

1842

Ada Lovelace got bored with her noble pursuits. She sketched in notebook what would later become known as the first published computer program... The only inconvenience was that the computer had not yet been invented.

1936

Alan Turing invents everything, but this does not justify him in the eyes of the British court, which sentenced him to chemical castration.

Later, the queen pardoned him, but, unfortunately, he was already long dead by that time.

1957

John Backus creates FORTRAN, the first language used by real programmers.

1959

Grace Hopper invents the first manufacturing-oriented programming language and calls it “common business-oriented language”, or COBOL for short.

1964

John George Kemeny and Thomas Kurtz decided that programming was too hard, so they needed to get back to basics. They call their programming language BASIC.

1970

Niklaus Wirth develops Pascal. It is one of several languages \u200b\u200bin which he contributede Wirth: He loved creating languages.

He also formulated Wirth's Law, rendering Moore's Law meaningless, since the developers wouldand to write such bloated programs that even mainframes will not keep up with them. This will later be proven through the invention of Electron.js.

1972

Dennis Ritchie got bored at work at Bell Labs and decided to create C with its curly braces, and the project was a huge success.Subsequently, he added segmentation errors and other developer-friendly features to improve performance.

Since he still had a couple of hours left, he and his friends at Bell Labs decided to create a sample program to demonstrate the C language. So they created an operating system called Unix.

1980

Alan Curtis Kaye invents object-oriented programming and calls it Smalltalk. In Smalltalk, everything is an object, even an object is an object.

1987

Larry Wall, with his experience in the religious field, becomes a preacher, and hise perl makes it a doctrine.

1983

Jean David Ishbia noticed that Ada Lovelace's program never started and decided to create a language and name it after her (Ada). But the language doesn't start either.

1986

Brad Cox and Tom Love decided to make an unreadable version of C based on Smalltalk. They called it Objective-C, but no one can understand its syntax.

1983

Bjarne Stroustrup travels back to the future and notices that C requirestoo little time to compile. He adds every feature he can think of to the language and calls it C ++.

Programmers all over the world agree with this because they have a brilliant excuse to watch cat videos and read xkcd at work.

1991

Guido van Rossum doesn't like curly braces, so he invents. When choosing the language syntax, the author was inspired by Monty Python and his flying circus.

1993

Roberto Jeruzalimski and her friends decided that they needed a local, Brazilian scripting language.AT during localization, a mistake was made that caused the indices to start counting from 1 instead of 0.The language is also called Lua.

1994

Rasmus Lerdorf creates a templating engine for his own CGI scripts for his home page and publishes its dot files on the Internet.

The world decides to use these dot files all over the place, and Rasmus furiously adds some additional database bindings to them and calls the result PHP.

1995

Yukihiro Matsumoto was not too happy and noticed that other programmers were also unhappy. He creates Ruby to make them happy. After Ruby was created, “Matz” is happy, the Ruby community is happy, everyone is happy.

1995

Brendan Eich takes the day off to develop a language that will power every web browser in the world and ultimately Skynet. He first went to Netscape and said the language was called LiveScript, but Java gained popularity during the code review, soit was decided use curly braces and rename the language to JavaScript.

It turned out that Java is a trademark, which could lead to problems, so JavaScript was later renamed to, but it is still called in the old way.

1996

James Gosling invents Java, the first truly overly verbose object-oriented programming language where design patterns prevail over pragmatism.

2001

Anders Hejlsberg reinvents Java and calls it C # because C programming seems cooler than Java. A new version Everyone likes Java because it is completely different from Java.

2005

David Heinemeyer Hansson creates a web framework later named Ruby on Rails. People forget that they were two different things.

2006

John Resig writes a helper library for JavaScript. Everyone thinks it's a language and makes a career for themselves on copy-paste jQuery code from the Internet.

2009

Ken Thompson and Rob Pike decide to create a language similar to C, but more "commercial", with more security tools and with Gophers (Gopher is a rodent, a character in the stories about Winnie the Pooh - approx. Transl.) As mascots ...

They call the language Go, decide to make it open source and start selling branded knee pads and helmets with Gopher.

2010

Graydon Horus also wants to create a language like C. He calls it Rust. Everyone is demanding that every piece of their programs be immediately rewritten in Rust. Graydon wants something more shiny and starts working on Swift for Apple.

2012

Anders Hejlsberg wants to write C # in web browsers. It creates TypeScript, which is essentially JavaScript with a lot of Java.

2013

Jeremy Ashkenas wants to be as happy as Ruby developers, so he creates CoffeeScript, which translates to JavaScript but looks more like Ruby. Jeremy never became as truly happy as Matz and the Ruby developers.

2014

Chris Latner creates Swift with the main goal of not being Objective-C. Ultimately, this language resembles Java.

The first implemented computer programming language high level is FORmula TRANslator. It was created by a group of programmers at the IBM Corporation between 1954 and 1957. A few years after its creation, the commercial sales of Fortran began - before that it was carried out either using machine codes or symbolic assemblers.

First of all, Fortran became widespread in the scientific and engineering environment, where calculations were performed on it.

One of the main advantages of today's Fortran is the huge number of programs and subroutine libraries written in it. In thousands of packages of this language you can find packages for solving complex integral equations, matrix multiplication, and so on. These packages have been created over many decades - they have not lost their relevance to this day. Most of their libraries are distinguished by good documentation, debugging and high efficiency, however, their Fortran code is constantly being automatically converted into modern software.

Fortran implementation history

After developing an effective alternative language called Fortran, the computer community was skeptical about the new product. Few believed that Fortran would make programming faster and more efficient. However, over time, scientists appreciated the capabilities of the language and began to actively use it to write intensive software calculations. Fortran was especially suitable for technical applications, in which it was greatly helped by the complex collection of all types of data.

Modern Fortran has been supplemented with capabilities that make it possible to effectively use new software technologies and program computing architectures.

After Fortran's overwhelming success, European companies began to fear that IBM would take the lead in the computer industry. The American and German communities created their committees for the development of a universal programming language, but later they merged into one committee. Its specialists developed a new language and called it International Algorithmic Language (IAL), but since ALGOrithmic Language quickly became the common name of the novelty, the committee had to change the official name of the IAL committee to Algol.

In the fifties of the twentieth century, with the advent of vacuum tube computers, the rapid development of programming languages \u200b\u200bbegan. Computers, which were significantly more expensive at the time than developing any program, required highly efficient code. Such code was developed manually in the language Assembler... In the mid-1950s, the algorithmic programming language FORTRAN was developed for IBM under the leadership of John Backus. Despite the fact that there were already language developments that convert arithmetic expressions into machine code, the creation of the FORTRAN language (FORmula TRANslator), which provides the ability to write a calculation algorithm using conditional operators and I / O operators, became the starting point of the era of algorithmic programming languages.

To the language FORTRAN there were requirements for creating highly efficient code. Therefore, many language constructs were originally developed with the IBM 407 architecture in mind. The success of the development of this language led to the fact that manufacturers of other computer systems began to create their own versions of translators. With the aim of some possible at that time unification of the language, FORTRAN IV, developed in 1966, became the first standard called FORTRAN 66.

ALGOL (ALGOrithmic Language) was developed under the leadership of Peter Naur in the late 1950s as an alternative to FORTRAN, originally oriented towards the IBM architecture. The main goal pursued by the developers of this language was architecture independence computing system... In addition, the creators of the ALGOL language sought to develop a language convenient for describing algorithms and using a notation system close to that used in mathematics.

The FORTRAN and ALGOL languages \u200b\u200bwere the first languages \u200b\u200bto focus on computational programming.

Language PL / I, the first versions of which appeared in the early 60s, was originally focused on the IBM 360 and extended the capabilities of FORTRAN with some features of the COBOL language developed in the same years. Despite the certain popularity of the PL / I language among programmers who worked on IBM computers and machines of the EU series, it is currently of purely theoretical interest.

In the late 1960s, Simula-67 was developed under the leadership of Nyard and Dahl, using the concept of user-defined data types. In fact, it is the first language to use the concept of classes.

In the mid-70s, Wirth proposed the language Pascal, which immediately became widely used. At the same time, at the initiative of the US Department of Defense, work began to create a high-level language called Ada - in honor of Ada Lovelace, a programmer and daughter of Lord Byron. The creation of the language began with the definition of requirements and the development of specifications. Four independent teams worked on the project, but they all used as a basis pascal language... In the early 80s, the first industrial compiler of the language was developed Ada.

Universal programming language FROM was developed in the mid-70s by Denis Ritchie and Ken Thompson. This language became a popular system programming language and at one time was used to write the kernel operating system UNIX. The C language standard was developed by the ANSI Standards Institute working group in 1982. The international standard for the C language was adopted in 1990. The C language formed the basis for the development of the C ++ and Java programming languages.

As well as algorithmic languages in parallel, the languages \u200b\u200bintended for processing business information, as well as the languages \u200b\u200bof artificial intelligence, developed. The former is COBOL (COmmon Business Oriented Language), and the latter is languages LISP (LISt Processing) and Prolog... LISP, developed in the 1960s under the leadership of J. McCarthy, was the first functional list-processing language to be widely used in game theory.

With the advent personal computers languages \u200b\u200bhave become integral parts of integrated development environments. There were languages \u200b\u200bused in various office programssuch as VBA (Visual Basic for Application).

In the 90s, with the spread of the Internet, the possibility of distributed data processing is expanding, which is reflected in the development of programming languages. Languages \u200b\u200bfocused on building server-side applications are emerging, such as Java, Perl and PHP, document description languages \u200b\u200b- Html and XML... Traditional programming languages \u200b\u200bC ++ and Pascal are also undergoing changes: a programming language begins to mean not only the functionality of the language itself, but also the class libraries provided by the programming environment. The emphasis from the specification of the programming languages \u200b\u200bthemselves is shifted to the standardization of mechanisms for the interaction of distributed applications. New technologies appear - COM and CORBA, specifying the interaction of distributed objects.

Let me single out a certain general trend in the development of programming languages. The astute reader has probably guessed long ago what I am about to say. Languages \u200b\u200bare evolving towards more and more abstraction. And this is accompanied by a drop in efficiency. The question is: is abstraction worth it? Answer: worth it. It is worth it, because an increase in the level of abstraction entails an increase in the level of programming reliability. Poor efficiency can be combated by building faster computers. If the memory requirements are too high, you can increase the amount. This, of course, takes time and money, but it can be solved. But there is only one way to deal with errors in programs: they must be corrected. Better yet, don't commit. Better yet, make them as difficult as possible. And this is exactly what all research in the field of programming languages \u200b\u200bis aimed at. And you will have to accept the loss of efficiency.

The purpose this review was an attempt to give the reader an idea of \u200b\u200ball the variety of existing programming languages. There is often an opinion among programmers about the "general applicability" of a particular language (C, C ++, Pascal, etc.). This opinion arises for several reasons: lack of information, habit, inertia of thinking. I tried to offset the first factor slightly. As for the rest, I can only say that a real professional should constantly strive to improve his professional qualifications. And for this you need not be afraid to experiment. So what if everyone around them writes in C / C ++ / VB / Pascal / Perl / Java / ... (underline the necessary)? Why not try something new? What if it turns out to be more effective? Of course, before deciding to use a new language, you need to carefully study all its features, including the availability of an effective implementation, the ability to interact with existing modules, etc., and only then make a decision. Of course, there is always a risk of going the wrong way, but ... Only the one who does nothing is not mistaken.

And further. I have heard and sometimes participated in discussions like “language A is better than language B”. I hope that after reading this review, many will be convinced of the senselessness of such disputes. The maximum that can be discussed is the advantages of one language over another when solving a particular problem in certain conditions. Here, there really is something to argue about sometimes. And the decision is sometimes far from obvious. However, to argue "in general" is obvious nonsense.

This article is intended to be a response to those who shout "the language of the X MUST DIE". I hope that the answer turned out to be quite adequate and convincing. I also hope that the article has, in addition to polemical, and cognitive value.

It is very important to know the general history of programming languages \u200b\u200band the history of the development of known and unknown languages. In this article, you will get to know this, but first, let's remember “What is a programming language?”.

Programming language is called a system of notation and rules that allows you to write a program for solving a problem in the form of sequential text in a form convenient for a person.

50s

In the fifties of the twentieth century, with the advent of vacuum tube computers, the rapid development of programming languages \u200b\u200bbegan. Programming began with writing programs directly in the form of machine instructions (in codes, as programmers say). Computers, which were significantly more expensive at the time than developing any program, required highly efficient code.

To facilitate coding, a machine-oriented one was developed, which allowed writing machine instructions in a symbolic form. Assembly language depended on the instruction set of a particular computer. It was convenient enough for programming small tasks that required maximum execution speed.

However, it was difficult to develop large projects in assembly language. The main problem was that a program written in Assembler was tied to the architecture of a particular computer and could not be ported to other machines. When improving the computer, all programs in Assembler had to be rewritten.

Almost immediately with the advent of computers, high-level languages \u200b\u200bwere developed, i.e. languages \u200b\u200bthat do not depend on a specific architecture. To execute a program in a high-level language, it must first be translated into the language of machine instructions. A special program that performs such a translation is called a translator or compiler.

The translated program is then executed directly by the computer. There is also the possibility of translating the program into an intermediate language that does not depend on the architecture of a particular computer, but, nevertheless, is as close as possible to the language of machine instructions.

Then the intermediate language program is executed special programwhich is called the interpreter. It is also possible to compile on the fly, when the executable fragment of the program is translated from the intermediate language into the language of machine instructions immediately before execution.

In the mid-1950s, a high-level algorithmic programming language FORTRAN was developed for IBM under the leadership of John Backus. Despite the fact that there were already language developments that convert arithmetic expressions into machine code, the creation of the FORTRAN language (FORmula TRANslator), which provides the ability to write a calculation algorithm using conditional operators and I / O operators, became the starting point of the era of high-level programming languages.

ALGOL (ALGOrithmic Language) was developed under the leadership of Peter Naur in the late 1950s as an alternative to FORTRAN, originally oriented towards the IBM architecture. The main goal pursued by the developers of this language was independence from the specific architecture of the computing system.

In addition, the creators of the ALGOL language sought to develop a language convenient for describing algorithms and using a notation system close to that used in mathematics. The FORTRAN and ALGOL languages \u200b\u200bwere the first languages \u200b\u200bto focus on computational programming.

60s

In the late 1960s, Simula-67 was developed under the leadership of Nyard and Dahl, using the concept of user-defined data types. In fact, it is the first language to use the concept of classes.

70s

In the mid-70s, Wirth proposed the Pascal language, which immediately became widely used. At the same time, at the initiative of the US Department of Defense, work began to create a high-level language called Ada - in honor of Ada Lovelace, a programmer and daughter of Lord Byron.

The creation of the language began with the definition of requirements and the development of specifications. Four independent teams worked on the project, but they all used Pascal as the basis. In the early 1980s, the first industrial compiler for the Ada language was developed.

Development of C

The universal programming language was developed in the mid-70s by Denis Ritchie and Ken Thompson. This language became a popular system programming language and was once used to write the kernel of the UNIX operating system.

The C language standard was developed by the ANSI Standards Institute working group in 1982. The international standard for the C language was adopted in 1990. The C language formed the basis for the development of programming languages \u200b\u200band Java.

The C language made it possible to really get rid of Assembler when creating operating systems. For example, almost all the text of the operating room unix systems is written in C and thus does not depend on a particular computer.

The main advantage of C is its simplicity and the absence of pseudoscientific solutions. The mechanism of passing parameters to a function (only by value) is described simply and clearly. The programmer who creates a C program always clearly understands how the program will be executed.

The concept of a pointer, static and automatic (stack) variables of the C language as closely as possible reflect the structure of any modern computer, so C programs are efficient and easy to debug.

Currently, the overwhelming majority of programs are written in C and C ++. Any operating system interface (the so-called API - Application Program Interface), i.e. a set of system calls intended for developers application programs, as a rule, is a set of functions in the C language.

Along with algorithmic languages, languages \u200b\u200bintended for processing business information, as well as languages \u200b\u200bof artificial intelligence, developed in parallel. The former includes COBOL (COmmon Business Oriented Language), and the latter includes LISP (LISt Processing) and Prolog.

LISP, developed in the 1960s under the leadership of J. McCarthy, was the first functional list-processing language to be widely used in game theory.

90s

In the 90s, with the spread of the Internet, the possibilities of distributed data processing expanded, which was reflected in the development of programming languages. There were languages \u200b\u200bfocused on creating server-side applications, such as Perl and, document description languages \u200b\u200b- and XML.

The traditional programming languages \u200b\u200bC ++ and Pascal have also undergone changes: programming language began to mean not only the functionality of the language itself, but also the class libraries provided by the programming environment.

The emphasis from the specification of the programming languages \u200b\u200bthemselves was transferred to the standardization of mechanisms for the interaction of distributed applications. New technologies have appeared - COM and CORBA, which specify the interaction of distributed objects.

Scopes of programming languages

Currently, programming languages \u200b\u200bare used in a wide variety of areas of human activity, such as:

  • scientific computing (languages \u200b\u200bC ++, FORTRAN, Java);
  • system programming (languages \u200b\u200bC ++, Java);
  • information processing (languages \u200b\u200bC ++, COBOL, Java);
  • artificial intelligence (LISP, Prolog);
  • publishing (Postscript, TeX);
  • remote information processing (Perl, PHP, Java, C ++);
  • description of documents (HTML, XML).

Based on the history of programming languages, we can say that over time, some languages \u200b\u200bhave evolved, acquired new features and remained in demand, while others have lost their relevance and today are of purely theoretical interest at best.