background preloader

Languages

Facebook Twitter

Covariance / contravariance

Context-free grammar. V → w where V is a single nonterminal symbol, and w is a string of terminals and/or nonterminals (w can be empty).

Context-free grammar

A formal grammar is considered "context free" when its production rules can be applied regardless of the context of a nonterminal. No matter which symbols surround it, the single nonterminal on the left hand side can always be replaced by the right hand side. Context-free grammars arise in linguistics where they are used to describe the structure of sentences and words in natural language, and they were in fact invented by the Linguist Noam Chomsky for this purpose, but have not really lived up to their original expectation. By contrast, in computer science, as the use of recursively defined concepts increased, they were used more and more. In linguistics, some authors use the term phrase structure grammar to refer to context-free grammars, whereby phrase structure grammars are distinct from dependency grammars. Regular language. In theoretical computer science and formal language theory, a regular language is a formal language that can be expressed using a regular expression.

Regular language

(Note that the "regular expression" features provided with many programming languages are augmented with features that make them capable of recognizing languages that can not be expressed by the formal regular expressions (as formally defined below).) Alternatively, a regular language can be defined as a language recognized by a finite automaton. In the Chomsky hierarchy, regular languages are defined to be the languages that are generated by Type-3 grammars (regular grammars). Regular languages are very useful in input parsing and programming language design. Formal definition[edit] The collection of regular languages over an alphabet Σ is defined recursively as follows: See regular expression for its syntax and semantics. Examples All finite languages are regular; in particular the empty string language {ε} = Ø* is regular. Then the set . Let. Hack. A Brief, Incomplete, and Mostly Wrong History of Programming Languages. 1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry.

A Brief, Incomplete, and Mostly Wrong History of Programming Languages

Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization. 1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML. 1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them. 1936 - Alonzo Church also invents every language that will ever be but does it better. 1940s - Various "computers" are "programmed" using direct wiring and switches. 1957 - John Backus and IBM create FORTRAN. 1958 - John McCarthy and Paul Graham invent LISP. 1959 - After losing a bet with L. 1965 - Kemeny and Kurtz go to 1964. 1996 - James Gosling invents Java.

Footnotes Edits. Speedcoding. Speedcoding or Speedcode was the first higher-level language created for an IBM computer.[1] The language was developed by John Backus in 1953 for the IBM 701 to support computation with floating point numbers.[2] The idea arose from the difficulty of programming the IBM SSEC machine when Backus was hired to calculate astronomical positions in early 1950.[3] The speedcoding system was an interpreter and focused on ease of use at the expense of system resources.

Speedcoding

It provided pseudo-instructions for common mathematical functions: logarithms, exponentiation, and trigonometric operations. The resident software analyzed pseudo-instructions one by one and called the appropriate subroutine. Speedcoding was also the first implementation of decimal input/output operations. See also[edit] References[edit] ^ Jump up to: a b F. Fortran. Fortran (previously FORTRAN, derived from Formula Translating System) is a general-purpose, imperative programming language that is especially suited to numeric computation and scientific computing.

Fortran

Originally developed by IBM in New York City[1] in the 1950s for scientific and engineering applications, Fortran came to dominate this area of programming early on and has been in continuous use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, computational physics and computational chemistry. It is one of the most popular languages in the area of high-performance computing[2] and is the language used for programs that benchmark and rank the world's fastest supercomputers.

Fortran encompasses a lineage of versions, each of which evolved to add extensions to the language while usually retaining compatibility with previous versions. Capitalization[edit] History[edit] In late 1953, John W.