background preloader

C Programming (Bourne Shell)

Facebook Twitter

Yacc. Yacc produces only a parser (phrase analyzer); for full syntactic analysis this requires an external lexical analyzer to perform the first tokenization stage (word analysis), which is then followed by the parsing stage proper.[5] Lexical analyzer generators, such as Lex or Flex are widely available.

Yacc

The IEEE POSIX P1003.2 standard defines the functionality and requirements for both Lex and Yacc. Some versions of AT&T Yacc have become open source. Strip (Unix) In Unix and Unix-like operating systems, the strip program removes unnecessary information from executable binary programs and object files, thus potentially resulting in better performance and sometimes significantly less disk space usage.

strip (Unix)

This information may consist of debugging and symbol information; however the standard leaves the scope of changes up to the implementer. The effect of strip can be achieved directly by the compiler. For instance, in GNU Compiler Collection this option is "-s". The GNU Project ships an implementation of strip as part of the GNU Binutils package. strip has been ported to other operating systems including Microsoft Windows. "strip", The Single UNIX Specification, Version 2, The Open Group, 1997. Strings (Unix) Strings are recognized by looking for sequences of at least 4 (by default) printable characters terminating in a NUL character (that is, null-terminated strings).

strings (Unix)

Some implementations provide options for determining what is recognized as a printable character, which is useful for finding non-ASCII and wide character text. It is part of the GNU Binary Utilities (binutils), and has been ported to other operating systems including Microsoft Windows.[2] Using strings to print sequences of characters that are at least 8 characters long (this command prints the system's BIOS information; should be run as root): dd if=/dev/mem bs=1k skip=768 count=256 2>/dev/null | strings -n 8 | less. Nm (Unix) The GNU Project ships an implementation of nm as part of the GNU Binutils package. /* * File name: test.c * For C code compile with: * gcc -c test.c * * For C++ code compile with: * g++ -c test.cpp */ int global_var;int global_var_init = 26; static int static_var;static int static_var_init = 25; static int static_function(){ return 0;} int global_function(int p){ static int local_static_var; static int local_static_var_init=5; local_static_var = p; return local_static_var_init + local_static_var;} int global_function2(){ int x; int y; return x+y;} #ifdef __cplusplusextern "C"#endifvoid non_mangled_function(){ // I do nothing} int main(void){ global_var = 1; static_var = 2; return 0;} If the previous code is compiled with the gcc C compiler, the output of the nm command is the following: When the C++ compiler is used, the output differs: The differences between the outputs also show an example of solving the name mangling problem by using extern "C" in C++ code.

nm (Unix)

Objdump. Lex (software) Lex is a computer program that generates lexical analyzers ("scanners" or "lexers").[1][2] Lex is commonly used with the yacc parser generator.

Lex (software)

Lex, originally written by Mike Lesk and Eric Schmidt[3] and described in 1975,[4][5] is the standard lexical analyzer generator on many Unix systems, and an equivalent tool is specified as part of the POSIX standard. [citation needed] Though originally distributed as proprietary software, some versions of Lex are now open source. Open source versions of Lex, based on the original AT&T code are now distributed as open source systems such as OpenSolaris and Plan 9 from Bell Labs.

The structure of a Lex file is intentionally similar to that of a yacc file; files are divided into three sections, separated by lines that contain only two percent signs, as follows: Definition section %% Rules section %% C code section The following is an example Lex file for the flex version of Lex. Abc123z.! The program will print: Using Flex and Bison at Macworld.com. Ctags. The original Ctags was introduced in BSD Unix and was written by Ken Arnold, with Fortran support by Jim Kleckner and Pascal support by Bill Joy.

Editors that support ctags[edit] Tag index files are supported by many source code editors, including: Variants of ctags[edit] There are a few variations of the ctags program: Etags[edit] Etags is the ctags utility that comes with Emacs. Exuberant Ctags[edit] Exuberant Ctags includes support for over 40 programming languages with the ability to add support for even more using regular expressions. GNU cflow. History[edit] It was initially an implementation of the cflow UNIX utility. cflow (UNIX utility)[edit]

GNU cflow

C99. Cover of the C99 standards document History[edit] After ANSI produced the official standard for the C programming language in 1989, which became an international standard in 1990, the C language specification remained relatively static for some time, while C++ continued to evolve, largely during its own standardization effort.

C99

Normative Amendment 1 created a new standard for C in 1995, but only to correct some details of the 1989 standard and to add more extensive support for international character sets. The standard underwent further revision in the late 1990s, leading to the publication of ISO/IEC 9899:1999 in 1999, which was adopted as an ANSI standard in May 2000. The language defined by that version of the standard is commonly referred to as "C99".

Design[edit] C99 is, for the most part, backward compatible with C89, but it is stricter in some ways.[3] In particular, a declaration that lacks a type specifier no longer has int implicitly assumed.

Environmental Variables (gcc - C Linux)

Linux c header files. C Programming Language (Linux) Programming (Bourne Shell)