background preloader

Dissertation Research

Facebook Twitter

Typography. A specimen of roman typefaces by William Caslon Typography is the art and technique of arranging type, type design, and modifying type glyphs (symbolic figures). In traditional typography, text is composed to create a readable, coherent, and visually satisfying whole that does not distract from the content. The goal of good typography is to balance the relationship of letterforms on a page, in order to aid the reader in understanding the message being conveyed. Thus, typography brings harmony between the functional and aesthetic aspects of the written alphabet. Typography is performed by a variety of professionals including typesetters, compositors, typographers, graphic artists, art directors, and comic-book artists. Introduction Typography from the French typographie, which derives from the Greek words τύπος typos = "dent, impression, mark, figure" and γραφία graphia = writing.

"Typography Exists to Honor Content History Woodblock printing Asia Europe Movable type China Korea Japan Roman type. Tools - Why are Apple Macs used so much in the graphic design industry? Fonts Mac OS X arguably comes with better fonts out of the box, but people can argue about this. Where it has a clear advantage, though, is management and ease of use. The built-in font chooser on a mac is leagues ahead of what you get in Windows programs, and the built-in font manager is simple and powerful (for some purposes you still need 3rd party software, but Windows font management absolutely sucks). Font smoothing is also vastly better on a Mac. Windows font smoothing still looks blurry in a lot of places, and Linux is just best not discussed. If you're rendering text to images or pdf graphics, having it come out looking good is a huge advantage. UI Consistency I'm on a Windows box right now, and I have open four applications that all have subtly different UI styles (not including Windows 8 Metro Apps) - all are Microsoft programs.

This also extends to deeper concepts like changing application and system settings. Modifier Keys This, for me, was always one of the big ones. Power. The Mac turns 30: a visual history. In addition to everything else, the first Macintosh was funny. On January 24th, 1984, 30 years ago today, Steve Jobs first revealed the computer he’d been talking about so much onstage at the Flint Center at DeAnza College in Cupertino, and he let it speak for itself. 27-year-old Jobs was all but unrecognizable from the turtleneck-wearing, polished presenter he would become. With long black hair, a gray suit that appears too large, and a green bow tie, he looks like a hippie dressed up for a relative’s wedding.

As he unzips an odd, cooler-sized bag and pulls out a Macintosh with one hand, he appears less confident than relieved. Even moments before he took the stage, then-CEO John Sculley told CNET, Jobs was panicked: “I’m scared shitless,” he told Sculley. “This is the most important moment of my life.” Now, with the benefit of 30 years’ hindsight, Jobs may have been right. When Steve Jobs introduced the Macintosh in 1984, he wasn’t just introducing a computer. The Mac turns 30: a visual history. Steve Jobs introduces the Original iMac - Apple Special Event (1998) Apple Confidential 2.0: The Definitive History of the World's Most Colorful ... - Owen W. Linzmayer. The History of the Apple Macintosh » Mac History. The Apple Macintosh revolutionized the entire computer industry by the year of 1984.

Steve Jobs and his ingenious Macintosh team arranged for the computer to be used by the normal “person in the street” – and not only by experts. “Insanely great” – Steve Jobs could hardly put into words his enthusiasm by the launch of the Macintosh. On the legendary annual general meeting of January 24th, 1984, in the Flint Center not far from the Apple Campus in Cupertino, the Apple co-founder initially quoted Bob Dylan’s “The Times They Are A-Changin’” in order to then polemicize against an imminent predominance of the young computer industry by IBM. The early 1980s. 1981 – Apple II has become the world’s most popular computer, and Apple has grown to a 300 million dollar corporation, becoming the fastest growing company in American business history. With over fifty companies vying for a share, IBM enters the personal computer market in November of 1981, with the IBM PC.1983.

. [ see also the articles: IBM Archives: The birth of the IBM PC. Non-IBM personal computers were available as early as the mid-1970s, first as do-it-yourself kits and then as off-the-shelf products. They offered a few applications but none that justified widespread use. Drawing on its pioneering SCAMP (Special Computer, APL Machine Portable) prototype of 1973, IBM's General Systems Division announced the IBM 5100 Portable Computer in September 1975. Weighing approximately 50 pounds, the 5100 desktop computer was comparable to the IBM 1130 in storage capacity and performance but almost as small and easy to use as an IBM Selectric Typewriter. It was followed by similar small computers such as the IBM 5110 and 5120. IBM's own Personal Computer (IBM 5150) was introduced in August 1981, only a year after corporate executives gave the go-ahead to Bill Lowe, the lab director in the company's Boca Raton, Fla., facilities.

Don Estridge, acting lab director at the time, volunteered to head the project. In sum, the development team broke all the rules. Microsoft MS-DOS early source code. Software Gems: The Computer History Museum Historical Source Code Series IBM did something very unusual for their 1981 personal computer Rather than using IBM proprietary components developed for their many other computers, the IBM PC used industry standard commercial parts. That included adopting the Intel 8088 microprocessor as the heart of the computer.

This “outsourcing” attitude extended to the software as well. Although IBM had prodigious internal software development resources, for the new PC they supported only operating systems that they did not themselves write, like CP/M-86 from Digital Research in Pacific Grove CA, and the Pascal-based P-System from the University of California in San Diego. But their favored OS was the newly-written PC DOS, commissioned by IBM from the five-year-old Seattle-based software company Microsoft. When Microsoft signed the contract with IBM in November 1980, they had no such operating system. The zip file contains four subdirectories: Acknowledgements. Atanasoff-Berry Computer Operation/Purpose. The Atanasoff Berry Computer, later named the ABC, was built at Iowa State University from 1939-1942 by physics professor Dr.

John Vincent Atanasoff and his graduate student, Clifford Berry. Operation/Purpose Why did Atanasoff invent the ABC? What did he intend for it to do? Atanasoff was a professor of Mathematics and Physics, and the 1920s and 30s were a time of active discoveries and new theories for the scientific disciplines, but especially for the physics discipline. Atanasoff’s Ph.D. work, The Dielectric Constant of Helium, was a study in theoretical physics, published in the Physical Review Vol. 36(7) in 1930. Atanasoff’s work required a great deal of mathematical calculation, which he performed on a Monroe calculator, at the time an advanced calculating machine, but which still required hours and hours of calculations. Since an expert computer1 requires about eight hours to solve a full set of eight equations in eight unknowns, k is about 1/64. History of computers - from the Abacus to the iPhone. By Chris Woodford. Last updated: November 8, 2016.

Computers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same. Read on to learn more about the history of computers—or take a look at our article on how computers work. Photo: One of the world's most powerful computers: NASA's Pleiades ICE supercomputer consists of 112,896 processor cores made from 185 racks of Silicon Graphics (SGI) workstations.

Cogs and Calculators It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Bush and the bomb. Brief History Of Computer. The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage. He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on. Generally speaking, computers can be classified into three generations. Each generation lasted for a certain period of time,and each gave us either a new and improved computer or an improvement to the existing computer. First generation: 1937 – 1946 - In 1937 the first electronic digital computer was built by Dr. John V. Second generation: 1947 – 1962 - This generation of computers used transistors instead of vacuum tubes which were more reliable. Third generation: 1963 - present - The invention of integrated circuit brought us the third generation of computers.

As a result of the various improvements to the development of the computer we have seen the computer being used in all areas of life. Back to the top of the page. Kony 2012 Cover the Night fails to move from the internet to the streets. The Kony 2012 Cover the Night campaign woke up to awkward questions on Saturday after activists failed to blanket cities with posters of the wanted Ugandan warlord, Joseph Kony. The movement's phenomenal success in mobilising young people online, following last month's launch of a 29-minute documentary which went viral, flopped in trying to turn that into real world actions. The campaign aimed to plaster "every city, on every block" around the world with posters, stickers and murals of Kony to pressure governments into hunting down the guerrilla leader, who has waged a brutal, decades-long insurgency in central Africa.

But paltry turnouts on Friday at locations across north America, Europe and Australia left cities largely unplastered and the movement's credibility damaged. "What happened to all the fuss about Kony? " said one typical tweet. "It's just been us the entire day," she said on Friday. S Invention. The printing trade was well established even before Gutenberg's time, using woodblock technology. A sheet of paper was placed on the inked woodblock and an impression taken by rubbing - a complex and time-consuming procedure. The genius of Gutenberg's invention was to split the text into its individual components, such as lower and upper case letters, punctuation marks, ligatures and abbreviations, drawing on the traditions of medieval scribes.

These individual items were then cast in quantity as mirror images and assembled to form words, lines and pages. The master for each letter was cut into the face of a steel block, resulting in a precise relief in reverse – the punch. The next step was to create a matrix by placing the punch on a rectangular block made of a softer metal – usually copper - and striking it vertically with a hammer-blow. The resulting matrix was reworked and straightened to form a right-angled cube. Translation: John Burland. Printing History Timeline. Printing History Timeline 20,671 views 5 faves 2 comments The history of printing dates back from the T'ang Dynasty when the Chinese developed woodblock printing.

Did you work on this visual? Printmaking: Techniques, History, Printmakers. Stencils Another print method is stencil-printing, from which silkscreen printing (serigraphy) is derived. In this process, a design is drawn directly onto the screen, and undrawn areas sealed with glue or varnish. Oil-based ink is then squeezed through the mesh of the silk screen onto paper. Alternative methods of transferring an image to silkscreen are the use of photo stencils. Andy Warhol (1928-87) popularized these techniques in his multiple portraits of 1960s celebrities. Contemporary printmakers often use a combination of conventional and digital techniques as well as the use of digital printers and photographic equipment. For details of other graphic arts, such as fine art photography, read about the Greatest Photographers (c.1880-present).

History Following its invention by Chinese art many centuries previously, fine art printmaking became established during the German Renaissance (1430-1580), during the early period of the Northern Renaissance. Belle Epoque Poster Lithographs. A Brief History of the Poster: Belle Epoque to Post-Modernism | International Poster Gallery. A Brief History of the Poster: Belle Epoque to Post-Modernism | International Poster Gallery.

Design before computers ruled the universe. I’m old enough to remember working with Exacto blades and poisonous chemicals to create layouts for printing. I’m old enough to remember printing. I’m young enough to be considered an expert at computer programs, so there’s a bright side to this story and the memories are the truth…except I’m too old to be sure they are the correct memories, but not old enough to accidentally wear my underwear outside of my pants. Where was I? Oh, yes…the short and weird history of the days before computers were in every design studio, home, car, video game, and smart phone. Fun fact: The computers in your cell phone and car are more powerful than the computers that were aboard Apollo 11.

For the young out there, Apollo 11 was the space flight that landed on the moon in 1969…and it wasn’t fake…the moon is real! When I first went to art school, a required course was Paste Up and Mechanicals. It started with a layout board. Practice made perfect! Some of the art supplies needed to be a designer. Design before computers ruled the universe. Craft in an Age of Change.