background preloader

Ada Lovelace: Founder of Scientific Computing

Ada Lovelace: Founder of Scientific Computing
Born: London, England, December 10, 1815 Died: London, England, November 27, 1852 Ada Byron was the daughter of a brief marriage between the Romantic poet Lord Byron and Anne Isabelle Milbanke, who separated from Byron just a month after Ada was born. Four months later, Byron left England forever. Lady Byron wished her daughter to be unlike her poetical father, and she saw to it that Ada received tutoring in mathematics and music, as disciplines to counter dangerous poetic tendencies. Lady Byron and Ada moved in an elite London society, one in which gentlemen not members of the clergy or occupied with politics or the affairs of a regiment were quite likely to spend their time and fortunes pursuing botany, geology, or astronomy. One of the gentlemanly scientists of the era was to become Ada's lifelong friend. In 1835, Ada married William King, ten years her senior, and when King inherited a noble title in 1838, they became the Earl and Countess of Lovelace.

The History of the ENIAC Computer Updated December 16, 2014. "...With the advent of everyday use of elaborate calculations, speed has become paramount to such a high degree that there is no machine on the market today capable of satisfying the full demand of modern computational methods." - from the ENIAC patent (U.S.#3,120,606) filed on June 26, 1947. The ENIAC I In 1946, John Mauchly and John Presper Eckert developed the ENIAC I (Electrical Numerical Integrator And Calculator). The Ballistics Research Laboratory, or BRL, the branch of the military responsible for calculating the tables, heard about John Mauchly's research at the University of Pennsylvania's Moore School of Electrical Engineering. continue reading below our video Loaded: 0% Progress: 0% John Mauchly had previously created several calculating machines, some with small electric motors inside. Partnership of John Mauchly & John Presper Eckert What Was Inside The ENIAC? Contributions of Doctor John Von Neumann Eckert-Mauchly Computer Corporation

Jean Sammet | National Center for Women & Information Technology Background Jean E. Sammet is a retired computer scientist and programmer who is best-known for her work on FORMAC, the first widely used general language and system for manipulating nonnumeric algebraic expressions. Sammet supervised the first scientific programming group for Sperry Gyroscope Co. (1955-1958). She joined IBM in 1961 to organize and manage the Boston Programming Center. During the 1970s and 1980s, she worked for IBM’s Federal Systems Division in various positions, emphasizing programming language issues including Ada. Sammet is the author of “PROGRAMMING LANGUAGES: History and Fundamentals,” which became a standard book on its topic, and was called an “instant computer classic” when published in 1969. She was very active in ACM and held many positions including President, Vice-President, Editor-in-Chief of Computing Reviews, General and/or Program Chair for the first two SIGPLAN History of Programming Languages Conferences (HOPL) in 1978 and 1993.

Socialism: Utopian and Scientific (Chpt. 1) Frederick Engels Socialism: Utopian and Scientific I [The Development of Utopian Socialism] Modern Socialism is, in its essence, the direct product of the recognition, on the one hand, of the class antagonisms existing in the society of today between proprietors and non-proprietors, between capitalists and wage-workers; on the other hand, of the anarchy existing in production. But, in its theoretical form, modern Socialism originally appears ostensibly as a more logical extension of the principles laid down by the great French philosophers of the 18th century. Like every new theory, modern Socialism had, at first, to connect itself with the intellectual stock-in-trade ready to its hand, however deeply its roots lay in material economic facts. The great men, who in France prepared men’s minds for the coming revolution, were themselves extreme revolutionists. One thing is common to all three. This historical situation also dominated the founders of Socialism. The answer was clear. Notes 1.

Rediscovering Utopia | Betterhumans > Column "Without a vision the people perish."—Proverbs 29:18 The month in which Islamic terrorists inspired by a utopian vision of a pan-Islamic Sultanate blew up 50 Turks and Britons in Istanbul might seem a strange one in which to argue for the importance of the utopian dimension in politics. But decidedly pragmatic and nonutopian militants are also killing Iraqis and Americans in Baghdad as part of a well-financed resistance to American "liberation." More often, from medieval peasant revolts to Martin Luther King, utopian visions of a freer, more equal and more united future have helped people mobilize against the crushing pragmatic acceptance of day-to-day tyranny and exploitation. Nonetheless, modern conservatives argue that all utopianism leads inexorably to totalitarianism and death camps since utopianism equals Communism, and democratic capitalism was supposedly just the victory of common sense. But is utopianism really so bad? Transhumanist visions Similarly, eco activist J.P.

Renaissance Now? - Rushkoff I first posted the embryo of this idea on a bbs called the Well back in 1991 or so. I was wondering, at the time, if recent advances in math, physics, technology and culture constituted a new renaissance. The conversation went on for over a year, and became the basis – or at least an the adjunct – for my book, Cyberia. I still find myself coming back to this notion of renaissance – whether I’m speaking about open source culture or religion. The birth of the Internet era was considered a revolution, by many. I prefer to think of the proliferation of interactive media as an opportunity for renaissance: a moment when we have the opportunity to step out of the story, altogether. Take a look back at what we think of as the original Renaissance – the one we were taught in school. Likewise, calculus – another key renaissance invention – is a mathematical system that allows us to derive one dimension from another. The great Renaissance was a simple leap in perspective.

IBM - Archives - Women in technology "If the bringing of women - half the human race - into the center of historical inquiry poses a formidable challenge to historical scholarship, it also offers sustaining energy and a source of strength." Gerder Lerner, 1982 At IBM women have been making contributions to the advancement of information technology for almost as long as the company has been in existence. Where many companies proudly date their affirmative action programs to the 1970s, IBM has been creating meaningful roles for female employees since the 1930s. This tradition was not the result of a happy accident. Instead, it was a deliberate and calculated initiative on the part of Thomas J. Soon IBM had so many women professionals in its ranks that the company formed a Women's Education Division. The tens of thousands of women who have been IBM employees since the 1930s have built upon that foundation, for women now comprise more than 30 percent of the total U.S. employee population. Plugboards and petaflops

Technology Timeline Technology Timeline TimelineHistory Timelines of Events provide fast facts and information about famous events in history, such as those detailed in the Technology Timeline, precipitated a significant change in World history. This major historical event is arranged in the Technology Timeline by chronological, or date order, providing an actual sequence of this past event which was of significance to history. Many historical events, such as detailed in the Technology Timeline, occurred during times of crisis or evolution or change. Many of the famous World events as detailed in the Technology Timeline describe famous, critical and major incidents. The specific period in history detailed in the Technology Timeline led to great changes in the development of World Civilisation. The Technology Timeline timeline provides fast information via timelines which highlight the key dates and major historical significance in a fast information format. Technology Timeline

Women in Technology by Jennie Wood A quick look at the history of technology will reveal that women have been involved from the beginning. Starting with Ada Lovelace, the world's first computer programmer, women have been innovators in technology for years. Here's a closer look at some of the most important women involved in the history and evolution of technology. Ada Lovelace Mathematician, Writer, Computer Programmer Born: December 10, 1815 Died: November 27, 1852 Born Augusta Ada Byron on December 10, 1815, the only legitimate child of poet Lord Byron and Anne Isabella Byron, Ada Lovelace was a writer and mathematician. Lord Byron separated from his wife and left England when Ada was just four months old. Grace Hopper Computer Scientist, U.S. Along with rising to the rank of Rear Admiral in the United States Navy, Grace Hopper was a computer scientist pioneer. The oldest of three children, Hopper was born in New York City. Hedy Lamarr Hedy Lamarr was an internationally known Austrian-American actor.

ARPAnet - The First Internet By Mary Bellis "The Internet may fairly be regarded as a never-ending worldwide conversation." - supreme judge statement on considering first amendment rights for Internet users. On a cold war kind of day, in swinging 1969, work began on the ARPAnet, grandfather to the Internet. One opposing view to ARPAnet's origins comes from Charles M. The first data exchange over this new network occurred between computers at UCLA and Stanford Research Institute. Four computers were the first connected in the original ARPAnet. To send a message on the network, a computer breaks its data into IP (Internet Protocol) packets, like individually addressed digital envelopes. As non-military uses for the network increased, more and more people had access, and it was no longer safe for military purposes. In 1986, one LAN branched out to form a new competing network, called NSFnet (National Science Foundation Network). "The Internet's pace of adoption eclipses all other technologies that preceded it.

Related: