background preloader

Supercomputer

Facebook Twitter

How Amazon Followed Google Into the World of Secret Servers. Chris Pinkham once oversaw the hardware and software that ran Amazon.com and went on to lead the team that built the company’s seminal EC2 internet service.

How Amazon Followed Google Into the World of Secret Servers

Photo: Ariel Zambelich/Wired Chris Pinkham was walking through a data center that would one day house Amazon’s seminal cloud computing service — the Elastic Compute Cloud — when he came face to face with a cage of Google machines. This was a decade ago, when Pinkham oversaw the hardware and software that ran Amazon, and the company was considering a spot in the data center, which housed machines for many web operations and other businesses. Google would later pull a curtain around its data-center hardware, moving much of it into private facilities, but in those days, it was easier for competitors like Pinkham to lay their eyes on Google machines.

Pinkham was struck by how different the machines looked — and how hot they were. “They were clearly not your average Dell, HP, IBM servers. The trend also extends beyond servers. U.S. Lab's "Titan" Named World's Fastest Supercomputer. Update November 12: Titan named world's fastest supercomputer.

U.S. Lab's "Titan" Named World's Fastest Supercomputer

In a breakthrough that harnesses video-game technology for solving science's most complex mysteries, the U.S. government's new Titan machine was named the world's fastest supercomputer. Deployed just two weeks ago, Titan is the fastest, most powerful, and most energy-efficient of a new generation of supercomputers that breach the bounds of "central processing unit" computing. The announcement in Salt Lake City marks a return to the top of the the closely watched, semiannual TOP500 list for the U.S. Titan supercomputer debuts for open scientific research. Forecasting for weather like this week's "Frankenstorm" may become a lot more accurate with the help of the Department of Energy's Titan supercomputer, a system that launched this month for open research development.

Titan supercomputer debuts for open scientific research

The computer, an update to the Jaguar system, is operated in Tennessee by Oak Ridge National Laboratory, part of the DOE's network of research labs. Researchers from academia, government labs, and various industries will be able to use Titan -- believed to be one of the two most powerful machines in the world -- to research things such as climate change and alternative fuels. "Why care about these big computers? Error Report. CardSpring. How Google Spawned The 384-Chip Server. Static.googleusercontent.com/external_content/untrusted_dlcp/research.google.com/en/us/pubs/archive/36448. 2012: The year storage becomes a celebrity. News Analysis January 30, 2012 05:54 AM ET Computerworld - While data storage has always been a necessary building block for technology, it's rarely garnered as much attention as it has in the past two years.

2012: The year storage becomes a celebrity

The reason: Corporate and retail consumers are being forced to store greater amounts of data and they need to make that data more useful -- and accessible. "Storage is going to become something everyone wants to know about," said Steve Wojtowecz, vice president of Storage Software Development at IBM. Who Has the Most Web Servers? - Data Center Knowledge. Data Centers Using Less Power Than Forecast, Report Says. The report, by Jonathan G.

Data Centers Using Less Power Than Forecast, Report Says

Koomey, a consulting professor in the civil and environmental engineering department at Stanford University, found that the actual number of computer servers declined significantly compared to 2010 forecasts because of this lowered demand for computing and because of the financial crisis of 2008 and the emergence of technologies like more efficient computer chips and computer server virtualization, which allows fewer servers to run more programs. The slowing of growth in consumption contradicts a 2007 forecast by the Environmental Protection Agency that the explosive expansion of the Internet and the computerization of society would lead to a doubling of power consumed by data centers from 2005 to 2010. In the new study, prepared at the request of The New York Times, Mr.

Koomey found that electricity used by data centers worldwide grew significantly, but it was an increase of only about 56 percent from 2005 to 2010. Though Mr. In an earlier paper, Mr. Weed09. Charity Engine. Efficient computing · Google Data Centers. Providing our users with fast, innovative products requires significant computing power.

Efficient computing · Google Data Centers

Data centers – which are large facilities containing lots of computers – account for most of Google’s energy needs. Report: Google Uses About 900,000 Servers - Data Center Knowledge. A Google admin works on a server inside a container in one of Google's early data centers.

Report: Google Uses About 900,000 Servers - Data Center Knowledge

(Source: Google). Have Google watchers been overestimating the number of servers in the company’s data center network? Recent guesstimates have placed Google’s server count at more than 1 million. But new data on Google’s energy use suggests that the company is probably running about 900,000 servers. Google never says how many servers are running in its data centers. Google Pours “Incredible” Computing Power into Antibody Drug Discovery With Adimab. Luke Timmerman2/3/10 Google is the undisputed king of Internet search and advertising, but its second act as a company might be to invent a new computer model for efficiently discovering targeted antibody drugs.

Google Pours “Incredible” Computing Power into Antibody Drug Discovery With Adimab

“Google is committing incredible resources to it. Google offers scientists epic computing power - Oddware. New program to harness Google’s massive computing power. Today, Google announced Google Exacycle for Visiting Faculty, a new academic grant program that will provide 1 billion hours of computational core capacity to a small group of qualified researchers.

New program to harness Google’s massive computing power

These researchers are tackling a variety of problems that require massive amounts of computational power to advance their disciplines. In the future, we think that Google Exacycle could also help companies create new business opportunities in a variety of industries, including human genome sequencing in biotech, Monte Carlo simulations in financial services, and complex rendering and CGI in entertainment, as well as address other challenging issues in energy, agriculture, and manufacturing. Amazon Architecture. This is a wonderfully informative Amazon update based on Joachim Rohde's discovery of an interview with Amazon's CTO.

Amazon Architecture

You'll learn about how Amazon organizes their teams around services, the CAP theorem of building scalable systems, how they deploy software, and a lot more. Many new additions from the ACM Queue article have also been included. Amazon grew from a tiny online bookstore to one of the largest stores on earth. They did it while pioneering new and interesting ways to rate, review, and recommend products. Greg Linden shared is version of Amazon's birth pangs in a series of blog articles. What Happens When Apps Go On Sale?: Revenue Up 22% On iPhone, 29% On Android. In a new research report from Distimo, the app store analytics provider examined two different ways that allow mobile developers to get a bump in both their download numbers and revenue. One way, which is within the developers’ control, is putting the app on sale. Within the first day, iPhone developers see an average increase of 41% in revenue using this method, and see revenue increases of 22% by the sale’s end.

Android apps, however, rose just 7% on day one, but closed out the sale with higher percentage gains than either iPhone or iPad. The second method Distimo looked into is getting the app featured in the app store. This is up to the app store’s operator, like Google or Apple, of course. When a developer decides to put an application on sale, there’s a delicate balance that has to be achieved. To examine what happens during when apps go on sale, Distimo examined the 100 top grossing apps in the iPhone App Atore, iPad App Store and Android Market. BOINC. Boing Boing. $5K PC Takes On $4.6M Supercomputer. Antwerp (Belgium) - Recent advances in general purpose GPU computing are beginning to shift perceptions in supercomputing applications. Public Data Sets on Amazon Web Services (AWS) Click here for the detailed list of available data sets.

Here are some examples of popular Public Data Sets: NASA NEX: A collection of Earth science data sets maintained by NASA, including climate change projections and satellite images of the Earth's surfaceCommon Crawl Corpus: A corpus of web crawl data composed of over 5 billion web pages1000 Genomes Project: A detailed map of human genetic variation Google Books Ngrams: A data set containing Google Books n-gram corpusesUS Census Data: US demographic data from 1980, 1990, and 2000 US CensusesFreebase Data Dump: A data dump of all the current facts and assertions in the Freebase system, an open database covering millions of topics.

Supercomputer Predicts Civil Unrest. In Isaac Asimov's "Foundation" series, the future of masses of people can be predicted with "psychohistory," a method of predicting future political and social trends, using a device called the "Prime Radiant. " In the 1950s, there wasn't the math or the computational power available to make such a thing reality. Now there might be. Prof Promises Supercomputer on Every Desktop. Virginia Tech researcher Wu Feng hopes his work on the HokieSpeed supercomputer will help make supercomputing more accessible (Photo:Virginia Tech) When Wu Feng looks at an iPad, he sees something more than a great way to play Fruit Ninja. To him, Apple’s sleek device looks more like a compute node on a supercomputer of the future: 1.5 gigaflops of computer power just waiting to be harnessed.

TechMan: Supercomputers in the cloud. Charity Engine: The Even Cheaper Cloud Supercomputer? The Charity Engine in 8 slides. Charity Engine: The Ethical Supercomputer That Can Win You $10,000. SETI@Home, a massive distributed computing effort hunting for alien life, is beloved by space geeks and Jodie Foster fans everywhere. But distributed computing--grabbing CPU time from individual computers to generate supercomputing-like abilities--isn’t limited to quests for aliens. The technology is used to assemble proteins (Rosetta@home), detect earthquakes, and more. The latest distributed computing project, dubbed Charity Engine, uses surplus PC time for all sorts of projects.

In the process, it raises cash for charity and offers participants the chance to win cold, hard cash. The idea for Charity Engine didn’t come from a charity, or even a large university seeking extra computing power. Anyone who wants to help can download the Charity Engine app, which runs in the background of your computer. Charity Engine’s biggest future client is Wolfram Mathematica, but the system could be used for everything from climate modeling to molecular modeling.