The Internet, peer-reviewed. It could be one of the most important innovations on the Internet since the browser.
Imagine an open-source, crowd-sourced, community-moderated, distributed platform for sentence-level annotation of the Web. In other words, a way to cut through the babble and restore some sanity and trust. False beliefs persist, even after instant online corrections. It seems like a great idea: Provide instant corrections to web-surfers when they run across obviously false information on the Internet.
But a new study suggests that this type of tool may not be a panacea for dispelling inaccurate beliefs, particularly among people who already want to believe the falsehood. “Real-time corrections do have some positive effect, but it is mostly with people who were predisposed to reject the false claim anyway,” said R. Kelly Garrett, lead author of the study and assistant professor of communication at Ohio State University. “The problem with trying to correct false information is that some people want to believe it, and simply telling them it is false won’t convince them.” Factual’s Gil Elbaz Wants to Gather the Data Universe.
FACTUAL sells data to corporations and independent software developers on a sliding scale, based on how much the information is used.
Small data feeds for things like prototypes are free; contracts with its biggest customers run into the millions. Sometimes, Factual trades data with other companies, building its resources. Snopes.com: Urban Legends Reference Pages. 5D optical memory in glass could record the last evidence of civilization. Using nanostructured glass, scientists at the University of Southampton have, for the first time, experimentally demonstrated the recording and retrieval processes of five dimensional digital data by femtosecond laser writing.
The storage allows unprecedented parameters including 360 TB/disc data capacity, thermal stability up to 1000°C and practically unlimited lifetime. Coined as the 'Superman' memory crystal, as the glass memory has been compared to the "memory crystals" used in the Superman films, the data is recorded via self-assembled nanostructures created in fused quartz, which is able to store vast quantities of data for over a million years. The information encoding is realised in five dimensions: the size and orientation in addition to the three dimensional position of these nanostructures.
Scientists ‘freeze’ light for an entire minute. So, as we have learned from Fantastic Voyage, our brain's thoughts are nothing more than little electrical signals traveling up and down our neurons.
So, if we could freeze that light we could freeze our thoughts. Then we could store them in a crystal for back up. I'm sure this will be very useful technology. However I'm sure that just like now people won't be bothered to back up their brain. Million-Year Data Storage Disk Unveiled. Back in 1956, IBM introduced the world’s first commercial computer capable of storing data on a magnetic disk drive.
The IBM 305 RAMAC used fifty 24-inch discs to store up to 5 MB, an impressive feat in those days. Today, however, it’s not difficult to find hard drives that can store 1 TB of data on a single 3.5-inch disk. But despite this huge increase in storage density and a similarly impressive improvement in power efficiency, one thing hasn’t changed. The lifetime over which data can be stored on magnetic discs is still about a decade. How Quantum Computers and Machine Learning Will Revolutionize Big Data - Wired Science. When subatomic particles smash together at the Large Hadron Collider in Switzerland, they create showers of new particles whose signatures are recorded by four detectors. The LHC captures 5 trillion bits of data — more information than all of the world’s libraries combined — every second.
After the judicious application of filtering algorithms, more than 99 percent of those data are discarded, but the four experiments still produce a whopping 25 petabytes (25×1015 bytes) of data per year that must be stored and analyzed. That is a scale far beyond the computing resources of any single facility, so the LHC scientists rely on a vast computing grid of 160 data centers around the world, a distributed network that is capable of transferring as much as 10 gigabytes per second at peak performance. Curation.
The Mathematical Shape of Big Science Data. Simon DeDeo, a research fellow in applied mathematics and complex systems at the Santa Fe Institute, had a problem.
He was collaborating on a new project analyzing 300 years’ worth of data from the archives of London’s Old Bailey, the central criminal court of England and Wales. Granted, there was clean data in the usual straightforward Excel spreadsheet format, including such variables as indictment, verdict, and sentence for each case. But there were also full court transcripts, containing some 10 million words recorded during just under 200,000 trials.
Today’s big data is noisy, unstructured, and dynamic. “How the hell do you analyze that data?” Data Scientist: The Sexiest Job of the 21st Century. Artwork: Tamar Cohen, Andrew J Buboltz, 2011, silk screen on a page from a high school yearbook, 8.5" x 12" Download a free chapter from Thomas H.
Davenport's book Keeping Up with the Quants. When Jonathan Goldman arrived for work in June 2006 at LinkedIn, the business networking site, the place still felt like a start-up. The Question to Ask Before Hiring a Data Scientist - Michael Li. By Michael Li | 10:00 AM August 6, 2014 When hiring data scientists, there’s nothing more frustrating than making the wrong hire.
Data scientists are in notoriously high demand, hard to attract, and command large salaries — compounding the cost of a mistake. At The Data Incubator, we’ve talked to dozens of employers looking to hire data scientists from our training program, from large corporates like Pfizer and JPMorgan Chase to smaller tech startups like Foursquare and Upstart. Employers that didn’t have good hiring experiences in the past often failed to ask a key question: For Start-Ups, Sorting the Data Cloud Is the Next Big Thing. “My smartphone produces a huge amount of data, my car produces ridiculous amounts of really valuable data, my house is throwing off data, everything is making data,” said Erik Swan, 47, co-founder of Splunk, a San Francisco-based start-up whose software indexes vast quantities of machine-generated data into searchable links.
Companies search those links, as one searches Google, to analyze customer behavior in real time. Splunk is among a crop of enterprise software start-up companies that analyze big data and are establishing themselves in territory long controlled by giant business-technology vendors like Oracle and I.B.M. How Big Data Gets Real. The business of Big Data, which involves collecting large amounts of data and then searching it for patterns and new revelations, is the result of cheap storage, abundant sensors and new software. It has become a multibillion-dollar industry in less than a decade. Growing at speed like that, it is easy to miss how much remains to do before the industry has proven standards. Until then, lots of customers are probably wasting much of their money.
Why the world’s governments are interested in creating hubs for open data. Amid the tech giants and eager startups that have camped out in East London’s trendy Shoreditch neighborhood, the Open Data Institute is the rare nonprofit on the block that talks about feel-good sorts of things like “triple-bottom line” and “social and environmental value.” In fact, I first met ODI’s CEO Gavin Starks because he used to run AMEE, a startup that builds software for environmental data, and he was one of our first speakers at GigaOM’s early green conferences. But ODI, which officially launched last October with funding from the U.K. government, is a private company and philanthropy isn’t its dominant aim.
ODI helps companies, entrepreneurs and governments find value in the explosion of open data, and it seems to be starting to gain commercial success like a savvy street vendor selling hot cakes. ODI CEO Gavin Starks, in front of art in the ODI offices in Shoreditch. The Limits of Big Data: A Review of Social Physics by Alex Pentland. In 1969, Playboy published a long, freewheeling interview with Marshall McLuhan in which the media theorist and sixties icon sketched a portrait of the future that was at once seductive and repellent. Noting the ability of digital computers to analyze data and communicate messages, he predicted that the machines eventually would be deployed to fine-tune society’s workings.
“The computer can be used to direct a network of global thermostats to pattern life in ways that will optimize human awareness,” he said. Big Data, Trying to Build Better Workers. Tears in rain: how Snapchat showed me the glory of data death. "I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched c-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. IBM's Watson wants to fix America's doctor shortage. Words by the Millions, Sorted by Software. Down in the Data Dumps: Researchers Inventory a World of Information. How Companies Learn Your Secrets. What are you revealing online? Much more than you think. What data is being collected on you? Some shocking info.
Everything We Know About What Data Brokers Know About You. How Facebook Uses Your Data to Target Ads, Even Offline.