background preloader

Tecnologia

Facebook Twitter

Unistellar's eVscope boosts citizen astronomy during COVID-19 lockdown. How did Elizabeth Holmes do it? - Uncharted Waters.

Future work

Machine learning platforms comparison: Amazon, Azure, Google, IBM. Data scientists who want to build machine learning models and put them into production have no shortage of available tools, but choosing the right one comes with some thorny decisions. The chart below breaks down some of the most popular machine learning platforms by their key features and price tags. Note that many open source tools are available for machine learning, as well as other vendor offerings, but we focused exclusively on vendor cloud platforms that span the entire machine learning lifecycle from data ingestion to model development to production. The market for machine learning platforms is heating up, and all of the leading vendors are looking to nab their share.

Analyst firm Forrester expects this market to grow at a rate of 15% annually through 2021. Several vendors have beefed up their offerings in recent months and now offer simple, cloud-based platforms for getting started with machine learning and developing models that can quickly be put into production. From the mundane to divine, some of the best-designed products of all time. A well-designed product equally elevates form and function. It is pleasing to look at, easy to use and solves a common problem.

We reached out to five design professors and posed the following question: What’s the best-designed product of all time, and why? Their responses vary from cheap, everyday products to newer, more expensive ones. But all share a story of trial, error and ingenuity. Cutting the glare Catherine Anderson, The George Washington University In the early 1920s, as Danish designer Poul Henningsen observed Copenhagen at night, he lamented the quality of light in people’s homes. Henningsen set out to create a new design that would mitigate this “dismal” effect; it would be “…constructed with the most difficult and noble task in mind: lighting in the home.”

“The aim is to beautify the home and who lived there,” he wrote, “to make the evening restful and relaxing.” His approach was scientific. In 1924, the “PH lamp” was born. It’s delightful to look at. Terminal waits Craig M. How the Soviets invented the internet and why it didn't work | Aeon Essays. On the morning of 1 October 1970, the computer scientist Viktor Glushkov walked into the Kremlin to meet with the Politburo.

He was an alert man with piercing eyes rimmed in black glasses, with the kind of mind that, given one problem, would derive a method for solving all similar problems. And at that moment the Soviet Union had a serious problem. A year earlier, the United States launched ARPANET, the first packet-switching distributed computer network that would in time seed the internet as we know it. The distributed network was originally designed to nudge the US ahead of the Soviets, allowing scientists’ and government leaders’ computers to communicate even in the event of a nuclear attack.

It was the height of the tech race, and the Soviets needed to respond. Glushkov’s idea was to inaugurate an era of electronic socialism. He named the colossally ambitious project the All-State Automated System. The idea, however, survived. Get Aeon straight to your inbox. Is Keck’s Law Coming to an End? Since 1980, the number of bits per second that can be sent down an optical fiber has increased some 10 millionfold. That’s remarkable even by the standards of late-20th-century electronics. It’s more than the jump in the number of transistors on chips during that same period, as described by Moore’s Law.

There ought to be a law here, too. Call it Keck’s Law, in honor of Donald Keck. He’s the coinventor of low-loss optical fiber and has tracked the impressive growth in its capacity. Maybe giving the trend a name of its own will focus attention on one of the world’s most unsung industrial achievements. Moore’s Law may get all the attention. Now, as electronics faces enormous challenges to keep Moore’s Law alive, fiber optics is also struggling to sustain the momentum. The heart of today’s fiber-optic connections is the core: a 9-micrometer-wide strand of glass that’s almost perfectly transparent to 1.55-µm, infrared light. One was a change to the way signals are encoded.

The Information Liar Paradox: A Problem for Floridi’s RSDI Definition. Three meanings of “technology” | Rethinking Technology. OMiLAB - OMiLAB Portal. Technological singularity. The technological singularity is the hypothesis that accelerating progress in technologies will cause a runaway effect wherein artificial intelligence will exceed human intellectual capacity and control, thus radically changing civilization in an event called the singularity.[1] Because the capabilities of such an intelligence may be impossible for a human to comprehend, the technological singularity is an occurrence beyond which events may become unpredictable, unfavorable, or even unfathomable.[2] The first use of the term "singularity" in this context was by mathematician John von Neumann.

Proponents of the singularity typically postulate an "intelligence explosion",[5][6] where superintelligences design successive generations of increasingly powerful minds, that might occur very quickly and might not stop until the agent's cognitive abilities greatly surpass that of any human. Basic concepts Superintelligence Non-AI singularity Intelligence explosion Exponential growth Plausibility. A year of tech industry hype in a single graph. Tech industry trends follow a fairly predictable pattern: there's a rush of hype, an inevitable backlash, and then a long, tired slog towards a product that actually works. It eventually produces incredible things like the internal combustion engine or my Droid 4, but it can be hard to tell exactly where given technology is on the slow journey from bullshit to reality. Luckily, the analysts at Gartner have given us a kind of roadmap, plotting out the trends of 2014 on an immutable line they call the Hypecycle.

Gartner has been making these for 20 years, and there's always a lot of guesswork involved, but this one's particularly useful as a snapshot of the present moment. Speech recognition is just starting to be useful (hello, Siri), and virtual reality decks like the Oculus Rift are getting there. People have stopped saying "big data" as much, but "internet of things" is still ascendant, and god help us once "neurobusiness" gets going.