See how computing power has increased just about a quadrillion-fold over six decades
Posted 11.02.2011 at 4:29 pm 4 Comments
Charles Xavier Thomas de Colmar invented the first commercially successful mechanical calculator in 1820. It was 100 years before mechanical calculators gave way, in the 1930s, to electromechanical calculators, which then quickly gave way to the first general-purpose electronic computer, ENIAC, in 1946. By 1965, Gordon Moore was predicting that engineers would be able to double the number of components on a microchip every two years (and by 1968, he co-founded Intel to help them do so).
Trevor Prideaux was having trouble texting. Prideaux, who was born without his left forearm, used to have to balance his smartphone on his prosthetic arm or lay it on a flat surface to text, dial, or otherwise take advantage of the technology. So with some help form the Exeter Mobility Center in Devon, UK, the 50-year-old Prideaux has become the first person to have a smartphone dock embedded in his prosthetic limb.
This morning the news came over the internet: Dennis Ritchie has died.
Dr. Ritchie doesn't have the mainstream adoring following of Steve Jobs, but he can take considerably more credit for the creation, and even the aesthetics, of the computer world we live in. It's almost impossible to find a personal computing product or paradigm that doesn't owe a direct debt to Ritchie.
Typing in Braille is tricky, requiring clunky and expensive dedicated devices--some costing as much as $6,000--with limited functionality beyond their primary design parameter. But a team of researchers at Stanford, including an undergrad on loan from New Mexico State University, have created a touchscreen interface that brings the ability to write in Braille to tablet PCs.
If you want to be a part of discovering the future of solar power, you can be. You don't need any special knowledge or equipment, just let Alán Aspuru-Guzik borrow your computer when you're not using it.
Researchers on two continents are reporting two big breakthroughs in quantum computing today — a quantum system built on the familiar von Neumann processor-memory architecture, and a working digital quantum simulator built on a quantum-computer platform. Although these developments are still constrained to the lab, they’re yet another sign that a quantum leap in computing may be just around the corner.
Living in the average dorm room costs students around $5,500 a year, but that only buys you drab cinderblock walls and bad furniture. To make your digs stand out, you’ll need some serious gear. We've put together a list of five dorm-room...well, certainly not essentials, but they're definitely gadgets that'll help your room stand out.
There are a lot of people out there dealing with some degree of hearing disability--one in six, by some estimates--and that audience is typically underserved when it comes to cinematic experience. Some films are screened with subtitles, but often at odd times. But Sony is working up a fix in its UK lab: a pair of glasses that places subtitles right in the user’s field of view.
Dual-core processors have been a computing mainstay for more than six years, allowing machines to handle two tasks at once without sacrificing speed in either. This year, dual-core chips have begun popping up in app-hungry phones. The next step: cameras. The Olympus PEN E-P3 is the first digital camera running on a dual-core chip, which lets it capture, retouch, and save shots nearly twice as fast as most competitors.
Home theater PCs (HTPCs) have kind of fallen out of favor as simpler, more efficient media gadgets have sprung up. But as we found with the Apple TV, sometimes simpler doesn't mean better. Our friends at Sound+Vision took a second look at the HTPC, and found some distinct benefits for the DIY-minded: a cheap price, endless possible upgrades, and lots of flexibility and power. Check it out here.
Since 2007, IBM has been working with the University of Illinois at Urbana-Champaign to construct the world’s fastest academic supercomputer. This week we learn that work has been mysteriously halted by IBM, which is taking back the parts it recently delivered to the school, giving U. of Illinois its money back, and ceasing work on the project just months before the massive computer is slated to be completed.
The biggest hack ever discovered has been exposed by McAfee, and the breadth and depth would be impressive it wasn’t so disconcerting: five years, at least 72 different governments, NGOs, and other organizations (including the United Nations and the International Olympic Committee) and reams and reams of secret data. Of course, McAfee believes there is a single “state actor” behind the attacks, but the company has declined to name it.
MIT engineers have a reputation for applying their vast intellectual resources and physical energies toward solving some of mankind’s greatest challenges. And it’s fair to say this morning that at MIT’s Computer Science and Artificial Intelligence Laboratory, researchers have lived up to that expectation.
A little more than a year ago, we wrote about an Australian hobbyist named Bruce Dell who was claiming--with video evidence to back it up--that he’d created a new graphics technology that could deliver unlimited power. That is, rather than working with a limited number of polygon shapes (restricted, of course, by computing power), a graphic environment could be built from an infinite number of 3-D virtual atoms, much like the physical world.
Five amazing, clean technologies that will set us free, in this month's energy-focused issue. Also: how to build a better bomb detector, the robotic toys that are raising your children, a human catapult, the world's smallest arcade, and much more.