After the Mark I was unseated as the world's strongest computer by the Army's ENIAC, the Navy, looking to catch up, created the Mark II. The second generation improved on the Mark I in several ways, including a thinner and easier-changed roll of tape to spit out the answers, three more teletypewriters and calculations done entirely by electrical relays, without any of the Mark I's mechanical gears. Despite the Navy's best efforts, the ENIAC remained on top, 1,000 times faster than the Mark II. Read the full story in Inside the Biggest Man-Made Brain
Now that we’ve spent this week looking at all the incredible ways data is gathered, computed, analyzed and used, we thought we’d take a look through the archives to see how we got to this data age to begin with.
Click here to launch the archive gallery.
The earliest goliath computers were built by the Army and the Navy, not for civilian use. Flash forward to the late 60s, when civilians could rent computers to use for their own complex calculations (read: taxes). The next step was getting computers to understand queries posed in simple English, rather than just relying on complex computing languages, making them easier for laypeople to use. We sped up the data crunching by giving computers logical reasoning, faster search skills and multiple processors, and before long, they were helping design aircraft and visualizing energy emissions from black holes (seen above). And now, we’ve gotten so good at storing and finding data that we have to be careful, because wherever we go, we leave traces of ourselves.
Learn about all this and more through PopSci’s coverage over the years in this week’s archive gallery.
The First All-Electronic Computer: April 1946
The Army’s 30-ton ENIAC (Electronic Numerical Integrator and Computer), introduced in 1946, could solve the “basic” aerodynamic problem around designing shells in 130 hours. That seems perhaps a bit sluggish by today’s standards, but when you consider that any other existing machine at the time would have needed at least a year to solve the problem, and even several lifetimes wouldn’t be enough time for a human to do it, ENIAC represents a giant leap forward in computing. Inventor John W. Mauchly realized the need for such a computer when he was slogging through mounds of geophysical data for his research. Once it was built, the data-crunching monster took up the entirety of a 30 by 50-foot room that had to be ventilated to siphon off the heat produced by its 18,000 electronic tubes. Read the full story in Lightning Strikes Mathematics
The Mark II Calculator is the World’s Biggest Brain, Not the Fastest: May 1947
After the Mark I was unseated as the world’s strongest computer by the Army’s ENIAC, the Navy, looking to catch up, created the Mark II. The second generation improved on the Mark I in several ways, including a thinner and easier-changed roll of tape to spit out the answers, three more teletypewriters and calculations done entirely by electrical relays, without any of the Mark I’s mechanical gears. Despite the Navy’s best efforts, the ENIAC remained on top, 1,000 times faster than the Mark II. Read the full story in Inside the Biggest Man-Made Brain
Time-Sharing a Computer: May 1967
At a time when computers were largely just tools for NASA, or the Army, we invited one of our feature writers to take one home for a spin. He used it for many of the things we do today – doing taxes, playing games and calculating. Our writer was convinced that time-sharing was the future of computing in the home. You wouldn’t have to buy a computer, you could just rent its services whenever you needed number-crunching power beyond that of a calculator (around April, perhaps). “It will be far cheaper,” he wrote, “to build one monster computer with thousands or even millions of customers hooked to it than to have small, individual machines in individual homes.” How times change. Read the full story in I Used a Real Computer at Home…and So Will You
Computing in Plain English: May 1982
Computers are great tools, but only if you know how to use them. Taking on BASIC or another computing language might have been daunting for the regular Joe just trying to do his taxes. And so, a kinder, gentler era of computing was ushered in with user-friendly programs that understood queries typed in plain English. Two programs, one called Intellect and one called Savvy, appeared around the same time. Savvy worked by interpreting patterns in the sentences, while Intellect relied on an understanding of the grammatical structure of the queries and the meaning of each specific word. While the makers of both programs disagreed on what method was best for helping computers to understand language, they both agreed that speech recognition was the next natural step. Read the full story in Smart Computers – Now They Speak Our Language
Artificial Intelligence and Multiple Processors: April 1983
A Japanese computer, with a hopeful release date of the early 90s integrated artificial intelligence concepts with the hope of getting the machine to help program itself, recognize relevant data, translate languages, make logical inferences and learn from experience. This fifth generation computer would use very-large-scale-integration semiconductor chips that were much more powerful than the ones already in use. We compared this machine to 2001‘s HAL in power. The great leap forward being demonstrated in the fifth generation computer was in the hardware (pictured here). The architecture of most computers at the time only allowed the machines to do one task at a time, but this machine (along with a couple of supercomputers already in existence) had multiple processors, allowing it to do several parts of the computation at once, and greatly speed up the process. Read the full story in Fifth-Generation Computers
Supercomputers Solve the Previously Unsolvable: May 1987
Before supercomputers, airplanes were designed largely by trial and error, throwing model planes into wind tunnels, sending up test pilots and hoping they’d come back. We tend to side with mathematician and computer whiz John von Neumann’s sentiment that “a wind tunnel is an insult.” We’re glad we have supercomputers to solve this and many other problems for us that would be impossible with out their incredible calculating power. We laid out several examples of fundamental problems solved by supercomputers, including the aeroelasticity of airplane wings, the drag on a car, combustion processes inside engines (we find it a little frightening that we hadn’t already figured that one out) and energy emissions from black holes (the model of which is pictured here). Read the full story in Solving Unsolvable Problems with Supercomputers
The Fast Data Finder: December 1987
A time when we couldn’t sift through piles and piles of data at the push of a button seems long ago and far away. The search function is an integral part of our lives now, but imagine the joy you would feel if you were using it for the first time. In 1987, our writer tested out the Fast Data Finder, a system originally developed for the Department of Defense, and experienced that joy firsthand. The FDF worked like a pipeline, with data going in one end and out the other, and freezing in the middle so the system could search for whatever keywords you asked for. So how fast was it? “In less time than it takes me to find my coffee cup, a black box of computer chips called the Fast Data Finder has read every word in the AP file.” Not too shabby. Read the full story in Super Searcher
A Cloudless Map of Earth: December 1990
An artist and a computer graphics specialist at NASA sorted through tons and tons of satellite images of Earth until they finally found enough cloudless pictures to stitch together for this, the most detailed computer-generated map of the world ever created at that time. The composite was made up of 32 million pixels worth of photos taken by Advanced Very High Resolution Radiometry Satellites with a resolution of four kilometers. Read the full story in World View
Tracking Mark’s Personal Data: July 2002
We tracked a fictional graphic designer named Mark through a day of work, ATM withdrawals, emails, IMs, phone calls, doctors visits and errands to see who can see his personal data – and how. The answer roughly amounted to a lot of people and a lot of ways. Your data is left a lot of places you wouldn’t necessarily expect. For example, ATMs have access to not only your financial information, but your picture as well. Your instant messages can be scanned by your employer for suspicious words and phrases. When you delete files from your computer, they aren’t actually gone until new data overwrites them and can be retrieved by a computer program. All in all, it left us feeling pretty paranoid. Read the full story in All Eyes Are on You
Harvesting the Power of Idle Laptops to Assist Supercomputers: June 2003
Think only the most badass of scientists get to run supercomputers? You’re right, but we laymen can still contribute by lending our unused computer power to distributed computing programs like Climateprediction.net, which launched in 2003 and helps predict changes in Earth’s climate, or SETI@home, which searches for extraterrestrial signals. So you can feel good about leaving your computer on during your extended lunch break, because you were helping look for aliens. Read the full story in Supercomputing Made Simple