SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Strolling New York City’s famed Fifth Avenue and nearby garment district, we crossed the portals of a new type of store, into a brand new world. The new retailers were computer stores, which we recognized as the herald of an oncoming revolution.
For sale on the merchants’ shelves were “hobby” or home computers, technically called microcomputers and later personal computers. These were first the passion of the build-it-yourself crowd; later, armed with software like VisiCalc and WordStar, number-crunchers and wordsmiths alike began slipping the little machines past the corporate eyes of Information Systems departments.

We took readers inside such microcomputers as the Altair, considered the first commercially successful personal computer, of which about 10,000 were sold. The Altair came with an Intel chip and a Basic programming language from a fledgling company, Microsoft.

We also introduced our readers to some of the important and influential personalities from the frontier days of the microcomputer. Weighing in with opinions on the new machines were Ted Nelson, a founder of the Southern California Computer Society-one of the earliest of such computer groups–and David Ahl, who started Creative Computing magazine from the basement of his Morristown, New Jersey, home in 1975. We also meet Ed Roberts, president of MITS, the company whose assemble-it-yourself computer was but the opening salvo in a movement that changed how America did business. Then, such pioneers were often regarded as zealots trumpeting a fad. It was no fad.

Timeline: December 1976

In 1976 a mysterious illness, which becomes known as Legionnaire’s Disease, kills 29 conventioneers in Philadelphia; IBM introduces the first inkjet printers; two space probes, Viking I and II, land on Mars; the first auto-focus camera is introduced; the United States celebrates its bicentennial.

Fast-growing new hobby: Real computers you assemble yourself

Low-cost microcircuits bring tremendous computing power to thousands of homes.

By Ed Endelso

The hobby of home computing has arrived, and its prospects are as dazzling as the display that Bob Arning, who runs The Computer Store, nestled somewhat incongruously amidst the sandwich joints and lofts of New York’s garment district, has been showing me.

I could have plunked down about the same amount I would pay for a color TV set and walked out with an assemble-it-yourself kit, ready to build my own computer. Thousands of Americans have been doing that lately.

“What can you do with a home computer?” says David Ahl, the publisher of Creative Computing magazine in response to my question. “Well, it can keep your Christmas card list for you. You can write and edit letters on it. You could use it to make out your shopping list by keeping a running record of kitchen supplies. Once you’ve had a computer for a while, you’ll ask yourself, `How did I ever get along without one?'”

Home computers are so useful because they are real computers, not toys. Pointing at the home computer on a table, Stan Viet, owner of The Computer Mart on New York’s Fifth Avenue, says, “That has more power than the early IBM 360 model 30.” If you know computers, you’ll remember that the 360 was considered revolutionary when it appeared in the 1960s.

The company that usually gets the credit for starting the microcomputer revolution is Intel Corp., a California-based semiconductor outfit. In the late 1960s, Intel helped a Japanese firm develop LSI chips for pocket calculators. Instead of using several chips for the circuitry, Intel created a single chip equal to the complete central-processing unit for a computer.

Fast-growing new hobby: Real computers you assemble yourself

Low-cost microcircuits bring tremendous computing power to thousands of homes.

By Ed Endelso

The hobby of home computing has arrived, and its prospects are as dazzling as the display that Bob Arning, who runs The Computer Store, nestled somewhat incongruously amidst the sandwich joints and lofts of New York’s garment district, has been showing me.

I could have plunked down about the same amount I would pay for a color TV set and walked out with an assemble-it-yourself kit, ready to build my own computer. Thousands of Americans have been doing that lately.

“What can you do with a home computer?” says David Ahl, the publisher of Creative Computing magazine in response to my question. “Well, it can keep your Christmas card list for you. You can write and edit letters on it. You could use it to make out your shopping list by keeping a running record of kitchen supplies. Once you’ve had a computer for a while, you’ll ask yourself, `How did I ever get along without one?'”

Home computers are so useful because they are real computers, not toys. Pointing at the home computer on a table, Stan Viet, owner of The Computer Mart on New York’s Fifth Avenue, says, “That has more power than the early IBM 360 model 30.” If you know computers, you’ll remember that the 360 was considered revolutionary when it appeared in the 1960s.

The company that usually gets the credit for starting the microcomputer revolution is Intel Corp., a California-based semiconductor outfit. In the late 1960s, Intel helped a Japanese firm develop LSI chips for pocket calculators. Instead of using several chips for the circuitry, Intel created a single chip equal to the complete central-processing unit for a computer.

The first-generation microprocessor, the Intel 4004, had the equivalent of 2,250 transistors on a chip. By 1973, Intel had the 8080, which was 20 times faster than the 4004. That was what H. Edward Roberts had been waiting for.

Ed Roberts, president of MITS, an electronics company in Albuquerque, New Mexico, had been thinking about some computer kits for a while. The 8080 had the capabilities he was after. The first MITS do-it-yourself computer (called the Altair because Roberts’ daughter pointed out that the starship Enterprise would visit a planet of that name on a current rerun of “Star Trek”) was announced in January 1975. MITS shipped 5,000 units in 1975 and the home computer industry was off and running.

Describing that industry isn’t easy, because it is exploding rather than growing. Noting that the Southern California Computer Society had begun with two members in September 1975 and had 20,000 members just one year later, Ted Nelson, a founder of the group, told a recent conference, “If the SCCS continues to grow at its present rate, in four years all mankind will be dues-paying members.” That’s just one computer club; it’s estimated that new clubs are being started at the rate of one a week.

This kind of furious growth in a field is so new that major discoveries are commonplace has produced a ravenous appetite for information. Everyone with a home computer wants to know what everyone else is doing.

Ahl sees no limits for the hobby computer. “In five years,” he says, “schools will have them, homes will have them, kids will have them. If a school doesn’t have one, the kids will start bringing their own computers in.”

“It’s comparable to pocket calculators,” says Ahl. Five years ago, if I had asked you whether you needed a pocket calculator, you wouldn’t have been able to think of a need for it. Today, everyone has a calculator.”

“This is the beginning of a movement that will have a greater impact than the calculator. A computer doesn’t just do calculations. It does anything you want it to.

“Right now, the data-processing industry regards us as a fringe movement, something out of the mainstream of interest. They’re wrong. In five years, the home computer will be the mainstream of this business.”

Plug in cables

Plug-in circuits

Plug-in circuits enable computer owners to expand the capability of most machines with additional memory.