In 1996, when Steve Jobs came back to Apple after a decade-long exile, the company’s products took a dramatic turn. The next 15 years would be a whirlwind of monstrous success after monstrous success–iMac, iPod, iTunes Music Store, Intel-based MacBook, iPhone, MacBook Air, iPad. Jobs’s resignation as CEO yesterday has led to some excessive hand-wringing about Apple’s future, near and far, but the Jobsian philosophy–in which the consumer is king, in which there is one right way to do things, in which it is always preferable to trim than to add–will hopefully have permeated Apple enough to weather his departure. It’s already had an effect on the world at large.

The Jobsian philosophy is so fundamentally different from the ethos of the other tech giants–Microsoft especially, but also Sony, Google, Facebook, and (until last week) HP–that it’s surprising that Jobs came from the same place and time. The core Silicon Valley companies all sprung from the tinkerers-in-garages set, a state of mind that’s remained essential to techies decades later. Jobs was a key member of that group, and his work with Apple in the company’s early years is not really so different from Microsoft’s early work, though Jobs was always less of a businessman and perhaps a bit more autocratic (especially as regards licensing).

After he was ousted by Apple’s board in 1985, he spent a decade creating another company, NeXT Computing, from scratch. It’s tempting to chalk up his later success to some of the life changes that happened during this time (which you can read more about in Gizmodo’s timeline)–meeting his biological family, getting married, having two children, beginning to identify as Buddhist–but the change in attitude and work habits that enabled his success might be more easily explained with simple math. The guy was barely 30 years old when he was forced out of Apple, and 40 when he came back. And it was when he came back that his vision coagulated into something tangible.

The Jobsian vision is a variation on minimalism, something completely unexpected when dealing with computers, inherently complex devices. To Jobs, computers are for real people. Not businessmen (ahem HP) or corporations (ahem Microsoft), but people. Computers should be beautiful objects. (Jobs at one point said, when resigning from Apple in 1985, “If Apple becomes a place where computers are a commodity item, where the romance is gone, and where people forget that computers are the most incredible invention that man has ever invented, I’ll feel I have lost Apple.”) Computers should be intuitive and simple, but never dull. It is the duty of the computer’s maker to discover the best way to do things, and to eliminate anything that makes that path difficult. And when you make something simple, the details become the most important thing.

Click to launch our guide to Steve Jobs’s minimalist ethos.

The easiest comparison, to me, is to a chef. Take the best ingredients, assemble them simply but precisely, and present a finished dish the way it should be consumed. No extra garnishes, nothing superfluous. Too much is worse than too little. No optional sauces, no mix-and-match, no “add this if you want.” The chef is the expert here, not the patron.

That mentality has irked or infuriated the tinkerers, as well it should. There’s certainly a sense of smugness–the Jobsian philosophy says “I know the way this should be done.” And it has led Apple astray, sometimes. But Apple is also backed by undeniably brilliant engineers and designers (chief among them Jon Ives), which is why their products are successes more often than not. A composed dish can be amazing, or awful, but a buffet can only rise to a certain height. That’s the Jobsian philosophy, anyway.

That minimalism has had an effect just about everywhere. Apple isn’t just a gadget-maker; the products spearheaded under Jobs are in the Museum of Modern Art. They’ve inspired similar-minded folks in all kinds of disparate industries, consciously or not. Apple was one of the first to fiercely embrace the use of certain typographic ideas (especially the Helvetica font), which is now used in just about every location imaginable, especially all over the web. Every tech company at least tried the start their own content stores, from Microsoft’s Zune to Sony’s Connect (some were more successful than others). Companies like American Apparel copied Apple’s minimalism, while just about every ad strives to hit an “Apple-like” note of innovation and hipness. Apple’s success in the future won’t rely on whoever’s sitting in the boss’s seat–it’ll come from hiring brilliant folks and adhering to the model already in place.

Apple isn’t like Sony, which crumpled in ability and influence after the departure of its two founders. That’s because Sony’s founders were amazing engineers and designers–but that’s it. Without their two stars, Sony had trouble. But Apple has a guiding philosophy to lead it, one that can function with all kinds of different leaders. With any luck, Apple will be just fine.

Consolidate the Desktop, 1998

The iMac, first released in 1998, was Steve Jobs’s first major product (and major success) after rejoining Apple in 1996. Though not nearly the first all-in-one desktop computer–the early Apple machines from the early 1980s were stylistic predecessors–new attention was paid to the simplicity and approachability of the iMac. The tagline: “There is no step three!”, based on the two-step process required to get the computer connected to the internet, right out of the box. The idea: this is not a hulking box for programmers. This is a fun computer, all in one neat package, that’ll let you do what you want to do as easily as possible.

Put Your Music Collection in Your Pocket, 2001

The iPod wasn’t nearly the first digital audio player around, being beaten to the punch by Creative and Diamond Rio, but it was the first to use the 1.8-inch hard drive that would allow a mix between storage (unlike the paltry 32MB flash-based Rio) and pocketability (unlike the hubcap-sized Creative Jukebox). But even more than that, the iPod promoted an ecosystem, an idea totally foreign at the time. One company would make your portable device, the computer it syncs with, and the music software it uses to sync. Even more, that same company would sell you the music you’d put on the device. It turned digital music from what had been a confusing and questionably lawful process into something simple and clean.

The Phone of the Future, 2007

Any tech writer could scrawl for days about the importance of the iPhone. But let’s focus on how, despite being the most revolutionary phone (or, hell, gadget in general) of the past 20 years, it was also stunningly simple, especially at launch. In 2007, the iPhone did things no other phone did, and a lot of its differences were subtractions. No apps, not for a whole year. No 3G, which, in these days of 4G, we forget was such a big deal in 2007. No removable battery, a decision that inspired derision and outrage from the tech press. Only a single function button–home. The iPhone is complex in lots of ways–its pioneering use of multitouch gestures, accelerometers, and software-based keyboard were all new ideas, and are now ubiquitous–but at its core, using it was simple and easy to understand.

How Simple Is too Simple?, 2008

The MacBook Air was not so much divisive as despised when first announced. Here was an outrageously gorgeous laptop, yes–sleek and thin beyond what anyone had thought possible–but at a serious price. No optical drive. One USB drive. Paltry battery life. A poor screen. An underpowered processor. And all for a very, very high asking price. This was an instance of Apple working out the kinks in the first generation. Now, a few generations later, the MacBook Air is bar none our favorite consumer laptop. The tech has caught up to the concept: Now, we can fit a powerful processor, all-day battery life, a respectable amount of flash storage, and a great screen into that same tiny package. Even better, the price has gone down, even below $1,000. The lack of an optical drive, which seemed so offensive in 2008, seems perfectly fine now. Who needs it, with Netflix, Hulu, and the rest? The laptop is now so light, and with such great battery life, that it’s in many ways the most portable laptop we’ve ever seen. The flash memory is fast, reliable, and sturdy. This computer just works, and in 2011, it works very, very well.

The Infamous Apple Tablet, 2010

The “Apple Tablet” was the most-rumored gadget in tech for about five years. The gadget blogs worked themselves into a frenzy, documenting every insane rumor, every possible incarnation of the fabled Tablet That Would Save Publishing And Possibly Ensure World Peace. And when it came, there was a deafening roar of “…is this it?” We waited years, and what we got was a giant iPod Touch? Well, yes and no–mostly no. The iPad turned out to be simpler than we had expected. Jobs and Apple realized that the existing conception of a tablet–a laptop shrunk into a slate form–didn’t work, and probably wouldn’t ever work. What did work for touch interface was the OS Apple had already made for the iPhone–so why not blow it up and see how it fared? One button. One big screen. No removable battery, no expandable memory. Wireless connections only. Only fingers necessary. The iPad is about as simple a slate as you could imagine. And it’s the only one to date that’s worked.

No Compromises

One of the most productive (yet controversial) aspects of the Jobsian philosophy is an abhorrence of compromise. Jobs is reputed to have always decided what he thinks a product should be able to do before consulting his engineers to see if they think it’s possible. (Early on, Jobs apparently built a case to enclose a computer his Apple III team was working on–only he built it much too small. Intentionally. Then he pressured the team to make a computer that fit in the case, which they did.) That extends to any external technology that he feels compromises his vision. In recent years, that’s meant a blackball on Blu-ray from Mac computers, and on Adobe Flash from Apple’s mobile devices. Both, says Jobs, are too intensive and put a bigger strain on devices than they’re worth. Those are major, standard platforms–we’re not talking about FireWire here–and if they don’t behave the way Apple wants, bam. Lose them. No other major tech manufacturer would dream of not supporting a standard like Blu-ray or Flash.

Back to the Mac, 2011

Earlier this year, Apple revealed (and then released) the newest version of Mac OS X, code-named Lion. Lion hasn’t been met with universal praise (on older machines, it is sometimes buggy and unstable, for one thing) but in philosophy, it’s wholly in keeping with the Jobsian ethos. Lion is inspired by iOS, a much newer and simpler way of computing, and even if it doesn’t always work, the idea behind it is solid. Reduce the number of keystroke commands, switch them to finger gestures. Do away with disks entirely. Don’t worry about meticulously saving your documents–Apple will do it for you. Share files by drag-and-drop–no need for complicated online lockers or extra hardware or anything like that. It’s a vision of computing that’s simple, understandable, and intuitive. You can read more about Lion’s futuristic new features in our piece on the subject.