In 1971, electrical engineering professor Leon Chua proposed a theoretical basic electronics component called a memristor. In 2008, Hewlett Packard brought the memristor out of theory and into the real world. And today, HP announced that they have finally proven that they can build devices that use memristors, instead of the transistors that enable all current computer chips. Since memristors can store and process data simultaneously, stack on top of one another in a 3-D fashion, and function at much smaller sizes than a transistor, this advance could increase the power and memory of computers to nearly unimaginable proportions within only a couple of years.
"In theory we can connect thousands of layers in a very straightforward fashion," Stan Williams, and scientist at HP, told the BBC. "It could provide a way of getting a ridiculous amount of memory on a chip."
Memristors improve on transistors in three key ways. First off, they allow the same device to serve as the processor and the memory. Right now, computers need separate devices for memory (such as solid state flash memory or regular magnetic hard drives) and processing (the computer chip itself). By eliminating the communication time and energy between those different parts of hardware, a memristor system would work far faster, and with far less energy, than a traditional computer.
Second, memristors can be much smaller than transistors. Quantum mechanics limits how tiny transistors can be, a limit that current technology is rapidly approaching. Memristors would allow computer chips to continue getting smaller past that point, all without resorting to exotic tricks like graphene chips or quantum computing.
Lastly, unlike transistors, which only work linearly, memristors can form three-dimensional networks. This added dimension exponentially expands the number of connections, and thus the power, of a memristor computer. In fact, the 3-D network capability of memristors is so profound that Leon Chua, the man who first theorized the existence of memristors in the 1971, believes that this technology could enable the creation of electronic brains. "We have the right stuff now to build real brains," he told the Times.
Hewlett Packard has already created a few simple devices that run on memristors as proof of concept, and they think that they can have the first working models capable of replacing some current computer parts within three years. However, with memristors enabling chip development for decades past where transistors would have hit their physical limit, the true value of this advance may not be realized for years to come.
That's amazing. Go HP!
Holy shit, HP actually did something good for once.
this could save money, Proceed.
This is an incredible advancement.
I can't wait for it to reach consumer level products.
It looks like it won't take the customary 10 years to do so either...
oh but it will. and i would love to be wrong too. hehe... i just think there'll be so many baby steps along the way before they scale it all the way up. sorry for the pessimism. i hope i'm wrong.
Hehe, well they discovered this two years ago, so another three (five total) to make at least a small appearance in consumer technology isn't all that unexpected, especially considering how monumental a breakthrough this is.
Electronic brains designed by a human...
Sounds kind of scary.
@Vega_Obscura I was thinking the same thing.
That's great! Does it also mean that HP will sell your systems with better hard drives too, now? Maybe, stop using those old 5800 rpm HDD's.
Funny , I was just watching Bladerunner last night , great movie .
Electronic brains ! We'll have " skinjobs " walkin the streets ..sweeet
Yet another step closer to my cybernetic body. There actually isn't much new to this story though. Except the brain part. Which reminds me. I need to go copyright the name iBrain.
Did you read the article? they are talking about true systems on a chip. so no hard drive at all may be possible. depending on the device.
The real question is what are they made of? Silicon? Marshmallows? The only thing they ruled out was graphene. Also what adaptations will be needed on the software side. Most importantly, is it gonna be cheap?
This would be one of my thoughts for AI, our brain work as memory and our cpu at same time. All computers we try to make an AI with are a separate memory and separate cpu. If we could combine them both as one, maybe we can get AI to work a lot nicer.
Nice! Way to go HP!
I'm not sure that anyone totally gets what kind of breakthrough this really is...
We're saying that my desktop pc components could all be replaced by this little chip. Video card, sound card, ram, hard drive, everything.
Might need connections to plug things in, or read discs, but the bulk of the computer 'guts' would be gone.
I would love to have a hard-drive sized processor with processing capability to scale, it would be like a 1,500 GHz processor. I can run Crisis on full graphics with that =)
"I'm not sure that anyone totally gets what kind of breakthrough this really is... We're saying that my desktop pc components could all be replaced by this little chip. Video card, sound card, ram, hard drive, everything."
True, although there would inevitably be other, smaller controllers for specific hardware reasons, and separate graphics and sound devices would still make sense for all of the reasons they do now.
With that said, a network of processing-but-also-memory nodes is, as the article notes, much more like a brain than it is like the present-day computer. It's a completely different architecture that would necessitate major changes in the architecture of whatever runs on top of it. I mean, think of a dual-core processor, or the switch from 32 to 64 bit. Major architecture changes, right? Now consider that neither of those things would have any meaning, whatsoever, in a memristor-based processor. There's no bus, there's a vast number of "cores." Instead of talking about computations per second, all data is capable of being processed simultaneously.
And, I mean, technically speaking, just like the brain, you'd only ever be using a small percentage of your "processing power" for any one task ... so the "scaling" comment above is a bit misleading.
I think people (including the writer) are confused by the bit about it being able to store and compute data. It's not at all meant to replace a hard drive. It would just combine the CPU cache with the processing core.
Well, after reading the *other* memristor article, it makes more sense - there are two possibilities.
One is a chip with its own memory that actually uses the computing power of the individual circuits. That's the "brain" layout folks are talking about, and it's entirely theoretical, nothing to do with ordinary computer science, etc.
The other is RAM that doesn't flush when the power goes off. *Shrug* And that's the one that folks are actually considering using. In that situation, nothing's combining with the CPU - which is still the same 2-4 cored linear number cruncher it's always been. Instead, RAM gets combined with the hard drive. That means data access much faster than even an SSD and a computer that, ideally, only ever has to "boot" once. = )
I'm betting the first iterations of this tech would be not very fast compared to some of the fastest home computers out there right now...it'll take some time before they optimize the hardware and the software running it.
Just like when multi-core machines came out, basically no one had software capable of taking advantage of multiple threads. And most still don't utilize it. It'll take a long time before they can match the software optimizations to fully utilize the power of the hardware.
Whilst reading this I kept hearing "My CPU is a neural-net processor, a learning computer" playing in my head...
I knew this would happen someday..
Imagine storing every song, movie, book, and software that *ever* existed on one chip.
I think RIAA/MPAA might give up then lol.
The Matrix is now non-fiction... Mobius ftw!
You still need to back up and share your data even when your CPU is off line or not connected to the network. So some of these will only be used as ram. As long as they are faster than HDDs and can be RAIDed, there is a significant opportunity there.
As far as video cards, sound cards, etc... You may be able to virtualize all the CPU's and memory of of your specialized cards. Want a new sound card? Segment a piece of the CPU and make the new Sound Card there. The only thing that you gain by separating the other peripherals is speed. If this chip design delivers the speed, then there is no reason to have external devices.
Ports - you will need lots of ports. Sound card ports, mic ports, etc... Maybe everything (except video) will be standardized as either wireless or USB.
Specialized hardware. video capture, Wifi transmitter/receiver, cellular transmitter/receiver, built in speakers, touch screen (iPad), you can't get away from it.
An awesome concept.
i do believe that it will only take a few years to see these put into systems like it says, just not in homes so much. With it requiring special software, if it is in computers, it will be servers for big companies where they can afford to have software created for it. Computers for the home will definitely be 10 years or so. Any home use of this in the near future will be more likely in handheld type devices. practice making the chips at the smaller sizes before upgrading to larger ones. practice writing software, and with all the phone app makers nowadays, we can get everybody out there practicing soon enough. i don't know, i'm rambling cause i'm so tired. I go sleepy now. my thoughts are no longer clear.
If supergenius computers take over the world, will they make us pay income tax?
Give me liberty, or give me...MY MONEY BACK!
So a processor and memory on the same chip. Gonna be expensive, plus the export controls needed to keep this kind of computing power out of the bad guys hands will have to be extreme.
BTW: Does this mean when the processor or memory crashes (and you know it will), I'll have to buy a whole new computer?
Apple should spend there 300$ billion dollars to buy HP and use these chips in the macs.
Apple would take twice as long to develop the technology so they could milk us for every penny. HP needs to hurry up and finish development so they can gain marketshare. i choose the way that gets it to me faster.
April Fool's you goofy-geeks. The pic is a closeup of a Twinky! What a bunch of yim yams!!