16 June 2008

"Fast bipolar nonvolatile switching," and why it changes everything


Yet more memristor news from HP Labs. An eye-blink after their stunning news of discovering the memristor's existence as the fourth fundamental passive electronic circuit element, comes Nature Nanotech's advance online publication of significant progress in the practical fabrication of these devices.

Authors Yang, Pickett, Li, Ohlberg, Stewart and Williams--all of HP Labs in Palo Alto--state:

We have built micro- and nanoscale TiO2 junction devices with platinum electrodes that exhibit fast bipolar nonvolatile switching. We demonstrate that switching involves changes to the electronic barrier at the Pt/TiO2 interface due to the drift of positively charged oxygen vacancies under an applied electric field. Vacancy drift towards the interface creates conducting channels that shunt, or short-circuit, the electronic barrier to switch ON. The drift of vacancies away from the interface annilihilates such channels, recovering the electronic barrier to switch OFF. Using this model we have built TiO2 crosspoints with engineered oxygen vacancy profiles that predictively control the switching polarity and conductance.

The keywords are "engineered oxygen vacancy profiles" and "predictively control." These indicate that memristors are hurtling from their emergence as a laboratory success-story to land square in the everyday IC design toolkit, right before our eyes. Remarkable!

More remarkable is the unusual behavior of these devices. First, there's no doubt that they'll be used as replacements for flash RAM and hard disks: they're faster, they use less energy, they would appear to be denser (50x50nm, in the versions discussed in Nature Nanotech), they operate in parallel, and scalability seems to be a real strength. Given the world's exponentiating appetite for information and information storage, this is all fine news for iPhones and laptops and other good things. But memristors' capability is not limited to storing 1's and 0's. That would be so 21st century. No, memristors can store values in-between. They are analog devices, and they will facilitate parallel analog computers. Those are common enough: you have one between your ears.

HP Labs' Jamie Beckett spoke with the researchers:
"A conventional device has just 0 and 1 – two states – this can be 0.2 or 0.5 or 0.9," says Yang. That in-between quality is what gives the memristor its potential for brain-like information processing... Any learning a computer displays today is the result of software. What we're talking about is the computer itself – the hardware – being able to learn."

Beckett elaborates,
...[Such a computer] could gain pattern-matching abilities [or could] adapt its user interface based on how you use it. These same abilities make it ideal for such artificial intelligence applications as recognizing faces or understanding speech.

Another benefit would be that such an architecture could be inherently self-optimizing. Presented with a repetitive task, or one requiring parallel processing, such a computer could be designed to route subtasks internally in increasingly efficient ways or using increasingly efficient algorithms. Tired of PCs that seem slower after a few months' use than when they were new? This computer would do just the opposite.

Beckett continues:
"When John Von Neumann first proposed computing machines 60 years ago, he proposed they function the way the brain does," says Stewart. "That would have meant analog parallel computing, but it was impossible to build at that time. Instead, we got digital serial computers."
Now it may be possible to build large-scale analog parallel computing machines, he says. "The advantage of those machines is that intrinsically they can do more learning."

Not science fiction. Science fact. And soon--sooner than any of us imagined--it will be up to the engineers and entrepreneurs to leverage this in fantastical ways, using memristors as routinely as they do transistors, invented just sixty years ago.


09 June 2008

Putting the "Lab" back in "LabVIEW" -- while spreading LabVIEW out of the lab


As my panel on "Breakthrough Innovation" at NI Week last year discussed, National Instruments and its LabVIEW programming environment are both an important part of the nanotechnology ecosystem and a fine case study of the commercial benefits of ecosystem-building and customer-driven, recombinant innovation.

So some media coverage of National Instruments caught my eye recently, such as:

National Instruments Announces Global Multi-core Programming Workshop

National Instruments (Nasdaq: NATI) today announced an initiative sponsored by Intel Corporation to deliver free, hands-on multi-core programming workshops based on the NI LabVIEW graphical programming language to engineers and scientists around the globe. The Multi-core Programming with NI LabVIEW Hands-On Workshop will be presented in 18 U.S. and Canadian cities beginning in May and 15 international cities this fall...

and


EETimes: LabView poised for parallel role


James Truchard thinks he may have one of the keys to the multicore era.
The chief executive of National Instruments believes the company's flagship LabView environment offers many of the tools parallel programmers need today. NI has plugged into parallel research efforts at Intel Corp. and Berkeley to make sure LabView evolves with the times...
LabView can automate the assignment of jobs to different cores and threads. It also can generate C code and offers tools to manually optimize CPU loading, debugging and the use of cache in multicore chips...
"This is going to be pretty painful for people," said Truchard. "Today's programming languages really weren't developed for parallel programming" and the new mechanisms Intel and others are developing to plug the holes "add a lot of complexity to the programming environment," he said...
Truchard notes that high school students in the annual First Robotics contest used LabView to program FPGAs. Last year NI set up a lab at Berkeley so students there could use its tools to prototype embedded system designs.

(How about that. High school and college students making their own silicon. Wow.)

The program is one of several at NI aimed at keeping LabView in the forefront of parallel programming research, currently one of the hottest topics in computer science, thanks to the move to multicore processors...

As you can deduce, there has been angst among analysts and engineers alike about the scarcity of software architectures supporting multiprocessing and parallelism.

Hardware has gotten ahead of software. But NI has had parallelism nailed in the architecture of LabVIEW for more than two decades, at least for scientific instrumentation and process automation. With the introduction of LabVIEW FPGA three years ago and LabVIEW 8.5 for multicore processors last year, support for true parallelism became not only possible but easy.

Now comes this "initiative sponsored by Intel."

The thing that raised my antennas is that the coverage underplays LabVIEW's traditional market focus on scientific instrumentation and process automation.

Is NI staging a breakout into general computing? Might LabVIEW be the new C++? Is NI taking the "Lab" out of "LabVIEW"?

Technically, there's no reason it couldn't happen. It is a fully-fledged programming environment.

If, say, The MathWorks issued a press release about Matlab, touting sponsorship by Intel, extolling multiprocessing support, commencing a worldwide tour that spotlights the ease with which parallelism can be leveraged and managed on its platform ...and avoiding mention of its foundational calculational capabilities... then we might all agree that would be news. And so is this.

So. Can they take LabVIEW out of the lab? Is that what this is about, even? Maybe I'm just running a fever, but that's how it hit me.

If I were to venture a suggestion, I'd advise that NI consider seeding the programming world with a version of LabVIEW that omits its instrumentation and heavy-duty numerical modules. Maybe even a free one, or one targeted at students. Include its current facilities for integrating traditional code, and watch what happens when folks realize how straightforward programming for multiprocessing can be.

Arguing against the notion is the dictum from page 102 of Scott's Big Book of Pithy Pronouncements: "The Great Unfunded Liability of technology is: Support." On the other hand, NI's experience with Lego Robotics might suggest an affordable upper bound for the incremental support-cost obligation.

The time is right for Dr. T to stake his claim as the Moses of Multiprocessing. He already has the stone tablets. (He got 'em from Jeff Kodosky.)


Happily, research customers can revel in another new initiative at NI, aimed at strengthening its traditional ties to the academic and research marketplaces. (After all, another dictum is, "Dance with the girl that brought ya.") I don't know all the details yet, but what I've heard is exciting for researchers at colleges and universities. It includes an NI Week Academic Forum, a focused day of presentations and discussions for research customers and industry partners, which extends NI Week one day forward. More at http://www.ni.com/niweek/academic.htm --and it appears you'll be hearing more about the broader aspects of this new outreach initiative in the coming weeks.




P.S. I learned today that my "Breakthrough Innovations" co-panelist, Dr. Andrew Hargadon, Director of the Center for Entrepreneurship and the Energy Efficiency Center at the University of California, Davis, will present the closing keynote address at this summer's NI Week. Excellent!

----------

UPDATE: 2 July 2008: The Mystery of the Exploding Hit-Meter has been solved, thanks to a kindly email from NI's Vincent Carpentier. Seems this post is linked in NI News for July '08. The whole newsletter is well worth visiting-- lots of exciting developments and interesting applications, with webcasts and other resources of interest to both LabVIEW users and just-plain-folks interested in the latest in computing and instrumentation.