Lawrence Livermore National Laboratory (LLNL) has bought a first-of-a-kind brain-inspired supercomputing platform for deep learning inference developed by IBM Research.
Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16-million neurons and 4-billion synapses and consume the energy equivalent of a tablet computer – a mere 2,5 watts of power for the 16 TrueNorth chips.
The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.
The new system will be used to explore new computing capabilities important to the National Nuclear Security Administration’s (NNSA) missions in cybersecurity, stewardship of the nation’s nuclear deterrent and non-proliferation. NNSA’s Advanced Simulation and Computing (ASC) program will evaluate machine learning applications, deep learning algorithms and architectures and conduct general computing feasibility studies. ASC is a cornerstone of NNSA’s Stockpile Stewardship Program to ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing.
“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” says Jim Brase, LLNL deputy associate director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”
The technology represents a fundamental departure from computer design that has been prevalent for the past 70 years and could be a powerful complement in the development of next-generation supercomputers able to perform at exascale speeds, 50 times (or two orders of magnitude) faster than today’s most advanced petaflop (quadrillion floating point operations per second) systems. Like the human brain, neurosynaptic systems require significantly less electrical power and volume.
“The low power consumption of these brain-inspired processors reflects the industry’s desire and a creative approach to reducing power consumption in all components for future systems as we set our sights on exascale computing,” says Michel McCoy, LLNL program director for Weapon Simulation and Computing.
“The delivery of this advanced computing platform represents a major milestone as we enter the next era of cognitive computing,” said Dharmendra Modha, IBM Fellow and chief scientist: brain-inspired computing at IBM Research – Almaden. “We value our relationships with the national labs. In fact, prior to design and fabrication, we simulated the IBM TrueNorth processor using LLNL’s Sequoia supercomputer. This collaboration will push the boundaries of brain-inspired computing to enable future systems that deliver unprecedented capability and throughput, while helping to minimise the capital, operating and programming costs – keeping our nation at the leading edge of science and technology.”
A single TrueNorth processor consists of 5,4-billion transistors wired together to create an array of 1-million digital neurons that communicate with one another via 256-million electrical synapses. At 0,8 volts, it consumes 70 milliwatts of power running in real time and delivers 46 giga-synaptic operations per second – orders of magnitude lower energy than a conventional computer running inference on the same neural network.
TrueNorth was originally developed under the auspices of Defense Advanced Research Projects Agency’s (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program in collaboration with Cornell University.
Under terms of the contract, LLNL will receive a 16-chip TrueNorth system representing a total of 16-million neurons and 4-billion synapses. LLNL will also receive an end-to-end ecosystem to helps create and program energy-efficient machines that mimic the brain’s abilities for perception, action and cognition. The ecosystem consists of a simulator; a programming language; an integrated programming environment; a library of algorithms as well as applications; firmware; tools for composing neural networks for deep learning; a teaching curriculum; and cloud enablement.
Lawrence Livermore computer scientists will collaborate with IBM Research, partners across the Department of Energy complex, and universities to expand the frontiers of neurosynaptic architecture, system design, algorithms and software ecosystem.