If you’ve ever asked a car mechanic how long a part will last until it breaks, odds are they shrugged their shoulders. They know how long parts last on average, and they can see when one is close to breaking. But knowing how many miles are left is extremely difficult, even using a supercomputer, because the exact moment a belt snaps or a battery dies is to some extent random.
Scientists at Sandia National Laboratories are creating a concept for a new kind of computer for solving complex probability problems like this. They propose that a “probabilistic computer” could not only create smarter maintenance schedules but also help scientists analyze subatomic shrapnel inside particle colliders, simulate nuclear physics experiments and process images faster and more accurately than is possible with conventional computers.
As part of a new microelectronics codesign research program, the Department of Energy’s Office of Science recently awarded the project $6 million over the next three years to develop the idea. Sandia will be working with Oak Ridge National Laboratory, New York University, the University of Texas at Austin and Temple University in Philadelphia.
A codesign microelectronics project involves multidisciplinary collaboration that takes into account the interdependencies among materials, physics, architectures and software. Researchers also will look at ways to incorporate machine learning methods.
The concept for a probabilistic computer runs opposite to how computers are normally built and programmed, Sandia scientist Brad Aimone said. Instead of making one that is perfectly predictable, Sandia wants one with built-in randomness that computes information differently every time.
“To a large degree, and at a great energy cost, we engineer computers to eliminate randomness. What we want to do in this project is to leverage randomness. Instead of fighting it, we want to use it.” said Aimone, who leads the project he and his team call COINFLIPS (short for CO-designed Improved Neural Foundations Leveraging Inherent Physics Stochasticity).
“What if, when I’m communicating with you, I flip a coin?” Aimone said. “If heads, you act on my message; if tails, you ignore it. We want to discover how you can use randomness like this to solve problems where probability is important.”
Concept modeled after unpredictable connections between brain cells
Aimone is an expert in technology that mimics the brain, including machine learning. He got his idea for a probabilistic computer from how brain cells talk to each other.
Inside your brain there are billions of cells called neurons that pass information across trillions of cell-to-cell connections called synapses, Aimone said. Whenever one neuron has a message, it sends a signal to lots of other neurons at the same time. But, only a random fraction on the receiving side carry on the message to more cells. Neuroscientists don’t agree why, but Aimone thinks it could be a reason why brains do some tasks better than computers, such as learning and adapting, or why they use less energy.
To imitate this brain behavior, scientists need to figure out how to generate trillions of random numbers at a time. That much randomness is too complex and takes too much power for computers, said Sandia’s Shashank Misra, who leads the COINFLIPS hardware team.
“We will need to get creative with new approaches, including new materials, atomic-scale control and machine learning-driven designs to generate the sheer volume of randomness needed and to make it useful for computation,” Misra said.
Conventional computers can look at the optical illusion on the left and normally only see a vase or two faces. Sandia National Laboratories is laying the groundwork for a computer that, like our brains, can glance many times and see both. Credit: Laura Hatfield, Sandia National Laboratories
COINFLIPS will also identify tasks that benefit from randomness.
Probabilistic computers are part of a larger effort at Sandia to explore what computers in the future might look like. Researchers around the world have recognized that the rate at which computers are improving is slowing down, Aimone said. To break past the apparent limits of computers, scientists are looking at new, original ways of designing them.
Conrad James, the Sandia manager of the COINFLIPS team said, “Several of us at Sandia have been exploring brain-inspired computing and new design approaches for years. Encouraging more communication between mathematicians, algorithm developers and device physicists led to the formation of this team and research proposal.”
Sandia adds to other efforts to rethink computers
COINFLIPS was one of only 10 proposals selected nationwide to receive funding to design new, energy-efficient microelectronics. Separately, Sandia is lending its expertise in nanotechnology and computer modeling to another selected project led by Lawrence Berkeley National Laboratory.
These researchers will be redesigning nanosized sensors used in communications, imaging, remote sensing and surveillance technologies to be more compact, efficient and integrated into a computer processor.
“The photon absorption, the transduction to an electrical event and the measurement will all be part of one quantum system,” said Sandia physicist François Léonard, who is a member of the collaboration.
They'll also use sophisticated materials like carbon nanotubes, which are hollow carbon straws 100,000 times thinner than a strand of hair, to improve the sensors.
A third Sandia team, led by researchers Alec Talin and Matt Marinella, will assist Oak Ridge National Laboratory with another selected project. Their findings could aid in improving the energy efficiency of sensor processing in driverless vehicles, mobile gadgets, and satellites.
According to Talin, the majority of a computer chip's time and energy is spent moving data from where it is stored to where it is processed. However, by combining these two factors and employing brain-inspired devices developed at Sandia, it may be feasible to reduce the amount of power computers need.
“The important concept is that memory and logic (processing) are co-located in the same basic constituent in the brain, the neuron,” Talin explained.
Fast, energy-efficient systems could possibly execute complicated tasks like image recognition and language translation in real time on mobile devices like smartphones without relying on cloud computing, according to Talin.
This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.