Today’s artificial intelligence technologies use quite a lot of energy. For this reason, the production of low-energy AI systems is very important for a sustainable world.
Artificial intelligence may be performed using tiny nanomagnets that communicate similarly to neurons in the brain, according to researchers. The Imperial College London researchers have developed a new technique that may significantly cut the energy cost of AI, which is presently doubling every 3.5 months around the world.
The international team has published the first evidence that networks of nanomagnets can be utilized to carry out AI-like operations in a paper in Nature Nanotechnology. The researchers demonstrated that nanomagnets could be used for ‘time-series prediction’ activities, such as predicting and managing diabetes patients’ insulin levels.
Artificial intelligence that employs ‘neural networks’ tries to duplicate the function of the brain’s neurons, in which they communicate with each other to process and store information. The math behind neural networks was originally designed by physicists to explain how magnets interact, but at the time it was too complex to utilize magnets directly as researchers didn’t know how to input data or get results.
Instead, software running on conventional silicon-based computers was used to duplicate the magnetic interactions in the brain, allowing them to simulate it. Using magnets directly to process and store data has now been possible thanks to advances made by the team; eliminating the middleman of a software simulation and potentially saving a lot of money.
For example, researchers at ETH Zurich have developed a new neural network that is able to read tree heights using satellite images.
The position of the nanomagnets can be altered with a magnetic field. Depending on the properties of the input field and the states of neighboring magnets, applying a magnetic field to a network of nanomagnets changes the state of the magnets.
Researchers in the University of Sheffield’s Department of Physics were able to devise a method to count the number of magnets in each condition after the field has passed through, allowing them to obtain the result.
“We’ve been trying to crack the problem of how to input data, ask a question, and get an answer out of magnetic computing for a long time. Now we’ve proven it can be done, it paves the way for getting rid of the computer software that does the energy-intensive simulation,” said Dr. Jack Gartside, the co-first author of the study.
Co-first author Kilian Stenning added: “How the magnets interact gives us all the information we need; the laws of physics themselves become the computer.”
“It has been a long-term goal to realize computer hardware inspired by the software algorithms of Sherrington and Kirkpatrick. It was not possible using the spins on atoms in conventional magnets, but by scaling up the spins into nanopatterned arrays we have been able to achieve the necessary control and readout,” explained Dr. Will Branford, the team leader.
The technology behind AI is now used in a variety of settings, such as voice recognition and self-driving cars. However, even relatively basic activities might need a great deal of energy to train AI. For example, training an AI to solve a Rubik’s cube consumes the same amount of energy as two nuclear power stations running for an hour. But low-energy AI systems would be doing the same task in a more sustainable way.
One of the main drawbacks of traditional, silicon-chip computer technologies is that a lot of their energy goes to waste when electrons are transferred inefficiently during processing and memory storage. Nanomagnets, on the other hand, process and transfer information in the form of a ‘magnon’ wave rather than using physical particle transportation like that used by conventional computers.
This way, low-energy AI systems will be able to process and store data at the same time rather than one after the other as in previous computers. This invention may lead to nanomagnetic computing becoming up to 100,000 times more efficient than current methods.
How can low-energy AI be utilized?
The team will next teach the system using real-world data, such as ECG signals, and aim to turn it into a functioning computer device. Magnetic systems may eventually be integrated into conventional computers to increase energy efficiency for computationally intensive processes.
Renewable energy might be used in low-energy AI systems. They would have the ability to handle data as it is collected, such as weather stations in Antarctica, rather than returning it to major data centers.
This low-energy AI approach may be utilized to develop sensors that can convert biometric data collected on the body into useful signals, such as predicting and regulating insulin levels in diabetic individuals or detecting abnormal heartbeats.