New Error Correcting Could Reduce AI Hallucinations

John Lister's picture

A physical limitation in traditional computers could be harnessed to power artificial intelligence according to a start-up business. They've demonstrated what they call a "thermodynamic computer."

The research is all about the way that computers use a physical process to carry out a digital, mathematical operation. All traditional computing boils down to a circuit board using electronic switches to represent and process data as either a 1 (for "true") or a 0 (for "false").

The physical aspect can lead to problems, however. New Scientist gives the example of a component warming up and causing a physical change, and in turn, an error in the data. Most computers deal with this through a combination of design to reduce the physical effects and error correction to spot where a bit of data has likely been changed. (Source:

Feel The Noise

However, New York-based Normal Computing has instead embraced the "noise" to carry out more powerful calculations. This included using electrical currents that have random variations rather than being a consistent flow.

The effect is complicated, but takes advantage of the fact that the circuit will naturally correct for these errors. This correction can be used to carry out a calculation more efficiently than normal in some scenarios. While this is only a proof of concept, the idea would be to use the same approach to take advantage of natural thermodynamic fluctuations such as tiny temperature swings.

The company says the particular benefits of this approach happen to work well for artificial intelligence operations, specifically in calculating uncertainty. This could make it easier for AI systems to assess how likely they are to have explained data correctly. (Source:

No More "AI Hallucinations"

In practical terms that could mean, for example, an AI tool that generated text could be more likely to "know" if the information conveyed in that text was meaningful and genuine. The thermodynamic approach doesn't necessarily change how well this process works, but rather how energy-efficient it was. That in turn could make it more financially viable to improve AI or use it more widely.

However, the company says it may be five years before it can turn its proof of concept into a full-scale computing device.

What's Your Opinion?

Does this process sound plausible? Is it a worthwhile area to explore? Would more accurate AI-generated material be useful?

Rate this article: 
Average: 5 (2 votes)