Extropic releases a new thermodynamic computer

Company founder Gill Verdon said: I’m very happy to finally share more about what Extropic is building: a full-stack hardware platform that leverages the natural fluctuations of matter as a computing resource to generate artificial intelligence.

What does this novel computing paradigm actually mean for the world?

Extend hardware extensions far beyond the limits of digital computing
Make AI accelerators many orders of magnitude faster and more energy efficient than digital processors (CPU/GPU/TPU/FPGA)
Unlock powerful probabilistic artificial intelligence algorithms that are not feasible on digital processors
A short Litepaper (below) gives you a preliminary understanding of our technology. We hope the following will make you excited about your future journey. Join us to accelerate the move towards a thermodynamic intelligent future.

  • Gill and Trev -Gill and Trev
    The core idea of Extropic is to use the physical randomness inherent in nature as a direct resource for computing.

The demand for computing power in the AI era is growing at an unprecedented exponential rate. Fortunately, over the past few decades, CMOS transistor technology has been miniaturized in accordance with Moore’s Law, making this exponential growth largely possible by improving computer efficiency.

Unfortunately, Moore’s Law began to slow down. The reason is rooted in basic physics: Transistors are approaching the atomic scale, where effects such as thermal noise are beginning to prohibit strict digital manipulation.

As a result, the energy demands of modern AI began to take off. Extreme measures have been proposed by the main players, such as the construction of a nuclear reactor drive data center dedicated to large-scale model training and reasoning. Continuing this expansion for decades will require infrastructure engineering work on an unprecedented scale and represents a difficult path to expanding human intelligence in general.

Biology, on the other hand, is neither rigid nor digital, and its computing circuits are much more efficient than anything humans have built to date. Networks of chemical reactions between cells drive computing in biological systems. Cells are small, so the number of reactants in these networks is countable [6, 7]. Therefore, reactions between reactants are indeed discrete and essentially random. The relative effect of this intrinsic randomness is inversely proportional to the number of reactant molecules, so fluctuations often dominate the dynamics of these systems.

From this, we can say with certainty that the constraints of digital logic do not limit the efficiency of computing equipment. The engineering challenge is obvious: How can we design a complete artificial intelligence hardware and software system from scratch so that it can thrive in an inherently noisy environment?

Energy-based models (EBMs) provide clues to potential solutions because they are concepts that arise in both thermodynamic physics and basic probabilistic machine learning. In physics, they are called parameterized thermal states and arise from the steady state of a system with adjustable parameters. In machine learning, they are called exponential families.

As we all know, exponential families are the best way to parameterize probability distributions, requiring a minimum amount of data to uniquely determine their parameters. As a result, they perform well in low-data situations, including scenarios where tail events in mission-critical applications need to be modeled, as shown in Figure 1. They achieve this by filling in gaps in the data with noise; they seek to maximize entropy while matching statistics on target distribution. This process of hallucinating every possibility not included in the dataset and severely punishing such events requires the use of a lot of randomness in training and reasoning.

At the microscopic scale, the behavior of matter, such as the movement of electrons in a conductor, is influenced by thermal noise and other random effects. This randomness is often seen as noise in traditional computing and needs to be minimized, but in Extropic’s design, this randomness is used directly to drive the computing process.

This sampling requirement has been a major limiting factor in the production and use of EBM. The fundamental reason is that sampling from the general energy landscape is very difficult on digital hardware, which must consume large amounts of electricity to generate and shape the entropy needed for the diffusion process. From a hardware perspective, digital sampling seems contrived: Why spend so much energy building increasingly complex raw digital computers when the most common and computationally intensive algorithms change and are full of noise?

Extropic is shortening this inefficiency and unlocking the full potential of generative AI by implementing EBM directly as a parametric random analog circuit. In terms of runtime and energy efficiency of algorithms based on complex landscape sampling, the Extropic accelerator will achieve multiple orders of magnitude improvements over digital computers.

The inverse accelerator works like Brownian motion. In Brownian motion, macro but light particles suspended in a fluid are subjected to random forces due to multiple collisions with microscopic liquid molecules. These collisions cause particles to spread randomly around the container. One can imagine using springs to fix Brown particles to the blood vessel wall and to each other, as shown in Figure 2 (a). In this case, the spring will resist random forces and the particles will tend to stay in specific parts of the container more than other parts. If you repeatedly sample the positions of the particles and wait long enough between samples (as shown in Figure 2 (b)), you will find that they follow a predictable steady-state probability distribution. If we change the stiffness of the spring, this distribution changes. This simple mechanical system is the source of programmable randomness.

More content: You can go to the official website behind the video

If you want to learn more, you can click on the link below the video.
Thank you for watching this video. If you like it, please subscribe and like it. thank

Paper address:https://www.extropic.ai/future

Video:

Scroll to Top