Use of Analog Computers in Artificial Intelligence (AI)

Analog Computers are a class of devices in which physical quantities like electrical voltage, mechanical motions, or fluid pressure are represented so that they are analogous to the corresponding amount in the problem to be solved. 

Here is a simple example of an analog computer.

If we turn the black and white wheels by certain amounts, the gray wheel shows the sum of the two rotations.

One of the earliest analog computers was The Antikythera  Mechanism, constructed around 100-200 B.C. It involved a series of interlocking bronze gears in such a way that the motion of certain dials was analogous to the motion of the sun and the moon. It was also capable of predicting eclipses decades in advance.

Advantages & Disadvantages of Analog Computers

To add two eight-bit numbers, around 50 transistors are required. With an analog computer, however, we can add two currents by simply connecting two wires. Similarly, to multiply two numbers, we need 1000s of transistors. On the contrary, we can pass a current (I Ampere) through a resistor (R-Ohm), and the potential drop across the two ends of the wire would be I*R, i.e., the product of two numbers.

Analog computers are powerful, fast, and energy efficient. However, digital computers replaced them because they are single-purposed and inaccurate, and since the inputs are continuous, it is difficult to repeat the process exactly.

Analog Computers & AI.

In AI, analog computers were used for various tasks, including pattern recognition, decision-making, and control. For example, they were used to train neural networks, which are machine-learning models inspired by the human brain’s structure and function. Analog computers were also used to implement rule-based AI systems that use specific rules to make decisions or take actions. 

Despite their widespread use in the past, analog computers are no longer as common in AI and machine learning, largely due to the advent of digital computers. Digital computers are much faster and more reliable than analog computers, and they can store and process much larger amounts of data. Additionally, digital computers are easier to program and maintain, which has made them the preferred choice for most AI and machine learning applications.

The rise in the use of Analog Computers in AI

There is a growing trend toward using larger neural networks in machine learning and artificial intelligence applications. This trend is driven by the need to improve performance on increasingly complex tasks and the availability of more data, hardware, and algorithms to support the training of larger networks. However, there are certain challenges associated with this increased demand.

  • Training a large neural network requires an amount of energy equivalent to the average yearly consumption of around three households.
  • Every modern computer stores data in memory and accesses it as needed. But when neural networks require huge matrix multiplications, most of the time, the energy goes into fetching the value of the weights rather than performing the calculations.
  • According to Moore’s Law, the number of transistors on a chip has traditionally doubled every two years. However, we are now approaching the point where the size of a transistor is approaching the size of an atom, which presents significant physical challenges to further miniaturization.

As digital computers approach their limits, neural networks have gained widespread popularity, with much of their functionality centered on matrix multiplication. Additionally, neural networks do not require the precise calculations of digital computers, as a 98% or 95% confidence in classifying an image as a dog is sufficient. These factors present a prime opportunity for analog computers to assume a much more leading role in AI.

Case Study: Mythic AI

Mythic AI is an analog computing startup that creates analog chips to run neural networks. Different AI algorithms, like motion detection, depth estimation, classification of objects, etc., are run in the analog domain.

Mythic has modified digital flash storage cells to make this possible. These cells are typically used for memory storage and can hold either a one or a zero. A positive voltage applied to the control gate allows electrons to pass through an insulating barrier and become trapped on the floating gate. The voltage can then be removed, and the electrons will remain on the floating gate for a long time, preventing current from flowing through the cell. 

      Source: https://www.youtube.com/watch?v=GVsUOuSjvcg&t=1128s

The stored value can be determined by applying a small voltage. No current will flow if there are electrons on the floating gate, indicating a zero. If there are no electrons, current will flow, meaning a one.

             Source: https://www.youtube.com/watch?v=GVsUOuSjvcg&t=1128s

Mythic’s idea is to use these cells not as on/off switches but as variable resistors. They do this by putting a specific number of electrons on each floating gate instead of all or nothing. The greater the number of electrons, the higher the channel’s resistance. When applying a small voltage, the flowing current equals V over R. But you can also think of this as voltage times conductance, where conductance is just the reciprocal of resistance. So a single flash cell can be used to multiply two values together, voltage times conductance.

To use this to run an artificial neural network, they first write all the weights to the flash cells as each cell’s conductance. Then, they input the activation values as the voltage on the cells. And the resulting current is the product of voltage times conductance, which is activation times weight. The cells are wired together so that the current from each multiplication adds together, completing the matrix multiplication.

              Source: https://www.youtube.com/watch?v=GVsUOuSjvcg&t=1128s

Their chip can perform 25 trillion math operations per second while using only 3W of power. On the contrary, newer digital systems can perform 20-100 trillion math operations per second, but they are costly (1000s of dollars) and consume 50-100W of power.

There have been suggestions to utilize analog circuitry in smart home speakers specifically for the purpose of detecting wake words such as “Alexa” or “Siri.” This approach would require less power and allow for the rapid and reliable activation of the digital circuitry in the device.

To sum up, it is uncertain if analog computers will become as prevalent as digital computers. However, they are better suited for a variety of current tasks we want computers to perform, and maybe we could make machines achieve true intelligence through the power of analog.


Don’t forget to join our Reddit page and discord channel, where we share the latest AI research news, cool AI projects, and more.

References:

  • https://www.britannica.com/technology/analog-computer
  • https://www.youtube.com/watch?v=GVsUOuSjvcg&t=1128s
  • https://www.youtube.com/watch?v=IgF3OX8nT0w