Is Analog Computing Making a Comeback in AI Research?

In summary, the conversation discusses the use of analog computing in self-learning AI. While it is a popular option, it is not the most practical as digital processors are faster at emulating the necessary algorithms for AI. Building an analog computer involves modeling the circuit in SPICE and optimizing it before implementation. There are also drawbacks to using analog computing, and it is not the most efficient solution.
  • #1
The Art of DHT
1
0
TL;DR Summary
Anyone with experience building electronic analog computers
I am researching self-learning AI, and one of the most talked of solutions is returning to analog computing. Analog computing uses real time changes in voltage and frequency to run calculations or perform functions. Does anyone have any experience in building electronic analog computers based on the op amp?
 
Engineering news on Phys.org
  • #2
Hello @The Art of DHT ,

:welcome: ##\qquad## !​

Yes, lots of folks have lots of experience :biggrin:.

That may answer your question, but my guess is that it doesn't help you very much :smile:.

google is your friend. Tried some terms too ?

What is it you want to do ?

##\ ##
 
  • #3
The Art of DHT said:
I am researching self-learning AI, and one of the most talked of solutions is returning to analog computing.
That is the talk, but not the practice. It is faster to emulate the AI algorithm on a digital processor. Op-amps are good at linear operations, but AI requires non-linear functions, which are more quickly and more accurately emulated by the multiply-accumulate function in a digital processor.

The first step in building an analog computer is to model the analog circuit in SPICE. Only once the algorithm and circuit has been optimised should a parallel analog solution be implemented.
 
  • Like
Likes pbuk and BvU
  • #4
The Art of DHT said:
AI ... one of the most talked of solutions is returning to analog computing.
The AI related analog computing (the one I know about) is not exactly the opamp kind, and honestly, I would rather take that as some kind of hybrid or unique solution than 'real' analog.

Analog computing had some strong points way back, but there was also serious drawbacks too.
By now it's simply more pain than gain.

I's just like the gold standard. Everybody knows why it was phased out, but still, some people just can't stop flirting with the idea.
 
  • Like
Likes Klystron and Hornbein

Related to Is Analog Computing Making a Comeback in AI Research?

What is analog computing, and how does it differ from digital computing?

Analog computing involves using continuous signals to represent information, as opposed to the discrete binary signals used in digital computing. Analog computers perform calculations using physical quantities such as voltage, current, or mechanical movements, which can be advantageous for certain types of problems, particularly those involving differential equations and real-time processing.

Why is there renewed interest in analog computing for AI research?

Renewed interest in analog computing for AI research stems from its potential to offer significant energy efficiency and speed advantages over traditional digital computing. As AI models grow increasingly complex, the power consumption and heat generation of digital systems become critical issues. Analog computing can potentially address these challenges by providing more efficient computation methods.

What are the potential benefits of using analog computing in AI applications?

Potential benefits of using analog computing in AI applications include reduced power consumption, faster processing speeds, and the ability to naturally handle certain types of computations, such as those involving continuous data and differential equations. These advantages can make analog computing particularly suitable for edge computing and real-time AI applications.

What are the main challenges facing the adoption of analog computing in AI?

The main challenges facing the adoption of analog computing in AI include the difficulty of designing and fabricating reliable analog hardware, the lack of mature development tools and frameworks, and the need for new algorithms and software that can fully leverage the capabilities of analog systems. Additionally, analog systems can be more susceptible to noise and less precise than digital systems, which can be a hurdle for certain applications.

Are there any current examples of analog computing being used in AI research or applications?

Yes, there are current examples of analog computing being used in AI research and applications. One notable example is the development of neuromorphic computing systems, which mimic the architecture and functioning of the human brain using analog circuits. These systems are being explored for their potential to efficiently perform tasks such as pattern recognition, sensory processing, and decision-making. Other examples include research into analog accelerators for neural networks and the use of memristors for analog computation in machine learning.

Similar threads

  • Electrical Engineering
Replies
2
Views
2K
Replies
2
Views
2K
Replies
9
Views
5K
Replies
10
Views
2K
  • Electrical Engineering
Replies
3
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
8
Views
1K
  • Programming and Computer Science
Replies
29
Views
3K
  • Electrical Engineering
Replies
4
Views
2K
  • STEM Career Guidance
Replies
10
Views
2K
  • Electrical Engineering
Replies
17
Views
6K
Back
Top