Is Analog Computing Making a Comeback in AI Research?

In summary, the conversation discusses the use of analog computing in self-learning AI. While it is a popular option, it is not the most practical as digital processors are faster at emulating the necessary algorithms for AI. Building an analog computer involves modeling the circuit in SPICE and optimizing it before implementation. There are also drawbacks to using analog computing, and it is not the most efficient solution.
  • #1
The Art of DHT
1
0
TL;DR Summary
Anyone with experience building electronic analog computers
I am researching self-learning AI, and one of the most talked of solutions is returning to analog computing. Analog computing uses real time changes in voltage and frequency to run calculations or perform functions. Does anyone have any experience in building electronic analog computers based on the op amp?
 
Engineering news on Phys.org
  • #2
Hello @The Art of DHT ,

:welcome: ##\qquad## !​

Yes, lots of folks have lots of experience :biggrin:.

That may answer your question, but my guess is that it doesn't help you very much :smile:.

google is your friend. Tried some terms too ?

What is it you want to do ?

##\ ##
 
  • #3
The Art of DHT said:
I am researching self-learning AI, and one of the most talked of solutions is returning to analog computing.
That is the talk, but not the practice. It is faster to emulate the AI algorithm on a digital processor. Op-amps are good at linear operations, but AI requires non-linear functions, which are more quickly and more accurately emulated by the multiply-accumulate function in a digital processor.

The first step in building an analog computer is to model the analog circuit in SPICE. Only once the algorithm and circuit has been optimised should a parallel analog solution be implemented.
 
  • Like
Likes pbuk and BvU
  • #4
The Art of DHT said:
AI ... one of the most talked of solutions is returning to analog computing.
The AI related analog computing (the one I know about) is not exactly the opamp kind, and honestly, I would rather take that as some kind of hybrid or unique solution than 'real' analog.

Analog computing had some strong points way back, but there was also serious drawbacks too.
By now it's simply more pain than gain.

I's just like the gold standard. Everybody knows why it was phased out, but still, some people just can't stop flirting with the idea.
 
  • Like
Likes Klystron and Hornbein

Similar threads

  • Electrical Engineering
Replies
2
Views
2K
Replies
2
Views
2K
Replies
9
Views
4K
Replies
10
Views
2K
  • Electrical Engineering
Replies
3
Views
837
  • Engineering and Comp Sci Homework Help
Replies
8
Views
1K
  • Programming and Computer Science
Replies
29
Views
3K
  • Electrical Engineering
Replies
4
Views
2K
  • STEM Career Guidance
Replies
10
Views
2K
  • Electrical Engineering
Replies
17
Views
6K
Back
Top