Finding Right ADC for 5V Differential Input

  • Thread starter Jdo300
  • Start date
  • Tags
    Adc
In summary, the conversation was about finding the right Analog to Digital converter IC to use in a circuit that can detect a voltage input and convert it to a digital value. The voltage divider is designed to handle an input range of 0 to 1000V, with a maximum voltage of 5V. There were discussions about protecting the circuit from spikes and over-voltages, and using a step-down transformer or a rectifier for this purpose. Other requirements for the ADC included being 12-bit, isolated from the ground line, and having a serial interface. There was also a mention of using a sample and hold circuit before the ADC. Ultimately, the conversation concluded with the possibility of using a transformer to step down the AC input and
  • #1
Jdo300
554
5
Hello All,

I'm working on a circuit that can detect a voltage input and convert it to a digital value that my micro-controller can read. But I'm having some trouble finding the right Analog to Digital converter IC to use. I want the circuit to be able to detect a positive or negative voltage difference of 5V across a resistor in a voltage divider.

The voltage divider is designed to handle an input range of 0 to 1000V so I am dividing it down 200:1 so that the maximum voltage I need to detect is 5V. (see attached image).

Ideally, I want the voltage divider circuit to be completely isolated from the circuit ground in case there are any spikes or other noise that could booger up my circuit, so I'm trying to find a ADC that can measure a differential input that is isolated from the ground line. Other requirements are that it be 12-bit, come in a DIP8 package (so I can socket it and replace it if/when I blow it up), and that it have a Serial interface to connect to my controller. Also it doesn't need to be blazingly fast since I am using a BASIC stamp to clock the data from it (I think the serial interface on those runs only at 14kHz). It would be great if someone here could give me some pointers in the right direction, as there are a dizzying number of different ADCs out there and I've just began my search for the right one.

Thanks,
Jason O
 

Attachments

  • Voltage Sensor.GIF
    Voltage Sensor.GIF
    1.4 KB · Views: 557
Engineering news on Phys.org
  • #2
Update,

Ok, I'm simplifying my life by not caring about the polarity of the voltage. I attached a new circuit diagram below showing my new idea. Basically, I'm just rectifying bumping down, and filtering the input voltage so I can read it safely. I also added a zener diode to clamp any spikes or over-voltages that may occur just in case the filter doesn't get everything.

To protect the ground line from current surges, I added a 1M resistor between the negative ADC input and the circuit ground. Does this look like it would be good enough to safely protect the circuit from the input?

Thanks,
Jason O
 

Attachments

  • ADC Circuit.GIF
    ADC Circuit.GIF
    8.2 KB · Views: 524
  • #3
Yoiks. First, I'd recommend using a step-down transformer from the AC high voltage down to a lower voltage to be digitized. That would be the traditional way to do this, so that UL safety regulations could be met. If that is not possible for some reason, then you should put your voltage divider first (before the rectifier that you show), and shield the hot parts (HV stuff) of the circuit from any possible human contact, and only expose the low voltage, low current stuff that goes into your ADC.
 
  • #4
Hi Berkeman,

I would go with a step-down transformer but this circuit must be able to meausre both DC and AC input. As for the rectifier, Since I'm ultimitely dividing a maximum input of 1000V down to 5V, I'm very concerned about the voltage drop that the diodes will present to the ADC inputs, if they drop 0.7V, then wouldn't that mean that all the measured values below 140V would get cut out? If I leave the diodes on the HV side of the diodes, I don't have to worry as much about the drop since using a 12-bit ADC gives me roughly 1-volt resolution through the whole range. Also, I'm not expecting to measure 1000V in regular use, I just wanted the circuit to be able to handle it if the input ever gets that high. realistically, I will be dealing with voltages between 50-300V. As far as shielding goes, are you referring to isolating the input from the circuit?

Thanks,
Jason O
 
  • #5
What is the maximum AC bandwidth? That will have a bearing on the resistors that you use for the dividers, and on how much parasitic capacitance you can tolerate.

It sounds like the way to go is to use a large divide ratio (with resistors big enough to limit the current significantly), and use an ADC with split supplies so that you can digitize both positive and negative voltages. Your sample and hold circuit before the ADC (if it's not built into the ADC you are using) will also need to run off of the split supplies.
 
  • #6
Hi Berkeman,

Ultimately the AC bandwidth is not important since its intended use is primarily for measuring DC voltages. But in practice, most of the voltages will likely look like noisy pulsed DC voltage, which could have AC components and ringing in it. So, ideally I just want to filter out some of the noise and get a fairly decent DC reading. It doesn't have to be totally accurate but just good enough to give me a good idea of the voltage I am measuring. I am mainly interested in the magnitude of the voltage, though in a perfect world, it would be nice to know what the polarity is too.

I wouldn't mind finding a nice ADC that can measure +-V on the input, but I want to avoid adding a negative supply on my board since I am cramped for space. If possible, it would be nice if they made an ADC that would measure +-V and run on 5V. The other concern that comes into play is the resolution. I would need something that has 24-bit resolution so that I can still get the 1 volt resolution that I would like for both positive and negative values. Do you know of any ADCs that would meet these requirements? I've been looking all over but I'm sure there's got to be something like that out there. The last constraint is that I want something that comes in an 8-pin DIP so I can socket it (and replace it) if/when I blow it up. Or if surface mount, something I can solder onto an adapter board and plug into an 8-pin DIP socket for easy replacement.

Thanks,
Jason O
 
  • #7
As Berkeman mentioned, use a transformer to step down the AC input and then rectify.
If you don't want the polarity, you can step down your AC considerably and use a RMS to DC converter (AD736). They will save you some board space. You can connect the output of the RMStoDC to your ADC. Texas instruments makes 24bit ADCs if I remember correctly. They also send free samples.

If you want to know the polarity, here's my idea-
step it down (like 1000 times - transformer+attenuator), now the max voltage will be 1V.
Connect one terminal of the secondary of the transformer to +2.5V.
Now your signal is shifted(offset) by 2.5V.
Connect the other terminal of the transformer to an op amp/buffer.
Use a single supply op amp to feed the signal to an ADC.
Vref of your ADC should be 2.5V.

The output of op amp is connected to the ADC and a comparator.
connect the other terminal of the comparator to 2.5V. If the comparator output is High, then the input AC voltage is +ve. If the comparator voltage is Low, the input Ac voltage is -ve.

Try it if you think its interesting.

Also, to drive ADCs you can use differential line driver op amps(http://www.analog.com/en/subCat/0,2879,759%255F842%255F0%255F%255F0%255F,00.html#319 )
 
Last edited by a moderator:
  • #8
Hi, thanks for the tips on the op-amp polarity checker. But I mentioned earlier that I can't use a transformer because I am expecting the input voltage to be mainly DC. With the transformer, I could capture some of the AC part (which will not be sinusoidal) but then I will loose the DC component that I want to measure.

- Jason O
 
  • #9
If your input is DC, for isolation you can use isolation amplifier. Texas instruments and Analog devices make them. I think they provide up to 4KV isolation.
Another way would be to use a optical isolation. convert voltage to light and then back to voltage.
Add clamping diodes at the input of the ADC or at the input of the amp that drives the ADC. For example, if the max input voltage (after scaling down) is 1V, then add 2 back to back High voltage diodes at the +ve input of the ADC. In case the resistors at the Voltage divider open, the diodes clamp the HV.
One clamping diode is enough at the -ve input.

Just curious, what's the reason for the 10M resistor from ADC -ve to ground?
 
  • #10
Hi,

Actually, I spoke to someone about that 1M resistor and I doesn't need to be there. my original thought was to limit any currents that could get onto the ground line from the input (HV spikes or whatever).

As for the isolated amplifier, that would be great, but are there any isolation methods out there that don't require me to have a separate, isolated power supply to power the isolated side? It would be great if there was some passive way to isolate the input and couple it into the op-amp/ADC. So far, all of the iso-amps I've seen need a power supply on the isolated end as well as the grounded end.

I really like the optical isolation idea. Do you know of any good ICs out there that I can use? So far the only ones I've found are little opto-couplers but they need more current then the resistor divider may be able to supply.

By the way, I also reaized that I will need a bit more current to flow through the resistor divider so that the rectifier diodes will turn on, so I lowered the resistor values to 1M and 5.03k to give a current draw of rougly 1mA at 1000V (which is just the max input), realistically, I will usually be around the 50-100V range for what I'lll be experimenting with, in which case the current draw will be much lower. Are there any optical isolation devices that operate on extremely low currents?

Thanks,
Jason O
 
Last edited:
  • #11
I can't think of any optical isolation devices that operate on extremely low currents, reason being, a LED has to be turned ON, which usually requires at least 4mA of current.
You can use a transistor to drive the LEDs. But the transistor would need additional power supply.
If adding another power supply is difficult, you can use a 9V battery to power the ADC side of the circuit.
I vaguely remember seeing iso amps with their own isolated power supply. I think it was made by analog devices.
 
  • #12
Hi,

I checked out Analog's website and found the iso-amps you were mentioning, unfotrunately, they are $50+ which is way over my budget. So for now, I'll just try to add adaquite filtering and voltage supression to my circuit to keep any noise and hash off the inputs.

Thanks,
Jason O
 

FAQ: Finding Right ADC for 5V Differential Input

What is an ADC and why is it important for 5V differential input?

An ADC, or analog-to-digital converter, is a device that converts analog signals (such as voltage) into digital signals that can be processed by a computer or microcontroller. It is important for 5V differential input because it allows for accurate measurement and conversion of analog signals into digital data.

How do I determine the appropriate ADC for 5V differential input?

To find the right ADC for 5V differential input, you should consider factors such as resolution (the number of bits used to represent the input signal), sampling rate (how often the ADC takes a measurement), and input voltage range (the range of voltages that can be accurately measured). You should also consider the specific requirements of your project and choose an ADC that meets those needs.

Can any ADC handle 5V differential input?

No, not all ADCs are designed to handle 5V differential input. Some ADCs may only be able to handle lower voltages, while others may be able to handle a wider range of voltages. It is important to carefully review the specifications of an ADC before choosing it for 5V differential input.

What is the difference between single-ended and differential input for an ADC?

Single-ended input means that the ADC measures the voltage between a single input and a reference voltage. Differential input means that the ADC measures the voltage between two inputs. Differential input can provide better accuracy and noise rejection, making it a good choice for applications where a precise measurement is needed.

Are there any other important factors to consider when choosing an ADC for 5V differential input?

In addition to resolution, sampling rate, and input voltage range, other important factors to consider include the number of channels (inputs) the ADC has, the power consumption, and the interface (such as SPI or I2C). It is also important to consider the overall quality and reliability of the ADC, as well as any additional features that may be useful for your specific application.

Back
Top