Can Ohm's Law be applied to all types of devices and power sources?

  • Thread starter jeff davis
  • Start date
  • Tags
    Wires
In summary, Jeff explained that if you have a smaller wire (which will increase the resistance), you will draw more current than the device is designed for. If you are using a constant unregulated power source, lowering the voltage will increase the current. This may cause the device to burn.
  • #1
jeff davis
55
13
Hello,
I am pondering a topic and i can't quite find the answer that i am looking for. It concerns amp draw. I understand ohms law and the relationships it entails. I am confused with the idea that people say a device will draw more amps than it is supposed to and cause things to burn up. I understand the idea of putting too much power to something and burning it up, but i am just making sure that my thought pattern is correct for the contex of "drawing too much current"
if you have x amount of voltage and x amount of resistance, then you can only draw a certain amount of current, correct? Just because you have a smaller wire (which will increase the resistance) does not mean that you are pulling any more amps or volts, rather you are decreasing them right? So the wire will burn up, and not the device it is connected to?
That was pretty mush moshed because i am having a hard time explaining what i want to know.

I guess i could ask just directly:
1.) can you burn up a device by using too small of a power cord? (other than a motor)
2.) When considering power (P=IV) how can you possibly put less voltage onto something and it draw way more current? Like if you had something that was rated 120v 400w, would it burn up if you only hooked 2v to it? the current would then be 200amps instead of 10/3?? This idea goes against my understanding of ohms.
 
Last edited:
Engineering news on Phys.org
  • #2
I am going to take a crack at your questions but hopefully someone else can give a better answer or even correct mine.

1) Using a smaller wire will increase the resistance of that wire (modeling the wire as a simple resistor) as you have said. This will cause the wire to burn out before the device it is connected to.

2) If you are using a constant unregulated power source (meaning P is constant) then by the equation you have provided (P=IV), if you decrease the voltage, the current will increase. This may cause the device to burn.
 
Last edited:
  • #3
Hi there Jeff

jeff davis said:
Hello,
I am pondering a topic and i can't quite find the answer that i am looking for. It concerns amp draw. I understand ohms law and the relationships it entails. I am confused with the idea that people say a device will draw more amps than it is supposed to and cause things to burn up. I understand the idea of putting too much power to something and burning it up, but i am just making sure that my thought pattern is correct for the contex of "drawing too much current"

there are 2 main reasons why a device will draw more current than it's supposed to

1) you supply a higher voltage that what it was designed for = more current draw - Ohms Law
2) if there is a low resistance / short circuit fault in the device - this will result in a higher current being drawn, even tho the device is receiving the correct voltage
if you have x amount of voltage and x amount of resistance, then you can only draw a certain amount of current, correct?

Correct
Just because you have a smaller wire (which will increase the resistance) does not mean that you are pulling any more amps or volts, rather you are decreasing them right? So the wire will burn up, and not the device it is connected to?

correct, you will get to a point where the wire cannot carry the current required by the device and it will burn up ( fuse)

This is how a fuse protects something - its diameter/makeup is designed to carry a certain current


I guess i could ask just directly:
1.) can you burn up a device by using too small of a power cord? (other than a motor)

NO see above comments
2.) When considering power (P=IV) how can you possibly put less voltage onto something and it draw way more current? Like if you had something that was rated 120v 400w, would it burn up if you only hooked 2v to it? the current would then be 200amps instead of 10/3?? This idea goes against my understanding of ohms.

No, not for purely resistive loads --- a light globe, a heater element etc, but motors are an exception. when the voltage is lowered on the motor, under load, it will draw more current, heat up and burn out the wire windings
This is because motors present a much more complex load to the power supplycheers
Dave
 
Last edited:
  • Like
Likes 1 person
  • #4
Undersized wires cause a voltage drop at the appliance under load. You could design a circuit that would burn up if the input voltage gets too low. One would hope that engineers do the opposite and consider what will happen to their design if the voltage drops.

Undersized wires are a problem unto themselves. They don't need any reason to be avoided other than the fact that they are a fire hazard.
 
  • #5
Undersized wires cause a voltage drop at the appliance under load.

Yes that is correct, particularly where high currents are being used as with my transceiver radio gear where I'm hauling >20 amps at 13.8V.
BUT the voltage drop will not cause damage to the transmitter, it will just mean the transmitter cannot produce full power out.

As a result, I use a suitably sized conductor diameter that still is easily flexible and to keep it as short as practical

The DC power lead that comes with the radio is usually ~ 6 - 8 ft long. Most of us guys cut that down to ~ 4 ft and this can solve a up to 2V voltage drop during transmitDave
 
  • #6
Thanks guys for your help. The idea is a lot more clear now. I suppose that when something is labeled 120v 400w, that is more a ratio (and a rating i am sure) than anything right? It means that when you put 120v on it you get 400w of work? So if i connected 2v to it instead, and if it even worked at all, i would only get 20/3w?

Thanks again!
 
  • #7
jeff davis said:
Thanks guys for your help. The idea is a lot more clear now. I suppose that when something is labeled 120v 400w, that is more a ratio (and a rating i am sure) than anything right? It means that when you put 120v on it you get 400w of work? So if i connected 2v to it instead, and if it even worked at all, i would only get 20/3w?

Thanks again!

This is only correct if the device happens to be a heater (i.e. resistive and not operating at high temperature). If a motor is supplied with very low volts, it will not manage to turn at all and will, therefore produce no back emf. It will consequently take much more current than the simple formula would suggest. Likewise, for a filament lamp, the low temperature resistance may be 1/10 of the normal resistance when it is glowing. There are other devices that will take virtually no current with a low voltage supply (very high resistance).
Basically, only resistive devices follow Ohm's Law and, you should remember that Ohm's Law states the behaviour at constant temperature. Ohm's Law is, strictly, not the 'definition' of Resistance; it simply states that Resistance is constant at constant temperature (subtle difference there!).
 

Related to Can Ohm's Law be applied to all types of devices and power sources?

1. What causes small wires to burn up devices?

Small wires can burn up devices due to the high amount of electrical current passing through them. This can cause the wire to overheat and melt, leading to damage or malfunction of the device.

2. Can the type of wire used affect the likelihood of burning up devices?

Yes, the type and quality of wire used can have a significant impact on the likelihood of burning up devices. Thin, low-quality wires are more prone to overheating and burning, while thicker, higher-quality wires can handle higher currents without burning up.

3. Is it safe to use small wires for high-power devices?

No, it is not safe to use small wires for high-power devices. The high amount of current required for such devices can easily exceed the capacity of small wires, leading to overheating and potential damage to the device.

4. Can faulty wiring be a contributing factor to small wires burning up devices?

Yes, faulty wiring can definitely contribute to small wires burning up devices. Loose connections, damaged insulation, or incorrect wiring can all increase the resistance in the circuit, causing the wire to overheat and potentially burn up the device.

5. How can I prevent small wires from burning up my devices?

To prevent small wires from burning up devices, it is important to use the right gauge and type of wire for the specific device and its power requirements. Regularly checking for loose or damaged wiring and ensuring proper insulation can also help prevent wire overheating and damage to devices.

Similar threads

  • Electrical Engineering
Replies
11
Views
1K
Replies
6
Views
2K
  • Electrical Engineering
Replies
12
Views
1K
  • Electrical Engineering
2
Replies
36
Views
3K
Replies
6
Views
2K
  • Electrical Engineering
Replies
12
Views
1K
Replies
6
Views
1K
Replies
61
Views
7K
Replies
22
Views
3K
Replies
2
Views
2K
Back
Top