Defining Ampere: Length of Wire Explained

  • Thread starter lionely
  • Start date
  • Tags
    Ampere
In summary, the conversation discusses the definition of the ampere and how it has changed over time. The current definition is a flow of one coulomb per second, while the previous definition involved two infinitely long wires exerting a force on each other. There is ongoing effort to overhaul the International System of units (SI) and define current in terms of a flow of electrons per second. The conversation also touches on the history of the ampere's definition and the challenges of teaching E&M to students.
  • #1
lionely
576
2

Homework Statement



This may be a stupid question but... I don't understand why the length of the wire has to be infinite in defining the ampere... could someone explain this to me?
 
Physics news on Phys.org
  • #2
lionely said:

Homework Statement



This may be a stupid question but... I don't understand why the length of the wire has to be infinite in defining the ampere... could someone explain this to me?

Uh ... huh? Where did you get that definition?
 
  • #3
My textbook says... One ampere is that steady current which when flowing in each of two infinitely long, straight parallel wires of negligible cross-section placed one metre apart in a vacuum causes each wire to exert a force of
2 x10^-7 Newton on each metre length of the other wire.
 
  • #4
lionely said:
My textbook says... One ampere is that steady current which when flowing in each of two infinitely long, straight parallel wires of negligible cross-section placed one metre apart in a vacuum causes each wire to exert a force of
2 x10^-7 Newton on each metre length of the other wire.

Weird. I've never seen that one before. An amp is simply a flow of one coulomb per second. Are you quoting this exactly? That is, are you sure that it is presented as "THE" definition of the amp, or as simply a result of the definition of the amp?
 
  • #5
The definition of the ampere has changed over time. See, for example, this entry at NIST.
 
  • #6
lionely said:
I don't understand why the length of the wire has to be infinite in defining the ampere... could someone explain this to me?
That makes the "end effects" vanishingly small. :smile: You can ignore them because they are not there.

So here they are defining the Ampere using the relation: F = B i l
where B = μ H

http://physicsforums.bernhardtmediall.netdna-cdn.com/images/icons/icon6.gif
 
Last edited by a moderator:
  • #7
Oh I see, so it is just to ignore the force on the end of the wires...
 
  • #8
phinds said:
An amp is simply a flow of one coulomb per second.

Those were the good times, but these pesky metrologists have to complicate everything.
 
  • #9
phinds said:
Weird. I've never seen that one before. An amp is simply a flow of one coulomb per second. Are you quoting this exactly? That is, are you sure that it is presented as "THE" definition of the amp, or as simply a result of the definition of the amp?
That's been the definition for a long, long time. Not an infinitely long time, but quite long nonetheless. The end effects are getting vanishingly small. (You have to be over 80 to remember the old definition.) The ampere currently is a base unit in the metric system and the coulomb is a derived unit, an ampere-second.
 
  • #10
The definition WILL change to "flow of X number of electrons per second" in a few years, but current electron pumps (as the devices are called) are not quite good enouh to replace the present definition; the pump accuracy needs to be improved by about a factor of 5-10 or so.
This change will happen at the same time as defining the value of e (and perhaps a few more constants, e.g. h) to be exact (similar to what has been done for c)
 
  • #11
f95toli said:
The definition WILL change to "flow of X number of electrons per second"
That's right. There's an ongoing effort to completely overhaul the International System of units (SI).

The ampere is currently defined the way it is because of those "pesky metrologists." Measuring charge is a touchy business. Measuring current is a better way to go. The standard presently has the ampere as a base unit and the coulomb as a derived unit. The reason is that experiments that measure electromagnetic induction currently provide significantly greater accuracy and precision than do experiments that measure the electrostatic force.

The movement that is underfoot is to completely rectify the quantum and relativistic scale worlds with our everyday macroscopic world. One consequence of this is that current will be defined in terms of a flow of a number of electrons per second. The ampere will be a "base unit" in name only. When the CIPM is finished with the overhaul, the only true base unit will be the second. All other units will flow from the second based on some defined physical constants.
 
  • #12
D H said:
That's been the definition for a long, long time. Not an infinitely long time, but quite long nonetheless. The end effects are getting vanishingly small. (You have to be over 80 to remember the old definition.) The ampere currently is a base unit in the metric system and the coulomb is a derived unit, an ampere-second.

Bolded part is not true. I took EE 50 years ago and coulomb/sec was the definition then.
 
  • #13
phinds said:
Bolded part is not true. I took EE 50 years ago and coulomb/sec was the definition then.
No, it wasn't. Fifty years ago was 1963. The current definition of the ampere and coulomb had already been in effect for 15 years. Your teacher at that time was old enough to remember the old definition and wasn't playing attention in 1948 when the ampere and coulomb were redefined.
 
  • #14
NIST page linked to earlier in the thread states the definition changed in 1948. I was taught C/s somewhere in mid seventies (1975, 6?), and I don't think my teacher was old enough to remember the old definition, she was in her thirties or forties.

My bet is C/s was easier to swallow for 13/14 years old than two infinite wires attracting each other with 10-7N per meter.
 
  • #15
Borek said:
My bet is C/s was easier to swallow for 13/14 years old than two infinite wires attracting each other with 10-7N per meter.
Not just to 13/14 year olds, but also to the teachers of 13/14 year olds. I don't know about Poland, but here in the US, primary school science teachers do not have to take the classical electricity and magnetism (E&M) class that vexes so many students of physics.
 
  • #16
D H said:
No, it wasn't. Fifty years ago was 1963. The current definition of the ampere and coulomb had already been in effect for 15 years. Your teacher at that time was old enough to remember the old definition and wasn't playing attention in 1948 when the ampere and coulomb were redefined.

OK, I'll give you that. He was an OLD guy and never was able to understand transistors. :smile:
 
  • #17
I suspect that many instructors (and textbooks) still teach amperes as being coulombs/second. My very old Halliday & Resnick (1967) does not. It explains why current rather charge was chosen as the base unit "for practical reasons having to do with the accuracy of measurements."

On a conceptual level, that charge is basic makes a lot more sense than current being basic. However, it's the other way around on the basis of what is more observable experimentally. Currently. Sorry for the bad puns.
 
  • #18
It is perhaps worth mentioning that no one actually uses the definition of the Ampere, the realisation is far too complicated even for metrology labs so no one has done the actual experiment in many years (80s).
For the past 20 years or so devices using the Ampere (Ammeters etc) have been calibrated "indirectly" by creating a known current using the realisations of the Volt (Josephson voltage) and Ohm (quantum hall effect). The reason for this is that we can measure these with much, much higher accuracy (we can realize the volt to one part in 10^15 since it relies on the second )than the Ampere in the "real" realisation, another reason is simply that most NMIs maintain system that can realize the Volt and Ohm anyway.

This is of course not an "official" way of doing it since one needs values for the Josephson and Klitzing constant which since they are not defined are not exact; but the CODATA values are so good that this is not really a practical problem. Hence, if you have a calibration certificate for you multimeter this is what the current specs are ultimately referred to (via some intermediate secondary standards)
That said, this is of course not a satisfactory situation which is why so much effort is going into finding a practical method for realising the ampere that is somehow connected to time*.



*Note that this might NOT turn out to be an electron pump that "counts" electrons; an alternative method would be to use a quantum phase slip junction which is the "current analogue" to the Josephson junction (used for voltage calibration).
A QPS junction should when irradiated by microwaves generate current steps (again analogoues to the Shapiro voltage steps in a Josephson junction) that could be used to realize the Ampere with extremely high precision; that is assuming we can get the steps flat enough and the current magnitude is high enough to be useful. This is very much work in progress.
 
Last edited:
  • #19
C/s definition has one important advantage, especially in chemistry - it shows the link between reactions in redox systems (where electrons are transferred between atoms/ions/molecules) and the current. Quite often I see students that don't see the link and I often wonder if it is not because they are not aware of the fact current is nothing else but charge transfer.

But perhaps that's not a matter of definition used, but just of the general lack of knowledge and inability to connect the dots to see a picture.
 
  • #20
The definition is going to change next year.

IMHO :biggrin: :

epenguin said:
this question gives me the chance to point out something not often mentioned that I ever saw. The laws according to Wiki are:


Faraday's 1st Law of Electrolysis - The mass of a substance altered at an electrode during electrolysis is directly proportional to the quantity of electricity transferred at that electrode. Quantity of electricity refers to the quantity of electrical charge, which measured in coulombs.

Faraday's 2nd Law of Electrolysis - For a given quantity of D.C electricity (electric charge), the mass of an elemental material altered at an electrode is directly proportional to the element's equivalent weight.



In a manner of speaking you could say that Faradays laws have nothing to do with electricity, and that the first law is superfluous.

To explain - the quantity of charge transferred is measured in Coulombs. For a long time the coulomb was defined electrochemically. If you look in old textbooks you will find the coulomb defined as the amount of charge that causes some ridiculous number of grams of silver to be deposited in an electrochemical cell.* If that that defines the quantity of electricity then the first law is contained in the second.

And together they could be restated something like “chemical equivalents in electrochemistry are the same as in ordinary chemistry”.

The historical fact (I do not know the history very well) is that the same Faraday who pioneered electrochemistry also pioneered electromagnetism. So no doubt he measured the currents in his electrochemical experiments with an electromagnetic device, and formulated the laws that way. He was deep into the unity of all these things. But someone else might have done just electrochemistry alone quite well and could have formulated the law chemically without ever seeing an ammeter, and we would probably be to this day teaching the subject slightly differently.

The silver deposition lends itself to fairly accurate measurement, that is why it became the standard. But at some point it proved possible to measure better, more precisely I suppose, by electromagnetic force of a current, and they made that the standard for the ampere and the coulomb that multiplied by a time. That is what it is now.

But don’t invest too much into it because next year they are going to change it into something more chemical again! :biggrin: http://en.wikipedia.org/wiki/New_SI_definitions

*then the ampere that per second or maybe at one time per hour
 
Last edited:
  • #21
I was taught that you can't use the relationship V=IR to define I. My teacher says V/I can be used to define R,

and that equation cannot define anything else. V is the potential difference which is defined in terms of work done by the charge... and he said we couldn't define I yet at the time UNTIL we did magnetic flux. (By the way I'm in Upper sixth form)
 

FAQ: Defining Ampere: Length of Wire Explained

What is an ampere?

An ampere is a unit of electric current that measures the rate of flow of electric charge in a circuit. It is named after the French physicist André-Marie Ampère.

How is the length of wire related to ampere?

The length of wire is one of the factors that determines the amount of electric current that can flow through it. A longer wire will have more resistance, which will limit the amount of current that can pass through it.

Why is it important to define ampere in terms of length of wire?

The definition of ampere in terms of length of wire helps us understand the relationship between current and wire length. It also allows for accurate measurement and standardization of electric current.

Can the length of wire affect the accuracy of ampere measurement?

Yes, the length of wire can affect the accuracy of ampere measurement. Using a longer wire with higher resistance can lead to a decrease in current flow, resulting in a less accurate measurement.

How do scientists determine the length of wire needed for a specific ampere measurement?

Scientists use mathematical formulas and calculations, as well as different types of equipment such as voltmeters and ammeters, to determine the appropriate length of wire for a specific ampere measurement. They also take into account factors such as the type of wire and its resistance.

Back
Top