- #1
Rasalhague
- 1,387
- 2
Homework Statement
If the energy flux associated with a light beam of wavelength 3 x 10-7 is 10 W m-2, estimate how long it would take, classically, for sufficient energy to arrive at a potassium atom of radius 2 x 10-10 in order than an electron be ejected.
Homework Equations
[tex]E_x = h \nu - \phi[/tex]
where [itex]E_x[/itex] is "the maximum electron energy" (which I interpret to mean the maximum kinetic energy of any individual electron emitted), [itex]h[/itex] is Planck's constant, 6.4 x 10-34 J s, [itex]\nu[/itex] is the frequency of the light, the speed of light divided by the wavelength, [itex]\phi[/itex] "the minimum energy needed to free an electron", which in this case is 1.9 eV, as calculated in the previous problem and confirmed by the back of the book.
The Attempt at a Solution
Exposed area of atom = [itex]\pi r^2 = 4 \times 10^{-20} \pi m^2[/itex].
Time in seconds = (Minumum energy) / (Flux x Area)
[tex]=\frac{3.04}{4 \pi} = 0.24.[/tex]
The book's answer makes the same calculation except that, instead of the minimum energy, it uses the maximum energy, 2.1 eV, resulting in 0.27 s. I get the same answer as it does, if I substitute maximum for minumum energy.
My question: Why is the maximum energy used here, rather than the minimum? The question only asks how long till an electron is emitted, according to classical theory, not how long till an electron with the maximum energy is emitted.
It's not altogether clear to be which parts of section 1.1 describe classical theory and which quantum theory. Perhaps this is the source of my confusion.
Last edited: