No vacuum anywhere vs the measurement of "c"

In summary: Welcome. Here's the deal. If you make a claim like " but there isn’t anywhere a real vacuum possible in the universe" then you need to document that and (see Wikipedia) is insufficient. You need to up your game a bit if you wish serious discussion.
  • #36
Let me try a little. The speed of light in vacuum is well defined within our concept of physics. It is a number appears over and over and therefore for practical purposes it behooves us to accurately measure (or otherwise adjudicate) its value relative to the other fundamental quantities. It does not matter that we cannot measure it directly and exactly (exactly is a meaningless concept) It matters only that we can measure it well enough for our purposes. And, by the way, it makes much more practical sense to use that value to define, using an atomic clock, the length unit than to store an oxidizing piece of metal in vault. Nothing is exact and inviolate
The fact that there are no perfect realizations of "light in a vacuum"in no way voids the concept, although you seem greatly worried.

/
 
Last edited:
  • Like
Likes Dangoe, Lord Jestocost and topsquark
Physics news on Phys.org
  • #37
Dangoe said:
We defined the second and hence the meter by the speed of light in vacuum
No, we didn't. SI units define the second in terms of a hyperfine transition in cesium atoms. The SI meter is then defined so that the speed of light in vacuum (which is really a misnomer, a better term would be "the finite invariant speed aka unit conversion factor between time and distance that relativity theory tells us must exist") has a fixed value. However, that definition is just a unit definition and does not depend on making any measurements involving light in vacuum. So the fact that in practice we cannot achieve a 100% vacuum is irrelevant.
 
  • Like
Likes dextercioby, Dangoe and topsquark
  • #38
hutchphd said:
then to store an oxidizing piece of metal in vault
And flaking. Don't forget flaking.
 
  • Haha
Likes Dangoe
  • #39
Dangoe said:
Still having a hard time to accept the given language though.
Well, in a sense you don’t have to accept it. You can use “vacuum” to mean the unobtainable ideal vacuum you think of. You just need to understand that other people use it differently. That way you can understand what they say (even if you wouldn’t say it that way) and you can clarify how you mean the term (so that they can understand what you say).

That said, it is easier to just use a word the way others use it.

Dangoe said:
Those measures are obviously showing uncertainties there.
Yes, that was before the modern definition of the meter, so at that time there was indeed still some experimental uncertainty in ##c## using SI units.
 
Last edited:
  • Like
Likes Dangoe and topsquark
  • #40
Thanks for the further explanations as well, clarifying it even more.

I feel this is all rather trivial than rocket science but for me very informative. Being authentic I imagine or at least hope other interested people at a similar level may find this thread helpful as well. I feel much better now for sure trying to understand things like this was itching me (even blocking somehow) and now it's gone.

I don't want just to swallow other people's or textbooks' facts, I want to understand them, way more fun, and ... okay, leaving that at this point.

Let me try to summarize in my own words the lessons learned: (subject to the next post)
After that some thoughts on some quotes I have in my mind as well.
Dangoe
 
  • #41
Summarizing more or less in short:

"We measured c historically by the reference meter stored in Paris until we could achieve very fine-tuned measurements in a vacuum better than needed [here the point - seems not to hurt] while getting more precise clocks (atomic clocks, Cäsium-based, even future improvements ongoing today) for the second, better say for time measurements.

And while a speed or a velocity is described by distance per time and while c is the upper maximal limit adding any to the same like a natural constant which we got to measure so accurately with the given means and several technics, we're able to somehow reverse engineer it, setting the meter close to the old one - to the old second as well - relying on the more and more precise time measurement abilities to fine-tune s and then setting c once as defined.

And Tadaaa we got a unit definition fixing c, m, and s. Good enough for our purposes, like very good.
The speed of light in a vacuum or which media ever is always the same with corrections to a given media.

As we can't define distances without c we are only left with more precise time measurements to be able to accomplish that. Because c is fixed by reality - a constant, we can rely on it to figure out a distance by finetuning the time by a good clock (Ca).

Nah, even if described without major flaws this still feels a bit shaky or because of my issues or real issues lurking - the latter way less probable. ;)

There I have a fallback. Again the initial measurements with whatever means are crucial. Leading to my opening: Where and how was it measured? And therein again the vacuum question, okay less important than I thought. So sorry - that happens while writing. Let me see you tomorrow again, maybe write a better summary. >or I got a point (of course not). I imagine most of that is kind of off-topic to reply to. But I can't summarize my actual current understanding without encompassing much more, sorry. I can probably do better like breaking down long sentences into smaller ones. Am not running away, this is it, for now, the next try may be better.

Dangoe
 
  • #42
I think I need some time off, driving me crazy and consuming way too much of my time off. Coming back, until then!
 
  • #43
Dangoe said:
Where and how was it measured?
Where and how was what measured?
 
  • #44
Dale said:
Where and how was what measured?
c
I mean it wasn't guessed in the first place, so measured somehow - we had the uncertainties lately in the historical context. Things are still hidden from me behind a paper wall, I can ask my local uni or just pay access or try zlib, anyway at one point they were measuring, getting better, and only then we have got the unit definitions. This didn't fall from the sky.
 
  • #45
Dangoe said:
I mean it wasn't guessed in the first place, so measured somehow - we had the uncertainties lately in the historical context. Things are still hidden from me behind a paper wall, I can ask my local uni or just pay access or try zlib, anyway at one point they were measuring, getting better, and only then we have got the unit definitions. This didn't fall from the sky.
I think about it a bit differently.

We try to measure things in the most accurate way possible. If we measure distance, we use the most accurate distance measuring tools we can make. If we want to measure time, we use the most accurate clocks we can build.

But measurement by itself is not enough. We want to be able to compare my measurement over here against your measurement over there. We need standards. Standard units. Comparisons. Reliable, repeatable, traceable comparisons. [There is a term for this: "metrology"].

We want a consistent repeatable standard against which we can all compare our meter sticks.
We want a consistent repeatable standard against which we can all compare our wrist watches.
We want a consistent repeatable standard against which we can compare our standard lab weights.

If the most consistently repeatable (not necessarily the most accurate) way we have of establishing a standard distance is to examine two scratches on a particular metal bar supported in a particular way then that is what we will use for a standard of distance. And we did for a while.

If the most consistently repeatable way we have of establishing a standard time is as a fraction of the duration of a tropical year then that is what we will use for a standard of time. And we did for a short while.

We can judge consistency and repeatability by doing repeated comparisons and seeing that they come out the same way every time within some tolerances. We can get statistics on how rough the tolerances are.

But technology moves on. If a distance measuring apparatus based on interferometry turns out to deliver more accurate comparisons than a bar with scratches while still maintaining consistency and reliability then we will change our standard definition for distance units accordingly.

But we also do not want to tie our definition to one exactingly precise description of a specific measuring device. So we do not define our standard in terms of a "Hewlett-Packard 5710-A dual column gas chromatograph with flame analyzation detectors". Instead, we word the standard in vendor-neutral terms involving physical laws according to our well-verified understanding of how the universe works. [There is a certain amount of distaste for anchoring definitions to physical artifacts when well verified physical laws are available instead].

This, even though the whole point of the definition is to be able to realize the definition by building a device to perform reliable, repeatable, consistent measurements that match, within reason, what a hypothetical ideal device would measure.
 
Last edited:
  • Like
Likes pinball1970, russ_watters and topsquark
  • #46
@jbriggs444: You are basically explaining why we invented units and why a Rolex is better than a sundial. I mostly agree, except at the end:

jbriggs444 said:
...physical laws according to our well-verified understanding of how the universe works.
That's relative. From overwhelmingly good to not at all. We are not in a Hollywood movie.
 
  • Skeptical
Likes Motore and jbriggs444
  • #47
Dangoe said:
@jbriggs444: You are basically explaining why we invented units and why a Rolex is better than a sundial. I mostly agree, except at the end:That's relative. From overwhelmingly good to not at all. We are not in a Hollywood movie.
If you want to claim that our understanding of the universe is wrong, you are in the wrong forum. Our unit definitions are on solid ground. Very solid.

If you want to claim that our understanding is as ludicrous as Hollywood physics... Politeness requires that I shut up now.
 
  • Like
Likes pinball1970, russ_watters, phinds and 2 others
  • #48
Dangoe said:
I mean it wasn't guessed in the first place, so measured somehow - we had the uncertainties lately in the historical context.
It was initially guessed to be infinite (I would document this but why should I bother?) Then it was measured (most notably perhaps by Romer)
This is my last respoinse, because I prefer thoughtful people If nobody responds to you, that may be a clue.
 
  • Like
Likes russ_watters and berkeman
  • #49
Dangoe said:
I mean it wasn't guessed in the first place, so measured somehow - we had the uncertainties lately in the historical context. Things are still hidden from me behind a paper wall, I can ask my local uni or just pay access or try zlib, anyway at one point they were measuring, getting better, and only then we have got the unit definitions. This didn't fall from the sky.
Have you tried Wikipedia ? or YouTube, for that matter.
 
  • #50
Dangoe said:
We are not in a Hollywood movie.
Exactly. Which means you should not be making off the cuff claims that you cannot back up.

The OP question has been sufficiently addressed. Thread closed.
 
  • #51
Dangoe said:
Things are still hidden from me behind a paper wall
I have no idea what you are talking about. There is copious information available online about various ways that ##c## has been measured over the past few centuries, and how the value that is now fixed in the SI unit definitions was arrived at.

If you have questions about something you read in the course of looking through all of that copious information available online, by all means start a new thread here to ask them. But to claim that you are unable to find any useful information online about the measurement of ##c## is simply not credible.

Thread will remain closed.
 
Back
Top