It seems that there is a "Chronology Protection Agency" which prevents
the appearance of closed timelike curves and so makes the Universe safe for historians. Stephen Hawking (1942-2018) in
1992
In other words,
"time" is defined as the independent variable which makes the equations of mechanics take on a simple form.
This is an operational definition which was designed in a simpler era of "classical" physics.
It still holds for nonrelativistic quantum theory, where time remains an old-fashioned "independent variable".
However, at a deeper level of understanding, time cannot be simply such an "independent" parameter
against which events are recorded. Instead, it's a component of spacetime
(to a degree, time and space can be traded for each other).
This has profound implications for our modern descriptions of the physical world.
Especially in the quantum realm.
Tom H (Yahoo!
2007-07-08)
The Beginning of Time
What was there before the Big Bang took place?
Time is just another coordinate of spacetime, so it has to unfold together with the other dimensions.
Time is created with the rest of space; there was simply no "before".
There was no "instant" of creation and there was no "location" for the primordial explosion either.
The center was everywhere. It still is.
A geometrical analogy might help:
Think of the surface of a sphere and imagine latitude is "time".
There's nothing north of the North Pole, is there?
This analogy with a sphere has other nice features.
In particular, the North Pole is not very different from nearby points;
it's just an artifact created from the way we measure things.
So too, the "instant" of creation is not well defined; it depends on the speed and location
of the vantage point from which the (theoretical) mesurement is made.
All of this is without even considering the quantum aspects
which nobody really understands (yet?).
Does this blow you mind? Well, it should.
It blows everybody's mind.
Concerning, the "stuff" the Universe was made from, the answer is also weird...
The key remark is that gravitation has more negative energy when everything is packed tight.
Think about everyday experience: energy is released when an object is dropped,
so there's less energy (more "negative energy") when the Earth and the object are closer together.
At the scale of the entire Universe, the numbers are mind-boggling:
The positive energy in the Universe today
(the energy of radiation and matter according to
E=mc2 )
seems to balance exactly the negative energy of gravitation.
Therefore, it looks like the Universe could have been created from zero energy,
from absolutely nothing!
Come to think of it, it MUST have been so,
or else how would you explain the "manufacture" of the original stuff itself?
This framework makes the Universe explainable
(in principle, at least)
without violating physical laws.
Ultimately, we can hope to be left with only one question:
"Why?" or "What caused it?"
That last question, however, is not a scientific one
(no matter how interesting it might be).
blue22op
(Yahoo! 2007-05-15)
Unavoidable Time Machines
Is there a time machine in the process of being made?
I grieved to think how brief the dream of the human intellect had been. H.G. Wells (1866-1946)
The Time Machine (1895)
Like perpetual motion, time travel is
both unavoidable and impossible.
Microscopically, time-travel is unavoidable.
Elementary particles routinely go backward in time;
there's no difference between a particle
moving forward in time and its
antiparticle moving backward in time.
A particle-antiparticle
creation or anihilation may also be described as a particle reversing its direction in time.
Now, can we harness this basic mechanism to make coherent systems
consisting of many particles
(and carrying definite information with them) go back in time?
The answer is as much of a "no" as what applies to the related question of
whether it's possible to transform brownian motion into coherent motion
(that would be called perpetual motion "of the second kind").
If you don't believe in one, you don't believe in the other...
Of course, science is not supposed to be about beliefs, but it is (to some degree).
It's a much more productive belief
(from a scientific standpoint)
to assume that perpetual motion can't exist than the opposite...
In one case, you'll refine the basic laws of thermodynamics.
In the other case, you may waste your life on doomed tinkering.
Similarly, the impossibility of time-travel imposes useful constraints
on the very laws of fundamental physics we are aiming to formulate.
It's almost certainly the more useful of two possible beliefs,
to put it in provocative terms.
This does not mean you can't have fun thinking about the paradoxes of time-travel.
However, those very paradoxes should be an indication that attempts at building
an actual time-machine are as doomed as attempts to build a perpetual motion machine.
Or vice-versa.
(2003-11-03) Laplace's Demon
In a predictable Universe, the past and the future are alike.
[For an intellect which would know all positions and velocities] nothing
could be uncertain and the future,
just like the past, would be present before its eyes. Pierre Simon de Laplace (1749-1827)
The "intellect" so introduced by the Marquis de Laplace
(Essai Philosophique sur les probabilités, 1814)
is now dubbed Laplace's Demon.
Its existence, within this world or outside of it, would make the Universe
frozen in spacetime, like a movie already filmed.
What Laplace envisioned was a God who could compute the past and the future
from a snapshot of the present (according to Newtonian mechanics,
perfect knowledge of all positions and velocities at one instant makes the
future entirely predictable and allows the deduction of what the past was exactly like).
Modern quantum mechanics precludes that.
Perfect knowledge of everything that ever was and ever
will be, simply cannot be deduced from anything but prior knowledge of the same.
Not even the past is certain because of the unavoidable existence
of a minute influence of the future on the past.
Laplace's Demon is a deep fallacy.
Le temps est un phénomène émergent qui
qui vient de notre méconnaissance des détails. Alain Connes, 1947-
(interview, 2014-02-05)
Alain Connes considers that the
compact operators on the Hilbert spaces
used in quantum mechanics may play a rôle similar to the infinitesimals
used in a bygone era to embody the continuity or space or time.
In the noncommutative case, Connes found the emergence of a single-parameter concept which could
be construed as the passage of thermodynamical time
when the notion isn't postulated a priori as a fundamental parameter.
The abstract structures now called Von Neumann algebras were introduced
in 1929 by John von Neumann (1903-1957)
in the special case of rings of operators
acting on Hilbert spaces.
Originally, much of the focus was on the commutative case.
(2014-06-12) GPS Time for the Hobbyist
Ultra-precise real-time clock (RTC) and frequency standard (10 MHz).
The Global Positioning System (GPS) is based on a network
of at least 24 active satellites with cesium atomic clocks onboard.
Each satellite keeps broadcasting its own time and space coordinates.
If a receiver gets at least 3 such signals simultaneously, it can work out its own
location in space and also synchronize its clock with the broadcasted time.
This last function is our main concern here...
Since 1972, all time systems are
synchronized to a whole number of seconds
Legal times normally differ from UTC by a whole number of hours
(in a few rare cases, half-hours or even quarters of an hour).
Normal time-zones are identified by a letter of the alphabet,
often pronounced with the conventions used in two-way radio transmissions.
Thus, "Z" or Zulu time is UTC itself.
"A" (Alpha) is UTC+1, "B" (Bravo) is UTC+2, and so forth (skipping J) thru "M" (Mike) for UTC+12.
For time zones east of Greenwich, the letters used are "N" (November) for UTC-1 thru "Y" (Yankee)
for UTC-12. The letters "M" and "Y" thus correspond to the same time-zone
but with different conventions for the change of dates.
The letter "J" (Julie, Juliet or Juliett) is reserved for local time when there's no ambiguity.
Increasingly, that convention is used only for Zulu time (UTC+0)
except in military circles.
As time goes on, the above table will always remain correct, except
for the number of seconds separating UTC and TAI
(see link provided for the latest update) because of the so-called
leap seconds which are inserted into UTC at certain dates to best fit
the tropical seasons on Earth, deduced from astronomical observations.
Universal Time, Coordinated (UTC)
differs from International Atomic Time
( TAI = Temps Atomique International )
by a whole number of seconds adjusted to avoid drifting away from solar time.
Mean solar time is based on astronomical observations only.
Our best estimate of that is UT1, published by the BIPM to a resolution of 100 ms
as the current difference (DUT1) between UT1 and UTC
(it's actually known to a precision of 2 ms or so).
Since 1972, UTC has been kept within 900 ms of UT1 ("solar time")
by the insertion of leap seconds at the end of certain predetermined days
(published at least 6 months in advance).
In the UTC system, the last minute of December 31 or June 30 may last 61 seconds.
The statutes would also allow that to happen at the end of March or September but this
possibility hasn't been needed yet.
Likewise, short minutes (59 seconds) are legally possible should the
need ever arise to reflect an increased rate in the rotation of the Earth.
Thus, the current system would allow for an adjustment of up to 4 seconds per year in either
direction, which is much more than what's needed to account for the observed irregularities
in the rotation of the Earth...
The decision to insert a leap second or not at those dates is based on astronomical observations
and it's the official duty of the International Earth Rotation Service
(IERS) to do so. So far, leap seconds have been inserted at the end of the following UTC months:
UTC months ending with a 61-second minute (J = June, D = December)
0
1
2
3
4
5
6
7
8
9
Total
1970-1979
D
JD
D
D
D
D
D
D
D
10
1980-1989
J
J
J
J
D
D
6
1990-1999
D
J
J
J
D
J
D
7
2000-2009
D
D
2
2010-2019
J
J
D
3
2020-2029
One great unsung feature of some satellite positioning receivers is an ultra-precise
pulse per second (PPS) outpout (10% duty cycle).
It's available, in particular, on version 3 of Adafruit's breakout board
featuring MediaTek's MT3339 satellite positioning receiver on a chip.
The long-term stability of this signal matches that of the GPS itself, which is
now synchronized with the network of atomic clocks that provides mankind's official time.
For metrological purposes, we can't rely directly on the pulses in that signal,
because of the jitter described below.
Instead, we'll use it to train a good oven controlled crystal oscillator (OCXO)
monitored over long periods of time (minutes, hours, days, months or even years)
to cancel all known sources of frequency drift.
This setup is known as a GPS Disciplined Oscillator (GPSDO).
One advantage of using a microcontroller for this task is the ability to log
long-term data to monitor the aging of the ovenized crystal itself,
via the long-term evolution of the control voltage necessary
to maintain the OCXO synchronized with GPS time.
Tne MT3339 receiver is able to take advantage of some
augmentations of GPS,
including the regional system most relevant to China:
The Quasi-Zenith Satellite System
(QZSS)
is a regional satellite based augmentation system
(SBAS)
of three or four geosynchronous satellites proposed by Japan to provide a variety of satellite services,
including an improvement of the positioning performance of GPS
with greater satellite availability in a region covering Japan, Asia and Oceania.
To take full advantage of the PPS signal from this receiver, we must first understand
what it really is. If all you want is to blink an LED with it, all you need
to know is that it's a 1 Hz digital signal (3.3 V logic)
with a 10 % duty cycle
(i.e., 100 ms positive pulses one second apart).
For metrological applications, we examine the timing much more precisely:
The yellow trace at left is the rising edge of the PPS signal,
which is used to trigger the scope (whose screen is thus updated once per second).
The blue trace is a 10 MHz local oven-controlled crystal oscillator
(OCXO) trimmed to be GPS-synchronized in the long run.
Here, we may regard this OCXO as a rock-stable time reference.
What's observed is that the blue trace wanders about slowly across the screen
but never for long... It jumps back and forth in either direction as needed to keep its
origin within an interval of about half a division (that's 10 ns).
Therefore, the yellow trace has a jitter of about
10 ns. (which is observed as a jitter of the blue trace only because
we had to trigger on the yellow one).
This jitter is clearly due to the fact that the PPS signal is produced by a digital
system with a clock rate of 100 MHz or so (or a multiple thereof).
Triggering on the falling edge causes exactly the same jitter,
possibly [??] because the pulse width is a whole number of cycles.
Even the best calibrations are never perfect but the above setup can easily be used
to measure the imperfection of my former manual OCXO calibration
(before automating the process).
Even with the jitter corrections, the origin of the blue trace may drift slowly
across the screen.
When I first measured that, it took between 282 s and 298 seconds
to travel 10 divisions to the left (most of the uncertainty in timing comes from that
10 ns jitter which makes the origin of the trace move back and forth across the "finish line").
This means my OCXO was then too fast by a relative amount equal to the drift
divided by the time it took to achieve it (say, 290 s).
Thus, the frequency of my reference oscillator could be determined in just a
couple of minutes with superb precision (0.03 ppb) using a lowly stopwatch.
The best relative precision we can achieve over short periods of time
is half the jitter (5 ns) divided by the measurement duration.
For longer measurement periods, the precision becomes limited by
the stability of the local OCXO and/or by the precision of the GPS itself.
From a design standpoint, I find it particularly appealing to use the 10 MHz signal
from the OCXO to generate the system clock of the microcontroller which
keeps the OCXO synchronized with GPS.
This way, the frequency of the OCXO itself can be measured with calibrated
software that counts the number of clock cycles between PPS pulses.
For best accuracy, we don't actually count clock cycles but determine what type of error
is made by sampling the signal at intervals of 10000000 clock cycles.
That information can then be used to control a digital-to-analog converter
(DAC) whose output will adjust the frequency of the OCXO until the period of
the PPS signal doesn't drift away from 10 million clock cycles for a long time.
This way, the frequency of the OCXO can be readily estimated with a precision equal to
55 ns (half the sum of the jitter and the clock period) divided by
the duration of measurement. The precision is thus 1 ppb after just one minute
and could potentially be 1440 times less in a whole day
(at which point the slight unstabilities of the OCXO come into play).
A common low-precision DAC (10-bit wide) can be used in this high-precision application,
by making digital trimming a mere correction to good manual trimming.
To do so, do your manual trimming with a potentiometer whose wiper is
connected via some fairly large resistor to the output of the DAC when it's in the
center of its range (or else temporarily connect the resistor to
whatever voltage that corresponds to).
The larger the resistor (compared to the value of the trimming potentiometer)
the smaller the adjustement per step will be.
The microcontroller should be able to determine, by trial and error,
whether an increase in voltage increases frequency or decreases it.
It should also be able to request manual trimming when the control
voltage needed is out the DAC's range.
When that condition occurs because of aging, the microntroller will know in what
direction the crystal ages and can set the DAC near the opposite end
before requesting a manual trim (that will make future trim requests less frequent).