The New Science of Clocks Prompts Questions About the Nature of Time, finds limits for their accuracy

“It occurred to us that actually a clock is a thermal machine,”[…]Like an engine, a clock harnesses the flow of energy to do work, producing exhaust in the process. Engines use energy to propel; clocks use it to tick.

Over the past five years, through studies of the simplest conceivable clocks, the researchers have discovered the fundamental limits of timekeeping. They’ve mapped out new relationships between accuracy, information, complexity, energy and entropy — the quantity whose incessant rise in the universe is closely associated with the arrow of time.

These relationships were purely theoretical until this spring, when the experimental physicist Natalia Ares and her team at the University of Oxford reported measurements of a nanoscale clock that strongly support the new thermodynamic theory.


The first thing to note is that pretty much everything is a clock. Garbage announces the days with its worsening smell. Wrinkles mark the years. “You could tell time by measuring how cold your coffee has gotten on your coffee table,”


Huber, Erker and their colleagues realized that a clock is anything that undergoes irreversible changes: changes in which energy spreads out among more particles or into a broader area. Energy tends to dissipate — and entropy, a measure of its dissipation, tends to increase — simply because there are far, far more ways for energy to be spread out than for it to be highly concentrated. This numerical asymmetry, and the curious fact that energy started out ultra-concentrated at the beginning of the universe, are why energy now moves toward increasingly dispersed arrangements, one cooling coffee cup at a time.

Not only do energy’s strong spreading tendency and entropy’s resulting irreversible rise seem to account for time’s arrow, but according to Huber and company, it also accounts for clocks. “The irreversibility is really fundamental,” Huber said. “This shift in perspective is what we wanted to explore.”

Coffee doesn’t make a great clock. As with most irreversible processes, its interactions with the surrounding air happen stochastically. This means you have to average over long stretches of time, encompassing many random collisions between coffee and air molecules, in order to accurately estimate a time interval. This is why we don’t refer to coffee, or garbage or wrinkles, as clocks.

We reserve that name, the clock thermodynamicists realized, for objects whose timekeeping ability is enhanced by periodicity: some mechanism that spaces out the intervals between the moments when irreversible processes occur. A good clock doesn’t just change. It ticks.

The more regular the ticks, the more accurate the clock. In their first paper, published in Physical Review X in 2017, Erker, Huber and co-authors showed that better timekeeping comes at a cost: The greater a clock’s accuracy, the more energy it dissipates and the more entropy it produces in the course of ticking.

“A clock is a flow meter for entropy,” said Milburn.

They found that an ideal clock — one that ticks with perfect periodicity — would burn an infinite amount of energy and produce infinite entropy, which isn’t possible. Thus, the accuracy of clocks is fundamentally limited.

Indeed, in their paper, Erker and company studied the accuracy of the simplest clock they could think of: a quantum system consisting of three atoms. A “hot” atom connects to a heat source, a “cold” atom couples to the surrounding environment, and a third atom that’s linked to both of the others “ticks” by undergoing excitations and decays. Energy enters the system from the heat source, driving the ticks, and entropy is produced when waste energy gets released into the environment.

Samuel Velasco/Quanta Magazine

The researchers calculated that the ticks of this three-atom clock become more regular the more entropy the clock produces. This relationship between clock accuracy and entropy “intuitively made sense to us,” Huber said, in light of the known connection between entropy and information.

In precise terms, entropy is a measure of the number of possible arrangements that a system of particles can be in. These possibilities grow when energy is spread more evenly among more particles, which is why entropy rises as energy disperses. Moreover, in his 1948 paper that founded information theory, the American mathematician Claude Shannon showed that entropy also inversely tracks with information: The less information you have about, say, a data set, the higher its entropy, since there are more possible states the data can be in.

“There’s this deep connection between entropy and information,” Huber said, and so any limit on a clock’s entropy production should naturally correspond to a limit of information — including, he said, “information about the time that has passed.”

In another paper published in Physical Review X earlier this year, the theorists expanded on their three-atom clock model by adding complexity — essentially extra hot and cold atoms connected to the ticking atom. They showed that this additional complexity enables a clock to concentrate the probability of a tick happening into narrower and narrower windows of time, thereby increasing the regularity and accuracy of the clock.

In short, it’s the irreversible rise of entropy that makes timekeeping possible, while both periodicity and complexity enhance clock performance. But until 2019, it wasn’t clear how to verify the team’s equations, or what, if anything, simple quantum clocks had to do with the ones on our walls.


The vibrating membrane isn’t a quantum system, but it’s small and simple enough to allow precise tracking of its motion and energy use. “We can tell from the energy dissipation in the circuit itself how much the entropy changes,” Ares said.

She and her team set out to test the key prediction from Erker and company’s 2017 paper: That there should be a linear relationship between entropy production and accuracy. It was unclear whether the relationship would hold for a larger, classical clock, like the vibrating membrane. But when the data rolled in, “we saw the first plots [and] we thought, wow, there is this linear relationship,” Huber said.

The regularity of the membrane clock’s vibrations directly tracked with how much energy entered the system and how much entropy it produced. The findings suggest that the thermodynamic equations the theorists derived may hold universally for timekeeping devices.


One major aspect of the mystery of time is the fact that it doesn’t play the same role in quantum mechanics as other quantities, like position or momentum; physicists say there are no “time observables” — no exact, intrinsic time stamps on quantum particles that can be read off by measurements. Instead, time is a smoothly varying parameter in the equations of quantum mechanics, a reference against which to gauge the evolution of other observables.

Physicists have struggled to understand how the time of quantum mechanics can be reconciled with the notion of time as the fourth dimension in Einstein’s general theory of relativity, the current description of gravity. Modern attempts to reconcile quantum mechanics and general relativity often treat the four-dimensional space-time fabric of Einstein’s theory as emergent, a kind of hologram cooked up by more abstract quantum information. If so, both time and space ought to be approximate concepts.

The clock studies are suggestive, in showing that time can only ever be measured imperfectly. The “big question,” said Huber, is whether the fundamental limit on the accuracy of clocks reflects a fundamental limit on the smooth flow of time itself — in other words, whether stochastic events like collisions of coffee and air molecules are what time ultimately is.


Source: The New Science of Clocks Prompts Questions About the Nature of Time | Quanta Magazine

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft