Tuesday, June 16, 2015

Physics in flux

With the daily news so often grim, and the 2016 presidential campaign already seeming endless, I find it uplifting, from time to time, to reground myself in less well covered -- but more meaningful and exciting -- dispatches from the frontiers of science. Today: reports and speculations from the frontiers of physics.

A chip off the Moore's Law block
We'll begin with applied physics. Since the Sixties, we've been on a pell-mell race to continue upping the density (read: continue decreasing the cost and/or size) of electronics. It's that steady progress that brings us such goodies as HDTV and smart phones.

Regularly the prognostication is made that Moore's Law (the doubling of component densities every two or so years) must soon come to a screeching halt. Intuitively, that makes sense -- over the decades, some features of state-of-the-art chips have been reduced to a mere 14 nm across. (Interatomic spacing in crystals is about 0.1 nanometers -- we're not talking about a lot of atoms across such tiny features.) Technologists, fortunately, keep coming up with clever approaches to sustain the trend. See "Intel: Moore's Law will continue through 7nm chips."

How long can such progress go on? Perhaps until  single molecules serve as electronic components. See "Molecular Electronics Takes Large Stride Forward" for the example of single-molecule diodes.

Which are we?
Meanwhile, information technology keeps pushing its virtual nose under more and more metaphorical tents -- even into theories about the nature of the universe. Not the least weird aspect of one theory is that the seemingly 3-D (ignoring time) universe may, in fact, be a 2-D hologram. See "Forget Space-Time: Information May Create the Cosmos."

In the cosmological mainstream, astronomers concluded in recent years that the pace of the universe's expansion since the Big Bang is accelerating, a phenomenon attributed to (unexplained) dark energy permeating ... everything.

How can one know anything about the expansion of a universe billions of years old and billions of light-years across? Indirectly! Light being a universal speed limit, anything seen at a distance of billions of light-years originated in an event billions of years in the past. Using a "standard candle" (the logical equivalent of a light bulb of known brightness -- by contrasting that known brightness with the bulb's apparent brightness, one can deduce its distance) astronomers measure vast distances. The standard candle relevant to this discussion is a particular type of bright-and-violent explosion: the Type 1a supernova.

In brief, the accelerating expansion of the universe has been derived from observation of standard candles at various derived distances. But what if the standard ... isn't? If our calibrations are off, our conclusions may also be. See "Is dark energy becoming marginalized?" from which the key quote is: 
... the researchers’ work suggests that the evidence for acceleration is nowhere near as strong as previously suggested – it is closer to 3σ rather than 5σ, and allows for expansion at a constant velocity. Nielsen et al. have come to this conclusion after studying a much larger database of type Ia supernovae – 50 of which were studied in the original work, while this study looks at 740 – that are used as “standard candles” to detect cosmic acceleration.
Type 1a supernova in the making
This doesn't rise to disproving accelerating expansion, but -- if confirmed -- it would surely raise doubts.

From the very large to the very small -- but sticking with matters of cosmological significance -- particle physicists continue to study the underlying nature of things. In 2013, you'll recall, physicists at the Large Hadron Collider confirmed their discovery of the Higgs boson. That particle is a manifestation of the Higgs field, which explains the property of mass in many subatomic particles. That same year, the LHC was turned off for a major upgrade.

A small part of the (very big) LHC
The LHC has recently been turned back on ("Large Hadron Collider turns on 'data tap' "). With the latest improvements, it's capable of achieving particle-collision energies of 13 trillion electron volts, about twice its previous limit. One hope for upcoming experiments is that "super-symmetric" particles will be detected among the higher-energy subatomic-particle debris. In a moment we'll loop back to the expansion of the universe ...

Super-symmetric particles (if they exist) are partners of a sort for the familiar particles, like electrons and quarks, of the Standard Model of particle physics. Confirmation of (some) current theories of supersymmetry (shortened by many physicists to SUSY) might resolve the "cosmological constant paradox." As Wikipedia puts it: "There is no known natural way to derive the tiny cosmological constant used in cosmology from particle physics."

It'll all sort out in the end ...
More of a Band-aid than an explanation, the cosmological constant is a parameter within general relativity. Einstein added the CC to his GR theory to "explain" why gravitation didn't cause the universe to collapse upon itself. (Expansion of the universe had yet to be inferred; a start to time and space in a Big Bang wasn't yet understood.) For the mathematically inclined:  the CC is a constant of integration -- which means no particular value is required to solve GR's system of simultaneous equations.

Think of the CC as a measure of the energy density of otherwise empty space. Sounds like dark energy, doesn't it? With the right value, the CC "explains" any possible rate of cosmic expansion.

If the LHC finds SUSY particle -- and equally, if it doesn't -- we're in for some interesting new physics.

No comments: