The term “ether” is unique in the history of physics not only because of the so many different meanings in which it has been used but also because it is the only term that has been eliminated and subsequently reinstated, though with a different connotation, by one and the same physicist [Einstein]. – Max Jammer, in the Foreword to Einstein and the Ether (2000)
Physics is considered the hardest of the sciences, in terms of its degree of precision and certainty regarding its data and theories. Physics is not, however, as “hard” a science as most people like to think. In fact, physical theories, like all theories, are human creations that approximate reality and, as such, are never the whole story. There is always more than one theory that can explain the available evidence – far more than one, in fact. The difficulty is finding a theory that is self-consistent, explains the evidence at issue, and fits within a broader explanatory framework.
In reading books like Brian Greene’s The Fabric of the Cosmos and The Elegant Universe, or Lee Smolin’s The Trouble With Physics, we realize that the one certainty in physics is this: our physical theories will continue to change over time. We will never – literally – have a complete description of the universe and its workings because we simply don’t know the full extent of what we don’t know. And we will never know the full extent of what we don’t know.
It’s important to keep in mind also that all theories, including physical theories, rest on assumptions about the nature of reality, generally called “postulates” or “principles.” If the assumptions turn out to be wrong, the theory will very likely be wrong. Last, theories can never be proven – they can only be supported by experimental evidence or disproved by experimental evidence that contradicts the theory at issue. The degree to which theories are rejected as invalid depends on the degree to which experimental evidence disproves key features of the theory.
For example, even though there is a fairly strong consensus among cosmologists that the universe is expanding at a faster rate than previously predicted by general relativity (the prevailing theory of gravity), very few physicists were willing, based on this difficulty alone, to reject general relativity as a theory. Rather, various fixes, including the now widespread concept of “dark energy,” have been developed to reconcile general relativity with the unexpected data about accelerating expansion (which led to the 2011 Nobel Prize in physics) and other gravitational anomalies. Dark matter is a similar patch to general relativity, for which there is even less evidence.
There are many other possible views, however, that can explain the data as well or better. Reg Cahill, my interviewee for this first installment of a new series on alternative cosmologies, for example, has pointed out that the supernovae data relied upon for the accelerating expansion model of the universe, can equally support the view of a universe expanding at a constant rate. A later interviewee, Jayant Narlikar, a supporter of the quasi steady-state cosmology, believes that we may be fundamentally misinterpreting Hubble’s law on redshift, which is the basis for the prevailing view that the universe is expanding.
While even scientists and science journalists often speak sloppily about theories being proven, it simply is not the case that any theory – even theories that rise to the level of “laws,” due to very strong experimental support – are ever proven. Richard Feynman, a Nobel Prize-winning physicist, stated: “[E]ven those ideas which have been held for a very long time and which have been very accurately verified might be wrong …. [W]e now have a much more humble point of view of our physical laws – everything can be wrong!”
There remains a chorus of discontent over the state of physics today. David Gross, another Nobelist and a physics professor at UC Santa Barbara, ended a 2005 conference on string theory, the dominant research interest for most theoretical physicists over the last two decades, by saying: “We don’t know what we are talking about … The state of physics today is like it was when we were mystified by radioactivity … They were missing something absolutely fundamental. We are perhaps missing something as profound as they were back then.”
Reg Cahill is a professor of physics at Flinders University in Adelaide, Australia, Reg has for many years now challenged the mainstream physical consensus in many ways. He’s published widely, but generally in “dissident” physics journals (yes, there are dissident journals in physics) because he challenges ideas that are generally considered to be settled. While Cahill is a proud maverick in his field he has nevertheless received recognition for his achievements and was awarded a Gold Medal in 2010 by the Telesio – Galilei Academy of Science for his development of the “process physics” model that is his signature achievement.
I had the pleasure of interviewing Reg, by email, on his process physics, why he believes Einstein got it wrong on gravity, and why a new understanding of the nature of mind is key for a more accurate physics. I am also working on a book about Reg, his physical theories and process philosophy as an alternative to the prevailing materialism of our era. I am highly intrigued by Reg’s ideas, but I am not equipped at this time to make my own determination about their validity, let alone their superiority to the mainstream views. Reg seems to me to be overly categorical in his pronouncements at times, and can tend toward the grandiose at other times. It is, however, clear to me that Reg’s voice should be part of today’s discussions about cosmology and physics more generally.
What led you into physics?
At a very young age, maybe five or six, I was fascinated by questions of how things worked – my first case was the radio. I thought it was an astounding process. Others, it seemed to me, didn’t ask such questions. From there my interest in physics simply developed. On leaving school I was initially planning on being an electrical engineer, but dropped that very quickly when the University of New South Wales (in Australia) offered me a scholarship to do a physics degree, and then stay to do a PhD. During the latter years of school and early years of university I was involved with radios, etc., as a hobby – and built my own oscilloscope, and later modified a military aircraft scope to work on mains voltage.
Who is the most influential thinker with respect to your own worldview, or your own brand of physics?
I don’t think there was any one person. My research over the years has simply gone deeper into fundamental issues – first doing low-energy nuclear physics, then high-energy particle physics: quarks and gluon theory, and finally now into developing new understandings of space and matter – at the “process physics” level. So this was an ongoing process of following the clues deeper and deeper. I came at the process physics point of view independently, and then was amazed to learn that Alfred North Whitehead, and other similar process thinkers, including Heraclitus of ancient Greece, had arrived at the same sort of thinking, but by very different routes.
Could you describe briefly your “process physics” and how it differs from mainstream views in physics?
My process physics perspective is fleshed out in a number of papers and my 2005 book, Process Physics. Basically, conventional physics uses a syntax-based model – where symbols stand for things: matter, space, time … The physical laws are encoded in rules of manipulations of these symbols [as described by the equations of various physical theories]. In process physics, to the contrary, one is modeling, at the deepest levels, that nature is all about process – processes involving the generation of patterns, their interaction by means of pattern recognition, and change – so this is an information-based model.
It is not, however, about our knowledge or information about reality; rather, it is about interacting patterns, where the structure of the patterns determines their interaction and evolution over time. This is a semantic information theory, whereas conventional physics uses syntactical information. Also, process physics does not begin by assuming the existence of space, matter, etc. It assumes only a cosmic-indexing type of time, which is emergent. These phenomena, in my theory, emerge from the more fundamental level of reality.
Physics has been in the news a lot lately with the discovery of evidence supporting the existence of the Higgs boson. Could you explain this finding and how it impacts, if at all, your process physics.
The standard model [of particle physics] starts by assuming that the equations have a certain symmetry. That symmetry requires all particles to be massless – which they are not. To avoid that outcome, a new field is introduced which, in an ad hoc way, forces most particles to have mass. So the whole procedure lacks elegance. This field in turn results in the supposed existence of a new particle – the Higgs boson. Given the manifest inelegance of this model I would be very surprised if the claimed discovery of the Higgs boson survives scrutiny. As for process physics, I doubt the new Higgs field data has any significance.
Is the Higgs field a modern name for what was previously called the “ether” and perhaps wrongly dismissed? How does your work relate to ether-based theories of physics?
No. The “ether” in process physics has been replaced by the term “dynamical space”. In conventional physics, space is a geometrical “container,” to the extent that its existence is even acknowledged. The 19th Century notion of the “ether” was considered to exist in the “container” of space. The “dynamical space” is, however, a complex fractal system, which only manifests geometrical properties at the higher level. Dynamical space is not just a concept. It has been detected repeatedly for more than 120 years without being widely acknowledged. Contrary to the widespread views on this issue, the speed of light is in fact anisotropic [not constant for all observers], when measured by an observer moving through the dynamical space. For example, the famous Michelson-Morley experiment did, in fact, when analyzed correctly, find evidence for light anisotropy. And the dynamics of dynamical space have also been discovered. I would expect that it is the dynamics of this new type of space – in particular its detected fractal texture – which causes particles to have mass. I am working on this conjecture.
There are many notions of the “ether,” and ultimately terminology is far less important than the concepts they convey. However, you write in your recent paper, suggesting that Einstein’s special relativity theory has been falsified, and Lorentz’s competing theory of relativity supported, the following as your introduction: “Physics has failed, from the early days of Galileo and Newton, to consider the existence of space as a structured, detectable and dynamical system, and one that underpins all phenomena…” This sounds to me like the ether that Lorentz himself advocated, so is it not fair to call your neo-Lorentzian theory a type of ether theory?
Aether theories are [generally] dualistic – they have both a space and an aether embedded in that space. Indeed, physicists find it almost impossible to abandon this dualism, except in special relativity and general relativity where both space and aether were abandoned in favor of space-time [a four-dimensional reality that views time as akin to an additional spatial dimension]. Lorentzian relativity is also a dualistic theory with an aether embedded in a space, but with time a separate phenomenon. In neo-Lorentzian relativity [which is a fair characterization of my process physics] we abandon this dualism by positing a structured dynamical space [as the fundamental level of reality]. This dynamical space appears to be fractally textured – according to experiment and theory. This dynamical space is different from both the older notion of space (as a perfect geometrical system) and to an aether, as some form of particulate system embedded in and possibly moving through the geometrical space. In neo-Lorentzian relativity, the “geometry” of the dynamical space is emergent – including its three-dimensionality [and other properties].
More generally, hasn’t physics come around in recent decades to the idea of space as a real entity and not simply a vessel for matter and energy? Mainstream publications like Brian Greene’s book The Fabric of the Cosmos focus on the fact that empty space has certain properties. Einstein also later repudiated his own suggestion, in his seminal 1905 paper on special relativity, that space has no properties.
To the contrary, conventional physics focuses on spacetime, not space as a separate entity. The very concept of “space” is actually rejected by special relativity and general relativity, although sloppy language often confuses the issue. So referring, in special relativity and general relativity, to empty space as having properties is actually misleading. In other words, what one observer identifies as a spatial part of spacetime, is different from another observer’s space part of that same spacetime.
Why is relativity theory so hard to challenge in mainstream physics journals? Are physicists generally group-thinkers who are highly resistant to challenges from the fringes, as respected thinkers like Lee Smolin and Thomas Kuhn have suggested?
In my view, few physicists actually understand special or general relativity. Most physicists’ complete belief in these theories is just that: belief without deep understanding – and they defend that belief with ferocity. Indeed, most physicists appear not to accept the scientific method – namely that ongoing experiments should decide whether a theory survives or not. Of course, special relativity, in particular, has been the foundation of physics for more than 100 years – and most physicists would say that its falsification would be incredibly unlikely. However, my recent paper on neo-Lorentzian relativity (“Dynamical 3-Space: neo-Lorentz Relativity”) shows just that – that special relativity is exactly derivable from Galilean relativity, and special relativity does not do the job claimed for it – meaning that its predictions are inconsistent with experiment.
Can you describe briefly your recent work suggesting that Einstein’s Special Relativity has been falsified?
Special relativity, rather than being a fundamentally new theory, is exactly derivable from Galilean relativity by an exact linear change of space and time coordinates, which mixes the Galilean space and time coordinates. So it turns out that there is no new physics in special relativity that is not already in Galilean relativity. In particular, the various so-called relativistic effects (length contraction, time dilation …) are merely coordinate artifacts. Such actual phenomena cannot emerge from merely a change of coordinates.
One can also show experimentally that these supposed “relativistic effects” are not those actually detected in experiments. One example is that the length contraction effect in neo-Lorentzian relativity is determined by the speed of an object relative to the dynamical space (which is some 500km/s for an object at rest on earth), whereas the special relativity length contraction is determined by the object’s speed with respect to the observer, which in most experiments is 0 km/s. This extreme contrast in predictions is manifestly checked by comparing results from Michelson interferometer experiments with spacecraft earth-flyby Doppler shift data: The outcome is that the special relativity prediction is falsified, and the neo-Lorentzian relativity prediction is confirmed.
Your process physics aligns well with Alfred North Whitehead’s work in philosophy, mathematics, and physics. Whitehead was a well-known panpsychist in that he believed that all matter has some mind associated with it, such that as matter complexifies so mind complexifies. How important is panpsychism in your process physics? Is this idea captured in your notion of “semantic information”?
I developed process physics before I became aware of Whitehead’s philosophy – I was more aware of the work of Heraclitus at that time. Nevertheless, I was happy to acknowledge the philosophical ideas of these and other process philosophers, when I became aware of them, and I now have an ongoing working relationship with various process philosophers. One should note that of course these philosophers had no detailed/mathematical/ implementation/theory/model for their philosophies and this is what my process physics attempts to provide. I also suspect that panpsychism is a valid property of reality, and yes, it is an aspect of “semantic information.”
John Archibald Wheeler made famous the notion that information may be fundamental to reality with his phrase “it from bit.” Do you agree with this idea and if not how would you modify it?
I agree, although Wheeler did not have an implementation mechanism or [detailed] model. In any case one must carefully distinguish between syntactical information, which I suspect is what Wheeler was referring to, and semantic information.
What do you mean by “semantic information” and how does this idea relate to the philosophical view known as panpsychism?
Process physics is about self-generated patterns – and how these patterns interact. So the theory and the reality it models are about active information – information has meaning for the system, and so is called semantic information. Syntactical information is that stored by way of symbols, and then “interactions” are by way of rules, i.e., equations. Equations always presuppose some a priori syntactical rules, and so cannot be fundamental. Semantic information, being active, suggests that the universe is self-aware in some manner, and at all levels. This is my preferred concept of panpsychism.
At the risk of beating this horse to death, another question on the nature of the ether and ether theories: I see the ether concept, or what you call dynamical space, as pretty key for the development of a more ideal future physics, and this is one of the key reasons I was intrigued by your Process Physics when I first came across it. Einstein stated in a 1919 letter to Lorentz that “with the word ether we say nothing else than that space has to be viewed as a carrier of physical qualities.” Do you agree with Einstein here? Would you agree that your “dynamical space” could be described in the same way, as a carrier of physical qualities, which are necessary for an accurate view of nature?
The (new) ether is, in my theory, a dynamical system, the “dynamical space” I’ve described above – which has a very complicated structure at the deepest level. I describe it as a “quantum foam,” meaning that at a deep level the dynamical space is describable by a wave-function whose time evolution is described by a Schrodinger-type theory. On larger scales the dynamical space can be described as being somewhat geometrical, i.e., having three dimensions, etc. It is this aspect that we use as spatial coordinates x, y, z. I would agree with Einstein’s above statement except that dynamical space is not the “carrier” of these properties; rather, disturbances of the dynamical space are in fact what we generally call “physical stuff.”
You stated above that you are working on the concept that the fractal nature of dynamical space may be the underlying reason that particles have mass. Can you flesh out this idea and contrast it with the Higgs field concept that has gained some recent support?
Wave functions propagating through a fractal space will have their energy changed. My conjecture is that this is equivalent to giving “mass” to the wave function. In the Higgs model there is no such structured space, only a smooth space-time, and so the Higgs field is, incorrectly in my view, constructed to provide mass to massive particles.
You have been a consistent critic of Einstein’s relativity theories. Can your process physics be viewed as a full substitution for Einstein’s theory of gravity, general relativity? If so, what kind of real world/technological changes would this substitution lead to?
This is a complete change: Einstein’s special relativity and general relativity have failed in almost every case to explain the observed data – and this contrary evidence grows stronger every day. The process physics perspective could lead to a fundamental revolution in physics – and there isn’t much that will not be changed if these ideas are adopted. Process physics will also have impacts outside of physics – such as in providing a more firm theoretical basis for non-local interactions [“entanglement”] that are often denied by physics at present, and a broader interconnectedness of the universe that is not acknowledged at present.