Speculations on Cosmology

I am not educated in cosmology. I learned a bit about General Relativity (the field of physics most relevant to cosmology) in grad school, but I confess that I never really understood any of the math. Characterizing my grasp of cosmology on a scale of 0 to 10, with 0 representing the average American, 1 representing the average college-educated American, 2 representing the average science geek, and 10 representing Stephen Hawking, I’d put myself at 3.

I’ve been reading a book about cosmology ("On The Origin of Time: Stephen Hawking’s Final Theory”, by Thomas Hertog) and wrote up some comments on its notions of life, but continuing reading inspires a general sense of intellectual unease. I recognize that I am grossly ignorant of the subject matter, but all my scientific intuitions are offended by the reasoning offered in the book. This unease induced me to “put my money where my mouth is” — I need to formulate my own thoughts on some of these issues. Only then can I identify the flaw in my thinking, or find something better.

Foundations
I begin by stating the foundations of my thinking, because I approach some aspects of physics in rather unconventional ways. For example, I have never accepted Schroedinger’s Equation as the essence of quantum mechanics. I see it as more akin to an engineering formula: you can use it to successfully predict all sorts of things about the universe, but the ability to predict a phenomenon is not the same as understanding it. For me, the essence of understanding quantum mechanics lies in the Uncertainty Principle, which is expressed most commonly as:

∆x ∆p ≥ h

In English, this says that, if you attempt to measure the motion of a particle, the uncertainty in your measurement of its position, multiplied by the uncertainty in your measurement of its momentum, will yield a number that is always greater than 6.62 x 10**-34 Joule-seconds. (Garsh, it’s been nearly 50 years since I last used that 6.62 number, yet I still remember it, yet I have difficulty remembering people’s names).

The idea behind the uncertainty principle is that the universe is not absolutely deterministic; there is an inherant uncertainty in everything. (Sorry, Martin Luther, you were wrong. God has not determined in advance who goes to heaven and who goes to hell, because he instituted the Uncertainty Principle.)

But there’s another way to interpret the Uncertainty Principle: it means that there’s a fundamental limit in the amount of information we can have about the motion of any particle. I’ll be building on this concept later.

By the way, many years ago I noticed an odd similarity: there are only two fundamental laws of physics that are expressed as inequalities (that is, using the ‘≥’ sign instead of the ‘=‘ sign). Those two fundamental laws are the Uncertainty Principle and the Second Law of Thermodynamics. This got me thinking and I eventually nailed it down: the Second Law of Thermodynamics is a Quantum Mechanical Effect

The second unconventional idea is an extension from a profound realization by Albert Einstein. He started with the observation that gravitational fields bend the path of light. That is, the path of light from a distant star that passes very close to the sun will be slightly bent, and so we will see it in an apparently different position. This prediction was verified in 1917 by an observation during an eclipse of the sun. Einstein noted that we use the path of a ray of light to gauge whether something is straight. 

CurvedRuler

Nope, that ruler isn’t straight, is it?

In other words, the path of a ray of light DEFINES the straighness of space. The geometry of space is defined by the path of light. Far out, huh?

I extended Einstein’s idea (without obtaining his approval). I assert that the width of space is defined by objects. Think of it this way: you’re an astronaut floating all alone in space. You want to measure a length of space. But you can’t do this without at least two material objects. You must put down a marker at one place and another marker at another place. To put in a simple terms, you cannot measure the distance between A and B if there’s nothing at A and nothing at B. Objects DEFINE the span of space.

In the same way, time is measured by events. Clocks go “tick - tock”. Without the tick and the tock, you don’t have any way to measure time. It simply doesn’t exist without events to mark it.

A Grand Leap
Now I’m going to take these foundational ideas and exalt them into grand laws:

1. The fundamental constituation of the universe is information, as measured in bits.

No, the fundamental particles in the Standard Model are not the most basic components of the universe. Photons, electrons, mesons, quarks, etc. are not the most basic things in the universe; bits are.

Of course, any person even half-sane will immediately ask, “Oh, yeah? And how are those bits transformed into other stuff?” That’s a good question, and I don’t have an answer; I instead declare that there must exist some process by which bits are manifested as the components that we observe. Newton didn’t try to explain where gravity comes from; he merely showed what it does. In weakly similar fashion, I don’t attempt to explain how patterns of bits are manifested as observable components of the universe.

Here’s one way of thinking about it. Here are some bits: 01000001. Just plain old bits, right? But wait! In the ASCII code, those bits mean “A”. Here are some more bits: 00011000001001000100001001111110010000100100001001000010. Gee, that’s a LOT of plain old bits. But let me organize them another way:

00011000
00100100
01000010
01111110
01000010
01000010
01000010

Let me represent the bits with blocks of black and white:

PixMapA

Those bits manifest themselves this way as a clunky “A”. Do you know how a computer transforms those bits inside memory into an “A” on the screen? I don’t either. I know some of how it does it, but not all. 

In fact, almost all of what a computer does involves using bits to represent things like how much money you have in the bank, where Pike’s Peak is on a map, what color a lemon is, the shape of a car, the voltage in a battery, and a million other things. The computer can represent anything in the real world as a collection of bits. That representation can be clunky, like our “A”, or it can be pretty accurate. The important thing is that the computer takes bits and “manifests” them into things from the real world.

Here comes that half-sane person again: “So, you’re saying that we really do live inside The Matrix, and reality is all just a big computer simulation?”

No, I’m not saying that. I’m saying that Reality has two fundamental types of components: bits and processes. The bits are absolutely real; the processes are absolutely real. The processes use the bits to make Reality. It isn’t a simulation: it’s real. There are no agents, no Merovingian, no Zion, no Architect. It’s just real, in the same fashion that dirt is real.

2. The number of bits defining reality is finite and unchangeable.
Physicists will recognize this as a “conservation law”. “But why,” they will ask, “must information be conserved? Why couldn’t it increase or decrease over the course of time?”

Let’s start with a simple example. Let’s suppose that we desire to be able to predict the future position of a moving particle in space with high precision and accuracy. So we make an initial measurement of its position and momentum (momentum is just speed multiplied by the particle’s mass). We use that initial measurement to predict its future position say, a million years in the future. When the appointed hour arrives, we measure it again. We know how far it traveled and we divide that immense distance by a million years, yielding an extremely precise determination of its true velocity, right?

Wrong! The uncertainty in its velocity as measured the first time means that predicting its position a million years into the future will probably be way off. We’ll likely never find that particle. The point here is that, if we COULD carry out such a measurement, then we could obtain immense amounts of information. In other words, without the Uncertainty Principle, then information in effect “draws interest” over the course of time: the longer we wait, the more bits we get. 

There are endless variations on this basic experiment, using chemical reactions or nuclear explosions or objects moving at close to the speed of light. The basic concept is universal: you cannot generate more bits by just waiting. 

Yes, it is possible to obtain lots of bits in the measurement if you expend lots of bits in supporting measurements. If you stationed millions of observers making millions of observations of the moving particle as it traversed the universe, you could end up with a very precise AVERAGE velocity. But this would require the expenditure of lots more bits by the additional observers, and wouldn’t tell you anything more about the FUTURE motion of the particle than is permitted by the Uncertainty Principle.

“But why can’t the total amount of information in the universe decline with time? After all, that’s what the Second Law of Thermodynamics says.”

Nope, that’s not what the Second Law says. Let’s use the textbook example of the Second Law at work: a thermally isolated, closed container with a bunch of gas molecules all crowded together in one corner. We release the molecules and they immediately spread all through the container. The entropy of the system increases dramatically. That’s the Second Law at work.

But the information content of the container has not changed. Consider: how would you specify the state of the gas in bits? Well, you’d want six numbers for each molecule: the three spatial coordinates and the three speed components of the velocity. (Let’s ignore finer points like rotation, internal energy levels, and so forth.) You’d need those six numbers for each and every molecule when you start the experiment. At the end of the experiment, when the molecules are all randomly distributed through the container, you STILL need six numbers to specify the state of each molecule. In terms of entropy, we have seen a huge increase. In terms of information, nothing has changed.

The Big Bang: “In for a penny, in for a pound”
Since I’ve gone wild with these speculations, I might as well keep going with even more grand crazy speculations. Everybody wonders about how the Big Bang started. Non-physicists always ask “What came before the Big Bang?”, to which physicists just shrug their shoulders. They wouldn’t understand anyway.

For me, the striking thing is the kinship of the Big Bang with black holes. Physicists are also taken with this kinship, and in fact the latest theories of cosmology attempt to unite these two phenomena. I, however, have a much simpler (more naive) way to perceiving their kinship. I note that the Big Bang inserted lots of stuff (I say it’s information) into the universe, and black holes remove lots of stuff (again, information, according to Crawford) from the universe. I combine the two in the most simplistic way possible: black holes cause Big Bangs.

Yes, yes, there are plenty of objections to this notion. But let me explain it first. We’ll follow the causal sequence. When a black hole first forms, it suddenly removes a large amount of mass-energy from the universe. That stuff is gone and is never coming back. Where does it go? At first, the answer was “Into the singularity at the center of the black hole.” But of late there appear to have been numerous whispers that there’s got to more to it than that. My whispering (‘mumbling’ is probably the better verb) is that all that stuff forms a new universe inside the singularity. 

How can that be? Harken back to the place above where I wrote that “I assert that the width of space is defined by objects.” You see, since (by my claim), the real constituation of the universe is information (bits), the black hole squishes all those electrons, mesons, photons, etc into nothingingness, but the bits remain untouched. All those zillions of bits remain unperturbed inside the singularity, where they remanifest into something different — something that fits inside the singularity. That stuff is the same primordial stuff that existed at the first instant of our own Big Bang: ylem? concentrated chocolate pudding? But the instant it becomes real, the Uncertainty Principle asserts itself. If we think of this super-hyper-mega-concentrated stuff as energy, then we apply the energy version of the Uncertainty Principle:

∆E ∆t ≥ h

This form of the Uncertainty Principle says that, for any spot in space, the uncertainty in the amount of energy contained in that spot, multiplied by the uncertainty in the amount of time consumed in our measurement is greater than or equal to that constant. That’s so much energy that it can remain that way for an extremely tiny period of time. The blob of concentrated chocolate pudding must get really big really, really fast. This is what cosmologists call “inflation”. It happened with our own Big Bang and they’re working on the details of how it happened. 

The inflation creates nasty problems for my wild speculations. Why did it stop? My hypothesis posits that the amount of energy doesn’t change, so the uncertainty in the energy shouldn’t change, either, right? If so, then the energy flavor of the Uncertainty Principle applies just as fiercely today, 14 billion years later, than it did at the very beginning. Oops. 

My guess is that the original concentrated chocolate pudding was all energy, but as the universe expanded, that energy was converted into something else: dark energy, perhaps? dark matter? concentrated butterscotch? I don’t know. There are a lot of possibilities here and the speculations are piling on top of each other like exponents in an equation.

But there’s yet another objection our half-sane friend might raise: that stuff can’t exist inside the singularity because the singularity is just too tiny. Harken back to the paragrah ending with "Objects DEFINE the span of space.” The objects that are manifestations of the bits are themselves what defines the span of the space. This leads me to conclude that, yes, there is an entire universe inside the singularity. That universe is rather like the TARDIS in the science fiction television show “Dr. Who”, which is bigger on the inside than on the outside. Inside the singularity of the black hole is a full-size universe. 

Thus, each and every black hole generates its own universe. There is a catch: that universe contains only as much information as went into the black hole. Therefore, every universe is a subset of the universe containing the black hole that spawned it. This means that, as the sequence goes, new universes get smaller and smaller. However, there is a lower limit to the size of almost all black holes, which in turn means that each new universe has a minimum amount of information. Is such a limitation due to the effect or to the cause? I don’t know.

So much for now. I’m sure that readers will find much in this essay that is preposterous. I offer it only as a stimulant for thought. Can you think of a clean, clear argument that invalidates this wacko concept? You can find my email address above.