Overview
- If the universe is only 6,000–10,000 years old, as young-earth creationism claims, light from galaxies billions of light-years away should not yet have reached Earth—yet we observe it routinely, creating what is known as the distant starlight problem.
- Multiple independent methods—stellar parallax, Cepheid variable stars, Type Ia supernovae, and cosmological redshift—converge on cosmic distances measured in billions of light-years, and no proposed young-earth solution has survived scientific scrutiny.
- Proposed solutions including c-decay, white hole cosmology, anisotropic synchrony conventions, and the mature creation hypothesis each fail on empirical, theoretical, or theological grounds, and the problem remains widely acknowledged as one of the most intractable challenges for young-earth models.
One of the most straightforward challenges to young-earth creationism comes not from biology or geology but from astronomy. The observable universe spans approximately 93 billion light-years in diameter, and telescopes routinely detect light from galaxies and quasars billions of light-years away.1, 16 A light-year is the distance light travels in one year—roughly 9.46 trillion kilometers. If the universe is only 6,000 to 10,000 years old, as young-earth models propose, then light from any object more than about 10,000 light-years away should not yet have reached Earth. Yet astronomers observe objects at distances of billions of light-years, and those observations carry detailed physical information—spectra, time-dilated supernovae, evolved galaxy morphologies—that consistently indicate vast ages. This contradiction is known as the distant starlight problem, or colloquially as the “starlight and time” problem.
The problem is not merely that distant objects exist. It is that we observe physical processes unfolding across cosmic time in a manner fully consistent with the light-travel distances involved. When astronomers detect a supernova in a galaxy 100 million light-years away, they observe the explosion’s light curve stretching over weeks, its spectrum evolving through well-understood nuclear physics, and its brightness declining along a predictable curve. They are watching an event that, by any conventional reading of the physics, occurred 100 million years ago. The starlight problem asks young-earth proponents to explain how such observations are possible in a universe less than ten thousand years old.
Establishing cosmic distances
The foundation of the distant starlight problem rests on the reliability of cosmic distance measurements. If those distances were uncertain or contested within mainstream astronomy, the problem might be dismissed as resting on shaky evidence. In fact, astronomical distances are established through a chain of independent, overlapping methods known as the cosmic distance ladder, where each rung calibrates the one above it and cross-checks from multiple techniques provide internal consistency.
The first rung uses stellar parallax—the apparent shift in a nearby star’s position against the background of more distant stars as Earth orbits the Sun. This is pure geometry, requiring no assumptions about the nature of light or stellar physics. The European Space Agency’s Gaia mission has measured parallaxes for nearly two billion stars with precisions reaching microarcseconds, directly confirming distances out to tens of thousands of light-years.2 For objects within the Milky Way, the distances are a matter of triangulation, not theory.
Beyond the range of parallax, astronomers use Cepheid variable stars—pulsating giants whose intrinsic brightness correlates tightly with their pulsation period. This period-luminosity relationship, discovered by Henrietta Swan Leavitt in 1912, allows astronomers to determine a Cepheid’s true luminosity from its observed pulsation rate and then calculate its distance by comparing intrinsic brightness with apparent brightness.3 Cepheids have been observed in galaxies tens of millions of light-years away, most recently by the James Webb Space Telescope, and they serve as the primary calibrator for the next rung of the distance ladder.17, 20
For distances reaching billions of light-years, astronomers rely on Type Ia supernovae as standard candles. These thermonuclear explosions of white dwarf stars have a well-characterized relationship between their peak brightness and the rate at which they fade, allowing their intrinsic luminosity to be determined from their light curves alone.4 Observations of Type Ia supernovae in the late 1990s by two independent research teams not only confirmed distances of billions of light-years but also revealed the accelerating expansion of the universe—a discovery that earned Saul Perlmutter, Brian Schmidt, and Adam Riess the 2011 Nobel Prize in Physics.5, 6
Complementing these geometric and photometric methods, cosmological redshift provides an independent distance indicator. As light travels through expanding space, its wavelength stretches, shifting spectral features toward the red end of the spectrum. The relationship between redshift and distance, codified in Hubble’s law, was first observed by Edwin Hubble in 1929 and has since been calibrated with exquisite precision.7 The most distant galaxies observed have redshifts exceeding 10, corresponding to light emitted when the universe was less than 500 million years old.1
Crucially, these methods are not independent guesses that happen to agree. Each one is calibrated against the others in overlapping distance ranges. Parallax calibrates Cepheids. Cepheids calibrate supernovae. Supernovae calibrate the Hubble expansion. The cosmic microwave background provides an entirely independent check on the expansion history, and it agrees with the local measurements to within a few percent—a small discrepancy known as the Hubble tension, which is itself the subject of intense research, but which involves disagreement over whether the universe is 13.77 or 13.80 billion years old, not whether it is billions versus thousands of years old.1, 19, 20
The problem stated precisely
The distant starlight problem can be stated in formal terms. The speed of light in a vacuum is a measured constant: 299,792,458 meters per second, or approximately 9.461 × 1012 kilometers per year.10 The age and size of the observable universe have been measured with high precision: 13.787 ± 0.020 billion years old, with a comoving diameter of approximately 93 billion light-years (the diameter exceeds twice the age in light-years because space itself has been expanding throughout cosmic history).1, 16
If the universe were instead 6,000 to 10,000 years old, light could have traveled at most 10,000 light-years since creation—roughly the distance from Earth to the center of the Milky Way galaxy. Every galaxy beyond the Milky Way, every quasar, every distant supernova, every feature of the cosmic microwave background (which originates from a shell of space 46 billion light-years away in comoving distance) should be invisible.1, 16 Yet we see all of these things, and we see them in exquisite detail.
The problem deepens when the information carried by distant light is considered. Light from distant objects does not simply arrive as a featureless beam. It carries absorption spectra revealing the chemical composition of intervening gas clouds, time dilation signatures in supernova light curves consistent with their measured redshifts, and detailed records of physical processes—stellar evolution, galaxy mergers, nuclear reactions—that unfold over timescales vastly exceeding ten thousand years.5, 6, 22 A supernova observed at redshift z = 1, for instance, shows its light curve stretched by exactly a factor of (1 + z) = 2, precisely as predicted by the expanding universe model. This time dilation is not an isolated curiosity but a systematic effect observed across thousands of supernovae at varying redshifts, and it is exactly what one would expect if the light genuinely traveled for billions of years through expanding space.5, 22
The c-decay hypothesis
One of the earliest young-earth attempts to resolve the starlight problem was the proposal that the speed of light was dramatically faster in the recent past. If light traveled millions of times faster shortly after creation, then photons from the most distant galaxies could have reached Earth within a few thousand years. This hypothesis, known as c-decay or the Setterfield hypothesis, was advanced most prominently by Barry Setterfield and Trevor Norman in a 1987 report commissioned by Stanford Research Institute International (now SRI International).8
Setterfield and Norman compiled historical measurements of the speed of light from the seventeenth century onward and argued that these measurements showed a systematic decline in the value of c over time. They proposed that the speed of light was effectively infinite at the moment of creation and decayed exponentially to its present value. If true, this would allow light from any distance to arrive at Earth almost instantaneously in the early universe.8
The hypothesis attracted immediate criticism from both mainstream scientists and fellow young-earth creationists. The Creation Research Society published Gerald Aardsma’s analysis showing that the apparent decline in historical measurements of c was entirely consistent with improving measurement precision over time—early measurements had large error bars, and the supposed trend vanished once measurement uncertainties were properly accounted for.23
More fundamentally, the speed of light is not an isolated constant. It is woven into the fabric of physics through relationships with other fundamental constants. The fine structure constant, α ≈ 1/137, governs the strength of the electromagnetic interaction and depends on the speed of light, Planck’s constant, and the elementary charge. If c had been dramatically different in the past, the fine structure constant would also have been different, which would have altered atomic spectra, nuclear reaction rates, and the chemistry of the early universe in detectable ways.10
Observations of distant quasars provide a direct test. The absorption spectra of gas clouds between Earth and distant quasars are imprinted with atomic transition lines whose precise wavelengths depend on the fine structure constant. By comparing these absorption lines at high redshift (corresponding to the distant past) with laboratory measurements, physicists can measure whether α has changed over billions of years. Multiple independent studies have constrained any variation in α to less than a few parts per million over the last 10 billion years.11 Similarly, measurements of molecular hydrogen absorption lines in distant quasar spectra constrain the proton-to-electron mass ratio to have been stable to within 10 parts per million over 7 billion years.18, 21
These constraints are devastating to the c-decay hypothesis. A speed of light millions of times larger than its present value would require the fine structure constant to have been radically different, which would leave unmistakable signatures in the spectra of distant objects. No such signatures are observed. The physics recorded in ancient light is the same physics operating today, to extraordinary precision.10, 11
Additionally, a changing speed of light would alter the energy-mass equivalence relation E = mc². If c were vastly larger in the past, the energy released in nuclear reactions would have been correspondingly greater, producing a Sun so luminous that Earth’s surface would have been sterilized. The c-decay hypothesis, in attempting to solve the starlight problem, creates a cascade of secondary problems that are arguably worse than the original.9, 10
White hole cosmology
In 1994, the young-earth physicist D. Russell Humphreys proposed a radically different cosmological model in his book Starlight and Time. Drawing on general relativity, Humphreys argued that the universe could be modeled as a bounded sphere of matter with Earth near its center. During the expansion of this sphere, Humphreys proposed, a white hole—the time-reverse of a black hole—formed, creating an event horizon that swept outward through the cosmos. Near the center of the white hole, gravitational time dilation would have been extreme: while billions of years elapsed in the outer regions of the universe (allowing light to traverse vast distances), only days passed near Earth.12
The appeal of this model is that it appears to use legitimate physics—general relativity genuinely predicts gravitational time dilation, and the mathematics of white holes is formally valid. Humphreys argued that his model allowed the universe to be simultaneously young (from Earth’s reference frame) and old (from the reference frame of distant galaxies), thereby reconciling the biblical timeline with astronomical observations.12
The model has been rejected by physicists on multiple grounds. First, it requires Earth to occupy a privileged position at or very near the center of the universe. Modern observations, including the near-perfect isotropy of the cosmic microwave background, indicate that the universe has no center—it is homogeneous and isotropic on large scales, consistent with the cosmological principle.1, 19 A universe with Earth at its center would produce detectable asymmetries in the CMB and in the distribution of distant galaxies, neither of which is observed.
Second, the time dilation required by the model is not a minor correction but an enormous effect: the ratio of time elapsed at Earth versus time elapsed at the edge of the universe would need to exceed a factor of one million. General relativity does predict time dilation near massive objects and within gravitational wells, but the magnitude required by Humphreys’s model is not supported by any self-consistent solution to Einstein’s field equations for a matter distribution resembling the observed universe.12
Third, the model requires a white hole—an object that has never been observed and that most physicists regard as physically unrealizable. While white holes are valid mathematical solutions to general relativity, they are thermodynamically unstable: any small perturbation would collapse them into a black hole. A white hole large enough to encompass the observable universe and persist for the duration required by Humphreys’s scenario has no known physical mechanism of formation or stability.
Fourth, and perhaps most critically, the model fails to account for the detailed physics encoded in distant light. If Earth experienced only days while the outer universe experienced billions of years, then light arriving at Earth from distant galaxies should carry information about billions of years of physical processes—which it does. But this concedes the very point the model was designed to avoid: the physical history of the universe is billions of years long. The model does not make the universe young in any physically meaningful sense; it merely asserts that Earth’s clock ran differently, while the rest of the cosmos aged normally.12
The anisotropic synchrony convention
A more recent proposal, advanced by the young-earth astrophysicist Jason Lisle in 2010, takes a different approach entirely. Rather than proposing new physics, Lisle argued that the distant starlight problem arises from a particular convention about how clocks are synchronized across space, and that an alternative convention dissolves the problem.13
The argument rests on a genuine subtlety in the physics of special relativity: the one-way speed of light has never been independently measured. All laboratory measurements of the speed of light measure the round-trip (two-way) speed—a pulse of light is sent to a mirror and reflected back, and the total distance divided by the total time gives c. The two-way speed is unambiguously 299,792,458 m/s. However, determining the one-way speed requires synchronized clocks at both the source and the detector, and synchronizing those clocks requires either transporting a clock (which introduces time-dilation corrections) or using a light signal (which assumes the very quantity being measured). This is known as the conventionality of simultaneity, recognized by Hans Reichenbach and others in the philosophy of physics.14
Lisle proposed an anisotropic synchrony convention (ASC) in which light traveling toward an observer moves instantaneously (infinite one-way speed), while light traveling away from the observer moves at c/2 (half the conventional speed). Under this convention, the round-trip speed of light remains c, as measured, but incoming light arrives the instant it is emitted. If adopted, light from any distance would arrive at Earth at the moment of its emission, and the starlight problem would disappear by definition.13
The proposal is technically correct in one narrow sense: the conventionality of the one-way speed of light is acknowledged in the philosophy of physics, and the ASC does not violate any experimental measurement of the two-way speed.14 However, several problems render the proposal scientifically vacuous as a solution to the distant starlight problem.
First, a synchrony convention is a choice of coordinate labeling, not a statement about physical reality. Choosing a convention in which incoming light is labeled as “instantaneous” does not change the physical fact that the light interacted with matter, underwent absorption and emission processes, and carried time-dependent information across the intervening space. The supernova light curves, the spectral evolution of variable stars, and the time-dependent structure of the CMB all record physical durations that are independent of how one labels the moment of arrival.5, 22
Second, the ASC creates severe difficulties when applied to observed astrophysical phenomena. Under the ASC, events that are observed simultaneously from Earth are assigned the same time of occurrence regardless of distance. This means a supernova observed today in a galaxy 100 million light-years away and a supernova observed today in a galaxy 10 billion light-years away would both be assigned to the present moment. But these supernovae show different redshifts, different time dilations, and different host galaxy morphologies that are entirely consistent with occurring at different cosmic epochs under standard physics. Under the ASC, these systematic correlations between distance and observed properties become unexplained coincidences.5, 6
Third, the ASC does not actually address the physical content of the starlight problem. The problem is not merely that photons have arrived at Earth but that the photons carry information about billions of years of physical processes. Relabeling the moment of arrival does not eliminate the physical history recorded in the light. A galaxy observed at a redshift of z = 8 shows a stellar population, chemical enrichment pattern, and morphology consistent with being observed roughly 600 million years after the Big Bang. No synchrony convention changes the physical state of that galaxy as recorded in its emitted light.1
The mature creation hypothesis
The oldest proposed solution to the distant starlight problem predates modern astronomy. The idea, sometimes called the omphalos hypothesis after Philip Henry Gosse’s 1857 book Omphalos, proposes that God created the universe in a mature, fully functioning state—including light already in transit from distant stars.15 Just as Adam, according to the Genesis account, was created as an adult rather than an infant, so the universe was created with the appearance of age. Under this view, light from a galaxy one billion light-years away did not travel for one billion years; it was created already en route, as part of a functioning cosmos.
The mature creation hypothesis is not a scientific theory in the conventional sense, as it makes no testable predictions and cannot be falsified by any observation. Any evidence of an old universe can be attributed to the appearance of age built into the creation. For this reason, it lies outside the domain of empirical science.
More significantly, the hypothesis encounters a severe theological and philosophical objection that has been raised by both secular critics and Christian theologians. The light arriving from distant objects does not carry a blank signal. It carries detailed records of physical events: supernovae exploding, galaxies colliding, stars forming from gas clouds, binary star systems spiraling inward and merging. If this light was created in transit, then these events never actually occurred. The supernova whose light curve we track over weeks, whose spectrum we analyze for nucleosynthesis products, whose remnant we observe at the correct distance—none of it happened. The light would be, in effect, a fabricated recording of fictional events, embedded in the universe at the moment of creation.15
This implication troubled Gosse’s contemporaries and continues to trouble thoughtful advocates of young-earth models. If God created light in transit bearing detailed records of events that never occurred, the creation would carry systematic false information. Supernova SN 1987A, observed in the Large Magellanic Cloud approximately 168,000 light-years away, was accompanied by a burst of neutrinos detected independently by three neutrino observatories on Earth, arriving within hours of the light—exactly as predicted by supernova physics. Under the mature creation hypothesis, both the photons and the neutrinos were created in transit, carrying coordinated false evidence of an explosion that never happened. Many theologians, including those sympathetic to creationism, have argued that a God who embeds systematic false information in the fabric of creation would be deceptive, a characterization incompatible with the theological traditions that motivate young-earth belief in the first place.15
Other proposed solutions
Several other young-earth proposals have been offered with less development or acceptance than the four major models discussed above. Some propose that space itself was stretched during creation, carrying light along with it in a manner analogous to cosmic inflation but compressed into the creation week. Others suggest that gravitational time dilation from a massive water canopy or some other structure near Earth could account for the discrepancy. Still others appeal to Riemannian geometry arguments in which the topology of the universe allows light to take shortcuts.
None of these proposals has been developed into a quantitative model capable of reproducing the detailed observations that standard cosmology explains. The cosmic microwave background alone contains millions of data points describing the temperature, polarization, and spectral properties of radiation from the early universe, all of which fit the standard cosmological model with extraordinary precision.1, 19 Any alternative model must account for this data, and no young-earth cosmology has attempted to do so.
An acknowledged difficulty
The distant starlight problem holds a distinctive status among challenges to young-earth creationism: it is widely acknowledged as unsolved even by young-earth physicists and astronomers themselves. Unlike debates over radiometric dating or the fossil record, where young-earth organizations maintain confident public positions, the starlight problem has generated internal disagreement and candid admissions of difficulty. The proliferation of mutually incompatible proposals—c-decay, white hole cosmology, the anisotropic synchrony convention, mature creation, and various hybrid models—reflects the absence of any consensus solution within the young-earth community itself.
The difficulty is compounded by the fact that cosmic distances are not established by a single method subject to a single set of assumptions. The cosmic distance ladder relies on independent techniques that cross-check one another: parallax requires only geometry, Cepheids require only the period-luminosity relation calibrated by parallax, Type Ia supernovae require only the Phillips relation calibrated by Cepheids, and the cosmic microwave background requires only well-understood plasma physics and general relativity.2, 3, 4, 17 To reject the conclusion that the universe is billions of years old, one must reject not a single measurement but an interlocking web of independent measurements from different branches of physics, all converging on the same answer.
The starlight and time problem remains, by the assessment of both mainstream cosmology and candid young-earth commentators, one of the most formidable challenges facing any model that proposes a universe thousands rather than billions of years old. The light that fills the night sky carries within it not merely photons but a detailed physical record of cosmic history—a record that, by every available measure, stretches back billions of years.
References
Observational evidence from supernovae for an accelerating universe and a cosmological constant
Expanding confusion: common misconceptions of cosmological horizons and the superluminal expansion of the universe
Constraint on a varying proton-to-electron mass ratio from molecular hydrogen absorption toward quasar PKS 1830−211
Nine-year Wilkinson Microwave Anisotropy Probe (WMAP) observations: cosmological parameter results
A comprehensive measurement of the local value of the Hubble constant with 1 km/s/Mpc uncertainty from the Hubble Space Telescope and the SH0ES team
Constraints on the constancy of the proton-to-electron mass ratio from quasar absorption lines