Short answer: Without tunnelling, stars like the Sun would never reach nuclear fusion temperatures; stars less massive than around $5M_{odot}$ would become "hydrogen white dwarfs" supported by electron degeneracy pressure. More massive objects would contract to around a tenth of a solar radius and commence nuclear fusion. They would be hotter than "normal" stars of a similar mass, but my best estimate is that they have similar luminosities. Thus it would not be possible to get a stable nuclear burning star with 1 solar luminosity. Stars of 1 solar luminosity could exist, but they would be on cooling tracks, much like brown dwarfs are in the real universe.
A very interesting hypothetical question. What would happen to a star if you "turned off" tunnelling. I think the answer to this is that the pre-main-sequence stage would become significantly longer. The star would continue to contract, releasing gravitational potential energy in the form of radiation and by heating the core of the star. The virial theorem tells us that the central temperature is roughly proportional to $M/R$ (mass/radius). So for a fixed mass, as the star contracts, its core gets hotter.
There are then (at least) two possibilities.
The core becomes hot enough for protons to overcome the Coulomb barrier and begin nuclear fusion. For this to happen, the protons need to get within about a nuclear radius of each other, let's say $10^{-15}$ m. The potential energy is
$e^2/(4pi epsilon_0 r) = 1.44$ MeV or $2.3times 10^{-13}$ J.
The protons in the core will have a mean kinetic energy of $3kT/2$, but some small fraction will have energies much higher than this according to a Maxwell-Boltzmann distribution. Let's say (and this is a weak point in my calculation that I may need to revisit when I have more time) that fusion will take place when protons with energies of $10 kT$ exceed the Coulomb potential energy barrier. There will be a small numerical uncertainty on this, but because the reaction rate would be highly temperature sensitive it will not be an order of magnitude out. This means that fusion would not begin until the core temperature reached about $1.5 times 10^{9}$ K.
In the Sun, fusion happens at around $1.5times 10^7$ K, so the virial theorem result tells us that stars would need to contract by about a factor of 100 for this to happen.
Because the gravity and density of such a star would be much higher than the Sun, hydrostatic equlibrium would demand a very high pressure gradient, but the temperature gradient would be limited by convection, so there would need to be an extremely centrally concentrated core with a fluffy envelope. Working through some simple proportionalities I think that the luminosity would be almost unchanged (see luminosity-mass relation but consider how luminosity depends on radius at a fixed mass), but that means the temperature would have to be hotter by a factor of the square root of the radius contraction factor. However, this could be academic, since we need to consider the second possibility.
(2) As the star shrinks, the electrons become degenerate and contribute degeneracy pressure. This becomes important when the phase space occupied by each electron approaches $h^3$. There is a standard bit of bookwork, which I am not going to repeat here - you can find it something like "The Physics of Stars" by Phillips - which shows that degeneracy sets in when
$$frac{ 4pi mu_e}{3h^3}left(frac{6G Rmu m_e}{5}right)^{3/2} m_u^{5/2} M^{1/2} = 1,$$
where $mu_e$ is the number of mass units per electron, $mu$ is the number of mass units per particle, $m_e$ is the electron mass and $m_u$ is an atomic mass unit. If I've done my sums right this means for a hydrogen gas (let's assume) with $mu_e=1$ and $mu = 0.5$ that degeneracy sets in when
$$ left(frac{R}{R_{odot}}right) simeq 0.18 left(frac{M}{M_{odot}}right)^{-1/3}$$
In other words, when the star shrinks to the size of $sim$ Jupiter, its interior will be governed by electron degeneracy pressure, not by perfect gas pressure. The significance of this is that electron degeneracy pressure is only weakly dependent (or independent for a completely degenerate gas) on temperature. This means that the star can cool whilst only decreasing its radius very slightly. The central temperature would never reach the high temperatures required for nuclear burning and the "star" would become a hydrogen white dwarf with a final radius of a few hundredths of a solar radius (or a bit smaller for more massive stars).
The second possibility must be the fate of something the mass of the Sun. However, there is a cross-over point in mass where the first possibility becomes viable. To see this, we note that the radius at which degeneracy sets in depends on $M^{-1/3}$, but the radius the star needs to shrink to in order to begin nuclear burning is proportional to $M$. The cross-over takes place somewhere in the range 5-10 $M_{odot}$. So stars more massive than this could commence nuclear burning at radii of about a tenth of a solar radius, without their cores being degenerate. An interesting possibility is that at a few solar masses there should be a class of object that contracts sufficiently that nuclear ignition is reached when the core is substantially degenerate. This might lead to a runaway "hydrogen flash", depending on whether the temperature dependence of the reaction rate is extreme enough.
Best question of the year so far. I do hope that someone has run some simulations to test these ideas.
Edit: As a postscript it is of course anomalous to neglect a quantum effect like tunnelling, whilst at the same time relying on degeneracy pressure to support the star! If one were to neglect quantum effects entirely and allow a star like the Sun to collapse, then the end result would surely be a classical black hole.
A further point that would need further consideration is to what extent radiation pressure would offer support in stars that were smaller, but much hotter.
No comments:
Post a Comment