Nathan K. Johnson-McDaniel, Benjamin J. Owen
Robust arguments predict that a hadron-quark mixed phase may exist in the cores of some "neutron" stars. Such a phase forms a crystalline lattice with a shear modulus higher than that of the crust due to the high density and charge separation, even allowing for the effects of charge screening. This may lead to strong continuous gravitational-wave emission from rapidly rotating neutron stars and gravitational-wave bursts associated with magnetar flares and pulsar glitches. We present the first detailed calculation of the shear modulus of the mixed phase. We describe the quark phase using the bag model plus first-order quantum chromodynamics corrections and the hadronic phase using relativistic mean-field models with parameters allowed by the most massive pulsar. Most of the calculation involves treating the "pasta phases" of the lattice via dimensional continuation, and we give a general method for computing dimensionally continued lattice sums including the Debye model of charge screening. We compute all the shear components of the elastic modulus tensor and angle average them to obtain the effective (scalar) shear modulus for the case where the mixed phase is a polycrystal. We include the contributions from changing the cell size, which are necessary for the stability of the lower-dimensional portions of the lattice. Stability also requires a minimum surface tension, generally tens of MeV/fm^2 depending on the equation of state. We find that the shear modulus can be a few times 10^33 erg/cm^3, two orders of magnitude higher than the first estimate, over a significant fraction of the maximum mass stable star for certain parameter choices.
View original:
http://arxiv.org/abs/1110.4650
No comments:
Post a Comment