view Abstract Citations (12) References (37) Co-Reads Similar Papers Volume Content Graphics Metrics Export Citation NASA/ADS Mass-Angular Regimes for Certain Instabilities of a Compact, Rotating Stellar Core Wiita, Paul J. ; Press, William H. Abstract Equilibrium states and domains of instability for compact, rapidly rotating stellar cores are investigated. Using a one-zone average-pressure model based on Maclaurin spheroids, regimes corresponding to degenerate dwarfs and neutron stars in the mass-angular-momentum (M, J)plane are tested for their stability to radial and to certain nonradial modes. Both cold, catalyzed and noncatalyzed equations of state are considered. Results for upper mass limits and the onset of secular and dynamical instabilities agree reasonably well with recent work on viscous, differentially rotating white dwarfs. Some effects of "non-zero" temperatures on neutrino luminosity and nondegenerate pressures are briefly considered. Scenarios for subsequent evolution of different regions in the (M, J)-plane, e.g., gravitational radiation versus viscous dissipation in triaxial configurations, binary fission, "fizzlers," and collapse to black holes, are sketched. Subject headings: stars: interiors - stars: neutron stars: rotation - stars: white dwarfs Publication: The Astrophysical Journal Pub Date: September 1976 DOI: 10.1086/154635 Bibcode: 1976ApJ...208..525W full text sources ADS |
Götz, Druckmüller, and, independently, Brady have defined a discrete Radon transform (DRT) that sums an image's pixel values along a set of aptly chosen discrete lines, complete in slope and intercept. The transform is fast, O(N2log N) for an N x N image; it uses only addition, not multiplication or interpolation, and it admits a fast, exact algorithm for the adjoint operation, namely backprojection. This paper shows that the transform additionally has a fast, exact (although iterative) inverse. The inverse reproduces to machine accuracy the pixel-by-pixel values of the original image from its DRT, without artifacts or a finite point-spread function. Fourier or fast Fourier transform methods are not used. The inverse can also be calculated from sampled sinograms and is well conditioned in the presence of noise. Also introduced are generalizations of the DRT that combine pixel values along lines by operations other than addition. For example, there is a fast transform that calculates median values along all discrete lines and is able to detect linear features at low signal-to-noise ratios in the presence of pointlike clutter features of arbitrarily large amplitude.
A review is given of black hole formation and astrophysical processes near black holes. The following topics are discussed: gravitational collapse, interactions of the hole with its environment, black holes in binary systems, and exotic possibilities. (200 references) (BJG)
From the Publisher:
This is the revised and greatly expanded Second Edition of the hugely popular Numerical Recipes: The Art of Scientific Computing. The product of a unique collaboration among four leading scientists in academic research and industry, Numerical Recipes is a complete text and reference book on scientific computing. In a self-contained manner it proceeds from mathematical and theoretical considerations to actual practical computer routines. With over 100 new routines (now well over 300 in all), plus upgraded versions of many of the original routines, this book is more than ever the most practical, comprehensive handbook of scientific computing available today. The book retains the informal, easy-to-read style that made the first edition so popular, with many new topics presented at the same accessible level. In addition, some sections of more advanced material have been introduced, set off in small type from the main body of the text. Numerical Recipes is an ideal textbook for scientists and engineers and an indispensable reference for anyone who works in scientific computing. Highlights of the new material include a new chapter on integral equations and inverse methods; multigrid methods for solving partial differential equations; improved random number routines; wavelet transforms; the statistical bootstrap method; a new chapter on less-numerical algorithms including compression coding and arbitrary precision arithmetic; band diagonal linear systems; linear algebra on sparse matrices; Cholesky and QR decomposition; calculation of numerical derivatives; Pade approximants, and rational Chebyshev approximation; new special functions; Monte Carlo integration in high-dimensional spaces; globally convergent methods for sets of nonlinear equations; an expanded chapter on fast Fourier methods; spectral analysis on unevenly sampled data; Savitzky-Golay smoothing filters; and two-dimensional Kolmogorov-Smirnoff tests. All this is in addition to material on such basic top
As electronic medical records enable increasingly ambitious studies of treatment outcomes, ethical issues previously important only to limited clinical trials become relevant to unlimited whole populations. For randomized clinical trials, adaptive assignment strategies are known to expose substantially fewer patients to avoidable treatment failures than strategies with fixed assignments (e.g., equal sample sizes). An idealized adaptive case--the two-armed Bernoulli bandit problem--can be exactly optimized for a variety of ethically motivated cost functions that embody principles of duty-to-patient, but the solutions have been thought computationally infeasible when the numbers of patients in the study (the "horizon") is large. We report numerical experiments that yield a heuristic approximation that applies even to very large horizons, and we propose a near-optimal strategy that remains valid even when the horizon is unknown or unbounded, thus applicable to comparative effectiveness studies on large populations or to standard-of-care recommendations. For the case in which the economic cost of treatment is a parameter, we give a heuristic, near-optimal strategy for determining the superior treatment (whether more or less costly) while minimizing resources wasted on any inferior, more expensive, treatment. Key features of our heuristics can be generalized to more complicated protocols.