Thursday, July 2, 2009
Hubble Space Telescope
The Hubble Space Telescope (HST) is a space telescope that was carried into orbit by the space shuttle in April 1990. It is named after the American astronomer Edwin Hubble. Although not the first space telescope, the Hubble is one of the largest and most versatile, and is well-known as both a vital research tool and a public relations boon for astronomy. The HST is a collaboration between NASA and the European Space Agency, and is one of NASA's Great Observatories, along with the Compton Gamma Ray Observatory, the Chandra X-ray Observatory, and the Spitzer Space Telescope.
Space telescopes were proposed as early as 1923. The Hubble was funded in the 1970s, with a proposed launch in 1983, but the project was beset by technical delays, budget problems, and the Challenger disaster. When finally launched in 1990, scientists found that the main mirror had been ground incorrectly, severely compromising the telescope's capabilities. However, after a servicing mission in 1993, the telescope was restored to its intended quality. Hubble's orbit outside the distortion of Earth's atmosphere allows it to take extremely sharp images with almost no background light. Hubble's Ultra Deep Field image, for instance, is the most detailed visible-light image ever made of the universe's most distant objects. Many Hubble observations have led to breakthroughs in astrophysics, such as accurately determining the rate of expansion of the universe.
The Hubble is the only telescope ever designed to be serviced in space by astronauts. There have been five servicing missions, the last occurring in May 2009. Servicing Mission 1 took place in December 1993 when Hubble's imaging flaw was corrected. Servicing missions 2, 3A, and 3B repaired various sub-systems and replaced many of the observing instruments with more modern and capable versions. However, following the 2003 Space Shuttle Columbia accident, the fifth servicing mission was canceled on safety grounds. After spirited public discussion, NASA reconsidered this decision, and administrator Mike Griffin approved one final Hubble servicing mission. STS-125 was launched in May 2009, and installed two new instruments and made numerous repairs. Assuming testing and calibration of the new equipment goes well, the Hubble should resume routine operation in September 2009.
The latest servicing should allow the telescope to function until at least 2014, when its successor, the James Webb Space Telescope (JWST), is due to be launched. The JWST will be far superior to Hubble for many astronomical research programs, but will only observe in infrared, so it will complement (not replace) Hubble's ability to observe in the visible and ultraviolet parts of the spectrum.
Proposals and precursors
In 1923, German scientist Hermann Oberth, considered—along with Robert Goddard and Konstantin Tsiolkovsky—one of the three fathers of modern rocketry, published Die Rakete zu den Planetenräumen ("The Rocket into Planetary Space"), which mentioned how a telescope could be propelled into Earth orbit by a rocket.
The history of the Hubble Space Telescope can be traced back as far as 1946, when the astronomer Lyman Spitzer wrote the paper "Astronomical advantages of an extraterrestrial observatory", in it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle and is known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for a telescope with a mirror 2.5 m in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere.
Spitzer devoted much of his career to pushing for a space telescope to be developed. In 1962 a report by the United States National Academy of Sciences recommended the development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining the scientific objectives for a large space telescope.
Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946,and NASA launched the Orbiting Solar Observatory to obtain UV, X-ray, and gamma-ray spectra in 1962. An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 National Aeronautics and Space Administration (NASA) launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1's battery failed after three days, terminating the mission. It was followed by OAO-2, which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year.
The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy, and 1968 saw the development by NASA of firm plans for a space-based reflecting telescope with a mirror 3 m in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for manned maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable space shuttle indicated that the technology to allow this was soon to become available.
Construction and engineering
Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while the Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission. MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct the spacecraft in which the telescope would be housed.
Optical Telescope Assembly (OTA)
Optically, the Hubble is a Cassegrain reflector of Ritchey-Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations into the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore its mirror needed to be polished to an accuracy of 10 nanometres, or about 1/65 of the wavelength of red light.
Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape.However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques.(The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other's work, which would have almost certainly caught the polishing error that later caused such problems. The Kodak mirror is now on permanent display at the Smithsonian Institution. An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory.
Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror's weight to a minimum it consisted of inch-thick top and bottom plates sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror on both sides with 138 rods that exerted varying amounts of force. This ensured that the mirror's final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer's managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984.The mirror was completed by the end of 1981; it was washed using 2,400 gallons of hot, deionized water and then received a reflective coating of aluminium 65 nm-thick and a protective coating of magnesium fluoride 25 nm-thick.
Doubts continued to be expressed about Perkin-Elmer's competence on a project of this importance as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as "unsettled and changing daily", NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer's schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until first March and then September 1986. By this time the total project budget had risen to US$1.175 billion.
Spacecraft systems
The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to adequately withstand frequent passages from direct sunlight into the darkness of Earth's shadow, which would generate major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable, and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned.Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed's clean room would later be expressed in the vacuum of space; the telescope's instruments would be covered in ice. To reduce that risk, a nitrogen gas purge was performed prior to launching the telescope into space.
While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said that Lockheed tended to rely on NASA directions rather than take their own initiative in the construction.
Ground support
The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University, one of the 33 US universities and 7 international affiliates that make up the AURA consortium. STScI was established in 1983 after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function "in-house", but scientists wanted it to be based in an academic establishment.The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provides similar support for European astronomers.
One rather complex task that falls to STScI is scheduling observations for the telescope. Hubble is situated in a low-Earth orbit so that it can be reached by the space shuttle for servicing missions, but this means that most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, which is specified to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance is to keep bright light out of the FGSs and to keep scattered light from entering the instruments. If the FGSs are turned off, however, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90 degrees to the plane of Hubble's orbit, in which targets are not occulted for long periods. Due to the precession of the orbit, the location of the CVZ moves slowly over a period of eight weeks. Because the limb of the Earth is always within about 30° of regions within the CVZ, the brightness of scattered earthshine may be elevated for long periods during CVZ observations.
Because Hubble orbits in the upper atmosphere, its orbit changes over time in a way that is not accurately predictable. The density of the upper atmosphere varies according to many factors, and this means that Hubble's predicted position for six weeks' time could be in error by up to 4,000 km. Observation schedules are typically finalized only a few days in advance, as a longer lead time would mean there was a chance that the target would be unobservable by the time it was due to be observed.
Engineering support for the Hubble is provided by NASA and contractor personnel at the Goddard Space Flight Center in Greenbelt, Maryland, 48 km south of the STScI. Hubble's operation is monitored 24 hours per day by four teams of flight controllers who make up Hubble's Flight Operations Team.
Flawed mirror
Within weeks of the launch of the telescope, the images returned showed that there was a serious problem with the optical system. Although the first images appeared to be sharper than ground-based images, the telescope failed to achieve a final sharp focus, and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function concentrated within a circle 0.1 arcsec in diameter as had been specified in the design criteria.The detailed performance is shown in graphs from STScI illustrating the mis-figured PSFs compared to post-correction and ground based PSFs.
Analysis of the flawed images showed that the cause of the problem was that the primary mirror had been ground to the wrong shape. Although it was probably the most precisely figured mirror ever made, with variations from the prescribed curve of no more than 1/65 of the wavelength of visible light, it was too flat at the edges. The mirror was barely 2.2 micrometres out from the required shape, but the difference was catastrophic, introducing severe spherical aberration, a flaw in which light reflecting off the edge of a mirror focuses on a different point from the light reflecting off its center.
The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy was largely unaffected. However, the loss of light to the large, out of focus halo severely reduced the usefulness of the telescope for faint objects or high contrast imaging. This meant that nearly all of the cosmological programs were essentially impossible since they required observation of exceptionally faint objects.NASA and the telescope became the butt of many jokes, and the project was popularly regarded as a white elephant. (For instance, in the movie The Naked Gun 2½: The Smell of Fear, the Hubble was pictured with the Titanic, the Hindenburg, and the Edsel). Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations. The error was well characterized and stable, enabling astronomers to optimize the results obtained using sophisticated image processing techniques such as deconvolution.
Origin of the problem
A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that the main null corrector, a device used to measure the exact shape of the mirror, had been incorrectly assembled—one lens was wrongly spaced by 1.3 mm.During the polishing of the mirror, Perkin-Elmer had analyzed its surface with two other null correctors, both of which correctly indicated that the mirror was suffering from spherical aberration. The company ignored these test results as it believed that the two null correctors were less accurate than the primary device that was reporting that the mirror was perfectly figured.
The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings such as relying totally on test results from a single instrument.
Design of a solution
The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak and Itek had each ground back-up mirrors for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope temporarily back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as "spectacles" to correct the spherical aberration.
The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror was −1.01324, instead of the intended −1.00230.The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror.
Because of the way the Hubble's instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the eight separate CCD chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device.
The system designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS was called the "Corrective Optics Space Telescope Axial Replacement" (COSTAR) and consisted essentially of two mirrors in the light path, one of which would be figured to correct the aberration.To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed.
By 2002 all of the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics, rendering it redundant; COSTAR was removed and returned to Earth in 2009, its space taken by the Cosmic Origins Spectrograph.
Servicing Mission 4
Servicing Mission 4 (SM4), was the last scheduled shuttle mission (STS-125) for the Hubble Space Telescope in May 2009.The servicing mission was first planned for October 14, 2008.However on 27 September 2008, the Science Instrument Command and Data Handling (SI C&DH) unit on HST failed. All science data passes through this unit before it can be transmitted to Earth. Although it has a backup unit, if the backup were to fail, the Hubble's useful life would be over.Therefore, on 29 September 2008, NASA announced the launch of SM4 was postponed until 2009 so this unit could be replaced as well.SM4, with a replacement SI C&DH unit,was launched aboard Space Shuttle Atlantis on May 11, 2009.
On SM4 astronauts, over the course of five spacewalks, installed two new instruments, Wide Field Camera 3 (WFC3), and the Cosmic Origins Spectrograph (COS). WFC3 will increase Hubble's observational capabilities in the ultraviolet and visible spectral ranges by up to 35 times due to its higher sensitivity and wider field of view. The telephone-booth sized COS assembly replaced the Corrective Optics Space Telescope Axial Replacement (COSTAR) that was installed in 1993 to correct Hubble's spherical aberration problems. (COSTAR was no longer needed after the replacement of the last two instruments which did not have the necessary correction built in.) The COS will do observations in the ultraviolet parts of the spectrum, complementing the measurements done by the repaired STIS system. The service mission repaired two instruments that had failed, the Advanced Camera for Surveys (ACS) and the Space Telescope Imaging Spectrograph (STIS). They also performed other component replacements including: all three Rate Sensor Units (each containing two gas-bearing gyroscopes); one of three Fine Guidance Sensor (FGS) units used to help keep pointing accuracy and increase platform stability; the SI C&DH unit; all six of the 125-pound (57 kg) nickel-hydrogen batteries used to provide all Hubble's electrical power to support operations during the night portion of its orbit; and three New Outer Blanket Layer (NOBL) thermal insulation protective blankets. The batteries had never been replaced and were more than 13 years over their original design life.[71] Assuming testing and calibration of the new equipment goes well, the Hubble should resume routine operation in September 2009.These efforts should keep the telescope fully functioning at least into 2014 and hopefully longer.
Hubble was originally designed to be returned to earth on board a shuttle. With the retirement of the shuttle fleet this will no longer be possible. NASA engineers developed the Soft Capture and Rendezvous System (SCRS), a ring-like device that was attached to Hubble’s aft bulkhead which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission.Atlantis released the Hubble Space Telescope on May 19, 2009 back into space after all repairs were successfully made. The next mission will be to deorbit the Hubble at the end of its service life.
Important discoveries
Hubble has helped to resolve some long-standing problems in astronomy, as well as turning up results that have required new theories to explain them. Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of 10%, which is consistent with other more accurate measurements made since Hubble's launch using other techniques.
While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used the telescope to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. This acceleration was later measured more accurately by other ground-based and space-based telescopes that confirmed Hubble's finding, but the cause of this acceleration is currently very poorly understood.
The high-resolution spectra and images provided by the Hubble have been especially well-suited to establishing the prevalence of black holes in the nuclei of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and work in the 1980s identified a number of good black hole candidates, it fell to work conducted with the Hubble to show that black holes are probably common to the centers of all galaxies.The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.
The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble's optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.
Other major discoveries made using Hubble data include proto-planetary disks (proplyds) in the Orion Nebula; evidence for the presence of extrasolar planets around sun-like stars;and the optical counterparts of the still-mysterious gamma ray bursts.[84] HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto and Eris.
Main articles: Hubble Deep Field and Hubble Ultra Deep Field
A unique legacy of Hubble are the Hubble Deep Field and Hubble Ultra Deep Field images, which utilized Hubble's unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe.
The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope (HST) in February 2006.
Impact on astronomy
Many objective measures show the positive impact of Hubble data on astronomy. Over 4,000 papers based on Hubble data have been published in peer-reviewed journals, and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only 2% of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data.
Although the HST has clearly had a significant impact on astronomical research, the financial cost of this impact has been large. A study on the relative impacts on astronomy of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain.
Making the decision between investing in ground-based versus space-based telescopes in the future is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 108 times brighter than the faintest targets observed by Hubble.Since then, advances in adaptive optics have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects. The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can only correct a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging; in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.
Usage
Anyone can apply for time on the telescope; there are no restrictions on nationality or academic affiliation.Competition for time on the telescope is intense, and the ratio of time requested to time available (the oversubscription ratio) typically ranges between 6 and 9.
Calls for proposals are issued roughly annually, with time allocated for a cycle lasting approximately one year. Proposals are divided into several categories; 'general observer' proposals are the most common, covering routine observations. 'Snapshot observations' are those in which targets require only 45 minutes or less of telescope time, including overheads such as acquiring the target; snapshot observations are used to fill in gaps in the telescope schedule that cannot be filled by regular GO programs.
Astronomers may make 'Target of Opportunity' proposals, in which observations are scheduled if a transient event covered by the proposal occurs during the scheduling cycle. In addition, up to 10% of the telescope time is designated Director's Discretionary (DD) Time. Astronomers can apply to use DD time at any time of year, and it is typically awarded for study of unexpected transient phenomena such as supernovae.Other uses of DD time have included the observations that led to the production of the Hubble Deep Field and Hubble Ultra Deep Field, and in the first four cycles of telescope time, observations carried out by amateur astronomers.
Black hole
In general relativity, a black hole is a region of space in which the gravitational field is so powerful that nothing, including light, can escape its pull. The black hole has a one-way surface, called an event horizon, into which objects can fall, but out of which nothing can come. It is called "black" because it absorbs all the light that hits it, reflecting nothing, just like a perfect blackbody in thermodynamics. Quantum analysis of black holes shows them to possess a temperature and Hawking radiation.
Despite its invisible interior, a black hole can reveal its presence through interaction with other matter. A black hole can be inferred by tracking the movement of a group of stars that orbit a region in space which looks empty. Alternatively, one can see gas falling into a relatively small black hole, from a companion star. This gas spirals inward, heating up to very high temperature and emitting large amounts of radiation that can be detected from earthbound and earth-orbiting telescopes. Such observations have resulted in the scientific consensus that, barring a breakdown in our understanding of nature, black holes do exist in our universe.
History
The idea of a body so massive that even light could not escape was put forward by geologist John Michell in a letter written to Henry Cavendish in 1783 to the Royal Society:
If the semi-diameter of a sphere of the same density as the Sun were to exceed that of the Sun in the proportion of 500 to 1, a body falling from an infinite height towards it would have acquired at its surface greater velocity than that of light, and consequently supposing light to be attracted by the same force in proportion to its vis inertiae, with other bodies, all light emitted from such a body would be made to return towards it by its own proper gravity.
In 1796, mathematician Pierre-Simon Laplace promoted the same idea in the first and second editions of his book Exposition du système du Monde (it was removed from later editions). Such "dark stars" were largely ignored in the nineteenth century, since light was then thought to be a massless wave and therefore not influenced by gravity. Unlike the modern black hole concept, the object behind the horizon is assumed to be stable against collapse.
In 1915, Albert Einstein developed his general theory of relativity, having earlier shown that gravity does in fact influence light's motion. A few months later, Karl Schwarzschild gave the solution for the gravitational field of a point mass and a spherical mass,showing that a black hole could theoretically exist. The Schwarzschild radius is now known to be the radius of the event horizon of a non-rotating black hole, but this was not well understood at that time, for example Schwarzschild himself thought it was not physical. Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution for the point mass a few months after Schwarzschild and wrote more extensively about its properties.
In 1930, astrophysicist Subrahmanyan Chandrasekhar calculated using general relativity that a non-rotating body of electron-degenerate matter above 1.44 solar masses (the Chandrasekhar limit) would collapse. His arguments were opposed by Arthur Eddington, who believed that something would inevitably stop the collapse. Eddington was partly correct: a white dwarf slightly more massive than the Chandrasekhar limit will collapse into a neutron star. But in 1939, Robert Oppenheimer and others predicted that stars above approximately three solar masses (the Tolman-Oppenheimer-Volkoff limit) would collapse into black holes for the reasons presented by Chandrasekhar.
Oppenheimer and his co-authors used Schwarzschild's system of coordinates (the only coordinates available in 1939), which produced mathematical singularities at the Schwarzschild radius, in other words some of the terms in the equations became infinite at the Schwartschild radius. This was interpreted as indicating that the Schwarzschild radius was the boundary of a bubble in which time stopped. This is a valid point of view for external observers, but not for infalling observers.
Because of this property, the collapsed stars were briefly known as "frozen stars,"[citation needed] because an outside observer would see the surface of the star frozen in time at the instant where its collapse takes it inside the Schwarzschild radius. This is a known property of modern black holes, but it must be emphasized that the light from the surface of the frozen star becomes redshifted very fast, turning the black hole black very quickly. Many physicists could not accept the idea of time standing still at the Schwarzschild radius, and there was little interest in the subject for over 20 years.
In 1958, David Finkelstein introduced the concept of the event horizon by presenting Eddington-Finkelstein coordinates, which enabled him to show that "The Schwarzschild surface r = 2 m is not a singularity, but that it acts as a perfect unidirectional membrane: causal influences can cross it in only one direction".This did not strictly contradict Oppenheimer's results, but extended them to include the point of view of infalling observers. All theories up to this point, including Finkelstein's, covered only non-rotating black holes.
In 1963, Roy Kerr found the exact solution for a rotating black hole. The rotating singularity of this solution was a ring, and not a point. A short while later, Roger Penrose was able to prove that singularities occur inside any black hole.
In 1967, astronomers discovered pulsars and within a few years could show that the known pulsars were rapidly rotating neutron stars. Until that time, neutron stars were also regarded as just theoretical curiosities. So the discovery of pulsars awakened interest in all types of ultra-dense objects that might be formed by gravitational collapse.
Physicist John Wheeler is widely credited with coining the term black hole in his 1967 public lecture Our Universe: the Known and Unknown, as an alternative to the more cumbersome "gravitationally completely collapsed star." However, Wheeler insisted that someone else at the conference had coined the term and he had merely adopted it as useful shorthand. The term was also cited in a 1964 letter by Anne Ewing to the AAAS:
According to Einstein’s general theory of relativity, as mass is added to a degenerate star a sudden collapse will take place and the intense gravitational field of the star will close in on itself. Such a star then forms a "black hole" in the universe.
If the semi-diameter of a sphere of the same density as the Sun were to exceed that of the Sun in the proportion of 500 to 1, a body falling from an infinite height towards it would have acquired at its surface greater velocity than that of light, and consequently supposing light to be attracted by the same force in proportion to its vis inertiae, with other bodies, all light emitted from such a body would be made to return towards it by its own proper gravity.
In 1796, mathematician Pierre-Simon Laplace promoted the same idea in the first and second editions of his book Exposition du système du Monde (it was removed from later editions). Such "dark stars" were largely ignored in the nineteenth century, since light was then thought to be a massless wave and therefore not influenced by gravity. Unlike the modern black hole concept, the object behind the horizon is assumed to be stable against collapse.
In 1915, Albert Einstein developed his general theory of relativity, having earlier shown that gravity does in fact influence light's motion. A few months later, Karl Schwarzschild gave the solution for the gravitational field of a point mass and a spherical mass,showing that a black hole could theoretically exist. The Schwarzschild radius is now known to be the radius of the event horizon of a non-rotating black hole, but this was not well understood at that time, for example Schwarzschild himself thought it was not physical. Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution for the point mass a few months after Schwarzschild and wrote more extensively about its properties.
In 1930, astrophysicist Subrahmanyan Chandrasekhar calculated using general relativity that a non-rotating body of electron-degenerate matter above 1.44 solar masses (the Chandrasekhar limit) would collapse. His arguments were opposed by Arthur Eddington, who believed that something would inevitably stop the collapse. Eddington was partly correct: a white dwarf slightly more massive than the Chandrasekhar limit will collapse into a neutron star. But in 1939, Robert Oppenheimer and others predicted that stars above approximately three solar masses (the Tolman-Oppenheimer-Volkoff limit) would collapse into black holes for the reasons presented by Chandrasekhar.
Oppenheimer and his co-authors used Schwarzschild's system of coordinates (the only coordinates available in 1939), which produced mathematical singularities at the Schwarzschild radius, in other words some of the terms in the equations became infinite at the Schwartschild radius. This was interpreted as indicating that the Schwarzschild radius was the boundary of a bubble in which time stopped. This is a valid point of view for external observers, but not for infalling observers.
Because of this property, the collapsed stars were briefly known as "frozen stars,"[citation needed] because an outside observer would see the surface of the star frozen in time at the instant where its collapse takes it inside the Schwarzschild radius. This is a known property of modern black holes, but it must be emphasized that the light from the surface of the frozen star becomes redshifted very fast, turning the black hole black very quickly. Many physicists could not accept the idea of time standing still at the Schwarzschild radius, and there was little interest in the subject for over 20 years.
In 1958, David Finkelstein introduced the concept of the event horizon by presenting Eddington-Finkelstein coordinates, which enabled him to show that "The Schwarzschild surface r = 2 m is not a singularity, but that it acts as a perfect unidirectional membrane: causal influences can cross it in only one direction".This did not strictly contradict Oppenheimer's results, but extended them to include the point of view of infalling observers. All theories up to this point, including Finkelstein's, covered only non-rotating black holes.
In 1963, Roy Kerr found the exact solution for a rotating black hole. The rotating singularity of this solution was a ring, and not a point. A short while later, Roger Penrose was able to prove that singularities occur inside any black hole.
In 1967, astronomers discovered pulsars and within a few years could show that the known pulsars were rapidly rotating neutron stars. Until that time, neutron stars were also regarded as just theoretical curiosities. So the discovery of pulsars awakened interest in all types of ultra-dense objects that might be formed by gravitational collapse.
Physicist John Wheeler is widely credited with coining the term black hole in his 1967 public lecture Our Universe: the Known and Unknown, as an alternative to the more cumbersome "gravitationally completely collapsed star." However, Wheeler insisted that someone else at the conference had coined the term and he had merely adopted it as useful shorthand. The term was also cited in a 1964 letter by Anne Ewing to the AAAS:
According to Einstein’s general theory of relativity, as mass is added to a degenerate star a sudden collapse will take place and the intense gravitational field of the star will close in on itself. Such a star then forms a "black hole" in the universe.
Subscribe to:
Posts (Atom)