Peter Lobner, updated 7 April 2020 & 19 January 2024
The first image of a black hole was released on 10 April 2019 at a press conference in Washington D.C. held by the Event Horizon Telescope (EHT) team and the National Science Foundation (NSF). The subject of the image is the supermassive black hole known as M87* located near the center of the Messier 87 (M87) galaxy. This black hole is about 55 million light years from Earth and is estimated to have a mass 6.5 billion times greater than our Sun. The image shows a glowing circular emission ring surrounding the dark region (shadow) containing the black hole. The brightest part of the image also may have captured a bright relativistic jet of plasma that appears to be streaming away from the black hole at nearly the speed of light, beaming generally in the direction of Earth.
The EHT is not one physical telescope. Rather, it an array of millimeter and sub-millimeter wavelength radio telescopes located around the world. The following map shows the eight telescopes that participated in the 2017 observations of M87. Three additional telescopes joined the EHT array in 2018 and later.
All of the EHT telescopes are used on a non-dedicated basis by an EHT team of more than 200 researchers during a limited annual observing cycle. The image of the M87* black hole was created from observations made during a one week period in April 2017.
The long baselines between the individual radio telescopes give the “synthetic” EHT the resolving power of a physical radio telescope with a diameter that is approximately equal to the diameter of the Earth. A technique called very long-baseline interferometry (VLBI) is used to combine the data from the individual telescopes to synthesize the image of a black hole. EHT Director, Shep Doeleman, referred to VLBI as “the ultimate in delayed gratification among astronomers.” The magnifying power of the EHT becomes real only when the data from all of the telescopes are brought together and the data are properly combined and processed. This takes time.
At a nominal operating wavelength of about 1.3 mm (frequency of 230 GHz), EHT angular resolution is about 25 microarcseconds (μas), which is sufficient to resolve nearby supermassive black hole candidates on scales that correspond to their event horizons. The EHT team reports that the M87* bright emission disk subtends an angle of 42 ± 3 microarcseconds.
For comparison, the resolution of a human eye in visible light is about 60 arcseconds (1/60thof a degree; there are 3,600 arcseconds in one degree) and the 2.4-meter diameter Hubble Space Telescope has a resolution of about 0.05 arcseconds (50,000 microarcseconds).
You can read five open access papers on the first M87* Event Horizon Telescope results written by the EHT team and published on 10 April 2019 in the Astrophysical Journal Letters here:
Congratulations to the EHT Collaboration for their extraordinary success in creating the first-ever image of a black hole shadow.
7 April 2020 Update: EHT observations were complemented by multi-spectral (multi-messenger) observations by NASA spacecraft
On 10 April 2019, NASA reported on its use of several orbiting spacecraft to observe M87 in different wavelengths during the period of the EHT observation.
“To complement the EHT findings, several NASA spacecraft were part of a large effort, coordinated by the EHT’s Multiwavelength Working Group, to observe the black hole using different wavelengths of light. As part of this effort, NASA’s Chandra X-ray Observatory, Nuclear Spectroscopic Telescope Array (NuSTAR) and Neil Gehrels Swift Observatory space telescope missions, all attuned to different varieties of X-ray light, turned their gaze to the M87* black hole around the same time as the EHT in April 2017. NASA’s Fermi Gamma-ray Space Telescope was also watching for changes in gamma-ray light from M87* during the EHT observations.”
“NASA space telescopes have previously studied a jet extending more than 1,000 light-years away from the center of M87*. The jet is made of particles traveling near the speed of light, shooting out at high energies from close to the event horizon. The EHT was designed in part to study the origin of this jet and others like it.”
NASA’s Neutron star Interior Composition Explorer (NICER) experiment on the International Space Station also contributed to the multi-spectral observations of M87*, which were coordinated by EHT’s Multiwavelength Working Group.
On April 25, 2019, NASA released the following composite image showing the M87 galaxy, the position of the M87* black hole and large relativistic jets of matter being ejected from the black hole. These infrared images were made by NASA’s orbiting Spitzer Space Telescope.
19 January 2024 Update: Results of the second M87* black hole EHT observation campaign
The original image of the M87* black hole released in April 2019 was derived from data collected during the April 2017 EHT observation campaign. In January 2024, the EHT Collaboration published the results of a second M87* black hole observation campaign, which took place in April 2018 with an improved global EHT array, wider frequency coverage, and increased bandwidth. This paper shows that the M87* black hole has maintained a similar size in the two images and that the brightest part of the ring surrounding the black hole has rotated about 30 degrees.
Original M87* black hole image (left) & an image from data collected one year later (right). Source: EHT Collaboration via Astronomy & Astrophysics (Jan 2024)
The EHT Collaboration concluded, “The perennial persistence of the ring and its diameter robustly support the interpretation that the ring is formed by lensed emission surrounding a Kerr black hole with a mass ∼6.5 × 109M⊙ (mass of the Sun). The significant change in the ring brightness asymmetry implies a spin axis that is more consistent with the position angle of the large-scale jet.”
For more information:
See the following sources for more information on the EHT and imaging the M87* black hole:
In my 6 August 2016 post, “Lunar Lander XCHALLENGE and Lunar XPrize are Paving the way for Commercial Lunar Missions,” I reported on the status of the Google Lunar XPrize, which was created in 2007 to “incentivize space entrepreneurs to create a new era of affordable access to the Moon and beyond,” and actually deliver payloads to the Moon. In addition, the lunar payloads were tasked with moving 500 meters (1,640 feet) after landing and transmitting high-definition photos and video back to Earth. Any additional science data would be a plus. In January 2018, after concluding that none of the remaining competitors could meet the extended 31 March 2018 deadline for landing on the Moon, the Google Lunar XPrize competition was cancelled, with the $30M in prizes remaining unclaimed. You can read this post here:
One of the competing Lunar XPrize teams was SpaceIL from Israel, which was developing a small lunar spacecraft named Beresheet (originally named Sparrow), that was designed to hitch a ride into an elliptical Earth orbit as a secondary payload on a SpaceX Falcon 9 commercial launch vehicle and then transfer itself to a lunar orbit and finally land on the Moon.
The SpaceIL lunar landing program continued after cancellation of the Lunar XPrize competition. You’ll find details on the SpaceIL lunar program here:
As completed by SpaceIL and Israel Aerospace Industries (IAI), the Beresheet spacecraft has a launch mass of 600 kg (1,323 pounds) and a landing mass of about 180 kg (397 pounds). The lander carries imagers, a magnetometer, a laser retro-reflector array (LRA) provided by the U.S. National Aeronautics and Space Administration (NASA), and a time capsule of cultural and historical Israeli artifacts.
After landing on the Moon, the Beresheet spacecraft electronic systems are expected to remain operational only for a few days. The original Lunar XPrize plan to demonstrate mobility and move the spacecraft after landing on the Moon has been dropped. The laser retro-reflectors will enable the spacecraft to serve as a fixed geographic reference point on the lunar surface long after the mission ends.While not designed for a long lunar surface mission, Beresheet is intended to demonstrate advances in technology that enable low-cost, privately-funded missions to another body in the solar system. Beresheet was developed and constructed for about $100 million. You’ll find more information on the Beresheet spacecraft here:
Beresheet was launched from Cape Canaveral, FL on 21 Feb 2019 into an initial elliptical Geosynchronous Transfer Orbit (GTO) that was dictated by the requirements for the Falcon 9 booster’s primary payload. Once in GTO, Beresheet used its small rocket engine to gradually raise its orbit to a 400,000 km (248,548 mile) apogee to intersect the Moon’s circular orbit, and phase its orbit so the spacecraft passed close to the Moon and could maneuver into a transfer orbit and be captured by the Moon’s gravity. This mission profile is illustrated below.
You can watch a short video with an animation of this mission profile here:
On 4 April, SpaceIL tweeted: “Critical lunar orbit capture took place successfully. #Beresheet is now entering an elliptical course around the #moon, as we get closer to the historical landing #11.4″
After circularizing its lunar orbit, Beresheet is scheduled to land on the Moon on 11 April 2019. NASA is providing communications support during the mission.
On 28 March, the X Prize founder and Executive Chairman Peter Diamand announced that, if the lunar landing is successful, the Foundation would award a $1 million “Moonshot Award” to Beresheet’s builders. Peter Diamand noted, “SpaceIL’s mission represents the democratization of space exploration.”
Best wishes to the SpaceIL team for a successful lunar landing. If successful, Israel will become the 4thnation, after Russia (Soviet Union), USA and China to land spacecraft on the Moon.
Update 12 April 2019: Beresheet spacecraft crashed during Moon landing attempt
The Beresheet spacecraft successfully initiated its descent from lunar orbit on 11 April 2019. Initial telemetry indicated that the landing profile was proceeding as planned.
Communications with the spacecraft was lost when Beresheet was about 489 feet (149 meters) above the moon’s surface. Opher Doron, the general manager of IAI, reported during the live broadcast, “We had a failure in the spacecraft; we unfortunately have not managed to land successfully.”
X Prize founder and Executive Chairman Peter Diamandis announced that SpaceIL and IAI will receive the $1 million Moonshot Award despite failing to make the planned soft landing on the Moon.
Update 14 May 2019: Preliminary failure analysis
On 17 April 2019, SpaceIL announced that its preliminary failure analysis indicated that a software command uploaded to restart a failed inertial measuring unit (IMU) may have started a sequence of events that ultimately shut down the main engines prematurely during the landing attempt, resulting in the crash of the Beresheet spacecraft.
Morris Kahn, SpaceIL’s primary source of funding, pledged that the team will try again for a Moon landing with a new spacecraft dubbed “Beresheet 2.0,” which will incorporate lessons learned from the first lunar landing attempt.
For more information on the Beresheet mission, see The Planetary Society mission report at the following link:
On 3 April 2019, Verizon reported that it turned on its 5G networks in parts of Chicago and Minneapolis, becoming the first wireless carrier to deliver 5G service to U.S. customers with compatible wireless devices in selected urban areas. In its initial 5G service, Verizon is offering an average data rate of 450 Mbps (Megabits per second), with plans to achieve higher speeds as the network rollout continues and service matures. Much of the 5G hype has been on delivering data rates at or above 1 Gbps (Gigabits per second = 1,000 Megabits per second).
In comparison, Verizon reports that it currently delivers 4G LTE service in 500 markets. This service is “able to handle download speeds between 5 and 12 Mbps …. and upload speeds between 2 and 5 Mbps, with peak download speeds approaching 50 Mbps.” Clearly, even Verizon’s initial 5G data rate is a big improvement over 4G LTE.
At the present time, only one mobile phone works with Verizon’s initial 5G service: the Moto Z3 with an attachment called the 5G Moto Mod. It is anticipated that the Samsung’s S10 5G smartphone will be the first all-new 5G mobile phone to hit the market, likely later this spring. You’ll find details on this phone here:
Other U.S. wireless carriers, including AT&T, Sprint and T-Mobile US, have announced that they plan to start delivering 5G service later in 2019.
5G technology standards
Wireless carriers and suppliers with a stake in 5G are engaged in the processes for developing international standards. However, with no firm 5G technology standard truly in place at this time, the market is still figuring out what 5G features and functionalities will be offered, how they will be delivered, and when they will be ready for commercial introduction. The range of 5G functionalities being developed are shown in the following ITU diagram.
Verizon’s initial 5G mobile phone promotion is focusing on data speed and low latency.
The primary 5G standards bodies involved in developing the international standards are the 3rd Generation Partnership Project (3GPP), the Internet Engineering Task Force (IETF), and the International Telecommunication Union (ITU). A key international standard, 5G/IMT-2020, is expected to be issued in (as you might expect) 2020.
You’ll find a good description of 5G technology by ITU in a February 2018 presentation, “Key features and requirements of 5G/IMT-2020 networks,” which you will find at the following link:
In my 6 June 2016 post, I reported on SC2, which eventually could benefit 5G service by:
“…developing a new wireless paradigm of collaborative, local, real-time decision-making where radio networks will autonomously collaborate and reason about how to share the RF (radio frequency) spectrum.”
SC2 is continuing into 2019. Fourteen teams have qualified for Phase 3 of the competition, which will culminate in the Spectrum Collaboration Challenge Championship Event, which will be held on 23 October 2019 in conjunction with the 2019 Mobile World Congress in Los Angeles, CA. You can follow SC2 news here:
Since late August 2017, the US LIGO 0bservatories in Washington and Louisiana and the European Gravitational Observatory (EGO), Virgo, in Italy, have been off-line for updating and testing. These gravitational wave observatories were set to start Observing Run 3 (O3) on 1 April 2019 and conduct continuous observations for one year. All three of these gravitational wave observatories have improved sensitivities and are capable of “seeing” a larger volume of the universe than in Observing Run 2 (O2).
Later in 2019, the Japanese gravitational wave observatory, KAGRA, is expected to come online for the first time and join O3. By 2024, a new gravitational wave observatory in India is expected to join the worldwide network.
On the advent of this next gravitational wave detection cycle, here’s is a brief summary of the status of worldwide gravitational wave observatories.
Advanced LIGO
The following upgrades were implemented at the two LIGO observatories since Observing Run 2 (O2) concluded in 2017:
Laser power has been doubled, increasing the detectors’ sensitivity to gravitational waves.
Upgrades were made to LIGO’s mirrors at both locations, with five of eight mirrors being swapped out for better-performing versions.
Upgrades have been implemented to reduce levels of quantum noise. Quantum noise occurs due to random fluctuations of photons, which can lead to uncertainty in the measurements and can mask faint gravitational wave signals. By employing a technique called quantum “squeezing” (vacuum squeezing), researchers can shift the uncertainty in the laser light photons around, making their amplitudes less certain and their phases, or timing, more certain. The timing of photons is what is crucial for LIGO’s ability to detect gravitational waves. This technique initially was developed for gravitational wave detectors at the Australian National University, and matured and routinely used since 2010 at the GEO600 gravitational wave detector in Hannover, Germany,
In comparison to its capabilities in 2017 during O2, the twin LIGO detectors have a combined increase in sensitivity of about 40%, more than doubling the volume of the observable universe.
You’ll find more news and information on the LIGO website at the following link:
GEO600 is a modest-size laser interferometric gravitational wave detector (600 meter / 1,969 foot arms) located near Hannover, Germany. It was designed and is operated by the Max Planck Institute for Gravitational Physics, along with partners in the United Kingdom.
In mid-2010, GEO600 became the first gravitational wave detector to employ quantum “squeezing” (vacuum squeezing) and has since been testing it under operating conditions using two lasers: its standard laser, and a “squeezed-light” laser that just adds a few entangled photons per second but significantly improves the sensitivity of GEO600. In a May 2013 paper entitled, “First Long-Term Application of Squeezed States of Light in a Gravitational Wave Observatory,” researchers reported the following results of operational tests in 2011 and 2012.
“During this time, squeezed vacuum was applied for 90.2% (205.2 days total) of the time that science-quality data were acquired with GEO600. A sensitivity increase from squeezed vacuum application was observed broadband above 400 Hz. The time average of gain in sensitivity was 26% (2.0 dB), determined in the frequency band from 3.7 to 4.0 kHz. This corresponds to a factor of 2 increase in the observed volume of the Universe for sources in the kHz region (e.g., supernovae, magnetars).”
While GEO600 has conducted observations in coordination with LIGO and Virgo, GEO600 has not reported detecting gravitational waves. At high frequencies GEO600 sensitivity is limited by the available laser power. At the low frequency end, the sensitivity is limited by seismic ground motion.
You’ll find more information on GEO600 at the following link:
Advanced Virgo, the European Gravitational Observatory (EGO)
At Virgo, the following upgrades were implemented since Observing Run 2 (O2) concluded in 2017:
The steel wires used during O2 observation campaign to suspend the four main mirrors of the interferometer have been replaced. The 42 kg (92.6 pound) mirrors now are suspended with thin fused-silica (glass) fibers, which are expected to increase the sensitivity in the low-medium frequency region. The mirrors in Advanced LIGO have been suspended by similar fused-silica fibers since those two observatories went online in 2015.
A more powerful laser source has been installed, which should improve sensitivity at high frequencies.
Quantum “squeezing” has been implemented in collaboration with the Albert Einstein Institute in Hannover, Germany. This should improve the sensitivity at high frequencies.
In comparison to its capabilities in 2017 during O2, Virgo sensitivity has been improved by a factor of about 2, increasing the volume of the observable universe by a factor of about 8.
You’ll find more information on Virgo at the following link:
KAGRA is a cryogenically-cooled laser interferometer gravitational wave detector that is sited in a deep underground cavern in Kamioka, Japan. This gravitational wave observatory is being developed by the Institute for Cosmic Ray Research (ICRR) of the University of Tokyo. The project website is at the following link:
The cryogenic mirror cooling system is intended to cool the mirror surfaces to about 20° Kelvin (–253° Celsius) to minimize the motion of molecules (jitter) on the mirror surface and improve measurement sensitivity. KAGRA’s deep underground site is expected to be “quieter” than the LIGO and VIRGO sites, which are on the surface and have experienced effects from nearby vehicles, weather and some animals.
The focus of work in 2018 was on pre-operational testing and commissioning of various systems and equipment at the KAGRA observatory. In December 2018, the KAGRA Scientific Congress reported that, “If our schedule is kept, we expect to join (LIGO and VIRGO in) the latter half of O3…” You can follow the latest news from the KAGRA team here:
IndIGO, the Indian Initiative in Gravitational-wave Observations, describes itself as an initiative to set up advanced experimental facilities, with appropriate theoretical and computational support, for a multi-institutional Indian national project in gravitational wave astronomy. The IndIGO website provides a good overview of the status of efforts to deploy a gravitational wave detector in India. Here’s the link:
On 22 January 2019, T. V. Padma reported on the Naturewebsite that India’s government had given “in-principle” approval for a LIGO gravitational wave observatory to be built in the western India state of Maharashtra.
“India’s Department of Atomic Energy and its Department of Science and Technology signed a memorandum of understanding with the US National Science Foundation for the LIGO project in March 2016. Under the agreement, the LIGO Laboratory — which is operated by the California Institute of Technology (Caltech) in Pasadena and the Massachusetts Institute of Technology (MIT) in Cambridge — will provide the hardware for a complete LIGO interferometer in India, technical data on its design, as well as training and assistance with installation and commissioning for the supporting infrastructure. India will provide the site, the vacuum system and other infrastructure required to house and operate the interferometer — as well as all labor, materials and supplies for installation.”
India’s LIGO observatory is expected to cost about US$177 million. Full funding is expected in 2020 and the observatory currently is planned for completion in 2024. India’s Inter-University Centre for Astronomy and Astrophysics (IUCAA), also in Maharashtra state, will lead the project’s gravitational-wave science and the new detector’s data analysis.
Using only the two US LIGO detectors, it is not possible to localize the source of gravitational waves beyond a broad sweep through the sky. On 1 August 2017, Virgo joined LIGO during the second Observation Run, O2. While the LIGO-Virgo three-detector network was operational for only three-and-a-half weeks, five gravitational wave events were observed. As shown in the following figure, the spatial resolution of the source was greatly improved when a triple detection was made by the two LIGO observatories and Virgo. These events are labeled with the suffix “HLV”.
The greatly reduced areas of the triple event localizations demonstrate the capabilities of the current global gravitational wave observatory network to resolve the source of a gravitational-wave detection. The LIGO and Virgo Collaboration reports that it can send Open Public Alerts within five minutes of a gravitational wave detection.
With timely notification and more precise source location information, other land-based and space observatories can collaborate more rapidly and develop a comprehensive, multi-spectral (“multi-messenger”) view of the source of the gravitational waves.
When KAGRA and LIGO-India join the worldwide gravitational wave detection network, it is expected that source localizations will become 5 to 10 times more accurate than can be accomplished with just the LIGO and Virgo detectors.
For more background information on gravitational-wave detection, see the following Lyncean posts:
In my 19 December 2016 post, “What to do with Carbon Dioxide,” I provided an overview of the following three technologies being developed for underground storage (sequestration) or industrial utilization of carbon dioxide:
Store in basalt formations by making carbonate rock
In the past two years, significant progress has been made in the development of processes to convert gaseous carbon dioxide waste streams into useful products. This post is intended to highlight some of the advances being made and provide links to additional current sources of information on this subject.
1. Carbon XPrize: Transforming carbon dioxide into valuable products
The NRG / Cosia XPrize is a $20 million global competition to develop breakthrough technologies that will convert carbon dioxide emissions from large point sources like power plants and industrial facilities into valuable products such as building materials, alternative fuels and other items used every day. You’ll find details on this competition on the XPrize website at the following link:
The competition is now in the testing and certification phase. Each team is expected to scale up their pilot systems by a factor of 10 for the operational phase, which starts in June 2019 at the Wyoming Integrated Test Center and the Alberta (Canada) Carbon Conversion Technology Center.
The teams will be judged by the amount of carbon dioxide converted into usable products and the value of those products. We’ll have to wait until the spring of 2020 for the results of this competition.
2. World’s largest post-combustion carbon capture project
Post-combustion carbon capture refers to capturing carbon dioxide from flue gas after a fossil fuel (e.g., coal, natural gas or oil) has been burned and before the flue gas is exhausted to the atmosphere. You’ll find a 2016 review of post-combustion carbon capture technologies in the paper by Y. Wang, et al., “A Review of Post-combustion Carbon DioxideCapture Technologies from Coal-fired Power Plants,” which is available on the ScienceDirect website here:
In January 2017, NRG Energy reported the completion of the Petra Nova post-combustion carbon capture project, which is designed to remove 90% of the carbon dioxide from a 240 MW “slipstream” of flue gas at the existing W. A. Parish generating plant Unit 8. The “slipstream” represents 40% of the total flue gas flow from the coal-fired 610 MW Unit 8. To date, this is the largest post-combustion carbon capture project in the world. Approximately 1.4 million metric tons of carbon dioxide will be captured annually using a process jointly developed by Mitsubishi Heavy Industries, Ltd. (MHI) and the Kansai Electric Power Co. The US Department of Energy (DOE) supported this project with a $190 million grant.
The DOE reported: “The project will utilize a proven carbon capture process, which uses a high-performance solvent for carbon dioxideabsorption and desorption. The captured carbon dioxide will be compressed and transported through an 80 mile pipeline to an operating oil field where it will be utilized for enhanced oil recovery (EOR) and ultimately sequestered (in the ground).”
You’ll find more information on the Petra Nova project at the following links:
3. Pilot-scale projects to convert carbon dioxideto synthetic fuel
Thyssenkrupp pilot project for conversion of steel mill gases into methanol
In September 2018, Thyssenkrupp reported that it had “commenced production of the synthetic fuel methanol from steel mill gases. It is the first time anywhere in the world that gases from steel production – including the carbon dioxide they contain – are being converted into chemicals. The start-up was part of the Carbon2Chem project, which is being funded to the tune of around 60 million euros by Germany’s Federal Ministry of Education and Research (BMBF)……..‘Today the Carbon2Chem concept is proving its value in practice,’ said Guido Kerkhoff, CEO of Thyssenkrupp. ‘Our vision of virtually carbon dioxide-free steel production is taking shape.’”
Berkeley Laboratory developing a copper catalyst that yields high efficiency carbon dioxide-to-fuels conversion
The DOE Lawrence Berkeley National Laboratory (Berkeley Lab) has been engaged for many years in creating clean chemical manufacturing processes that can put carbon dioxide to good use. In September 2017, Berkeley Lab announced that its scientists has developed a new electrocatalyst comprised of copper nanoparticles that can directly convert carbon dioxide into multi-carbon fuels and alcohols (e.g., ethylene, ethanol, and propanol) using record-low inputs of energy. For more information, see the Global Energy World article here:
The term negative emissions technology (NET) refers to an industrial processes designed to remove and sequester carbon dioxidedirectly from the ambient atmosphere rather than from a large point source of carbon dioxide generation (e.g. the flue gas from a fossil-fueled power generating station or a steel mill). Think of a NET facility as a carbon dioxideremoval “factory” that can be sited independently from the sources of carbon dioxide generation.
The Swiss firm Climeworks is in the business of developing carbon dioxideremoval factories using the following process:
“Our plants capture atmospheric carbon with a filter. Air is drawn into the plant and the carbon dioxide within the air is chemically bound to the filter. Once the filter is saturated with carbon dioxide it is heated (using mainly low-grade heat as an energy source) to around 100 °C (212 °F). The carbon dioxide is then released from the filter and collected as concentrated carbon dioxide gas to supply to customers or for negative emissions technologies. Carbon dioxide-free air is released back into the atmosphere. This continuous cycle is then ready to start again. The filter is reused many times and lasts for several thousand cycles.”
This process is shown in the following Climeworks diagram:
You’ll find more information on Climeworks on their website here:
In 2017, Climeworks began operation in Iceland of their first pilot facility to remove carbon dioxide from ambient air and produce concentrated carbon dioxide that is injected into underground basaltic rock formations, where the carbon dioxide gets converted into carbonite minerals in a relatively short period of time (1 – 2 years) and remains fixed in the rock. Climeworks uses waste heat from a nearby geothermal generating plant to help run their carbon capture system. This process is shown in the following diagram.
This small-scale pilot facility is capable of removing only about 50 tons of carbon dioxide from the atmosphere per year, but can be scaled up to a much larger facility. You’ll find more information on this Climeworks project here:
In October 2018, Climeworks began operation in Italy of another pilot-scale NET facility designed to remove carbon dioxide from the atmosphere. This facility is designed to remove 150 tons of carbon dioxide from the atmosphere per year and produce a natural gas product stream from the atmospheric carbon dioxide, water, and electricity. You’ll find more information on this Climeworks project here:
5. Consensus reports on waste stream utilization and negative emissions technologies (NETs)
The National Academies Press (NAP) recently published a consensus study report entitled, “Gaseous Carbon Waste Streams Utilization, Status and Research Needs,” which examines the following processes:
Mineral carbonation to produce construction material
Chemical conversion of carbon dioxideinto commodity chemicals and fuels
Biological conversion (photosynthetic & non-photosynthetic) of carbon dioxide into commodity chemicals and fuels
Methane and biogas waste utilization
The authors note that, “previous assessments have concluded that …… > 10 percent of the current global anthropogenic carbon dioxide emissions….could feasibly be utilized within the next several decades if certain technological advancements are achieved and if economic and political drivers are in place.”
You can download a free pdf copy of this report here:
Also on the NAP website is a prepublication report entitled, “Negative Emissions Technologies and Reliable Sequestration.” The authors note that NETs “can have the same impact on the atmosphere and climate as preventing an equal amount of carbon dioxide from being emitted from a point source.”
You can download a free pdf copy of this report here:
In this report, the authors note that recent analyses found that deploying NETs may be less expensive and less disruptive than reducing some emissions at the source, such as a substantial portion of agricultural and land-use emissions and some transportation emissions. “ For example, NAPs could be a means for mitigating the methane generated from enteric fermentation in the digestive systems of very large numbers of ruminant animals (e.g., in the U.S., primarily beef and dairy cattle). For more information on this particular matter, please refer to my 31 December 2016 post, “Cow Farts Could be Subject to Regulation Under a New California Law,”which you’ll find here:
A 15 February 2019 article by Meilan Solly in the Smithsonian online magazine describes a recently released interactive map of the world that shows how the Earth’s continents have moved since 750 million years ago. With your cursor, you can zoom in and rotate the globe in any direction. Using a pull-down menu at the top center of the screen, you can see the relative positioning of the landmasses at the point in time you selected. A similar selection box in the upper right corner of the screen allows you to select a particular geological or evolutionary milestone (i.e., first land animals) in Earths’ development. Even better, you can enter an address in the text box in the upper-left corner of the screen and then see how your selected location has migrated as you explore through the ages.
Following are screenshots showing what’s happened to the Lyncean Group’s meeting site in San Diego during the past 750 million years.
I hope you enjoy the interactive globe, with visualization created and maintained by Ian Webster, plate tectonic and paleogeographic maps by C.R. Scotese, and the address search tool by LocationIQ.
On 24 January 2016, I posted the article, “Where in the Periodic Table Will We Put Element 119?”, which reviews the development of the modern periodic table of chemical elements since it was first formulated in 1869 by Russian chemist Dimitri Mendeleev, through the completion of Period 7 with the naming element 118 in 2016. You can read this post here:
2019 is the 150thanniversary of Dimitri Mendeleev’s periodic table of elements. To commemorate this anniversary, the United Nations General Assembly and the United Nations Educational, Scientific and Cultural organization (UNESCO) have proclaimed 2019 as the International Year of the Periodic Table of Chemical Elements (IYPT). You’ll find more information on the IYPT here:
A brief animated “visualization” entitled “Setting the Table,”created by J. Yeston, N. Desai and E. Wang, provides a good overview of the history and configuration of the periodic table. Check it out here:
The prospects for extending the periodic table beyond element 118 (into a new Period 8) is discussed in a short 2018 video from Science Magazine entitled “Where does the periodic table end?,”which you can view here:
The next phase in the hunt for new superheavy elements is about to start in Russia
Flerov Laboratory of Nuclear Reactions (FLNR) Joint Institute for Nuclear Research (JINR) in Dubna is the leading laboratory in Russia, and perhaps the world, in the search for superheavy elements. The FLNR website is here:
FLNR is the home of several accelerators and other experimental setups for nuclear research, including the U400 accelerator, which has been the laboratory’s basic tool for the synthesis of new elements since being placed in operation in 1979. You can take a virtual tour of U400 on the FLNR website.
On 30 May 2012 the International Union of Pure and Applied Chemistry (IUPAC) honored the work done by FLNR when it approved the name Flerovium (Fl) for superheavy element 114.
Yuri Oganessian, the Scientific Leader of FLNR, has contributed greatly to extending the periodic table through the synthesis of new superheavy elements. On 30 November 2016, IUPAC recognized his personal contributions by naming superheavy element 118 after him: Oganesson (Og).
FLNR has built a new $60 million accelerator facility, dubbed the Superheavy Element Factory (SHEF), which is expected to be capable of synthesizing elements beyond 118. The SHEF building and the DC-280 cyclotron that will be used to synthesize superheavy elements are shown in the photos below.
The 2016 paper, “Status and perspectives of the Dubna superheavy element factory,”by S. Dmitriev, M. Itkis and Y. Oganessian, presents an overview of the DC-280 cyclotron design, including the following diagram showing the general arrangement of the major components.
For insights into the processes for synthesizing superheavy elements, I recommend that you view the following March 2018 video in which FLNR Director Sergey Dmitriev describes the design of SHEF and the planned process of synthesizing superheavy elements 119 and 120. This is a rather long (23 min) video, but I think it will be worth your time.
On 26 December 2018, the DC-280 cyclotron produced its first beam of accelerated heavy ions. The hunt for new superheavy elements using DC-280 is scheduled to begin in the spring of 2019.
A good overview of FLNR, as it prepares to put its Superheavy Element Factory into operation, is available in the article by Sam Kean, entitled “A storied Russian lab is trying to push the periodic table past its limits—and uncover exotic new elements,” which was posted on 30 January 2019 on the Science Magazine website. You’ll find this article at the following link:
The next few years may yield exciting new discoveries of the first members of Period 8 of the periodic table. I think Dimitri Mendeleev would be impressed.
Additional reading:
H. Kragh, “The Search for Superheavy Elements: Historical and Philosophical Perspectives,” Niels Bohr Institute https://arxiv.org/pdf/1708.04064.pdf
Ian Fraser is an award-winning journalist, commentator and broadcaster who writes about business, finance, politics and economics. In 2018, under the banner of WawamuStats, he started posting a series of short videos that help visualize trends that are hard to see in voluminous numerical data, but become apparent (even a bit stunning) in a dynamic graphical format. On its Facebook page, WawamuStats explains:
“Historical data are fun, but reading them is tedious. This page makes these tedious data into a dynamic timeline, which shows historical data.”
Regarding the GDP data used for the dynamic visualizations, WawamuStats states:
“Gross Domestic Product (GDP) is a monetary measure of the market value of all the final goods and services produced in a period of time, often annually or quarterly. Nominal GDP estimates are commonly used to determine the economic performance of a whole country or region, and to make international comparisons.”
Here are the three WawamuStats GDP videos I think you will enjoy.
Top 10 Country GDP Ranking History (1960-2017)
This dynamic visualization shows the top 10 countries with the highest GDP from 1960 to 2017. At the start, most of the top 10 countries are from Europe and North America. You’ll see the rapid rise of Japan’s economy followed decades later by the rapid rise of China’s economy.
Top 10 Country GDP Per Capita Ranking History (1962-2017)
This dynamic visualization shows the top 10 countries with the highest GDP per capita from 1962 to 2017. As you will see, most of the top 10 countries are from developed regions in Europe, North America, and Asia. Since 2017, Luxembourg has been regarded as the richest country in terms of GDP per capita.
Future Top 10 Country Projected GDP Ranking (2018-2100)
This dynamic visualization shows how Asian economies are expected to grow and eventually dominate the world economy, with China’s economy, and later India’s economy, exceeding the US economy in terms of GDP, and several European economies dropping out of the top 10 ranking. While the specific national GDP values are only projections, the macro trends, with a strong shift toward Asian economies, probably is correct.
You can find additional dynamic video timelines on the WawamuStats Facebook page here:
Peter Lobner, updated 24 Jan 2019, 12 Nov 2019 and 17 October 2023
The National Aeronautics and Space Administration’s (NASA) durable New Horizon spacecraft made its close flyby of Pluto on 14 July 2015, passing 7,800 mi (12,500 km) above the surface of that dwarf planet and returning a remarkable trove of photos and data. Since then, the spacecraft has been continuing its journey out of our solar system and now is flying through the Kuiper Belt, which is a very large, diffuse region beyond the orbit of Neptune containing millions of small bodies in distant orbits around the Sun. These Kuiper Belt Objects (KBOs) are believed to be “leftovers” (i.e., they never coalesced into planets) from the formation of the early solar system. You can read more about the Kuiper Belt on the NASA website here:
On 28 August 2015, NASA announced that it had selected the next destination for New Horizons after the Pluto flyby: a small KBO designated 2014 MU69, originally named Ultima Thule, about 1 billion miles (1.6 billion km) beyond Pluto. The spacecraft’s trajectory from Earth to Ultima Thule is shown in the following NASA diagram.
On 1 January 2019, the New Horizons spacecraft made a close flyby of 2014 MU69, at a range of 2,200 miles (3,500 km) and a relative speed of 14 kilometers per second (31,317 mph). At a distance of 4.1 billion miles (6.6 billion km) from the Earth, radio signals took 6 hours and 6 minutes traveling at the speed of light to traverse the distance between the spacecraft and Earth during the encounter. On 1 January 2019, NASA released the following blurry image, which was taken at long range.
NASA reported: “At left is a composite of two images taken by New Horizons’ high-resolution Long-Range Reconnaissance Imager (LORRI), which provides the best indication of Ultima Thule’s size and shape so far. Preliminary measurements of this Kuiper Belt object suggest it is approximately 20 miles long by 10 miles wide (32 kilometers by 16 kilometers). An artist’s impression at right illustrates one possible appearance of Ultima Thule, based on the actual image at left. The direction of Ultima’s spin axis is indicated by the arrows. “
In the weeks following the flyby, New Horizons will be downloading all of the higher-resolution photos and data acquired during its close encounter with 2014 MU69 and we’ll be getting a much more detailed understanding of this KBO.
It appears that NASA has the opportunity to target one or more additional KBOs for future New Horizons flybys in the 2020s. The spacecraft’s electric power source, a plutonium (Pu-238)-fueled radioisotope thermoelectric generator (RTG), is capable of providing power well into the 2030s, albeit at gradually reducing power levels. In addition, the spacecraft has significant hydrazine fuel remaining for course correction and attitude control en route to a future KBO flyby.
On 2 January 2019, NASA released the following photo taken on the inbound leg of the flyby, still 18,000 miles (28,000 km) from 2014 MU69.
You’ll find more information on NASA’s New Horizons mission here:
24 January 2019 update: Latest photo shows 2014 MU69 surface to be unusually smooth
Today, NASA released the following photo of 2014 MU69, taken at a distance of 4,200 miles (6,700 kilometers) on 1 January 2019, just seven minutes before closest approach. Principal Investigator Alan Stern, of the Southwest Research Institute, reported, “Over the next month there will be better color and better resolution images that we hope will help unravel the many mysteries of Ultima Thule.”
12 November 2019 update: 2014 MU69 renamed
NASA announced that 2014 MU69 was formally renamed “Arrokoth”, which NASA says “means ‘sky’ in the language of the Powhatan people, a Native American tribe indigenous to Maryland. The state is home to New Horizons mission control at the Johns Hopkins University Applied Physics Laboratory in Laurel.” Here’s a colorized view of Arrokoth.
17 October 2023 update: Analysis of Arrokoth’s structure
In September 2023, a research team from the Southwest Research Institute (SwRI) in Boulder, Colorado reported the results of their detailed analysis of Arrokoth, focusing on 12 mounds identified on the larger lobe, named Wenu, and tentatively, three on the smaller lobe, named Weeyo.
As reported by EarthSky, “The data from New Horizons suggest that the mounds have a common origin. In this case, that common origin would date back to when Arrokoth first formed billions of years ago. As with planets and moons, the Kuiper Belt object formed when various chunks and bits of material collided together, creating the larger planetesimal now known as Arrokoth.” This formation process is shown in the following diagram.
Data from New Horizons suggests that Arrokoth formed from material that collided together at slow speeds. Source: New Horizons/ NASA/ JHUAPL/ SwRI/ James Tuttle Keane.
The locations and structure of the mounds on Arrokoth are shown in the following graphic.
The mound structures dominate Arrokoth’s larger lobe (named Wenu). In addition, there are a few tentative mounds identified on the smaller lobe (named Weeyo). Source: SwRI via EarthSky
On 19 January 1942, US President Franklin D. Roosevelt approved the production of an atomic bomb. At that time, most of the technology for producing an atomic bomb still needed to be developed and the US had very little infrastructure in place to support that work.
The Manhattan Engineer District (MED, aka the “Manhattan Project”) was responsible for the research, design, construction and operation of the early US nuclear weapons complex and for delivering atomic bombs to the US Army during World War II (WW II) and in the immediate post-war period. The Manhattan Project existed for just five years. In 1943, 75 years ago, the Manhattan Project transitioned from planning to construction and initial operation of the first US nuclear weapons complex facilities. Here’s a very brief timeline for the Manhattan Project.
13 August 1942: The Manhattan Engineer District was formally created under the leadership of U.S. Army Colonel Leslie R. Groves.
2 December 1942: A team led by Enrico Fermi achieved the world’s first self-sustaining nuclear chain reaction in a graphite-moderated, natural uranium fueled reactor known simply as Chicago Pile-1 (CP-1).
1943 – 1946: The Manhattan Project managed the construction and operation of the entire US nuclear weapons complex.
16 July 1945: The first nuclear device was successfully tested at the Trinity site near Alamogordo, NM, less than three years after the Manhattan Project was created.
6 & 9 August 1945: Atomic bombs were employed by the US against Japan, contributing to ending World War II.
1 January 1947: The newly formed, civilian-led Atomic Energy Commission (AEC) took over management and operation of all research and production facilities from the Manhattan Engineer District.
25 August 1947: The Manhattan Engineer District was abolished.
The WW II nuclear weapons complex was the foundation for the early US post-war nuclear weapons infrastructure that evolved significantly over time to support the US mutually-assured destruction strategy during the Cold War with the Soviet Union. Today, the US nuclear weapons complex continues to evolve as needed to perform its critical role in maintaining the US nuclear deterrent capability.
2. A Closer Look at the Manhattan Project Timeline
You’ll find a comprehensive, interactive timeline of the Manhattan Project on the Department of Energy’s (DOE) OSTI website at the following link:
The Atomic Heritage Foundation is dedicated to “supporting the Manhattan Project National Historical Park and capturing the memories of the people who harnessed the energy of the atom.” Their homepage is here:
The Manhattan Project National Historical Park was authorized by Congress in December 2014 and subsequently was approved by the President to commemorate the Manhattan Project. The Manhattan Project National Historical Park is an extended “park” that currently is comprised of three distinct DOE sites that each had different missions during WW II:
Los Alamos, New Mexico: Nuclear device design, test and production
Oak Ridge, Tennessee: Enriched uranium production
Hanford, Washington: Plutonium production
On 10 November 2015, a memorandum of agreement between DOE and the National Park Service (NPS) established the park and the respective roles of DOE and NPS in managing the park and protecting and presenting certain historic structures to the public.
You’ll find the Manhattan Project National Historical Park website here:
Following is a brief overview of the three sites that currently comprise the Manhattan Project National Historical Park.
3.1. Los Alamos, New Mexico
Los Alamos Laboratory was established 75 years ago, in early 1943, as MED Site Y, under the direction of J. Robert Oppenheimer. This was the Manhattan Project’s nuclear weapons laboratory, which was created to consolidate in one secure, remote location most of the research, design, development and production work associated producing usable nuclear weapons to the US Army during WW II.
The first wave of scientists began arriving at Los Alamos Laboratory in April 1943. Just 27 months later, on 16 July 1945, the world’s first nuclear device was detonated 200 miles south of Los Alamos at the Trinity Site near Alamogordo, NM. This was the plutonium-fueled, implosion-type device code named “Gadget.”
During WW II, the Los Alamos Laboratory produced three atomic bombs:
One uranium-fueled, gun-type atomic bomb code named “Little Boy” was produced. This was the atomic bomb dropped on Hiroshima, Japan on 6 August 1945, making it the first nuclear weapon used in warfare. This atomic bomb design was not tested before it was used operationally.
Two plutonium-fueled, implosion-type atomic bombs code named “Fat Man” were produced. These bombs were very similar to Gadget. One of the Fat Man bombs was dropped on Nagasaki, Japan on 9 August 1945. The second Fat Man bomb could have been used during WW II, but it was not needed after Japan announced its surrender on 15 August 1945.
The highly-enriched uranium for the Little Boy bomb was produced by the enrichment plants at Oak Ridge. The plutonium for Gadget and the two Fat Man bombs was produced by the production reactors at Hanford.
Three historic sites are on Los Alamos National Laboratory property and currently are not open to the public:
Gun Site Facilities: three bunkered buildings (TA-8-1, TA-8-2, and TA-8-3), and a portable guard shack (TA-8-172).
V-Site Facilities: TA-16-516 and TA-16-517 V-Site Assembly Building
Pajarito Site: TA-18-1 Slotin Building, TA-8-2 Battleship Control Building, and the TA-18-29 Pond Cabin.
You’ll find information on the Manhattan Project National Historical Park sites at Los Alamos here:
Land acquisition was approved in 1942 for planned uranium “atomic production plants” in the Tennessee Valley. The selected site officially became the Clinton Engineer Works (CEW) in January 1943 and was given the MED code name Site X. This is where MED and its contractors managed the deployment during WW II of the following three different uranium enrichment technologies in three separate, large-scale industrial process facilities:
Liquid thermal diffusion process, based on work by Philip Abelson at Naval Research Laboratory and the Philadelphia Naval Yard. This process was implemented at S-50, which produced uranium enriched to < 2 at. % U-235.
Gaseous diffusion process, based on work by Harold Urey at Columbia University. This process was implemented at K-25, which produced uranium enriched to about 23 at. % U-235 during WW II.
Electromagnetic separation process, based on Ernest Lawrence’s invention of the cyclotron at the University of California Berkeley in the early 1930s. This process was implemented at Y-12 where the final output was weapons-grade uranium.
The Little Boy atomic bomb used 92.6 pounds (42 kg) of highly enriched uranium produced at Oak Ridge with contributions from all three of these processes.
The nearby township was named Oak Ridge in 1943, but the nuclear site itself was not officially renamed Oak Ridge until 1947.
The three Manhattan Project National Historical Park sites at Oak Ridge are:
X-10 Graphite Reactor National Historic Landmark
K-25 complex
Y-12 complex: Buildings 9731 and 9204-3
The S-50 Thermal Diffusion Plant was dismantled in the late 1940s. This site is not part of the Manhattan Project National Historical Park.
Following is a brief overview of X-10, K-25 and Y-12 historical sites. There’s much more information on the Manhattan Project National Historical Park sites at Oak Ridge here:
X-10 was the world’s second nuclear reactor (after the Chicago Pile, CP-1) and the first reactor designed and built for continuous operation. It was intended to produce the first significant quantities of plutonium, which were used by scientists at Los Alamos to characterize plutonium and develop the design of a plutonium-fueled atomic bomb.
X-10 was a large graphite-moderated, natural uranium fueled reactor that originally had an continuous design power rating of 1.0 MWt, which later was raised to 3.5 MWt. Originally, it was intended to be a prototype for the much larger plutonium production reactors being planned for Hanford. The selection of air cooling for X-10 enabled this reactor to be deployed more rapidly, but limited its value as a prototype for the future water-cooled plutonium production reactors.
The X-10 reactor core was comprised of graphite blocks arranged into a cube measuring 24 feet (7.3 meters) on each side. The core was surrounded by several feet of high-density concrete and other material to provide radiation shielding. The core and shielding were penetrated by 1,248 horizontal channels arranged in 36 rows. Each channel served to position up to 54 fuel slugs in the core and provide passages for forced air cooling of the core. Each fuel slug was an aluminum clad, metallic natural uranium cylinder measuring 4 inches (10.16 cm) long x 1.1 inches (2.79 cm) in diameter. New fuel slugs were added manually at the front face (the loading face) of the reactor and irradiated slugs were pushed out through the back face of the reactor, dropping into a cooling water pool. The reactor was controlled by a set of vertical control rods.
The basic geometry of the X-10 reactor is shown below.
Site construction work started 75 years ago, on 27 April 1943. Initial criticality occurred less than seven months later, on 4 November 1943.
Plutonium was recovered from irradiated fuel slugs in a pilot-scale chemical separation line at Oak Ridge using the bismuth phosphate process. In April 1944, the first sample (grams) of reactor-bred plutonium from X-10 was delivered to Los Alamos. Analysis of this sample led Los Alamos scientists to eliminate one candidate plutonium bomb design (the “Thin Man” gun-type device) and focus their attention on the Fat Man implosion-type device. X-10 operated as a plutonium production reactor until January 1945, when it was turned over to research activities. X-10 was permanently shutdown on 4 November 1963, and was designated a National Historic Landmark on 15 October 1966.
K-25 Gaseous Diffusion Plant
Preliminary site work for the K-25 gaseous diffusion plant began 75 years ago, in May 1943, with work on the main building starting in October 1943. The six-stage pilot plant was ready for operation on 17 April 1944.
The K-25 gaseous diffusion plant feed material was uranium hexafluoride gas (UF6) from natural uranium and slightly enriched uranium from both the S-50 liquid thermal diffusion plant and the first (Alpha) stage of the Y-12 electromagnetic separation plant. During WW II, the K-25 plant was capable of producing uranium enriched up to about 23 at. % U-235. This product became feed material for the second (Beta) stage of the Y-12 electromagnetic separation process, which continued the enrichment process and produced weapons-grade U-235.
As experience with the gaseous diffusion process improved and additional cascades were added, K-25 became capable of delivering highly-enriched uranium after WW II.
You can take a virtual tour of K-25, including its decommissioning and cleanup, here:
Construction on the second Oak Ridge gaseous diffusion plant, K-27, began on 3 April 1945. This plant became operational after WW II. By 1955, the K-25 complex had grown to include gaseous diffusion buildings K-25, K-27, K-29, K-31 and K-33 that comprised a multi-building, enriched uranium production chain collectively known as the Oak Ridge Gaseous Diffusion Plant (ORGDP). Operation of the ORGDP continued until 1985.
Additional post-war gaseous diffusion plants based on the technology developed at Oak Ridge were built and operated in Paducah, KY (1952 – 2013) and Portsmouth, OH (1954 – 2001).
Y-12 Electromagnetic Separation Plant
In 1941, Earnest Lawrence modified the 37-inch (94 cm) cyclotron in his laboratory at the University of California Berkeley to demonstrate the feasibility of electromagnetic separation of uranium isotopes using the same principle as a mass spectrograph.
The initial industrial-scale design agreed in 1942 was called an Alpha (α) calutron, which was designed to enrich natural uranium (@ 0.711 at.% U-235) to >10 at.% U-235. The later Beta (β) calutron was designed to further enrich the output of the Alpha calutrons, as well as the outputs from the K-25 and S-50 processes, and produce weapons-grade uranium at >88 at.% U-235.
The calutrons required large magnet coils to establish the strong electromagnetic field needed to separate the uranium isotopes U-235 and U-238. The shape of the magnet coils for both the Alfa and Beta calutrons resembled a racetrack, with many individual calutron modules (aka “tanks”) arranged side-by-side around the racetrack. At Y-12, there were nine Alpha calutron “tracks” (5 x Alpha-1 and 4 x Alpha-2 tracks), each with 96 calutron modules (tanks), for a total of 864 Alpha calutrons. In addition, there were eight Beta calutron tracks, each with 36 calutron modules, for a total of 288 beta calutrons, only 216 of which ever operated.
Due to wartime shortages of copper, the Manhattan Project arranged a loan from the Treasury Department of about 300 million Troy ounces (10,286 US tons) of silver for use in manufacturing the calutron magnet coils. A general arrangement of a Beta calutron module (tank) is shown in the following diagram, which also shows the isotope flight paths from the uranium tetrachloride (UCl4) ion source to the ion receivers. Separated uranium was recovered by burning the graphite ion receivers and extracting the metallic uranium from the ash.
Construction of Buildings 9731 and 9204-3 at the Y-12 complex began 75 years ago, in February 1943. By February 1944, initial operation of the Alpha calutrons had produced only 0.44 pounds (0.2 kg) of U-235 @ 12 at.%. By August 1945, the Y-12 Beta calutrons had produced the 92.6 pounds (42 kg) of weapons-grade uranium needed for the Little Boy atomic bomb.
After WW II, the silver was recovered from the calutron magnet coils and returned to the Treasury Department.
3.3. Hanford, Washington
On January 16, 1943, General Leslie Groves officially endorsed Hanford as the proposed plutonium production site, which was given the MED code name Site W. The plan was to construct three large graphite-moderated, water-cooled plutonium production reactors, designated B, D, and F, in along the Columbia River. The Hanford site also would include a facility for manufacturing the new uranium fuel slugs for the reactors as well as chemical separation plants and associated facilities to recover and process plutonium from the irradiated uranium slugs.
After WW II, six more plutonium production reactors were built at Hanford along with additional plutonium and nuclear waste processing and storage facilities.
The Manhattan Project National Historical Park sites at Hanford are:
B Reactor, which has been a National Historic Landmark since 19 August 2008
The previous Hanford High School in the former Town of Hanford and Hanford Construction Camp Historic District
Bruggemann’s Agricultural Warehouse Complex
White Bluffs Bank and Hanford Irrigation District Pump House
A brief overview of the B Reactor and the other Hanford production reactors is provided below. There’s more information on the Manhattan Project National Historical Park sites at Hanford here:
The Manhattan Project National Historical Park does not include the Hanford chemical separation plants and associated plutonium facilities in the 200 Area, the uranium fuel production plant in the 300 Area, or the other eight plutonium production reactors that were built in the 100 Area. Information on all Hanford facilities, including their current cleanup status, is available on the Hanford website here:
The B Reactor at the Hanford Site was the world’s first full-scale reactor and the first of three plutonium production reactor of the same design that became operational at Hanford during WW II. B Reactor and the similar D and F Reactors were significantly larger graphite-moderated reactor than the X-10 Graphite Reactor at Oak Ridge. The rectangular reactor core measured 36 feet (11 m) wide x 36 feet (11 m) tall x 28 feet (8.53 m) deep, surrounded by radiation shielding. These reactors were fueled by aluminum clad, metallic natural fuel slugs measuring 8 inches (20.3 cm) long x 1.5 inches (3.8 cm) in diameter. As with the X-10 Graphite Reactor, new fuel slugs were inserted into process tubes (fuel channels) at the front face of the reactor. The irradiated fuel slugs were pushed out of the fuel channels at the back face of the reactor, falling into a water pool to allow the slugs to cool before further processing for plutonium recovery.
Reactor cooling was provided by the once-through flow of filtered and processed fresh water drawn from the Columbia River. The heated water was discharged from the reactor into large retention basins that allowed some cooling time before the water was returned to the Columbia River.
Construction of B Reactor began 75 years ago, in October 1943, and fuel loading started 11 months later, on September 13, 1944. Initial criticality occurred on 26 September 1944, followed shortly by operation at the initial design power of 250 MWt.
B Reactor was the first reactor to experience the effects of xenon poisoning due to the accumulation of Xenon (Xe-135) in the uranium fuel. Xe-135 is a decay product of the relatively short-lived (6.7 hour half-life) fission product iodine I-135. With its very high neutron cross-section, Xe-135 absorbed sufficient neutrons to significantly, and unexpectedly, reduce B Reactor power. Fortunately, DuPont had added more process tubes (a total of 2004) than called for in the original design of B Reactor. After the xenon poisoning problem was understood, additional fuel was loaded, providing the core with enough excess reactivity to override the neutron poisoning effects of Xe-135.
On 3 February 1945, the first batch of B Reactor plutonium was delivered to Los Alamos, just 10 months after the first small plutonium sample from the X-10 Graphite Reactor had been delivered.
Regular plutonium deliveries from the Hanford production reactors provided the plutonium needed for the first ever nuclear device (the Gadget) tested at the Trinity site near Alamogordo, NM on 16 July 1945, as well as for the Fat Man atomic bomb dropped on Nagasaki, Japan on 9 August 1945 and an unused second Fat Man atomic bomb. These three devices each contained about 13.7 pounds (6.2 kilograms) of weapons-grade plutonium produced in the Hanford production reactors.
From March 1946 to June 1948, B Reactor was shut down for maintenance and modifications. In March 1949, B Reactor began the first tritium production campaign, irradiating targets containing lithium and producing tritium for hydrogen bombs.
By 1963, B Reactor was permitted to operate at a maximum power level of 2,090 MWt. B Reactor continued operation until 29 January 1968, when it was ordered shut down by the Atomic Energy Commission. Because of its historical significance, B Reactor was given special status that allows it to be open for public tours as part of the Manhattan Project National Historical Park.
The Other WW II Production Reactors at the Hanford Site: D & F
During WW II, three plutonium reactors of the same design were operational at Hanford: B, D and F. All had an initial design power rating of 250 MWt and by 1963 all were permitted to operate at a maximum power level of 2,090 MWt.
D Reactor: This was the world’s second full-scale nuclear reactor. It became operational in December 1944, but experienced operational problems early in life due to growth and distortion of its graphite core. After developing a process for controlling graphite distortion, D Reactor operated successfully through June 1967.
F Reactor: This was the third of the original three production reactors at Hanford. It became operational in February 1945 and ran for more than twenty years until it was shut down in June1965.
D and F Reactors currently are in “interim safe storage,” which commonly is referred to as “cocooned.” These reactor sites are not part of the Manhattan Project National Historical Park.
Post-war Production Reactors at Hanford: H, DR, C, K-West, K-East & N
After WW II, six additional plutonium production reactors were built and operated at Hanford. The first three, named H, DR and C, were very similar in design to the B, D and F Reactors. The next two, K-West and K-East, were of similar design, but significantly larger than their predecessors. The last reactor, named N, was a one-of-a kind design.
H Reactor: This was the first plutonium production reactor built at Hanford after WW II. It became operational in October 1949 with a design power rating of 400 MWt and by 1963 was permitted to operate at a maximum power level of 2,090 MWt. It operated for 15 years before being permanently shut down in April 1965.
DR Reactor: This reactor originally was planned as a replacement for the D Reactor and was built adjacent to the D Reactor site. DR became operational in October 1950 with an initial design power rating of 250 MWt. It operated in parallel with D Reactor for 14 years, and by 1963 was permitted to operate at the same maximum power level of 2,090 MWt. DR was permanently shut down in December 1964.
C Reactor: Reactor construction started June 1951 and it was completed in November 1952, operating initially at a design power of 650 MWt. By 1963, C Reactor was permitted to operate at a maximum power level of 2,310 MWt. It operated for sixteen years before being shut down in April 1969. C Reactor was the first reactor at Hanford to be placed in interim safe storage, in 1998.
K-West & K-East Reactors: These larger reactors differed from their predecessors mainly in the size of the moderator stack, the number, size and type of process tubes (3,220 process tubes), the type of shielding and other materials employed, and the addition of a process heat recovery system to heat the facilities. These reactors were built side-by-side and became operational within four months of each other in 1955: K-West in January and K-East in April. These reactors initially had a design power of 1,800 MWt and by 1963 were permitted to operate at a maximum power level of 4,400 MWt before an administrative limit of 4,000 MWt was imposed by the Atomic Energy Commission. The two reactors ran for more than 15 years. K-West was permanently shut down in February 1970 followed by K-East in January 1971.
N Reactor: This was last of Hanford’s nine plutonium production reactors and the only one designed as a dual-purpose reactor capable of serving as a production reactor while also generating electric power for distribution to the external power grid. The N Reactor had a reactor design power rating of 4,000 MWt and was capable of generating 800 MWe. The N Reactor also was the only Hanford production reactor with a closed-loop primary cooling system. Plutonium production began in 1964, two years before the power generating part of the plant was completed in 1966. N Reactor operated for 24 years until 1987, when it was shutdown for routine maintenance. However, it never restarted, instead being placed in standby status by DOE and then later retired.
Four of these reactors (H, DR, C and N) are in interim safe storage while the other two (K-West and K-East) are being prepared for interim safe storage. None of these reactor sites are part of the Manhattan Project National Historical Park.
The Federation of American Scientists (FAS) reported that the nine Hanford production reactors produced 67.4 metric tons of plutonium, including 54.5 metric tons of weapons-grade plutonium, through 1987 when the last Hanford production reactor (N Reactor) was shutdown.
4. Other Manhattan Project Sites
There are many MED sites that are not yet part of the Manhattan Project National Historical Park. You’ll find details on all of the MED sites on the American Heritage Foundation website, which you can browse at the following link:
Another site worth browsing is the interactive world map created by the ALSOS Digital Library for Nuclear Issues on Google Maps to show the locations and provide information on offices, mines, mills, plants, laboratories, and test sites of the US nuclear weapons complex from World War II to 2016. The map includes over 300 sites, including the Manhattan Project sites. I think you’ll enjoy exploring this interactive map.
Greene, Sherrell R., “A diamond in Dogpatch: The 75th anniversary of the Graphite Reactor – Part 2: The Postwar Years,” American Nuclear Society, December 2018 www.ans.org/pubs/magazines/download/a_1139
“Uranium Enrichment Processes Directed Self-Study Course, Module 5.0: Electromagnetic Separation (Calutron) and Thermal Diffusion,” US Nuclear Regulatory Commission Technical Training Center, 9/08 (Rev 3) https://www.nrc.gov/docs/ML1204/ML12045A056.pdf
“Uranium Enrichment Processes Directed Self-Study Course, Module 2.0: Gaseous Diffusion,” US Nuclear Regulatory Commission Technical Training Center, 9/08 (Rev 3) https://www.nrc.gov/docs/ML1204/ML12045A050.pdf
Hanford site, plutonium production reactors and processing facilities:
“Hanford Site Historical District: History of the Plutonium Production Facilities 1943-1990,” DOE/RL-97-1047, Department of Energy, Hanford Cultural and Historical Resources Program, June 2002 https://www.osti.gov/servlets/purl/807939
“Operating Limits – Hanford Production Reactors,” HW-76327, Research and Engineering Operation, Irradiation Processing Department, 5 November 1963 https://www.osti.gov/servlets/purl/10189795
“Hanford’s Historic B Reactor – Presentation to PNNL Open World Forum March 20, 2009,” HNF-40918-VA, Department of Energy, 2009 https://www.osti.gov/servlets/purl/951760