Atacama Large Millimeter / submillimeter Array (ALMA) Provides a Unique Window on the Universe

Peter Lobner

The Atacama Large Millimeter / submillimeter Array (ALMA) is a single telescope composed of 66 high-precision, 12-meter antennas. ALMA operates at wavelengths of 0.3 to 9.6 millimeters. As shown in the following chart, this puts ALMAs observing range around the boundary between microwave and infrared.

wavelength-spectrum1Source: physics.tutorvista.com

This enables ALMA’s users to examine “cold” regions of the universe, which are optically dark but radiate brightly in the millimeter / submillimeter portions of the electromagnetic spectrum. In that frequency range, ALMA is a complete astronomical imaging and spectroscopic instrument with a resolution better than the Hubble Space Telescope.

The ALMA Array Operations Site (AOS) is located on the Chajnantor plateau (which in the local Atacameño language, Kunza, means “place of departure”), at an elevation of about 5,000 meters (16,400 feet) above sea level in northern Chile.

ALMA_AOSView of the AOS. Source: ESO

On 30 September 2013 the last of the 66 antennas, each of which weighs more than 100 tons, was delivered to the AOS on the giant transporter named Otto (one of two available for the task) and handed over to the ALMA Observatory. The 12 meter antennas have reconfigurable baselines ranging from 15 meters to 18 km. Depending on what is being observed, the transporters can move ALMA antennas to establish the desired array. The transporters carry power generators to maintain the cryogenic systems needed to ensure that the antenna continues functioning during transport.

ALMA_antenna on transporterSource: ESOalma_antennas_nrao04bSource: ESO

ALMA is managed by an international partnership  of the European Southern Observatory (ESO), the U.S. National Science Foundation (NSF) and the National Institutes of Natural Sciences (NINS) of Japan, together with NRC (Canada), NSC and ASIAA (Taiwan), and KASI (Republic of Korea), in cooperation with the Republic of Chile.

The ALMA telescope is operated from the Operations Support Facilities (OSF), which is located at a considerable distance from the telescope at an elevation of about 2,900 meters (9,500 feet). The OSF also served as the Assembly, Integration, Verification, and Commissioning (AIVC) station for all the antennas and other high technology equipment before they were moved to the AOS.

The ALMA website is at the following link:

http://www.almaobservatory.org

You’ll find many downloadable ALMA-related documents on the Publications tab of this website. A good overview of the ALMA telescope and the design of the individual antennas is available at:

http://www.almaobservatory.org/images/pdfs/alma_brochure_explore_2007.pdf

ALMA press releases, with details of on many of interesting observations being made at the observatory are at the following link:

http://www.almaobservatory.org/en/press-room/press-releases

An example of the type of remarkable observations being made with ALMA is in the 16 July 2016 press release, ALMA Observes First Protoplanetary Water Snow Line Thanks to Stellar Outburst.”

“This line marks where the temperature in the disk surrounding a young star drops sufficiently low for snow to form. A dramatic increase in the brightness of the young star V883 Orionis flash heated the inner portion of the disk, pushing the water snow line out to a far greater distance than is normal for a protostar, and making it possible to observe it for the first time.”

ALMA was looking in the right place at the right time. An artist’s impression of the water-snow line around V883 Orionis is shown in the ESO image below.

eso1626aCredit: A. Angelich (NRAO/AUI/NSF)/ALMA (ESO/NAOJ/NRAO)

You can read this ALMA press release and view a short video simulation of the event at the following link:

http://www.eso.org/public/usa/news/eso1626/

No doubt ALMA’s unique capabilities will continue to expand our knowledge of the universe in the millimeter / submillimeter portions of the electromagnetic spectrum. In collaboration with great land-based and space-based observatories operating in other portions of the spectrum, ALMA will help create a more comprehensive understanding of our universe. See my 6 March 2016 post, Remarkable Multispectral View of Our Milky Way Galaxy,” to see how different a portion of the night sky can look in different portions of the electromagnetic spectrum.

Is it Possible to Attribute Specific Extreme Weather Events to Global Climate Change?

Peter Lobner

On 7 September 2016, the National Oceanic and Atmospheric Administration (NOAA) reported that climate change increased the chance of record rains in Louisiana by at least 40%. This finding was based on a rapid assessment conducted by NOAA and partners after unusually severe and prolonged rains affected a broad area of Louisiana in August 2016. You can read this NOAA news release at the following link:

http://www.noaa.gov/media-release/climate-change-increased-chances-of-record-rains-in-louisiana-by-at-least-40-percent

NOAA reported that models indicated the following:

  • The return period for extreme rain events of the magnitude of the mid-August 2016 downpour in Louisiana has decreased from an average of 50 years to 30 years.
  • A typical 30-year event in 1900 would have had 10% less rain than a similar event today; for example, 23 inches instead of 25 inches.

NOAA notes that “return intervals” are statistical averages over long periods of time, which means that it’s possible to have more than one “30-year event” in a 30-year period.

NOAA Lousiana Aug2016 extreme rain graphSource: NOAA

In their news release NOAA included the following aerial photos of Denham Springs, Louisiana. The photo on the left was at the height of the flooding on August 15, 2016. The photo on the right was taken three days later when floodwaters had receded.

NOAA Lousiana Aug2016 extreme rain photosSource: NOAA / National Geodetic Survey

World Weather Attribution (WWA) is an international effort that is, “designed to sharpen and accelerate the scientific community’s ability to analyze and communicate the possible influence of climate change on extreme-weather events such as storms, floods, heat waves and droughts”. Their website is at the following link:

https://wwa.climatecentral.org

WWA attempts to address the question: “Did climate change have anything to do with this?” but on their website, WWA cautions:

“Scientists are now able to answer this for many types of extremes. But the answer may vary depending on how the question is framed……..it is important for every extreme event attribution study to clearly define the event and state the framing of the attribution question.”

To get a feeling for how they applied this principal, you can read the WWA report, “Louisiana Downpours, August 2016,” at the following link:

https://wwa.climatecentral.org/analyses/louisiana-downpours-august-2016/

I find this report quite helpful in putting the Louisiana extreme precipitation event in perspective. I object to the reference to “human-caused climate change,” in the report because the findings should apply regardless of the source of the observed change in climate between 1900 and 2016.

On the WWA website, you can easily navigate to several other very interesting analyses of extreme weather events, and much more.

The National Academies Press (NAP) recently published the following two reports on extreme weather attribution, both of which are worth your attention.

The first NAP report, “Attribution of Extreme Weather Events in the Context of Climate Change,” applies to the type of rapid assessment performed by NOAA after the August 2016 extreme precipitation event in Louisiana. The basic premise of this report is as follows:

“The media, the public, and decision makers increasingly ask for results from event attribution studies during or directly following an extreme event. To meet this need, some groups are developing rapid and/or operational event attribution systems to provide attribution assessments on faster timescales than the typical research mode timescale, which can often take years.”

NAP Attribution of Severe Weather Events  Source: NAP

If you have established a free NAP account, you can download a pdf copy of this report for free at the following link:

http://www.nap.edu/catalog/21852/attribution-of-extreme-weather-events-in-the-context-of-climate-change

The second NAP report, “Frontiers of Decadal Climate Variability,” addresses a longer-term climate issue. This report documents the results of a September 2015 workshop convened by the National Academies of Sciences, Engineering, and Medicine to examine variability in Earth’s climate on decadal timescales, which they define as 10 to 30 years.

NAP Decadal Climate Variation   Source: NAP

This report puts the importance of understanding decadal climate variability in the following context:

“Many factors contribute to variability in Earth’s climate on a range of timescales, from seasons to decades. Natural climate variability arises from two different sources: (1) internal variability from interactions among components of the climate system, for example, between the ocean and the atmosphere, and (2) natural external forcing (functions), such as variations in the amount of radiation from the Sun. External forcing (functions) on the climate system also arise from some human activities, such as the emission of greenhouse gases (GHGs) and aerosols. The climate that we experience is a combination of all of these factors.

Understanding climate variability on the decadal timescale is important to decision-making. Planners and policy makers want information about decadal variability in order to make decisions in a range of sectors, including for infrastructure, water resources, agriculture, and energy.”

While decadal climate variability is quite different than specific extreme weather events, the decadal variability establishes the underlying climate patterns on which extreme weather events may occur.

You can download a pdf copy of this report for free at the following link:

http://www.nap.edu/catalog/23552/frontiers-in-decadal-climate-variability-proceedings-of-a-workshop

I think it’s fair to say that, in the future, we will be seeing an increasing number of “quick response” attributions of extreme weather events to climate change. Each day in the financial section of the newspaper (Yes, I still get a printed copy of the daily newspaper!), there is an attribution from some source about why the stock market did what it did the previous day. Some days these financial attributions seem to make sense, but other days they’re very much like reading a fortune cookie or horoscope, offering little more than generic platitudes.

Hopefully there will be real science behind attributions of extreme weather events to climate change and the attributors will heed WWA’s caution:

“…it is important for every extreme event attribution study to clearly define the event and state the framing of the attribution question.”

Modernizing the Marine Corps Amphibious Landing Capabilities

Peter Lobner

Updated 7 January 2019 and 15 December 2020

1.  Introduction

The U.S. Marine Corps is taking a two-prong approach to ensure their readiness to conduct forcible amphibious landing operations: (1) modernize the fleet of existing Assault Amphibious Vehicles (AAVs), the 71A, and (2) select the contractor for the next-generation Amphibious Combat Vehicles (ACVs). The firms involved in these programs are Science Applications International Corporation (SAIC) and BAE Systems.

Both the existing Marine AAVs and the new ACVs are capable of open-ocean ship launch and recovery operations from a variety of the Navy’s amphibious warfare ships, such as a landing ship dock (LSD) or landing platform dock (LPD). These ships may be as much as 12 miles (19 km) offshore. After traveling like a small boat toward the shore, maneuvering through the surf line, and landing on the beach, the AAVs and new ACVs operate as land vehicles to deliver troops, cargo, or perform other missions.

AAVs_preparing_to_debark_USS_Gunston_HallCurrent-generation AAV 71As in an LPD well deck. Source: Wikimedia Commons / U.S. Navy091016-N-5148B-052Current-generation AAV 71A disembarking from an LPD well deck into the open ocean. Source: U.S. Navy

The Marine Corps plans to maintain the ability to put 10 amphibious battalions ashore during a forcible landing operation.

Let’s take a look in more detail at the Marine Corps AAV 71A modernization program and the new ACV competition.

2.  The modernized AAV SU

The AAV SU is upgraded version of the existing, venerable Marine Corps AAV 71A, which can carry 25 embarked Marines. The AAV SU incorporates the following modernized systems and survivability upgrades:

  • armor protection on its flat underbelly
  • buoyant ceramic armor on the flanks
  • blast-resistant seats replacing legacy bench seating
  • new engine & transmission; greater horsepower & torque
  • improved water jets propulsors yielding higher speed at sea
  • external fuel tanks, and
  • upgraded vehicle controls and driver interface

Marine AAV 71ACurrent-generation AAV 71A after landing on a beach. Source: okrajoeSAIC AAV SU unveilingUnveiling AAV SU. Source: SAIC

In January 2016, SAIC unveiled the modernized AAV SU at its facility in Charleston SC and delivered the first prototype for testing at U.S. Marine Corps Base Quantico, VA on 4 March 2016. A total of 10 AAV SUs will be tested before the Marine Corps commits to upgrading its entire fleet of 392 AAVs.

Even after ACV deployment, the Marine Corps plans to maintain enough AAV SUs to equip four amphibious battalions.

You can view a Marine Corps video on the AAV survivability upgrade program at the following link:

3. The Next-generation ACV

On 24 November 2015, BAE Systems and SAIC were down-selected from a field of five competitors and awarded contracts to build engineering and manufacturing development prototypes of their respective next-generation ACVs. Both of the winning firms are offering large, eight-wheel drive vehicles that are designed to be more agile and survivable on land than the current AAV, with equal performance on the water.  The ACV is air-transportable in a C-130 Hercules or larger transport aircraft.

Under contracts valued at more than $100 million, BAE Systems and SAIC each will build 16 ACVs to be delivered in the January – April 2017 time frame for test and evaluation. It is expected that a winner will be selected in 2018 and contracted to deliver 204 ACVs starting in 2020. The new ACVs will form six Marine amphibious battalions that are all scheduled to be operational by the summer of 2023.

At the following link, you can view a Marine Corps video on the ACV program and its importance to the Marine’s “service defining” mission of making amphibious landings in contested areas:

BAE Systems ACV: Super AV

In 2011, BAE Systems teamed with the Italian firm Iveco to offer a variant of the Italian 8-wheeled Super AV amphibious vehicle to the Marine Corps.

The BAE version of this diesel-powered vehicle has a top speed of 65 mph (105 kph) on paved roads and 6 knots (6.9 mph, 11 kph) in the water. Its range is 12 miles (19 km) at sea followed by 200 miles on land. Two small shrouded propellers provide propulsion at sea. On land, the “H-drive” system provides power to individual wheels, so the vehicle can continue operating if an individual wheel is damaged or destroyed.

The armored passenger and crew compartments are protected by a V-shaped hull. Individuals are further protected from blast effects by shock-mounted seats.

On 27 September 2016, BAE Systems unveiled their 34-ton Super AV ACV, which normally will carry a crew of three and 11 embarked Marines, with a capability to carry two more for a total of 13 (i.e., a full Marine squad).

BAE Super AV unveiledBAE Super AV ACV unveiled. Source: BAE Systems

You can view a 2014 BAE Systems video on their Super AV at the following link:

https://www.youtube.com/watch?v=9QK7xUtzjA4

SAIC ACV: Terrex 2

SAIC partnered with ST Kinetics, which developed the Terrex amphibious vehicle currently in use by Singapore’s military. This vehicle currently is configured for a crew of three and 11 embarked Marines.

The basic configuration of SAIC’s Terrex 2 is similar to the BAE Super AV: V-shaped hull, shock-mounted seats and other protection for occupants, propeller driven in the water, independent wheel-driven on land, with similar mobility. SAIC’s Terrex 2 can reach speeds of 55 mph on paved roads and 7 knots (8 mph, 12.9 kph) in the open ocean. A Remote Weapon System (machine guns and cannon) and 10 “fusion cameras” allow closed-hatch missions with day/night 360-degree situational awareness.

SAIC Terrex 2 landing on beachSource: SAICSAIC ACVSource: SAIC

You can see a short 2014 SAIC video on their AAV SU upgrade program and their Terrex 2 ACV at the following link:

7 January 2019 Update:  BAE won the ACV competition in June 2018

On 19 June 2018, it was announced that the Marine Corps had selected BAE to build the next generation Amphibious Combat Vehicle and a contract for $198 million for the first 30 ACVs had been awarded to BAE.  These vehicles are due to be delivered in the fall of 2019 for use in Initial Operational Testing & Evaluation (IOT&E). A decision to begin full rate production of the ACV is expected in 2020.

You’ll find more information on the ACV selection and BAE contract award on the Breaking Defense website here:

https://breakingdefense.com/2018/06/bae-beats-upstart-saic-to-build-marine-amphibious-combat-vehicle/

15 December 2020 Update:  BAE Set to Begin Full-Rate Production of the Marines’ New Amphibious Combat Vehicles 

In December 2020, the Marine Corps awarded BAE Systems a contract valued at almost $185 million to start full-rate production of the ACV and deliver the first 36 amphibious combat vehicles. BAE expects that this first-lot order will increase to 72 vehicles in early 2021.  In following years, the Marines have options to order 80 vehicles annually over five years.

The Marine’s new BAE AVC on the beach at Marine Corps Base Camp Pendleton, California. Source: Andrew Cortez / U.S. Marine Corps

You’ll find more information at the following link:

https://www.military.com/daily-news/2020/12/14/marines-new-amphibious-combat-vehicles-set-begin-full-rate-production.html?ESRC=eb_201215.nl

Deadline – Espionage or Innocent Coincidence?

Peter Lobner

The March 1944 issue of Astounding Science Fiction magazine contained a short story by Cleve Cartmill entitled, Deadline, that may, or may not have revealed secrets related to the Manhattan Project. This short story was edited by MIT-educated John W. Campbell Jr.

ASF_March 1944 cover                             Source: Astounding Science Fiction

Cleve Cartmill’s notoriety after the publication of Deadline is described in The Encyclopedia of Science Fiction (http://www.sf-encyclopedia.com/entry/cartmill_cleve):

“He is best remembered in the field for one famous (but untypical) story, “Deadline” (March 1944 Astounding),which described the atomic bomb a year before it was dropped: in this near-future fable, the evil Sixa (i.e., Axis) forces are prevented from dropping the Bomb, and the Seilla (Allies) decline to do so, justly fearing its dread potential. US Security subsequently descended on Astounding, but was persuaded (truthfully) by John W.Campbell Jr that Cartmill had used for his research only material available in public libraries. Cartmill’s prediction made sf fans enormously proud, and the story was made a prime exhibit in the arguments about prediction in sf.”

I’ve been unable to find an online source for the full-text of Deadline, but here’s a sample of the March 1944 text:

“U-235 has been separated in quantity sufficient for preliminary atomic-power research and the like. They get it out of uranium ores by new atomic isotope separation methods; they now have quantities measured in pounds….But they have not brought it together, or any major portion of it. Because they are not at all sure that, once started, it would stop before all of it had been consumed….They could end the war overnight with controlled U-235 bombs……So far, they haven’t worked out any way to control the explosion.”

The status of the Manhattan Project’s nuclear weapons infrastructure at the time that Deadline was published in March 1944 is outlined below.

  • The initial criticality at the world’s first nuclear reactor, the CP-1 pile in Chicago, occurred on 2 December 1942.
  • The initial criticality at the world’s second nuclear reactor, the X-10 Graphite Reactor in Oak Ridge (also known as the Clinton pile and the X-10-pile), and the first reactor designed for continuous operation, occurred 4 November 1943. X-10 produced its first plutonium in early 1944.
  • The initial criticality of the first large-scale production reactor, Hanford B, occurred in September 1944. This was followed by Hanford D in December 1944, and Hanford F in February 1945.
  • Initial operation of the first production-scale thermal diffusion plant (S-50 at Oak Ridge) began in January 1945, delivering 0.8 – 1.4% enriched uranium initially to the Y-12 calutrons, and later to the K-25 gaseous diffusion plant.
  • Initial operation of the first production-scale gaseous diffusion plant (K-25 at Oak Ridge) began operation in February 1945, delivering uranium enriched up to about 23% to the Y-12 calutrons
  • The Y-12 calutrons began operation in February 1945 with feed from S-50, and later from K-25. The calutrons provided uranium at the enrichment needed for the first atomic bombs.
  • The Trinity nuclear test occurred on 16 July 1945
  • The Little Boy uranium bomb was dropped on Hiroshima on 6 August 1945
  • The Fat Man plutonium bomb was dropped on Nagasaki on 9 August 1945

You can read more about of Deadline, including reaction at Los Alamos to this short story, on Wikipedia at the following link:

https://en.wikipedia.org/wiki/Deadline_(science_fiction_story)

You also can download, “The Astounding Investigation: The Manhattan Project’s Confrontation With Science Fiction,” by Albert Berger at the following link:

https://www.gwern.net/docs/1984-berger.pdf

This investigation report, prepared by Astounding Science Fiction, identifies a number of sci-fi stories from 1934 to 1944 that included references to atomic weapons in their story lines, so Deadline was not the first to do so. Regarding the source of the technical information used in Deadline, the investigation report notes:

“However, when questioned as to the source of the technical material in “Deadline,” the references to U-235 separation, and to bomb and fuse design, Cartmill ‘explained that he took the major portion of it directly from letters sent to him by John Campbell…and a minor portion of it from his own general knowledge.’”

While Deadline may have angered many Manhattan Project Military Intelligence senior security officers, neither Cartmill nor Campbell were ever charged with a crime. The investigation noted that stories like Deadline could cause unwanted public speculation about actual classified projects. In addition, such stories might help people working in compartmented classified programs to get a better understanding of the broader context of their work.

I don’t think there was any espionage involved, but, for its time, Deadline provided very interesting insights into a fictional nuclear weapons project. What do you think?

The Universe is Isotropic

Peter Lobner, Updated 12 January 2021

The concepts of up and down appear to be relatively local conventions that can be applied at the levels of subatomic particles, planets and galaxies. However, the universe as a whole apparently does not have a preferred direction that would allow the concepts of up and down to be applied at such a grand scale.

A 7 September 2016 article entitled, “It’s official: You’re lost in a directionless universe,” by Adrian Cho, provides an overview of research that demonstrates, with a high level of confidence, that the universe is isotropic. The research was based on data from the Planck space observatory. In this article, Cho notes:

“Now, one team of cosmologists has used the oldest radiation there is, the afterglow of the big bang, or the cosmic microwave background (CMB), to show that the universe is “isotropic,” or the same no matter which way you look: There is no spin axis or any other special direction in space. In fact, they estimate that there is only a one-in-121,000 chance of a preferred direction—the best evidence yet for an isotropic universe. That finding should provide some comfort for cosmologists, whose standard model of the evolution of the universe rests on an assumption of such uniformity.”

The European Space Agency (ESA) developed the Planck space observatory to map the CMB in microwave and infrared frequencies at unprecedented levels of detail. Planck was launched on 14 May 2009 and was placed in a Lissajous orbit around the L2 Lagrange point, which is 1,500,000 km (930,000 miles) directly behind the Earth. L2 is a quiet place, with the Earth shielding Planck from noise from the Sun. The approximate geometry of the Earth-Moon-Sun system and a representative spacecraft trajectory (not Planck, specifically) to the L2 Lagrange point is shown in the following figure.

Lissajous orbit L2Source: Abestrobi / Wikimedia Commons

The Planck space observatory entered service on 3 July 2009. At the end of its service life, Planck departed its valuable position at L2, was placed in a heliocentric orbit, and was deactivated on 23 October 2013. During more than four years in service, Planck performed its CBM mapping mission with much greater resolution than NASA’s Wilkinson Microwave Anisotropy Probe (WMAP), which operated from 2001 to 2010.  Planck was designed to map the CMB with an angular resolution of 5-10 arc minutes and a sensitivity of a millionth of a degree.

One key result of the Planck mission is the all-sky survey shown below.

Planck all-sky survey 2013 CBM temperature map shows anisotropies in the temperature of the CMB at the full resolution obtained by Planck. Source: ESA / Planck Collaboration

ESA characterizes this map as follows:

“The CMB is a snapshot of the oldest light in our Universe, imprinted on the sky when the Universe was just 380,000 years old. It shows tiny temperature fluctuations that correspond to regions of slightly different densities, representing the seeds of all future structure: the stars and galaxies of today.”

The researchers who reported that the universe was isotropic noted that an anisotropic universe would leave telltale patterns in the CMB. However, these researchers found that the actual CMB shows only random noise and no signs of such patterns.

The researchers who reported that the universe was isotropic noted that an anisotropic universe would leave telltale patterns in the CMB.  However, these researchers found that the actual CMB shows only random noise and no signs of such patterns.

In 2015, the ESA / Planck Collaboration used CMB data to estimate the age of the universe at 13.813 ± 0.038 billion years.  This was lightly higher than, but within the uncertainty band of, an estimate derived in 2012 from nine years of data from NASA’s Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft.

In July 2018, the ESA / Planck Collaboration published the “Planck Legacy” release of their results, which included the following two additional CBM sky survey maps.

Planck all-sky survey 2013 CBM smoothed temperature map (top) and smoothed temperature + polarization map (bottom). Source: ESA / Planck Collaboration

The ESA/Planck Collaboration described these two new maps as follows:

  • (In the top map), “the temperature anisotropies have been filtered to show mostly the signal detected on scales around 5º on the sky. The lower view shows the filtered temperature anisotropies with an added indication of the direction of the polarized fraction of the CMB.”
  • “A small fraction of the CMB is polarized – it vibrates in a preferred direction. This is a result of the last encounter of this light with electrons, just before starting its cosmic journey. For this reason, the polarization of the CMB retains information about the distribution of matter in the early Universe, and its pattern on the sky follows that of the tiny fluctuations observed in the temperature of the CMB” (in the 2013 map, above).

Using Planck CMB data, the ESA / Planck Collaboration team has estimated the value of the Hubble constant. Their latest estimate, in 2018, was 67.4 km / second / megaparsec with an uncertainty of less than 1%.  This is lower than the value derived from astrophysical measurements: 73.5 km / second / megaparsec with an uncertainty of 2%.

You’ll find more details on the Planck mission and scientific results on the ESA’s website at the following link: http://www.esa.int/Our_Activities/Space_Science/Planck

For more information:

New Catalyst Could Greatly Reduce the Cost of Splitting Water

Peter Lobner

Splitting water (H2O) is the process of splitting the water molecule into its constituent parts: hydrogen (H2) and oxygen (O2). A catalyst is a substance that speeds up a chemical reaction or lowers the energy required to get a reaction started, without being consumed itself in a chemical reaction.

Water moleculeWater molecule.  Source: Laguna Design, Getty Images

A new catalyst, created as a thin film crystal comprised of one layer of iridium oxide (IrOx) and one layer of strontium iridium oxide (SrIrO3), is described in a September 2016 article by Umair Irfan entitled, “How Catalyst Could Split Water Cheaply.” This article is available on the Scientific American website at the following link:

http://www.scientificamerican.com/article/new-catalyst-could-split-water-cheaply/?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

The new catalyst, which is the only known catalyst to work in acid, applies to the oxygen evolution reaction; the slower half of the water-splitting process.

Author Irfan notes that, “Many of the artificial methods of making hydrogen and oxygen from water require materials that are too expensive, require too much energy or break down too quickly in real-world conditions…” The availability of a stable catalyst that can significantly improve the speed and economics of water splitting could help promote the shift toward more widespread use of clean, renewable fuels. The potential benefits include:

  • May significantly improve hydrogen fuel economics
  • May allow water splitting to compete with other technologies (i.e., batteries and pumped storage) for energy storage. See my 4 March 2016 posting on the growing need for grid energy storage.
  • May improve fuel cells

At this point, it is not clear exactly how the IrOx / SrIrO3 catalyst works, so more research is needed before the practicality of its use in industrial processes can be determined.

The complete paper, “A highly active and stable IrOx/SrIrO3 catalyst for the oxygen evolution reaction,” by Seitz, L. et al., is available to subscribers on the Science magazine website at the following link:

http://science.sciencemag.org/content/353/6303/1011.full

Space-based Gravity Wave Detection System to be Deployed by ESA

Peter Lobner

The first detection of gravitational waves occurred on 14 September 2015 at the land-based Laser Interferometer Gravitational-Wave Observatory (LIGO). Using optical folding techniques, LIGO has an effective baseline of 1,600 km (994 miles). See my 16 December 2015 and 11 February 2016 posts for more information on LIGO and other land-based gravitational wave detectors.

Significantly longer baselines, and theoretically greater sensitivity can be achieved with gravitational wave detectors in space. Generically, such a space-based detector has become known as a Laser Interferometer Space Antenna (LISA). Three projects associated with space-based gravitational wave detection are:

  • LISA (the project name predated the current generic usage of LISA)
  • LISA Pathfinder (a space-based gravitational wave detection technology demonstrator, not a detector)
  • Evolved LISA (eLISA)

These projects are discussed below.

The science being addressed by space-based gravitational wave detectors is discussed in the eLISA white paper, “The Gravitational Universe.” You can download this whitepaper, a 1-page summary, and related gravitational wave science material at the following link:

https://www.elisascience.org/whitepaper/

LISA

The LISA project originally was planned as a joint European Space Agency (ESA) and National Aeronautics & Space Administration (NASA) project to detect gravitational waves using a very long baseline, triangular interferometric array of three spacecraft.

Each spacecraft was to contain a gravitational wave detector sensitive at frequencies between 0.03 mHz and 0.1 Hz and have the capability to precisely measure its distances to the other two spacecraft forming the array. The equilateral triangular array, which was to measure about 5 million km (3.1 million miles) on a side, was expected to be capable of measuring gravitational-wave induced strains in space-time by precisely measuring changes of the separation distance between pairs of test masses in the three spacecraft. In 2011, NASA dropped out of this project because of funding constraints.

LISA Pathfinder

The LISA Pathfinder (LPF) is a single spacecraft intended to validate key technologies for space-based gravitational wave detection. It does not have the capability to detect gravity waves.

This mission was launched by ESA on 3 December 2015 and the spacecraft took station in a Lissajous orbit around the Sun-Earth L1 Lagrange point on 22 January 2016. L1 is directly between the Earth and the Sun, about 1.5 million km (932,000 miles) from Earth. An important characteristic of a Lissajous orbit is that the spacecraft will follow the L1 point without requiring any propulsion. This is important for minimizing external forces on the LISA Pathfinder experiment package. The approximate geometry of the Earth-Moon-Sun system and a representative spacecraft (not LPF, specifically) stationed at the L1 Lagrange point is shown in the following figure.

L1 Lagrange pointSource: Wikimedia Commons

The LISA Pathfinder’s mission is to validate the technologies used to shield two free-floating metal cubes (test masses), which form the core of the experiment package, from all internal and external forces that could contribute to noise in the gravitational wave measurement instruments. The on-board measurement instruments (inertial sensors and a laser interferometer) are designed to measure the relative position and orientation of the test masses, which are 38 cm (15 inches) apart, to an accuracy of less than 0.01 nanometers (10e-11 meters). This measurement accuracy is believed to be adequate for detecting gravitational waves using this technology on ESA’s follow-on mission, eLISA.

The first diagram below is an artist’s impression of the LISA Pathfinder technology package, showing the inertial sensors housing the test masses (gold) and the laser interferometer (middle platform). The second diagram provides a clearer view of the test masses and the laser interferometer.

LPF technology package 1

Source: ESA/ATG medialab, August 2015LPF technology package 2Source: ESA LISA Pathfinder briefing, 7 June 2016

You’ll find more general information in an ESA LISA Pathfinder overview, which you can download from NASA’s LISA website at the following link:

http://lisa.nasa.gov/Documentation/LISA-LPF-RP-0001_v1.1.pdf

LISA Pathfinder was commissioned and ready for scientific work on 1 March 2016. In a 7 June 2016 briefing, ESA reported very favorable performance results from LISA Pathfinder:

  • LPF successfully validated the technologies used in the local (in-spacecraft) instrument package (test masses, inertial sensors and interferometer).
  • LPF interferometer noise was a factor of 100 less than on the ground.
  • The measurement instruments can see femtometer motion of the test masses (LPF goal was picometer).
  • Performance is essentially at the level needed for the follow-on eLISA mission

You can watch this full (1+ hour) ESA briefing at the following link:

http://www.esa.int/Our_Activities/Space_Science/Watch_LISA_Pathfinder_briefing

eLISA

Evolved LISA, or eLISA, is ESA’s modern incarnation of the original LISA program described previously. ESA’s eLISA website home page is at the following link:

https://www.elisascience.org

As shown in the following diagrams, three eLISA spacecraft will form a very long baseline interferometric array that is expected to directly observe gravitational waves from sources anywhere in the universe. In essence, this array will be a low frequency microphone listening for the sounds of gravitational waves as they pass through the array.

eLISA constellation 1Source: ESAeLISA constellation 2Source: ESA

As discussed previously, gravity wave detection depends on the ability to very precisely measure the distance between test masses that are isolated from their environment but subject to the influence of passing gravitational waves. Measuring the relative motion of a pair of test masses is considerably more complex for eLISA than it was for LPF. The relative motion measurements needed for a single leg of the eLISA triangular array are:

  • Test mass 1 to Spacecraft 1
  • Spacecraft 1 to Spacecraft 2
  • Spacecraft 2 to Test Mass 2

This needs to be done for each of the three legs of the array.

LPF validated the technology for making the test mass to spacecraft measurement. Significant development work remains to be done on the spacecraft-to-spacecraft laser system that must take precise measurements at very long distances (5 million km, 3.1 million miles) of the relative motion between each pair of spacecraft.

In the 6 June 2016 LISA Pathfinder briefing, LPF and ESA officials indicated that an eLisa launch date is expected in the 2029 – 2032 time frame. Then it reaches its assigned position in a trailing heliocentric orbit, eLISA will be a remarkable collaborative technical achievement and a new window to our universe.

2016 Arctic Sea Ice Minimum Was Second Lowest on Record

Peter Lobner

On 15 September 2016, the National Snow and Ice Data Center (NSIDC) in Boulder, CO reported their preliminary assessment that the Arctic sea ice minimum for this year was reached on 10 September 2016.

Arctic sea ice minimum 10Sep2016Source: NSIDC

The minimum extent of the Arctic sea ice on 10 September 2016 was 4.14 million square kilometers (1.60 million square miles). This is the white area in the map above. The orange line on this map shows the 1981 to 2010 median extent of the Arctic sea ice for that day.

  • There were extensive areas of open water on the Northern Sea Route along the Arctic coast of Russia (the Beaufort and Chukchi seas, and in the Laptev and East Siberian seas).
  • In contrast, there was much less open water on parts of the Northwest Passage along the Arctic coast of Canada (around Banks and Victoria Islands).

The 2016 minimum tied with 2007 for the second lowest Arctic sea ice minimum on record.

The historic Arctic sea ice minimum, which occurred in 2012, was 3.39 million square kilometers (1.31 million square miles); about 18% less than in 2016 [750,000 square kilometers (290,000 square miles) less than in 2016].

You can read the NSIDC preliminary report on the 2016 Arctic sea ice minimum at the following link:

https://nsidc.org/arcticseaicenews/

An historic event in the Arctic occurred in September 2016 when the commercial cruise liner Crystal Serenity, escorted by the RRS Shackleton, made the first transit of the Northwest Passage by a cruise liner. The voyage originated in Vancouver, Canada and arrived in New York City on 16 September 2016. The timing of this Arctic cruise coincided well with this year’s minimum sea ice conditions. See my 30 August 2016 post for more details on the Crystal Serenity’s historic Arctic voyage.

Floating Wave-powered Generators Offer the Potential for Commercial-scale Energy Harvesting From the Ocean

Peter Lobner

The idea of extracting energy from wave motion in the open ocean is not a new one. This energy source is renewable and relatively persistent in comparison to wind and solar power. However, no commercial-scale wave power generator currently is in operation anywhere in the world. The primary issues hindering deployment of this technology are:

  • the complexity of harnessing wave power
  • the long-term impact of the harsh ocean environment (storms, constant pounding from the sea, corrosive effects of salt water) on the generating equipment
  • the high cost of generating electricity from wave power relative to almost all other energy sources, including wind and solar

In April 2014, Dave Levitan posted an article entitled, “Why Wave Power Has Lagged Far Behind as Energy Source,” on the Environment360 website. You can read this article at the following link:

http://e360.yale.edu/feature/why_wave_power_has_lagged_far_behind_as_energy_source/2760/

You’ll find a June 2014 presentation entitled, “Wave Energy Technology Brief,” by the International Renewable Energy Agency (IRENA) at the following link:

http://www.irena.org/documentdownloads/publications/wave-energy_v4_web.pdf

The general consensus seems to be that the wave energy industry is at about the same level of maturity as the wind and solar energy industries were about 30 years ago, in the 1980s.

Several U.S. firms offer autonomous floating devices that are capable of extracting energy from the motion of ocean waves and generating usable, persistent, renewable electric power. Two of the leaders in this field are Ocean Power Technologies, Inc. (OPT) in Pennington, NJ (with subsidiaries in the UK and Australia) and Northwest Energy Innovations, LLC (NWEI) in Portland, OR. Let’s take a look at their products

Ocean Power Technologies, Inc. (OPT)

OPT (http://www.oceanpowertechnologies.com) is the developer of the PowerBuoy®, which is a moored ocean buoy that extracts energy from the heave (vertical motion) of ocean waves and converts this into electrical energy for marine applications (i.e., offshore oil, gas, scientific and military applications) or for distribution to onshore facilities and/or connection to an onshore electric power grid. OPT currently offers PowerBuoy® in two power output ranges: up to 350 watts and up to 15 kW.

PowerBuoy   Source: OPT

The modest output from individual PowerBuoys® can be combined via an Undersea Substation Pod into a scalable wave farm to deliver significant power output to the intended user.

PowerBuoy wave farmOPT wave farm concept. Source: OPT

You’ll find a description of PowerBuoy® design and operation on the OPT website at the following link:

http://www.oceanpowertechnologies.com/powerbuoy/

OPT describes their PowerBuoy® as follows:

“The PowerBuoy consists of a float, spar, and heave plate as shown in the (following) schematic…… The float moves up and down the spar in response to the motion of the waves. The heave plate maintains the spar in a relatively stationary position. The relative motion of the float with respect to the spar drives a mechanical system contained in the spar that converts the linear motion of the float into a rotary one. The rotary motion drives electrical generators that produce electricity for the payload or for export to nearby marine applications using a submarine electrical cable. This high performance wave energy conversion system generates power even in moderate wave environments.

The PowerBuoy’s power conversion and control systems provide continuous power for these applications under the most challenging marine conditions. The spar contains space for additional battery capacity if required to ensure power is provided to a given application even under extended no wave conditions.”

PowerBuoy diagram    Source: OPT

On the OPT website, you’ll find several technical presentations on the PowerBuoy® at the following link:

http://www.oceanpowertechnologies.com/technology/

Northwest Energy Innovations, LLC (NWEI)

NWEI (http://azurawave.com) is the developer of the Azura™ wave energy device, which is a moored ocean buoy that extracts power from both the heave (vertical motion) and surge (horizontal motion) of waves to maximize energy extraction. Electric power is generated by the relative motion of a rotating / oscillating float and the hull of the Azura™ wave energy device.

Hull-Float-Pod   Source: NWEI

You can see a short video on the operating principle of the Azura™ wave energy device at the following link:

http://azurawave.com/technology/

In 2012, the Azura prototype was fabricated and deployed at the Northwest National Marine Renewable Energy Center (NNMREC) ocean test site offshore from Newport, OR.

NNMREC site mapSource: flickr / Oregon State University

On May 30, 2015, under a Department of Energy (DOE) and U.S. Navy sponsored program, NWEI deployed the improved Azura™ prototype at the Navy’s Wave Energy Test Site at the Marine Corps Base, Kaneohe Bay, Oahu, Hawaii. The Azura prototype extends 12 feet above the surface and 50 feet below the surface. It generates up to 18 kW of electricity.

NWETS site photo Source: NWEI

You can view a short video on the Azura being installed at the offshore site in Kaneohe Bay at the following link:

https://www.youtube.com/watch?v=LAqNOTSoNHs

In September 2016, the Azura™ prototype reached a notable milestone when it became the first wave-powered generator connected to a U.S. commercial power grid.

Conclusions

I think we all can all agree that the technology for wave-generated power still is pretty immature. The cost of wave-generated power currently is very high in comparison to most alternatives, including wind and solar power. Nonetheless, there is a lot of energy in ocean waves and the energy density can be higher than wind or solar. As the technology matures, this is an industry worth watching, but you’ll have to be patient.

Improving Heavy Tractor-Trailer Aerodynamics

This 26 September 2016 post was replaced on 3 April 2020 with my updated and expanded post, “SuperTrucks – Revolutionizing the Heavy Tractor-Trailer Freight Industry with Science,” which is available at the following link:  https://lynceans.org/all-posts/supertrucks-revolutionizing-the-heavy-tractor-trailer-freight-industry-with-science/

I hope you’ll find the new post to be informative, useful and different from any other single source on the subject.

Best regards,

Peter Lobner

3 April 2020