Large Autonomous Vessels will Revolutionize the U.S. Navy

Peter Lobner

In this post, I will describe two large autonomous vessels that are likely to revolutionize the way the U.S. Navy operates. The first is the Sea Hunter, originally sponsored by Defense Advanced Projects Agency (DARPA), and the second is Echo Voyager developed by Boeing.

DARPA Anti-submarine warfare (ASW) Continuous Trail Unmanned Vessel (ACTUV)

ACTUV conceptSource: DARPA

DARPA explains that the program is structured around three primary goals:

  • Demonstrate the performance potential of a surface platform conceived originally as an unmanned vessel.
    • This new design paradigm reduces constraints on conventional naval architecture elements such as layout, accessibility, crew support systems, and reserve buoyancy.
    • The objective is to produce a vessel design that exceeds state-of-the art manned vessel performance for the specified mission at a fraction of the vessel size and cost.
  •  Advance the technology for unmanned maritime system autonomous operation.
    • Enable independently deploying vessels to conduct missions spanning thousands of kilometers of range and months of duration under a sparse remote supervisory control model.
    • This includes autonomous compliance with maritime laws and conventions for safe navigation, autonomous system management for operational reliability, and autonomous interactions with an intelligent adversary.
  • Demonstrate the capability of an ACTUV vessel to use its unique sensor suite to achieve robust, continuous track of the quietest submarine targets over their entire operating envelope.

While DARPA states that ACTUV vessel is intended to detect and trail quiet diesel electric submarines, including air-independent submarines, that are rapidly proliferating among the world’s navies, that detect and track capability also should be effective against quiet nuclear submarines. The ACTUV vessel also will have capabilities to conduct counter-mine missions.

The ACTUV program is consistent with the Department of Defense (DoD) “Third Offset Strategy,” which is intended to maintain U.S. military technical supremacy over the next 20 years in the face of increasing challenges from Russia and China. An “offset strategy” identifies particular technical breakthroughs that can give the U.S. an edge over potential adversaries. In the “Third Offset Strategy”, the priority technologies include:

  • Robotics and autonomous systems: capable of assessing situations and making decisions on their own, without constant human monitoring
  • Miniaturization: enabled by taking the human being out of the weapons system
  • Big data: data fusion, with advanced, automated filtering / processing before human involvement is required.
  • Advanced manufacturing: including composite materials and additive manufacturing (3-D printing) to enable faster design / build processes and to reduce traditionally long supply chains.

You can read more about the “Third Offset Strategy” at the following link:

http://breakingdefense.com/2014/11/hagel-launches-offset-strategy-lists-key-technologies/

You also may wish to read my 19 March 2016 post on Arthur C. Clarke’s short story “Superiority.” You can decide for yourself if it relates to the “Third Offset Strategy.”

Leidos (formerly SAIC) is the prime contractor for the ACTUV technology demonstrator vessel, Sea Hunter. In August 2012, Leidos was awarded a contract valued at about $58 million to design, build, and operationally test the vessel.

In 2014, Leidos used a 32-foot (9.8 meter) surrogate vessel to demonstrate the prototype maritime autonomy system designed to control all maneuvering and mission functions of an ACTUV vessel. The first voyage of 35 nautical miles (65.8 km) was conducted in February 2014. A total of 42 days of at-sea demonstrations were conducted to validate the autonomy system.

Sea Hunter is an unarmed 145-ton full load displacement, diesel-powered, twin-screw, 132 foot (40 meters) long, trimaran that is designed to a wide range of sea conditions. It is designed to be operational up to Sea State 5 [moderate waves to 6.6 feet (2 meters) height, winds 17 – 21 knots] and to be survivable in Sea State 7 [rough weather with heavy waves up to 20 feet (6 meters) height]. The vessel is expected to have a range of about 3,850 miles (6,200 km) without maintenance or refueling and be able to deploy on missions lasting 60 – 90 days.

Sea Hunter side view cropSource: DARPA

Raytheon’s Modular Scalable Sonar System (MS3) was selected as the primary search and detection sonar for Sea Hunter. MS3 is a medium frequency sonar that is capable of active and passive search, torpedo detection and alert, and small object avoidance. In the case of Sea Hunter, the sonar array is mounted in a bulbous housing at the end of a fin that extends from the bottom of the hull; looking a bit like a modern, high-performance sailboat’s keel.

Sea Hunter will include sensor technologies to facilitate the correct identification of surface ships and other objects on the sea surface. See my 8 March 2015 post on the use of inverse synthetic aperture radar (ISAR) in such maritime surveillance applications.

During a mission, an ACTUV vessel will not be limited by its own sensor suit. The ACTUV vessel will be linked via satellite to the Navy’s worldwide data network, enabling it to be in constant contact with other resources (i.e., other ships, aircraft, and land bases) and to share data.

Sea Hunter was built at the Vigor Shipyard in Portland, Oregon. Construction price of the Sea Hunter is expected to be in the range from $22 to $23 million. The target price for subsequent vessels is $20 million.

You can view a DARPA time-lapse video of the construction and launch of Sea Hunter at the following link:

http://www.darpa.mil/attachments/ACTUVTimelapseandWalkthrough.mp4

Sea Hunter launch 1Source: DARPA

Sea Hunter lauunch 2Source: DARPA

In the above photo, you can see on the bottom of the composite hull, just forward of the propeller shafts, what appears to be a hatch. I’m just speculating, but this may be the location of a retractable sonar housing, which is shown in the first and second pictures, above.

You can get another perspective of the launch and the subsequent preliminary underway trials in the Puget Sound in the DARPA video at the following link:

http://www.darpa.mil/attachments/ACTUVTimelapseandWalkthrough.mp4

During the speed run, Sea Hunter reached a top speed of 27 knots. Following the preliminary trials, Sea Hunter was christened on 7 April 2016. Now the vessel starts an operational test phase to be conducted jointly by DARPA and the Office of Naval Research (ONR). This phase is expected to run through September 2018.

DARPA reported that it expects an ACTUV vessel to cost about $15,000 – $20,000 per day to operate. In contrast, a manned destroyer costs about $700,000 per day to operate.

The autonomous ship "Sea Hunter", developed by DARPA, is shown docked in Portland, Oregon before its christening ceremonySource: DARPA

You can find more information on the ACTUV program on the DARPA website at the following link:

http://www.darpa.mil/news-events/2016-04-07

If ACTUV is successful in demonstrating the expected search and track capabilities against quiet submarines, it will become the bane of submarine commanders anywhere in the world. Imagine the frustration of a submarine commander who is unable to break the trail of an ACTUV vessel during peacetime. During a period of conflict, an ACTUV vessel may quickly become a target for the submarine being trailed. The Navy’s future conduct of operations may depend on having lots of ACTUV vessels.

28 July 2016 update: Sea Hunter ACTUV performance testing

On 1 May 2016, Sea Hunter arrived by barge in San Diego and then started initial performance trial in local waters.

ACTUV in San Diego BaySource: U.S. Navy

You can see a video of Sea Hunter in San Diego Bay at the following link:

https://news.usni.org/2016/05/04/video-navys-unmanned-sea-hunter-arrives-in-san-diego

On 26 July 2016, Leidos reported that it had completed initial performance trials in San Diego and that the ship met or surpassed all performance objectives for speed, maneuverability, stability, seakeeping, acceleration, deceleration and fuel consumption. These tests were the first milestone in the two-year test schedule.

Leidos indicated that upcoming tests will exercise the ship’s sensors and autonomy suite with the goals of demonstrating maritime collision regulations compliance capability and proof-of-concept for different Navy missions.

4 October 2018 update:  DARPA ACTUV program completed.  Sea Hunter testing and development is being continued by the Office of Naval Research

In January 2018, DARPA completed the ACTUV program and the Sea Hunter was transferred to the Office of Naval Research (ONR), which is continuing to operate the technology demonstration vessel under its Medium Displacement Unmanned Surface Vehicle (MDUSV) program.  You can read more about the transition of the DARPA program to ONR here:
 
 
It appears that ONR is less interested in the original ACTUV mission and more interested in a general-purpose “autonomous truck” that can be configured for a variety of missions while using the basic autonomy suite demonstrated on Sea Hunter.  In December 2017, ONR awarded Leidos a contract to build the hull structure for a second autonomous vessel that is expected to be an evolutionary development of the original Sea Hunter design.  You can read more about this ONR contract award here:
 

Echo Voyager Unmanned Underwater Vehicle (UUV)

Echo Explorer - front quarter viewSource: BoeingEcho Explorer - top openSource: Boeing

Echo Voyager is the third in a family of UUVs developed by Boeing’s Phantom Works. The first two are:

  • Echo Ranger (circa 2002): 18 feet (5.5 meters) long, 5 tons displacement; maximum depth 10,000 feet; maximum mission duration about 28 hours
  • Echo Seeker (circa 2015): 32 feet (9.8 meter) long; maximum depth 20,000 feet; maximum mission duration about 3 days

Both Echo Ranger and Echo Seeker are battery powered and require a supporting surface vessel for launch and recovery at sea and for recharging the batteries. They successfully have demonstrated the ability to conduct a variety of autonomous underwater operations and to navigate safely around obstacles.

Echo Voyager, unveiled by Boeing in Huntington Beach, CA on 10 March 2016, is a much different UUV. It is designed to deploy from a pier, autonomously conduct long-duration, long-distance missions and return by itself to its departure point or some other designated destination. Development of Echo Voyager was self-funded by Boeing.

Echo Voyager is a 50-ton displacement, 51 foot (15.5 meters) long UUV that is capable of diving to a depth of 11,000 feet (3,352 meters). It has a range of about 6,500 nautical miles (12,038 km), and is expected to be capable of autonomous operations for three months or more. The vessel is designed to accommodate various “payload sections” that can extend the length of the vessel up to a maximum of 81 feet (24.7 meters).

You can view a Boeing video on the Echo Voyager at the following link:

https://www.youtube.com/watch?v=L9vPxC-qucw

The propulsion system is a hybrid diesel-electric rechargeable system. Batteries power the main electric motor, enabling a maximum speed is about 8 knots. Electrically powered auxiliary thrusters can be used to precisely position the vessel at slow speed. When the batteries require recharging,

The propulsion system is a hybrid diesel-electric rechargeable system. Batteries power the main electric motor, enabling a maximum speed is about 8 knots. Electrically powered auxiliary thrusters can be used to precisely position the vessel at slow speed. When the batteries require recharging, Echo Voyager will rise toward the surface, extend a folding mast as shown in the following pictures, and operate the diesel engine with the mast serving as a snorkel. The mast also contains sensors and antennae for communications and satellite navigation.

Echo Explorer - mast extendingSource: screenshot from Boeing video at link aboveEcho Explorer - snorkelingSource: screenshot from Boeing video at link above

The following image, also from the Boeing video, shows deployment of a payload onto the seabed.Echo Explorer - emplacing on seabedSource: screenshot from Boeing video at link above

Initial sea trials off the California coast were conducted in mid-2016.

Boeing currently does not have a military customer for Echo Voyager, but foresees the following missions as being well-suited for this type of UUV:

  • Surface and subsurface intelligence, surveillance, and reconnaissance (ISR)
  • ASW search and barrier patrol
  • Submarine decoy
  • Critical infrastructure protection
  • Mine countermeasures
  • Weapons platform

Boeing also expects civilian applications for Echo Voyager in offshore oil and gas, marine engineering, hydrography and other scientific research.

4 October 2018 update:  Progress in Echo Voyager development

Echo Voyager is based at a Boeing facility in Huntington Beach, CA.  In June 2018, Boeing reported that Echo Voyager had returned to sea for a second round of testing.  You can read more on Echo Voyager current status and the Navy’s plans for future large UUVs here:

http://www.latimes.com/business/la-fi-boeing-echo-voyager-20180623-story.html

Echo Voyager operating near the surface with mast extended. Source.  Boeing

The Invisible Man may be Blind!

Peter Lobner

Metamaterials are a class of material engineered to produce properties that don’t occur naturally.

The first working demonstration of an “invisibility cloak” was achieved in 2006 at the Duke University Pratt School of Engineering using the complex metamaterial-based cloak shown below.

Duke 2006 metamaterial cloakSource: screenshot from YouTube link below.

The cloak deflected an incoming microwave beam around an object and reconstituted the wave fronts on the downstream side of the cloak with little distortion. To a downstream observer, the object inside the cloak would be hidden.

Effect of Duke metamaterial cloakSource: screenshot from YouTube link below.

You can view a video of this Duke invisibility cloak at the following link:

https://www.youtube.com/watch?v=Ja_fuZyHDuk

In a paper published in the 18 September 2015 issue of Science, researchers at UC Berkley reported creating an ultra-thin, metamaterial-based optical cloak that was successful in concealing a small scale, three-dimensional object. The abstract of this paper, “An ultrathin invisibility skin cloak for visible light”, by Ni et al., is reproduced below.

“Metamaterial-based optical cloaks have thus far used volumetric distribution of the material properties to gradually bend light and thereby obscure the cloaked region. Hence, they are bulky and hard to scale up and, more critically, typical carpet cloaks introduce unnecessary phase shifts in the reflected light, making the cloaks detectable. Here, we demonstrate experimentally an ultrathin invisibility skin cloak wrapped over an object. This skin cloak conceals a three-dimensional arbitrarily shaped object by complete restoration of the phase of the reflected light at 730-nanometer wavelength. The skin cloak comprises a metasurface with distributed phase shifts rerouting light and rendering the object invisible. In contrast to bulky cloaks with volumetric index variation, our device is only 80 nanometer (about one-ninth of the wavelength) thick and potentially scalable for hiding macroscopic objects.”

If you have a subscription to Science, you can read the full paper at the following link:

http://science.sciencemag.org/content/349/6254/1310

Eric Grundhauser writes on the Atlas Obscura website about an interesting quandary for users of an optical invisibility cloak.

“Since your vision is based on the light rays that enter your eyes, if all of these rays were diverted around someone under an invisibility cloak, the effect would be like being covered in a thick blanket. Total darkness.”

So, the Invisible Man is likely to be less of a threat than he appeared in the movies. You should be able to locate him as he stumbles around a room, bumping into everything he can’t see at visible light frequencies. However, he may be able to navigate and sense his adversary at other electromagnetic and/or audio frequencies that are less affected by his particular invisibility cloak.

You can read Eric Grundhauser’s complete article, “The Problem With Invisibility is Blindness,” at the following link:

http://www.atlasobscura.com/articles/the-problem-with-invisibility-is-the-blindness?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

Recognizing this inconvenient aspect of an invisibility cloak, researchers from Yunnan University, China, have been investigating the concept of a “reciprocal cloak,” which they describe as, “an intriguing metamaterial device, in which a hidden antenna or a sensor can receive electromagnetic radiation from the outside but its presence will not be detected.” One approach is called an “open cloak,” which includes a means to, “open a window on the surface of a cloak, so that exchanging information and matter with the outside can be achieved.”

You can read the complete 2011 paper, “Electromagnetic Reciprocal Cloak with Only Axial Material Parameter Spatially Variant,” by Yang et al., at the following link:

http://www.hindawi.com/journals/ijap/2012/153086/

An all-aspect, broadband (wide range of operational frequencies) invisibility cloak is likely to remain in the realm of fantasy and science fiction. A 10 March 2016 article entitled, “Invisibility cloaks can never hide objects from all observers,” by Lisa Zyga, explains:

“….limitations imposed by special relativity mean that the best invisibility cloaks would only be able to render objects partially transparent because they would suffer from obvious visible distortions due to motion. The result would be less Harry Potter and more like the translucent creatures in the 1987 movie Predator.”

You can read the complete article at the following link:

http://phys.org/news/2016-03-invisibility-cloaks.html

Further complications are encountered when applying an invisibility cloak to a very high-speed vessel. A 28 January 2016 article, also by Lisa Zyga, explains:

“When the cloak is moving at high speeds with respect to an observer, relativistic effects shift the frequency of the light arriving at the cloak so that the light is no longer at the operational frequency. In addition, the light emerging from the cloak undergoes a change in direction that produces a further frequency shift, causing further image distortions for a stationary observer watching the cloak zoom by.”

You can read the complete article, “Fast-moving invisibility cloaks become visible,” at the following link:

http://phys.org/news/2016-01-fast-moving-invisibility-cloaks-visible.html

So, there you have it! The Invisible Man may be blind, the Predator’s cloak seems credible even when he’s moving, and a really fast-moving cloaked Klingon battlecruiser is vulnerable to detection.

Internet Archive: a Great Access Point to Many Web Resources and Vintage Science Fiction

Peter Lobner

Internet Archive is a non-profit library of millions of free books, audio books, movies, music, software and more, which you can access at the following link:

https://archive.org

It’s hard to navigate this site to find out what’s there. The home page presents icons for the “Top Collections in the Archive,” but you have to scroll through many pages to view hundreds of these icons, each of which links to a corresponding collection. Interesting collections I found include:

  • American Libraries
  • The Library of Congress
  • The LibriVox Free Audiobook Collection
  • Software Library: MS-DOS Games
  • Computer Magazine Archives
  • Television Archive
  • Grateful Dead
  • Metropolitan Museum of Art Gallery Images
  • Kahn Academy

Archive icons

There’s a Pulp Magazine Archive at the following link:

https://archive.org/details/pulpmagazinearchive

Once there, select Topic: “science fiction”, or use the following direct link:

https://archive.org/details/pulpmagazinearchive?and%5B%5D=subject%3A%22science+fiction%22&sort=-downloads&page=2

Then you’re on your way to libraries of vintage science fiction.  Below are results from my own searches.

Galaxy Science Fiction:

Galaxy Science Fiction was an American digest-size science fiction magazine published from 1950 to 1980. It was founded by an Italian company, World Editions, to help it break in to the American market. World Editions hired as editor H. L. Gold, who rapidly made Galaxy the leading science fiction magazine of its time, focusing on stories about social issues rather than technology.

The Galaxy Science Fiction archive, with 361 results, is located at the following link:

https://archive.org/details/galaxymagazine

Galaxy SF archive pic

If:

If was an American science fiction magazine launched in March 1952 by Quinn Publications. The magazine was moderately successful, though it was never regarded as one of the first rank of science fiction magazines. It achieved its greatest success under editor Frederik Pohl, winning the Hugo Award three years running from 1966 to 1968. If was merged into Galaxy Science Fiction after the December 1974 issue, its 175th issue overall.

The If science fiction archive, with 176 results, is located at the following link:

https://archive.org/details/ifmagazine

If SF archive pic

Amazing Stories

 Amazing Stories was an American science fiction magazine launched in April 1926 as the first magazine devoted solely to science fiction. Amazing Stories was published, with some interruptions, for almost 80 years. Although Amazing Stories was not considered an influential magazine in the genre, it was nominated for the Hugo award three times in the 1970s. It ceased publication in 2005

The Amazing Stories archive, with 160 results, is located at the following link:

https://archive.org/details/pulpmagazinearchive?and%5B%5D=amazing+stories&sort=-downloads&page=2

Amazing SF archive pic

The Skylark of Space is one of the earliest novels of interstellar travel and is considered a classic of pulp science fiction. Originally serialized in 1928, it is available as a 9-hour audiobook at the following link:

https://archive.org/details/skylark_space_2_1012_librivox

Skylark of Space

Good luck navigating the Internet Archive website. I hope you find some interesting things.

Is Arthur C. Clarke’s 1953 Short Story “Superiority” a Parable for Today?

Peter Lobner

Sir Arthur Charles Clarke was a British science fiction writer, science writer and futurist who became recognized worldwide for his great many short stories and novels, which have captivated readers since the early 1950s. You might know him best as the author of “Childhood’s End” and “2001: A Space Odyssey.

Sir-Arthur-C.-Clarke  Source: http://amazingstoriesmag.com

In the short story “Superiority,” which was published in his 1953 story collection, Expedition to Earth, Clarke describes a spacefaring federation of planets involved in a protracted war with a distant adversary, with both sides using comparable weaponry. The allure of advanced weaponry and “a revolution in warfare” led one side to allocate their resources away from traditional weaponry and invest instead in fewer vessels with advanced weapons systems that were sure to turn the tide of the war: the Sphere of Annihilation, the Battle Analyzer, and the Exponential Field.

As you might guess, the outcome was somewhat different, because:

  • The new systems was “almost perfected in the laboratory”
  • There were unforeseen complications and delays during development of the operational systems
  • There were unforeseen support and training requirements that compromised the operational use of the new systems and introduced new vulnerabilities
  • The new systems failed to deliver the expected “force multiplier” effect
  • There were unforeseen consequences from the operational use of some new weaponry

The adversary won the war with a numerically superior fleet using obsolete weapons based on inferior science.

Take time now to read this short story at the following link:

http://www.mayofamily.com/RLM/txt_Clarke_Superiority.html

Bill Sweetman has written an interesting commentary on Arthur C. Clarke’s “Superiority,“ in the 14 March 2016 issue of Aviation Week and Space Technology. His commentary, entitled, “Timeless Insight Into Why Military Programs Go Wrong – The history of defense program failures was foretold in 1953,” finds stunning parallels between the story line in “Superiority” and the history of many real-world defense programs from WW II to the present day. You can read Bill Sweetman’s commentary at the following link:

http://aviationweek.com/defense/opinion-timeless-insight-why-military-programs-go-wrong

Considering SAIC’s long-term, significant role in supporting many U.S. advanced war-fighting and intelligence system programs, many of us were the real-world analogs of the thousands of scientists, engineers, and managers working for Professor-General Norden, the Chief of the Research Staff, in “Superiority.” In Bill Sweetman’s commentary, he asks, “Is ‘Superiority’ a parable?” Based on your own experience at SAIC and elsewhere in the military – industrial complex, what do you think?

If you still haven’t read “Superiority,” please do it now. It’s worth your time.

Science is not Driving the Climate Change Debate

Peter Lobner

Thanks to Paul Fleming for sending me a thought provoking, well-documented paper entitled, “Global Warming and the Irrelevance of Science,” which was posted online on 17 February 2016 by Richard S. Lindzen, Alfred P. Sloan Professor of Atmospheric Sciences (Emeritus) Massachusetts Institute of Technology. This paper is the text of a lecture delivered on 20 August 2015 to the 48th Session: Erice International Seminars on Planetary Emergencies.

The basic premise of this paper is that, in many fields such as climate research, governments have a monopoly on the support of scientific research, and, through government-funded research contracts, influence the outcome of the very research being funded.

Lindzen starts his paper by observing that,

“Unfortunately, as anticipated by Eisenhower in his farewell speech from January 17, 1961 (the one that also warned of the military-industrial complex), ‘Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity.’

Rather, the powers that be invent the narrative independently of the views of even cooperating scientists. It is, in this sense, that the science becomes irrelevant.”

Lindzen uses the term “iron triangle” to describe this closed-loop vicious cycle:

  • Vertex 1: Scientists perform research and make meaningless or ambiguous statements about the research (IPCC WG1)
  • Vertex 2: Advocates and media ‘translate’ these statements into alarmist declarations [IPCC WG2 (impacts) & WG3 (mitigation), some politicians]
  • Vertex 3: Politicians respond to alarm by feeding more money to the scientists in the first vertex

The net result is poor environmental decision-making that is not supportable by credible climate science. On this matter, Lindzen notes:

“The situation may have been best summarized by Mike Hulme, director of the Tyndall Centre at the University of East Anglia (a center of concern for global warming): “To state that climate change will be ‘catastrophic’ hides a cascade of value-laden assumptions, which do not emerge from empirical or theoretical science.”

Lindzen characterized the following three different narratives related to the global warming debate:

  • Narrative 1 – IPCC WG1:
    • Broadly supportive of the proposition that increasing greenhouse gas concentrations are a serious concern
    • Relatively open about the uncertainties and even contradictions associated with this position
    • Public pronouncements tend to be vague with ample room for denial, carefully avoiding catastrophist hyperbole while also avoiding outright rejection of such hyperbole
  • Narrative 2 – Skeptics:
    • Regard the fact that virtually all models ‘run hot’ (i.e., their projections for the most part greatly exceed observed warming) as strongly supporting the case for low climate sensitivity
    • Generally believe in testing the physics underlying the positive feedbacks in sensitive models rather than averaging models
    • Much more open to the numerous known causes of climate change (including long period ocean circulations, solar variability, impacts of ice, etc.), and do not regard CO2 as the climate’s ultimate ‘control knob’
    • Openly oppose catastrophism
  • Narrative 3 – Political promoters of climate alarm (including IPCC WG2 & WG3, many environmental NGOs and mass media)
    • Emphasize alleged consequences of the worst case scenarios presented by WG1
    • Claim virtually unanimous support
    • It is this narrative for which the science is largely irrelevant.

Lindzen notes that, “Unfortunately, for most people, the third narrative is all they will see.“

You can read Richard S. Lindzen’s complete paper at the following link:

http://euanmearns.com/global-warming-and-the-irrelevance-of-science/

Thanks also to Mike Spaeth for sending me the following link to an informative document entitled, “A Primer on Carbon Dioxide and Climate,” prepared by a recently formed organization known as the CO2 Coalition.

http://co2coalition.org/primer-carbon-dioxide-climate/

The CO2 Coalition, formed in 2015, represents itself as, “a new and independent, non-profit organization that seeks to engage thought leaders, policy makers, and the public in an informed, dispassionate discussion of how our planet will be affected by CO2 released from the combustion of fossil fuel.” Hopefully, they can help make some headway with the mass media, general public, and politicians that currently are entrenched in Narrative 3. Even cartoonists know that this will be an uphill battle.

Research & critical thinking

Source: http://www.gocomics.com/nonsequitur/2016/02/16

Simulating Extreme Spacetimes

Peter Lobner

Thanks to Dave Groce for sending me the following link to the Caltech-Cornell Numerical Relativity collaboration; Simulating eXtreme Spacetimes (SXS):

http://www.black-holes.org

Caltech SXSSource: SXS

From the actual website (not the image above), click on the yellow “Admit One” ticket and you’re on your way.

Under the “Movies” tab, you’ll find many video simulations that help visualizes a range of interactions between two black holes and between a black hole and a neutron star. Following is a direct link:

http://www.black-holes.org/explore/movies

A movie visualizing GW150914, the first ever gravitational wave detection on 14 September 2015, is at the following SXS link:

https://www.black-holes.org/gw150914

At the above link, you also can listen to the sound of the GW150914 “in-spiral” event (two black holes spiraling in on each other).  You can read more about the detection of GW150914 in my 11 February 2016 post.

On the “Sounds” tab on the SXS website, you’ll find that different types of major cosmic events are expected to emit gravitational waves with waveforms that will help characterize the original event. You can listen to the expected sounds from a variety of extreme cosmic events at the following SXS link:

http://www.black-holes.org/explore/sounds

Have fun exploring SXS.

Synthetic Aperture Radar (SAR) and Inverse SAR (ISAR) Enable an Amazing Range of Remote Sensing Applications

Peter Lobner

SAR Basics

Synthetic Aperture Radar (SAR) is an imaging radar that operates at microwave frequencies and can “see” through clouds, smoke and foliage to reveal detailed images of the surface below in all weather conditions. Below is a SAR image superimposed on an optical image with clouds, showing how a SAR image can reveal surface details that cannot be seen in the optical image.

Example SAR imageSource: Cassidian radar, Eurimage optical

SAR systems usually are carried on airborne or space-based platforms, including manned aircraft, drones, and military and civilian satellites. Doppler shifts from the motion of the radar relative to the ground are used to electronically synthesize a longer antenna, where the synthetic length (L) of the aperture is equal to: L = v x t, where “v” is the relative velocity of the platform and “t” is the time period of observation. Depending on the altitude of the platform, “L” can be quite long. The time-multiplexed return signals from the radar antenna are electronically recombined to produce the desired images in real-time or post-processed later.

SAR principle

Source: Christian Wolff, http://www.radartutorial.eu/20.airborne/pic/sar_principle.print.png

This principle of SAR operation was first identified in 1951 by Carl Wiley and patented in 1954 as “Simultaneous Buildup Doppler.”

SAR Applications

There are many SAR applications, so I’ll just highlight a few.

Boeing E-8 JSTARS: The Joint Surveillance Target Attack Radar System is an airborne battle management, command and control, intelligence, surveillance and reconnaissance platform, the prototypes of which were first deployed by the U.S. Air Force during the 1991 Gulf War (Operation Desert Storm). The E-8 platform is a modified Boeing 707 with a 27 foot (8 meter) long, canoe-shaped radome under the forward fuselage that houses a 24 foot (7.3 meters) long, side-looking, multi-mode, phased array antenna that includes a SAR mode of operation. The USAF reports that this radar has a field of view of up to 120-degrees, covering nearly 19,305 square miles (50,000 square kilometers).

E-8 JSTARSSource: USAF

Lockheed SR-71: This Mach 3 high-altitude reconnaissance jet carried the Advanced Synthetic Aperture Radar System (ASARS-1) in its nose. ASARS-1 had a claimed 1 inch resolution in spot mode at a range of 25 to 85 nautical miles either side of the flight path.  This SAR also could map 20 to 100 nautical mile swaths on either side of the aircraft with lesser resolution.

SR-71Source: http://www.wvi.com/~sr71webmaster/sr_sensors_pg2.htm

Northrop RQ-4 Global Hawk: This is a large, multi-purpose, unmanned aerial vehicle (UAV) that can simultaneously carry out electro-optical, infrared, and synthetic aperture radar surveillance as well as high and low band signal intelligence gathering.

Global HawkSource: USAF

Below is a representative RQ-4 2-D SAR image that has been highlighted to show passable and impassable roads after severe hurricane damage in Haiti. This is an example of how SAR data can be used to support emergency management.

Global Hawk Haiti post-hurricane image123-F-0000X-103Source: USAF

NASA Space Shuttle: The Shuttle Radar Topography Mission (SRTM) used the Space-borne Imaging Radar (SIR-C) and X-Band Synthetic Aperture Radar (X-SAR) to map 140 mile (225 kilometer) wide swaths, imaging most of Earth’s land surface between 60 degrees north and 56 degrees south latitude. Radar antennae were mounted in the Space Shuttle’s cargo bay, and at the end of a deployable 60 meter mast that formed a long-baseline interferometer. The interferometric SAR data was used to generate very accurate 3-D surface profile maps of the terrain.

Shuttle STRMSource: NASA / Jet Propulsion Laboratory

An example of SRTM image quality is shown in the following X-SAR false-color digital elevation map of Mt. Cotopaxi in Ecuador.

Shuttle STRM imageSource: NASA / Jet Propulsion Laboratory

You can find more information on SRTM at the following link:

https://directory.eoportal.org/web/eoportal/satellite-missions/s/srtm

ESA’s Sentinel satellites: Refer to my 4 May 2015 post, “What Satellite Data Tell Us About the Earthquake in Nepal,” for information on how the European Space Agency (ESA) assisted earthquake response by rapidly generating a post-earthquake 3-D ground displacement map of Nepal using SAR data from multiple orbits (i.e., pre- and post-earthquake) of the Sentinel-1A satellite.  You can find more information on the ESA Sentinel SAR platform at the following link:

http://www.esa.int/Our_Activities/Observing_the_Earth/Copernicus/Sentinel-1/Introducing_Sentinel-1

You will find more general information on space-based SAR remote sensing applications, including many high-resolution images, in a 2013 European Space Agency (ESA) presentation, “Synthetic Aperture Radar (SAR): Principles and Applications”, by Alberto Moreira, at the following link:

https://earth.esa.int/documents/10174/642943/6-LTC2013-SAR-Moreira.pdf

ISAR Basics

ISAR technology uses the relative movement of the target rather than the emitter to create the synthetic aperture. The ISAR antenna can be mounted in a airborne platform. Alternatively, ISAR also can be used by one or more ground-based antennae to generate a 2-D or 3-D radar image of an object moving within the field of view.

ISAR Applications

Maritime surveillance: Maritime surveillance aircraft commonly use ISAR systems to detect, image and classify surface ships and other objects in all weather conditions. Because of different radar reflection characteristics of the sea, the hull, superstructure, and masts as the vessel moves on the surface of the sea, vessels usually stand out in ISAR images. There can be enough radar information derived from ship motion, including pitching and rolling, to allow the ISAR operator to manually or automatically determine the type of vessel being observed. The U.S. Navy’s new P-8 Poseidon patrol aircraft carry the AN/APY-10 multi-mode radar system that includes both SAR and ISAR modes of operation.

The principles behind ship classification is described in detail in the 1993 MIT paper, “An Automatic Ship Classification System for ISAR Imagery,” by M. Menon, E. Boudreau and P. Kolodzy, which you can download at the following link:

https://www.ll.mit.edu/publications/journal/pdf/vol06_no2/6.2.4.shipclassification.pdf

You can see in the following example ISAR image of a vessel at sea that vessel classification may not be obvious to the casual observer. I can see that an automated vessel classification system is very useful.

Ship ISAR image

Source: Blanco-del-Campo, A. et al., http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5595482&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F7361%2F5638351%2F05595482.pdf%3Farnumber%3D5595482

Imaging Objects in Space: Another ISAR (also called “delayed Doppler”) application is the use of one or more large radio telescopes to generate radar images of objects in space at very long ranges. The process for accomplishing this was described in a 1960 MIT Lincoln Laboratory paper, “Signal Processing for Radar Astronomy,” by R. Price and P.E. Green.

Currently, there are two powerful ground-based radars in the world capable of investigating solar system objects: the National Aeronautics and Space Administration (NASA) Goldstone Solar System Radar (GSSR) in California and the National Science Foundation (NSF) Arecibo Observatory in Puerto Rico. News releases on China’s new FAST radio telescope have not revealed if it also will be able to operate as a planetary radar (see my 18 February 2016 post).

The 230 foot (70 meter) GSSR has an 8.6 GHz (X-band) radar transmitter powered by two 250 kW klystrons. You can find details on GSSR and the techniques used for imaging space objects in the article, “Goldstone Solar System Radar Observatory: Earth-Based Planetary Mission Support and Unique Science Results,” which you can download at the following link:

http://echo.jpl.nasa.gov/asteroids/Slade_Benner_Silva_IEEE_Proceedings.pdf

The 1,000 foot (305 meter) Arecibo Observatory has a 2.38 GHz (S-band) radar transmitter, originally rated at 420 kW when it was installed in 1974, and upgraded in 1997 to 1 MW along with other significant upgrades to improve radio telescope and planetary radar performance. You will find details on the design and upgrades of Arecibo at the following link:

http://www.astro.wisc.edu/~sstanimi/Students/daltschuler_2.pdf

The following examples demonstrate the capabilities of Arecibo Observatory to image small bodies in the solar system.

  • In 1999, this radar imaged the Near-Earth Asteroid 1999 JM 8 at a distance of about 5.6 million miles (9 million km) from Earth. The ISAR images of this 1.9 mile 3-km) sized object had a resolution of about 49 feet (15 meters).
  • In November 1999, Arecibo Observatory imaged the tumbling Main-Belt Asteroid 216 Kleopatra. The resulting ISAR images, which made the cover of Science magazine, showed a dumbbell-shaped object with an approximate length of 134.8 miles (217 kilometers) and varying diameters up to 58.4 miles (94 kilometers).

Asteroid image  Source: Science

More details on the use of Arecibo Observatory to image planets and other bodies in the solar system can be found at the following link:

http://www.naic.edu/general/index.php?option=com_content&view=article&id=139&Itemid=474

The NASA / Jet Propulsion Laboratory Asteroid Radar Research website also contains information on the use of radar to map asteroids and includes many examples of asteroid radar images. Access this website at the following link:

http://echo.jpl.nasa.gov

Miniaturization

In recent years, SAR units have become smaller and more capable as hardware is miniaturized and better integrated. For example, Utah-based Barnard Microsystems offers a miniature SAR for use in lightweight UAVs such as the Boeing ScanEagle. The firm claimed that their two-pound “NanoSAR” radar, shown below, weighed one-tenth as much as the smallest standard SAR (typically 30 – 200 pounds; 13.6 – 90.7 kg) at the time it was announced in March 2008. Because of power limits dictated by the radar circuit boards and power supply limitations on small UAVs, the NanoSAR has a relatively short range and is intended for tactical use on UAVs flying at a typical ScanEagle UAV operational altitude of about 16,000 feet.

Barnard NanoSARSource: Barnard Microsystems

ScanEagle_UAVScanEagle UAV. Source: U.S. Marine Corps.

Nanyang Technological University, Singapore (NTU Singapore) recently announced that its scientists had developed a miniaturized SAR on a chip, which will allow SAR systems to be made a hundred times smaller than current ones.

?????????????????????????????????????????????????????????Source: NTU

NTU reports:

“The single-chip SAR transmitter/receiver is less than 10 sq. mm (0.015 sq. in.) in size, uses less than 200 milliwatts of electrical power and has a resolution of 20 cm (8 in.) or better. When packaged into a 3 X 4 X 5-cm (0.9 X 1.2 X 1.5 in.) module, the system weighs less than 100 grams (3.5 oz.), making it suitable for use in micro-UAVs and small satellites.”

NTU estimates that it will be 3 to 6 years before the chip is ready for commercial use. You can read the 29 February 2016 press release from NTU at the following link:

http://media.ntu.edu.sg/NewsReleases/Pages/newsdetail.aspx?news=c7aa67e7-c5ab-43ae-bbb3-b9105a0cd880

With such a small and hopefully low cost SAR that can be integrated with low-cost UAVs, I’m sure we’ll soon see many new and useful radar imaging applications.

Remarkable Multispectral View of Our Milky Way Galaxy

Peter Lobner, updated 18 August 2023

Moody Blues cover - In search of the lost chordAlbum Album cover art credit: Deram Records

Some of you may recall the following lyrics from the 1968 Moody Blues song, “The Word,” by Graeme, Edge, from the album “In Search of the Lost Chord”:

This garden universe vibrates complete

Some, we get a sound so sweet

 Vibrations reach on up to become light

And then through gamma, out of sight

Between the eyes and ears there lie

The sounds of color and the light of a sigh

And to hear the sun, what a thing to believe

But it’s all around if we could but perceive

 To know ultraviolet, infrared and X-rays

Beauty to find in so many ways

On 24 February 2016, the European Southern Observatory (ESO) Consortium announced that it has completed the ATLASGAL Survey of the Milky Way. The survey mapped the entire galactic plane visible from the southern hemisphere at sub-millimeter wavelengths, between infrared light and radio waves, using the Atacama Pathfinder EXperiment (APEX) telescope located at 5,100 meters (16,732 ft.) above sea level in Chile’s Atacama region. The southern sky is particularly important because it includes the galactic center of our Milky Way. The Milky Way in the northern sky has already been mapped by the James Clerk Maxwell Telescope, which is a sub-millimeter wavelength telescope at the Mauna Kea Observatory in Hawaii.

The new ATLASGAL maps cover an area of sky 140 degrees long and 3 degrees wide. ESO stated that these are the sharpest maps yet made, and they complement those from other land-based and space-based observatories. The principal space-based observatories are the following:

  • European Space Agency’s (ESA) Plank satellite: Mission on-going, mapping anisotropies of the cosmic microwave background at microwave and infrared frequencies.
  • ESA’s Herschel Space Observatory: Mission on-going, conducting sky surveys in the far-infrared and sub-millimeter frequencies.
  • National Aeronautics and Space Administration (NASA) Spitzer Space Telescope: Mission on-going, conducting infrared observations and mapping as described in my 1 April 2015 post.
  • NASA’s Hubble Space Telescope: Mission on-going, observing and mapping at ultraviolet, optical, and infrared frequencies.
  • NASA’s Chandra X-Ray Observatory: Mission on-going, observing and mapping X-ray sources.
  • NASA’s Compton Gamma Ray Observatory: Mission ended in 2000. Observed and mapped gamma ray and x-ray sources.

ESO reported that the combination of Planck and APEX data allowed astronomers to detect emission spread over a larger area of sky and to estimate from it the fraction of dense gas in the inner galaxy. The ATLASGAL data were also used to create a complete census of cold and massive clouds where new generations of stars are forming.

You can read the ESO press release at the following link:

https://www.eso.org/public/news/eso1606/

Below is a composite ESO photograph that shows the same central region of the Milky Way observed at different wavelengths.

ESO Multispectral view of Milky WaySource: ESO/ATLASGAL consortium/NASA/GLIMPSE consortium/VVV Survey/ESA/Planck/D. Minniti/S. Guisard. Acknowledgement: Ignacio Toledo, Martin Kornmesser

  • The top panel shows compact sources of sub-millimeter radiation detected by APEX as part of the ATLASGAL survey, combined with complementary data from ESA’s Planck satellite, to capture more extended features.
  • The second panel shows the same region as seen in shorter, infrared wavelengths by the NASA Spitzer Space Telescope
  • The third panel shows the same part of sky again at even shorter wavelengths, the near-infrared, as seen by ESO’s VISTA infrared survey telescope at the Paranal Observatory in Chile. Regions appearing as dark dust tendrils in the third panel show up brightly in the ATLASGAL view (top panel).
  • The bottom panel shows the more familiar view in visible light, where most of the more distant structures are hidden from view

NASA’s Goddard Space Flight Center also  created a multispectral view of the Milky Way, which  is shown in the following composite photograph of the same central region of the Milky Way observed at different wavelengths.

NASA Goddard multispectralSource: NASA Goddard Space Flight Center

Starting from the top, the ten panels in the NASA image cover the following wavelengths.

  • Radio frequency (408 MHz)
  • Atomic hydrogen
  • Radio frequency (2.5 GHz)
  • Molecular hydrogen
  • Infrared
  • Mid-infrared
  • Near-infrared
  • Optical
  • X-ray
  • Gamma ray

The Moody Blues song, “The Word,” ends with the following lyrics:

 Two notes of the chord, that’s our full scope

But to reach the chord is our life’s hope

And to name the chord is important to some

So they give it a word, and the word is “Om”

While “Om” (pronounced or hummed “ahh-ummmm”) traditionally is a sacred mantra of Hindu, Jain and Buddhist religions, it also may be the mantra of astronomers as they unravel new secrets of the Milky Way and, more broadly, the Universe. I suspect that completing the ATLASGAL Survey of the Milky Way was an “Om” moment for the many participants in the ESO Consortium effort.

For more information

VBB-3, the World’s Most Powerful Electric Car, will Challenge the Land Speed Record in 2016

Peter Lobner

Updated 2 January 2017

Venturi Buckeye Bullet-3 (VBB-3) is an all-electric, four wheel drive, land speed record (LSR) car that has been designed to exceed 400 mph (643.7 km/h). The organizations involved in this project are:

  • Venturi Automobiles:

This Monaco-based company is a leader in the field of high performance electric vehicles. Read more at the Venturi website at the following link:

http://en.venturi.fr/challenges/world-speed-records

  • Ohio State University (OSU) Center for Automotive Research (CAR):

OSU’s CAR has been engaged in all-electric LSR development and testing since 2000. On 3 October 2004 at the Bonneville Salt Flats in Utah, the original nickel-metal hydride (NiMH) battery-powered Buckeye Bullet reached a top speed of 321.834 mph (517.942 km/h).

In an on-going program known as Mission 01, started in 2009, OSU partnered with Venturi to develop, test, and conduct the land speed record runs of the hydrogen fuel cell-powered VBB-2, the battery-powered VBB-2.5, and the more powerful battery-powered VBB-3.  Read more at the OSU / CAR website at following link:

https://car.osu.edu/search/node/VBB-3

 The Venturi – OSU team’s accomplishments to date are:

  • 2009:  The team’s first world land speed record was achieved on the Bonneville Salt Flats with hydrogen fuel cell-powered VBB-2 at 303 mph (487 km/h).
  •  2010:  The team returned to the salt flats with the 700 hp lithium-ion battery powered VBB-2.5 which set another world record at 307 mph (495 km/h); with a top speed at 320 mph (515 km/h).
  •  2013:  The 3,000 hp lithium iron phosphate battery-powered VBB-3 was unveiled. Due to the flooding of the Bonneville Salt Flats, the FIA and the organizers of the world speed records program cancelled the 2013 competition.
  •  2014Poor track conditions at Bonneville persisted after flooding from a summer storm. Abbreviated test runs by VBB-3 yielded a world record in its category (electric vehicle over 3.5 metric tons) with an average speed of 212 mph (341 km/h) and a top speed of 270 mph (435 km/h).
  •  2015:  Poor track conditions at Bonneville persisted after flooding from a summer storm. Abbreviated test runs by VBB-3 yielded a world record in its category (electric vehicle over 3.5 metric tons) with an average speed of 212 mph (341 km/h) and a top speed of 270 mph (435 km/h).

You will find a comparison of the VBB-2, VBB-2.5 and VBB-3 vehicles at the following link:

http://en.vbb3.venturi.fr/about/the-car

VBB-3 has a 37.2 ft. (11.35 meter) long, slender, space frame chassis that houses eight battery packs with a total of 2,000 cells, two 1,500 hp AC induction motors developed by Venturi for driving the front and rear wheels, a coolant system for the power electronics, disc brakes and a braking parachute, and a small cockpit for the driver. The basic internal arrangement of these components in the VBB-3 chassis is shown in the following diagram.

VBB-3 internalSource: Venturi

You can see a short video of a test drive of VBB-3 without its external skin at the following link:

http://en.vbb3.venturi.fr

The exterior aerodynamic carbon fiber shell was designed with the aid of the OSU Supercomputer Center to minimize vehicle drag and lift.

VBB-3 skinSource: Venturi

The completed VBB-3 with members of the project team is shown below.

VBB-3 completeSource: Venturi

A good video showing the 2010 VBB-2.5 record run and a 2014 test run of VBB-3 is at the following link:

https://www.youtube.com/watch?v=KLn07Y-t1Xc&ebc=ANyPxKqkVxPKQWnYXzUemRbE5WWlRIJUbaXA-UN6XPNoiDZG1O4NsFq8RE08QlrfdbfkxKmE32MEf5g2Qw0_WQbFXBvKYz9qwg

VBB-3 currently is being prepared in the OSU / CAR workshop in Columbus, Ohio, for another attempt at the land speed record in summer 2016. A team of about 25 engineers and students are planning to be at the Bonneville Salt Flats in summer 2016 with the goal of surpassing 372 mph (600 km/h).

You can subscribe to Venturi new releases on VBB-3 at the following link:

http://en.venturi.fr/news/the-vbb-3-gets-ready

VBB-3 at BonnevilleSource: Venturi

Update 2 January 2017: VBB-3 sets new EV land speed record

On 19 September 2016, VBB-3 set an electric vehicle (Category A Group VIII Class 8) land-speed record of 341.4 mph (549 kph), during a two-way run within one hour on the Bonneville salt flats in Utah. You can read the OSU announcement at the following link:

https://news.osu.edu/news/2016/09/21/ohio-states-all-electric-venturi-buckeye-bullet-3-sets-new-landspeed-record/

You also can watch a short video of VBB-3’s record run at the following link:

https://www.youtube.com/watch?v=rIqT4qLtGcY

Certification of this EV speed record by the Federation Internationale de l’Automobile’s (FIA) is still pending.

The Venturi-OSU team believes VBB-3 has the capability to achieve 435 mph (700 kph) in the right conditions, so we can expect more record attempts in the future.

Dispatchable Power from Energy Storage Systems Help Maintain Grid Stability

Peter Lobner

On 3 March 2015, Mitsubishi Electric Corporation announced the delivery of the world’s largest energy storage system, which has a rated output of 50 MW and a storage capacity of 300 MWh. The battery-based system is installed in Japan at Kyushu Electric Power Company’s Buzen Power Plant as part of a pilot project to demonstrate the use of high-capacity energy storage systems to balance supply and demand on a grid that has significant, weather-dependent (intermittent), renewable power sources (i.e., solar and/or wind turbine generators). This system offers energy-storage and dispatch capabilities similar to those of a pumped hydro facility. You can read the Mitsubishi press release at the following link:

http://www.mitsubishielectric.com/news/2016/pdf/0303-b.pdf

The energy storage system and associated electrical substation installation at Buzen Power Plant are shown below. The energy storage system is comprised of 63 4-module units, where each module contains sodium-sulfur (NaS) batteries with a rated output of 200 kW. The modules are double stacked to reduce the facility’s footprint and cost.

Buzen Power Plant - JapanSource: Mitsubishi

The following simplified diagram shows how the Mitsubishi grid supervisory control and data acquisition (SCADA) system matches supply with variable demand on a grid with three dispatchable energy sources (thermal, pumped hydro and battery storage) and one non-dispatchable (intermittent) energy source (solar photovoltaic, PV). As demand varies through the day, thermal power plants can maneuver (within limits) to meet increasing load demand, supplemented by pumped hydro and battery storage to meet peak demands and to respond to the short-term variability of power from PV generators. A short-term power excess is used to recharge the batteries. Pumped hydro typically is recharged over night, when the system load demand is lower.

Mitsubishi SCADA

Above diagram: Mitsubishi BLEnDer® RE Battery SCADA System (Source: Mitsubishi)

Battery storage is only one of several technologies available for grid-connected energy storage systems. You can read about the many other alternatives in the December 2013 Department of Energy (DOE) report, “Grid Energy Storage”, which you can download at the following link:

http://www.sandia.gov/ess/docs/other/Grid_Energy_Storage_Dec_2013.pdf

This 2013 report includes the following figure, which shows the rated power of U.S. grid storage projects, including announced projects.

US 2013 grid  storage projectsSource: DOE

As you can see, battery storage systems, such as the Mitsubishi system at Buzen Power Plant, comprise only a small fraction of grid-connected energy storage systems, which currently are dominated in the U.S. by pumped hydro systems. DOE reported that, as of August 2013, there were 202 energy storage systems deployed in the U.S. with a total installed power rating of 24.6 GW. Energy storage capacity (i.e., GWh) was not stated. In contrast, total U.S. installed generating capacity in 2013 was over 1,000 GW, so fully-charged storage systems can support about 2.4% of the nation’s load demand for a short period of time.

Among DOE’s 2013 strategic goals for grid energy storage systems are the following cost goals:

  • Near-term energy storage systems:
    • System capital cost: < $1,750/kW; < $250/kWh
    • Levelized cost: < 20¢ / kWh / cycle
    • System efficiency: > 75%
    • Cycle life: > 4,000 cycles
  • Long-term energy storage systems:
    • System capital cost: < $1,250/kW; < $150/kWh
    • Levelized cost: < 10¢ / kWh / cycle
    • System efficiency: > 80%
    • Cycle life: > 5,000 cycles

Using the DOE near-term cost goals, we can estimate the cost of the energy storage system at the Buzen Power Plant to be in the range from $75 – 87.5 million. DOE estimated that the storage devices contributed 30 – 40% of the cost of an energy storage system.  That becomes a recurring operating cost when the storage devices reach their cycle life limit and need to be replaced.

The Energy Information Agency (EIA) defines capacity factor as the ratio of a generator’s actual generation over a specified period of time to its maximum possible generation over that same period of time. EIA reported the following installed generating capacities and capacity factors for U.S. wind and solar generators in 2015:

US renewable power 2015

Currently there are 86 GW of intermittent power sources connected to the U.S. grid and that total is growing year-on-year. As shown below, EIA expects 28% growth in solar generation and 16% growth in wind generation in the U.S. in 2016.

Screen Shot 2016-03-03 at 1.22.06 PMSource: EIA

The reason we need dispatchable grid storage systems is because of the proliferation of grid-connected intermittent generators and the need for grid operators to manage grid stability regionally and across the nation.

California’s Renewables Portfolio Standard (RPS) Program has required that utilities procure 33% of their electricity from “eligible renewable energy resources” by 2020. On 7 October 2015, Governor Jerry Brown signed into law a bill (SB 350) that increased this goal to 50% by 2030. There is no concise definition of “eligible renewable energy resources,” but you can get a good understanding of this term in the 2011 California Energy Commission guidebook, “Renewables Portfolio Standard Eligibility – 4th Edition,” which you can download at the following link:

http://www.energy.ca.gov/2010publications/CEC-300-2010-007/CEC-300-2010-007-CMF.PDF

The “eligible renewable energy resources” include solar, wind, and other resources, several of which would not be intermittent generators.

In 2014, the installed capacity of California’s 1,051 in-state power plants (greater than 0.1 megawatts – MW) was 86.9 GW. These plants produced 198,908 GWh of electricity in 2014. An additional 97,735 GWh (about 33%) was imported from out-of-state generators, yielding a 2014 statewide total electricity consumption of almost 300,000 GWh of electricity. By 2030, 50% of total generation is mandated to be from “eligible renewable energy resources,” and a good fraction of those resources will be operating intermittently at average capacity factors in the range from 22 – 33%.

The rates we pay as electric power customers in California already are among the highest in the nation, largely because of the Renewables Portfolio Standard (RPS) Program. With the higher targets for 2030, we soon will be paying even more for the deployment, operation and maintenance of massive new grid-connected storage infrastructure that will be needed to keep the state and regional grids stable.