Internet Archive: a Great Access Point to Many Web Resources and Vintage Science Fiction

Peter Lobner

Internet Archive is a non-profit library of millions of free books, audio books, movies, music, software and more, which you can access at the following link:

https://archive.org

It’s hard to navigate this site to find out what’s there. The home page presents icons for the “Top Collections in the Archive,” but you have to scroll through many pages to view hundreds of these icons, each of which links to a corresponding collection. Interesting collections I found include:

  • American Libraries
  • The Library of Congress
  • The LibriVox Free Audiobook Collection
  • Software Library: MS-DOS Games
  • Computer Magazine Archives
  • Television Archive
  • Grateful Dead
  • Metropolitan Museum of Art Gallery Images
  • Kahn Academy

Archive icons

There’s a Pulp Magazine Archive at the following link:

https://archive.org/details/pulpmagazinearchive

Once there, select Topic: “science fiction”, or use the following direct link:

https://archive.org/details/pulpmagazinearchive?and%5B%5D=subject%3A%22science+fiction%22&sort=-downloads&page=2

Then you’re on your way to libraries of vintage science fiction.  Below are results from my own searches.

Galaxy Science Fiction:

Galaxy Science Fiction was an American digest-size science fiction magazine published from 1950 to 1980. It was founded by an Italian company, World Editions, to help it break in to the American market. World Editions hired as editor H. L. Gold, who rapidly made Galaxy the leading science fiction magazine of its time, focusing on stories about social issues rather than technology.

The Galaxy Science Fiction archive, with 361 results, is located at the following link:

https://archive.org/details/galaxymagazine

Galaxy SF archive pic

If:

If was an American science fiction magazine launched in March 1952 by Quinn Publications. The magazine was moderately successful, though it was never regarded as one of the first rank of science fiction magazines. It achieved its greatest success under editor Frederik Pohl, winning the Hugo Award three years running from 1966 to 1968. If was merged into Galaxy Science Fiction after the December 1974 issue, its 175th issue overall.

The If science fiction archive, with 176 results, is located at the following link:

https://archive.org/details/ifmagazine

If SF archive pic

Amazing Stories

 Amazing Stories was an American science fiction magazine launched in April 1926 as the first magazine devoted solely to science fiction. Amazing Stories was published, with some interruptions, for almost 80 years. Although Amazing Stories was not considered an influential magazine in the genre, it was nominated for the Hugo award three times in the 1970s. It ceased publication in 2005

The Amazing Stories archive, with 160 results, is located at the following link:

https://archive.org/details/pulpmagazinearchive?and%5B%5D=amazing+stories&sort=-downloads&page=2

Amazing SF archive pic

The Skylark of Space is one of the earliest novels of interstellar travel and is considered a classic of pulp science fiction. Originally serialized in 1928, it is available as a 9-hour audiobook at the following link:

https://archive.org/details/skylark_space_2_1012_librivox

Skylark of Space

Good luck navigating the Internet Archive website. I hope you find some interesting things.

Is Arthur C. Clarke’s 1953 Short Story “Superiority” a Parable for Today?

Peter Lobner

Sir Arthur Charles Clarke was a British science fiction writer, science writer and futurist who became recognized worldwide for his great many short stories and novels, which have captivated readers since the early 1950s. You might know him best as the author of “Childhood’s End” and “2001: A Space Odyssey.

Sir-Arthur-C.-Clarke  Source: http://amazingstoriesmag.com

In the short story “Superiority,” which was published in his 1953 story collection, Expedition to Earth, Clarke describes a spacefaring federation of planets involved in a protracted war with a distant adversary, with both sides using comparable weaponry. The allure of advanced weaponry and “a revolution in warfare” led one side to allocate their resources away from traditional weaponry and invest instead in fewer vessels with advanced weapons systems that were sure to turn the tide of the war: the Sphere of Annihilation, the Battle Analyzer, and the Exponential Field.

As you might guess, the outcome was somewhat different, because:

  • The new systems was “almost perfected in the laboratory”
  • There were unforeseen complications and delays during development of the operational systems
  • There were unforeseen support and training requirements that compromised the operational use of the new systems and introduced new vulnerabilities
  • The new systems failed to deliver the expected “force multiplier” effect
  • There were unforeseen consequences from the operational use of some new weaponry

The adversary won the war with a numerically superior fleet using obsolete weapons based on inferior science.

Take time now to read this short story at the following link:

http://www.mayofamily.com/RLM/txt_Clarke_Superiority.html

Bill Sweetman has written an interesting commentary on Arthur C. Clarke’s “Superiority,“ in the 14 March 2016 issue of Aviation Week and Space Technology. His commentary, entitled, “Timeless Insight Into Why Military Programs Go Wrong – The history of defense program failures was foretold in 1953,” finds stunning parallels between the story line in “Superiority” and the history of many real-world defense programs from WW II to the present day. You can read Bill Sweetman’s commentary at the following link:

http://aviationweek.com/defense/opinion-timeless-insight-why-military-programs-go-wrong

Considering SAIC’s long-term, significant role in supporting many U.S. advanced war-fighting and intelligence system programs, many of us were the real-world analogs of the thousands of scientists, engineers, and managers working for Professor-General Norden, the Chief of the Research Staff, in “Superiority.” In Bill Sweetman’s commentary, he asks, “Is ‘Superiority’ a parable?” Based on your own experience at SAIC and elsewhere in the military – industrial complex, what do you think?

If you still haven’t read “Superiority,” please do it now. It’s worth your time.

Science is not Driving the Climate Change Debate

Peter Lobner

Thanks to Paul Fleming for sending me a thought provoking, well-documented paper entitled, “Global Warming and the Irrelevance of Science,” which was posted online on 17 February 2016 by Richard S. Lindzen, Alfred P. Sloan Professor of Atmospheric Sciences (Emeritus) Massachusetts Institute of Technology. This paper is the text of a lecture delivered on 20 August 2015 to the 48th Session: Erice International Seminars on Planetary Emergencies.

The basic premise of this paper is that, in many fields such as climate research, governments have a monopoly on the support of scientific research, and, through government-funded research contracts, influence the outcome of the very research being funded.

Lindzen starts his paper by observing that,

“Unfortunately, as anticipated by Eisenhower in his farewell speech from January 17, 1961 (the one that also warned of the military-industrial complex), ‘Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity.’

Rather, the powers that be invent the narrative independently of the views of even cooperating scientists. It is, in this sense, that the science becomes irrelevant.”

Lindzen uses the term “iron triangle” to describe this closed-loop vicious cycle:

  • Vertex 1: Scientists perform research and make meaningless or ambiguous statements about the research (IPCC WG1)
  • Vertex 2: Advocates and media ‘translate’ these statements into alarmist declarations [IPCC WG2 (impacts) & WG3 (mitigation), some politicians]
  • Vertex 3: Politicians respond to alarm by feeding more money to the scientists in the first vertex

The net result is poor environmental decision-making that is not supportable by credible climate science. On this matter, Lindzen notes:

“The situation may have been best summarized by Mike Hulme, director of the Tyndall Centre at the University of East Anglia (a center of concern for global warming): “To state that climate change will be ‘catastrophic’ hides a cascade of value-laden assumptions, which do not emerge from empirical or theoretical science.”

Lindzen characterized the following three different narratives related to the global warming debate:

  • Narrative 1 – IPCC WG1:
    • Broadly supportive of the proposition that increasing greenhouse gas concentrations are a serious concern
    • Relatively open about the uncertainties and even contradictions associated with this position
    • Public pronouncements tend to be vague with ample room for denial, carefully avoiding catastrophist hyperbole while also avoiding outright rejection of such hyperbole
  • Narrative 2 – Skeptics:
    • Regard the fact that virtually all models ‘run hot’ (i.e., their projections for the most part greatly exceed observed warming) as strongly supporting the case for low climate sensitivity
    • Generally believe in testing the physics underlying the positive feedbacks in sensitive models rather than averaging models
    • Much more open to the numerous known causes of climate change (including long period ocean circulations, solar variability, impacts of ice, etc.), and do not regard CO2 as the climate’s ultimate ‘control knob’
    • Openly oppose catastrophism
  • Narrative 3 – Political promoters of climate alarm (including IPCC WG2 & WG3, many environmental NGOs and mass media)
    • Emphasize alleged consequences of the worst case scenarios presented by WG1
    • Claim virtually unanimous support
    • It is this narrative for which the science is largely irrelevant.

Lindzen notes that, “Unfortunately, for most people, the third narrative is all they will see.“

You can read Richard S. Lindzen’s complete paper at the following link:

http://euanmearns.com/global-warming-and-the-irrelevance-of-science/

Thanks also to Mike Spaeth for sending me the following link to an informative document entitled, “A Primer on Carbon Dioxide and Climate,” prepared by a recently formed organization known as the CO2 Coalition.

http://co2coalition.org/primer-carbon-dioxide-climate/

The CO2 Coalition, formed in 2015, represents itself as, “a new and independent, non-profit organization that seeks to engage thought leaders, policy makers, and the public in an informed, dispassionate discussion of how our planet will be affected by CO2 released from the combustion of fossil fuel.” Hopefully, they can help make some headway with the mass media, general public, and politicians that currently are entrenched in Narrative 3. Even cartoonists know that this will be an uphill battle.

Research & critical thinking

Source: http://www.gocomics.com/nonsequitur/2016/02/16

Simulating Extreme Spacetimes

Peter Lobner

Thanks to Dave Groce for sending me the following link to the Caltech-Cornell Numerical Relativity collaboration; Simulating eXtreme Spacetimes (SXS):

http://www.black-holes.org

Caltech SXSSource: SXS

From the actual website (not the image above), click on the yellow “Admit One” ticket and you’re on your way.

Under the “Movies” tab, you’ll find many video simulations that help visualizes a range of interactions between two black holes and between a black hole and a neutron star. Following is a direct link:

http://www.black-holes.org/explore/movies

A movie visualizing GW150914, the first ever gravitational wave detection on 14 September 2015, is at the following SXS link:

https://www.black-holes.org/gw150914

At the above link, you also can listen to the sound of the GW150914 “in-spiral” event (two black holes spiraling in on each other).  You can read more about the detection of GW150914 in my 11 February 2016 post.

On the “Sounds” tab on the SXS website, you’ll find that different types of major cosmic events are expected to emit gravitational waves with waveforms that will help characterize the original event. You can listen to the expected sounds from a variety of extreme cosmic events at the following SXS link:

http://www.black-holes.org/explore/sounds

Have fun exploring SXS.

Synthetic Aperture Radar (SAR) and Inverse SAR (ISAR) Enable an Amazing Range of Remote Sensing Applications

Peter Lobner

SAR Basics

Synthetic Aperture Radar (SAR) is an imaging radar that operates at microwave frequencies and can “see” through clouds, smoke and foliage to reveal detailed images of the surface below in all weather conditions. Below is a SAR image superimposed on an optical image with clouds, showing how a SAR image can reveal surface details that cannot be seen in the optical image.

Example SAR imageSource: Cassidian radar, Eurimage optical

SAR systems usually are carried on airborne or space-based platforms, including manned aircraft, drones, and military and civilian satellites. Doppler shifts from the motion of the radar relative to the ground are used to electronically synthesize a longer antenna, where the synthetic length (L) of the aperture is equal to: L = v x t, where “v” is the relative velocity of the platform and “t” is the time period of observation. Depending on the altitude of the platform, “L” can be quite long. The time-multiplexed return signals from the radar antenna are electronically recombined to produce the desired images in real-time or post-processed later.

SAR principle

Source: Christian Wolff, http://www.radartutorial.eu/20.airborne/pic/sar_principle.print.png

This principle of SAR operation was first identified in 1951 by Carl Wiley and patented in 1954 as “Simultaneous Buildup Doppler.”

SAR Applications

There are many SAR applications, so I’ll just highlight a few.

Boeing E-8 JSTARS: The Joint Surveillance Target Attack Radar System is an airborne battle management, command and control, intelligence, surveillance and reconnaissance platform, the prototypes of which were first deployed by the U.S. Air Force during the 1991 Gulf War (Operation Desert Storm). The E-8 platform is a modified Boeing 707 with a 27 foot (8 meter) long, canoe-shaped radome under the forward fuselage that houses a 24 foot (7.3 meters) long, side-looking, multi-mode, phased array antenna that includes a SAR mode of operation. The USAF reports that this radar has a field of view of up to 120-degrees, covering nearly 19,305 square miles (50,000 square kilometers).

E-8 JSTARSSource: USAF

Lockheed SR-71: This Mach 3 high-altitude reconnaissance jet carried the Advanced Synthetic Aperture Radar System (ASARS-1) in its nose. ASARS-1 had a claimed 1 inch resolution in spot mode at a range of 25 to 85 nautical miles either side of the flight path.  This SAR also could map 20 to 100 nautical mile swaths on either side of the aircraft with lesser resolution.

SR-71Source: http://www.wvi.com/~sr71webmaster/sr_sensors_pg2.htm

Northrop RQ-4 Global Hawk: This is a large, multi-purpose, unmanned aerial vehicle (UAV) that can simultaneously carry out electro-optical, infrared, and synthetic aperture radar surveillance as well as high and low band signal intelligence gathering.

Global HawkSource: USAF

Below is a representative RQ-4 2-D SAR image that has been highlighted to show passable and impassable roads after severe hurricane damage in Haiti. This is an example of how SAR data can be used to support emergency management.

Global Hawk Haiti post-hurricane image123-F-0000X-103Source: USAF

NASA Space Shuttle: The Shuttle Radar Topography Mission (SRTM) used the Space-borne Imaging Radar (SIR-C) and X-Band Synthetic Aperture Radar (X-SAR) to map 140 mile (225 kilometer) wide swaths, imaging most of Earth’s land surface between 60 degrees north and 56 degrees south latitude. Radar antennae were mounted in the Space Shuttle’s cargo bay, and at the end of a deployable 60 meter mast that formed a long-baseline interferometer. The interferometric SAR data was used to generate very accurate 3-D surface profile maps of the terrain.

Shuttle STRMSource: NASA / Jet Propulsion Laboratory

An example of SRTM image quality is shown in the following X-SAR false-color digital elevation map of Mt. Cotopaxi in Ecuador.

Shuttle STRM imageSource: NASA / Jet Propulsion Laboratory

You can find more information on SRTM at the following link:

https://directory.eoportal.org/web/eoportal/satellite-missions/s/srtm

ESA’s Sentinel satellites: Refer to my 4 May 2015 post, “What Satellite Data Tell Us About the Earthquake in Nepal,” for information on how the European Space Agency (ESA) assisted earthquake response by rapidly generating a post-earthquake 3-D ground displacement map of Nepal using SAR data from multiple orbits (i.e., pre- and post-earthquake) of the Sentinel-1A satellite.  You can find more information on the ESA Sentinel SAR platform at the following link:

http://www.esa.int/Our_Activities/Observing_the_Earth/Copernicus/Sentinel-1/Introducing_Sentinel-1

You will find more general information on space-based SAR remote sensing applications, including many high-resolution images, in a 2013 European Space Agency (ESA) presentation, “Synthetic Aperture Radar (SAR): Principles and Applications”, by Alberto Moreira, at the following link:

https://earth.esa.int/documents/10174/642943/6-LTC2013-SAR-Moreira.pdf

ISAR Basics

ISAR technology uses the relative movement of the target rather than the emitter to create the synthetic aperture. The ISAR antenna can be mounted in a airborne platform. Alternatively, ISAR also can be used by one or more ground-based antennae to generate a 2-D or 3-D radar image of an object moving within the field of view.

ISAR Applications

Maritime surveillance: Maritime surveillance aircraft commonly use ISAR systems to detect, image and classify surface ships and other objects in all weather conditions. Because of different radar reflection characteristics of the sea, the hull, superstructure, and masts as the vessel moves on the surface of the sea, vessels usually stand out in ISAR images. There can be enough radar information derived from ship motion, including pitching and rolling, to allow the ISAR operator to manually or automatically determine the type of vessel being observed. The U.S. Navy’s new P-8 Poseidon patrol aircraft carry the AN/APY-10 multi-mode radar system that includes both SAR and ISAR modes of operation.

The principles behind ship classification is described in detail in the 1993 MIT paper, “An Automatic Ship Classification System for ISAR Imagery,” by M. Menon, E. Boudreau and P. Kolodzy, which you can download at the following link:

https://www.ll.mit.edu/publications/journal/pdf/vol06_no2/6.2.4.shipclassification.pdf

You can see in the following example ISAR image of a vessel at sea that vessel classification may not be obvious to the casual observer. I can see that an automated vessel classification system is very useful.

Ship ISAR image

Source: Blanco-del-Campo, A. et al., http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5595482&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F7361%2F5638351%2F05595482.pdf%3Farnumber%3D5595482

Imaging Objects in Space: Another ISAR (also called “delayed Doppler”) application is the use of one or more large radio telescopes to generate radar images of objects in space at very long ranges. The process for accomplishing this was described in a 1960 MIT Lincoln Laboratory paper, “Signal Processing for Radar Astronomy,” by R. Price and P.E. Green.

Currently, there are two powerful ground-based radars in the world capable of investigating solar system objects: the National Aeronautics and Space Administration (NASA) Goldstone Solar System Radar (GSSR) in California and the National Science Foundation (NSF) Arecibo Observatory in Puerto Rico. News releases on China’s new FAST radio telescope have not revealed if it also will be able to operate as a planetary radar (see my 18 February 2016 post).

The 230 foot (70 meter) GSSR has an 8.6 GHz (X-band) radar transmitter powered by two 250 kW klystrons. You can find details on GSSR and the techniques used for imaging space objects in the article, “Goldstone Solar System Radar Observatory: Earth-Based Planetary Mission Support and Unique Science Results,” which you can download at the following link:

http://echo.jpl.nasa.gov/asteroids/Slade_Benner_Silva_IEEE_Proceedings.pdf

The 1,000 foot (305 meter) Arecibo Observatory has a 2.38 GHz (S-band) radar transmitter, originally rated at 420 kW when it was installed in 1974, and upgraded in 1997 to 1 MW along with other significant upgrades to improve radio telescope and planetary radar performance. You will find details on the design and upgrades of Arecibo at the following link:

http://www.astro.wisc.edu/~sstanimi/Students/daltschuler_2.pdf

The following examples demonstrate the capabilities of Arecibo Observatory to image small bodies in the solar system.

  • In 1999, this radar imaged the Near-Earth Asteroid 1999 JM 8 at a distance of about 5.6 million miles (9 million km) from Earth. The ISAR images of this 1.9 mile 3-km) sized object had a resolution of about 49 feet (15 meters).
  • In November 1999, Arecibo Observatory imaged the tumbling Main-Belt Asteroid 216 Kleopatra. The resulting ISAR images, which made the cover of Science magazine, showed a dumbbell-shaped object with an approximate length of 134.8 miles (217 kilometers) and varying diameters up to 58.4 miles (94 kilometers).

Asteroid image  Source: Science

More details on the use of Arecibo Observatory to image planets and other bodies in the solar system can be found at the following link:

http://www.naic.edu/general/index.php?option=com_content&view=article&id=139&Itemid=474

The NASA / Jet Propulsion Laboratory Asteroid Radar Research website also contains information on the use of radar to map asteroids and includes many examples of asteroid radar images. Access this website at the following link:

http://echo.jpl.nasa.gov

Miniaturization

In recent years, SAR units have become smaller and more capable as hardware is miniaturized and better integrated. For example, Utah-based Barnard Microsystems offers a miniature SAR for use in lightweight UAVs such as the Boeing ScanEagle. The firm claimed that their two-pound “NanoSAR” radar, shown below, weighed one-tenth as much as the smallest standard SAR (typically 30 – 200 pounds; 13.6 – 90.7 kg) at the time it was announced in March 2008. Because of power limits dictated by the radar circuit boards and power supply limitations on small UAVs, the NanoSAR has a relatively short range and is intended for tactical use on UAVs flying at a typical ScanEagle UAV operational altitude of about 16,000 feet.

Barnard NanoSARSource: Barnard Microsystems

ScanEagle_UAVScanEagle UAV. Source: U.S. Marine Corps.

Nanyang Technological University, Singapore (NTU Singapore) recently announced that its scientists had developed a miniaturized SAR on a chip, which will allow SAR systems to be made a hundred times smaller than current ones.

?????????????????????????????????????????????????????????Source: NTU

NTU reports:

“The single-chip SAR transmitter/receiver is less than 10 sq. mm (0.015 sq. in.) in size, uses less than 200 milliwatts of electrical power and has a resolution of 20 cm (8 in.) or better. When packaged into a 3 X 4 X 5-cm (0.9 X 1.2 X 1.5 in.) module, the system weighs less than 100 grams (3.5 oz.), making it suitable for use in micro-UAVs and small satellites.”

NTU estimates that it will be 3 to 6 years before the chip is ready for commercial use. You can read the 29 February 2016 press release from NTU at the following link:

http://media.ntu.edu.sg/NewsReleases/Pages/newsdetail.aspx?news=c7aa67e7-c5ab-43ae-bbb3-b9105a0cd880

With such a small and hopefully low cost SAR that can be integrated with low-cost UAVs, I’m sure we’ll soon see many new and useful radar imaging applications.

Remarkable Multispectral View of Our Milky Way Galaxy

Peter Lobner, updated 18 August 2023

Moody Blues cover - In search of the lost chordAlbum Album cover art credit: Deram Records

Some of you may recall the following lyrics from the 1968 Moody Blues song, “The Word,” by Graeme, Edge, from the album “In Search of the Lost Chord”:

This garden universe vibrates complete

Some, we get a sound so sweet

 Vibrations reach on up to become light

And then through gamma, out of sight

Between the eyes and ears there lie

The sounds of color and the light of a sigh

And to hear the sun, what a thing to believe

But it’s all around if we could but perceive

 To know ultraviolet, infrared and X-rays

Beauty to find in so many ways

On 24 February 2016, the European Southern Observatory (ESO) Consortium announced that it has completed the ATLASGAL Survey of the Milky Way. The survey mapped the entire galactic plane visible from the southern hemisphere at sub-millimeter wavelengths, between infrared light and radio waves, using the Atacama Pathfinder EXperiment (APEX) telescope located at 5,100 meters (16,732 ft.) above sea level in Chile’s Atacama region. The southern sky is particularly important because it includes the galactic center of our Milky Way. The Milky Way in the northern sky has already been mapped by the James Clerk Maxwell Telescope, which is a sub-millimeter wavelength telescope at the Mauna Kea Observatory in Hawaii.

The new ATLASGAL maps cover an area of sky 140 degrees long and 3 degrees wide. ESO stated that these are the sharpest maps yet made, and they complement those from other land-based and space-based observatories. The principal space-based observatories are the following:

  • European Space Agency’s (ESA) Plank satellite: Mission on-going, mapping anisotropies of the cosmic microwave background at microwave and infrared frequencies.
  • ESA’s Herschel Space Observatory: Mission on-going, conducting sky surveys in the far-infrared and sub-millimeter frequencies.
  • National Aeronautics and Space Administration (NASA) Spitzer Space Telescope: Mission on-going, conducting infrared observations and mapping as described in my 1 April 2015 post.
  • NASA’s Hubble Space Telescope: Mission on-going, observing and mapping at ultraviolet, optical, and infrared frequencies.
  • NASA’s Chandra X-Ray Observatory: Mission on-going, observing and mapping X-ray sources.
  • NASA’s Compton Gamma Ray Observatory: Mission ended in 2000. Observed and mapped gamma ray and x-ray sources.

ESO reported that the combination of Planck and APEX data allowed astronomers to detect emission spread over a larger area of sky and to estimate from it the fraction of dense gas in the inner galaxy. The ATLASGAL data were also used to create a complete census of cold and massive clouds where new generations of stars are forming.

You can read the ESO press release at the following link:

https://www.eso.org/public/news/eso1606/

Below is a composite ESO photograph that shows the same central region of the Milky Way observed at different wavelengths.

ESO Multispectral view of Milky WaySource: ESO/ATLASGAL consortium/NASA/GLIMPSE consortium/VVV Survey/ESA/Planck/D. Minniti/S. Guisard. Acknowledgement: Ignacio Toledo, Martin Kornmesser

  • The top panel shows compact sources of sub-millimeter radiation detected by APEX as part of the ATLASGAL survey, combined with complementary data from ESA’s Planck satellite, to capture more extended features.
  • The second panel shows the same region as seen in shorter, infrared wavelengths by the NASA Spitzer Space Telescope
  • The third panel shows the same part of sky again at even shorter wavelengths, the near-infrared, as seen by ESO’s VISTA infrared survey telescope at the Paranal Observatory in Chile. Regions appearing as dark dust tendrils in the third panel show up brightly in the ATLASGAL view (top panel).
  • The bottom panel shows the more familiar view in visible light, where most of the more distant structures are hidden from view

NASA’s Goddard Space Flight Center also  created a multispectral view of the Milky Way, which  is shown in the following composite photograph of the same central region of the Milky Way observed at different wavelengths.

NASA Goddard multispectralSource: NASA Goddard Space Flight Center

Starting from the top, the ten panels in the NASA image cover the following wavelengths.

  • Radio frequency (408 MHz)
  • Atomic hydrogen
  • Radio frequency (2.5 GHz)
  • Molecular hydrogen
  • Infrared
  • Mid-infrared
  • Near-infrared
  • Optical
  • X-ray
  • Gamma ray

The Moody Blues song, “The Word,” ends with the following lyrics:

 Two notes of the chord, that’s our full scope

But to reach the chord is our life’s hope

And to name the chord is important to some

So they give it a word, and the word is “Om”

While “Om” (pronounced or hummed “ahh-ummmm”) traditionally is a sacred mantra of Hindu, Jain and Buddhist religions, it also may be the mantra of astronomers as they unravel new secrets of the Milky Way and, more broadly, the Universe. I suspect that completing the ATLASGAL Survey of the Milky Way was an “Om” moment for the many participants in the ESO Consortium effort.

For more information

VBB-3, the World’s Most Powerful Electric Car, will Challenge the Land Speed Record in 2016

Peter Lobner

Updated 2 January 2017

Venturi Buckeye Bullet-3 (VBB-3) is an all-electric, four wheel drive, land speed record (LSR) car that has been designed to exceed 400 mph (643.7 km/h). The organizations involved in this project are:

  • Venturi Automobiles:

This Monaco-based company is a leader in the field of high performance electric vehicles. Read more at the Venturi website at the following link:

http://en.venturi.fr/challenges/world-speed-records

  • Ohio State University (OSU) Center for Automotive Research (CAR):

OSU’s CAR has been engaged in all-electric LSR development and testing since 2000. On 3 October 2004 at the Bonneville Salt Flats in Utah, the original nickel-metal hydride (NiMH) battery-powered Buckeye Bullet reached a top speed of 321.834 mph (517.942 km/h).

In an on-going program known as Mission 01, started in 2009, OSU partnered with Venturi to develop, test, and conduct the land speed record runs of the hydrogen fuel cell-powered VBB-2, the battery-powered VBB-2.5, and the more powerful battery-powered VBB-3.  Read more at the OSU / CAR website at following link:

https://car.osu.edu/search/node/VBB-3

 The Venturi – OSU team’s accomplishments to date are:

  • 2009:  The team’s first world land speed record was achieved on the Bonneville Salt Flats with hydrogen fuel cell-powered VBB-2 at 303 mph (487 km/h).
  •  2010:  The team returned to the salt flats with the 700 hp lithium-ion battery powered VBB-2.5 which set another world record at 307 mph (495 km/h); with a top speed at 320 mph (515 km/h).
  •  2013:  The 3,000 hp lithium iron phosphate battery-powered VBB-3 was unveiled. Due to the flooding of the Bonneville Salt Flats, the FIA and the organizers of the world speed records program cancelled the 2013 competition.
  •  2014Poor track conditions at Bonneville persisted after flooding from a summer storm. Abbreviated test runs by VBB-3 yielded a world record in its category (electric vehicle over 3.5 metric tons) with an average speed of 212 mph (341 km/h) and a top speed of 270 mph (435 km/h).
  •  2015:  Poor track conditions at Bonneville persisted after flooding from a summer storm. Abbreviated test runs by VBB-3 yielded a world record in its category (electric vehicle over 3.5 metric tons) with an average speed of 212 mph (341 km/h) and a top speed of 270 mph (435 km/h).

You will find a comparison of the VBB-2, VBB-2.5 and VBB-3 vehicles at the following link:

http://en.vbb3.venturi.fr/about/the-car

VBB-3 has a 37.2 ft. (11.35 meter) long, slender, space frame chassis that houses eight battery packs with a total of 2,000 cells, two 1,500 hp AC induction motors developed by Venturi for driving the front and rear wheels, a coolant system for the power electronics, disc brakes and a braking parachute, and a small cockpit for the driver. The basic internal arrangement of these components in the VBB-3 chassis is shown in the following diagram.

VBB-3 internalSource: Venturi

You can see a short video of a test drive of VBB-3 without its external skin at the following link:

http://en.vbb3.venturi.fr

The exterior aerodynamic carbon fiber shell was designed with the aid of the OSU Supercomputer Center to minimize vehicle drag and lift.

VBB-3 skinSource: Venturi

The completed VBB-3 with members of the project team is shown below.

VBB-3 completeSource: Venturi

A good video showing the 2010 VBB-2.5 record run and a 2014 test run of VBB-3 is at the following link:

https://www.youtube.com/watch?v=KLn07Y-t1Xc&ebc=ANyPxKqkVxPKQWnYXzUemRbE5WWlRIJUbaXA-UN6XPNoiDZG1O4NsFq8RE08QlrfdbfkxKmE32MEf5g2Qw0_WQbFXBvKYz9qwg

VBB-3 currently is being prepared in the OSU / CAR workshop in Columbus, Ohio, for another attempt at the land speed record in summer 2016. A team of about 25 engineers and students are planning to be at the Bonneville Salt Flats in summer 2016 with the goal of surpassing 372 mph (600 km/h).

You can subscribe to Venturi new releases on VBB-3 at the following link:

http://en.venturi.fr/news/the-vbb-3-gets-ready

VBB-3 at BonnevilleSource: Venturi

Update 2 January 2017: VBB-3 sets new EV land speed record

On 19 September 2016, VBB-3 set an electric vehicle (Category A Group VIII Class 8) land-speed record of 341.4 mph (549 kph), during a two-way run within one hour on the Bonneville salt flats in Utah. You can read the OSU announcement at the following link:

https://news.osu.edu/news/2016/09/21/ohio-states-all-electric-venturi-buckeye-bullet-3-sets-new-landspeed-record/

You also can watch a short video of VBB-3’s record run at the following link:

https://www.youtube.com/watch?v=rIqT4qLtGcY

Certification of this EV speed record by the Federation Internationale de l’Automobile’s (FIA) is still pending.

The Venturi-OSU team believes VBB-3 has the capability to achieve 435 mph (700 kph) in the right conditions, so we can expect more record attempts in the future.

Dispatchable Power from Energy Storage Systems Help Maintain Grid Stability

Peter Lobner

On 3 March 2015, Mitsubishi Electric Corporation announced the delivery of the world’s largest energy storage system, which has a rated output of 50 MW and a storage capacity of 300 MWh. The battery-based system is installed in Japan at Kyushu Electric Power Company’s Buzen Power Plant as part of a pilot project to demonstrate the use of high-capacity energy storage systems to balance supply and demand on a grid that has significant, weather-dependent (intermittent), renewable power sources (i.e., solar and/or wind turbine generators). This system offers energy-storage and dispatch capabilities similar to those of a pumped hydro facility. You can read the Mitsubishi press release at the following link:

http://www.mitsubishielectric.com/news/2016/pdf/0303-b.pdf

The energy storage system and associated electrical substation installation at Buzen Power Plant are shown below. The energy storage system is comprised of 63 4-module units, where each module contains sodium-sulfur (NaS) batteries with a rated output of 200 kW. The modules are double stacked to reduce the facility’s footprint and cost.

Buzen Power Plant - JapanSource: Mitsubishi

The following simplified diagram shows how the Mitsubishi grid supervisory control and data acquisition (SCADA) system matches supply with variable demand on a grid with three dispatchable energy sources (thermal, pumped hydro and battery storage) and one non-dispatchable (intermittent) energy source (solar photovoltaic, PV). As demand varies through the day, thermal power plants can maneuver (within limits) to meet increasing load demand, supplemented by pumped hydro and battery storage to meet peak demands and to respond to the short-term variability of power from PV generators. A short-term power excess is used to recharge the batteries. Pumped hydro typically is recharged over night, when the system load demand is lower.

Mitsubishi SCADA

Above diagram: Mitsubishi BLEnDer® RE Battery SCADA System (Source: Mitsubishi)

Battery storage is only one of several technologies available for grid-connected energy storage systems. You can read about the many other alternatives in the December 2013 Department of Energy (DOE) report, “Grid Energy Storage”, which you can download at the following link:

http://www.sandia.gov/ess/docs/other/Grid_Energy_Storage_Dec_2013.pdf

This 2013 report includes the following figure, which shows the rated power of U.S. grid storage projects, including announced projects.

US 2013 grid  storage projectsSource: DOE

As you can see, battery storage systems, such as the Mitsubishi system at Buzen Power Plant, comprise only a small fraction of grid-connected energy storage systems, which currently are dominated in the U.S. by pumped hydro systems. DOE reported that, as of August 2013, there were 202 energy storage systems deployed in the U.S. with a total installed power rating of 24.6 GW. Energy storage capacity (i.e., GWh) was not stated. In contrast, total U.S. installed generating capacity in 2013 was over 1,000 GW, so fully-charged storage systems can support about 2.4% of the nation’s load demand for a short period of time.

Among DOE’s 2013 strategic goals for grid energy storage systems are the following cost goals:

  • Near-term energy storage systems:
    • System capital cost: < $1,750/kW; < $250/kWh
    • Levelized cost: < 20¢ / kWh / cycle
    • System efficiency: > 75%
    • Cycle life: > 4,000 cycles
  • Long-term energy storage systems:
    • System capital cost: < $1,250/kW; < $150/kWh
    • Levelized cost: < 10¢ / kWh / cycle
    • System efficiency: > 80%
    • Cycle life: > 5,000 cycles

Using the DOE near-term cost goals, we can estimate the cost of the energy storage system at the Buzen Power Plant to be in the range from $75 – 87.5 million. DOE estimated that the storage devices contributed 30 – 40% of the cost of an energy storage system.  That becomes a recurring operating cost when the storage devices reach their cycle life limit and need to be replaced.

The Energy Information Agency (EIA) defines capacity factor as the ratio of a generator’s actual generation over a specified period of time to its maximum possible generation over that same period of time. EIA reported the following installed generating capacities and capacity factors for U.S. wind and solar generators in 2015:

US renewable power 2015

Currently there are 86 GW of intermittent power sources connected to the U.S. grid and that total is growing year-on-year. As shown below, EIA expects 28% growth in solar generation and 16% growth in wind generation in the U.S. in 2016.

Screen Shot 2016-03-03 at 1.22.06 PMSource: EIA

The reason we need dispatchable grid storage systems is because of the proliferation of grid-connected intermittent generators and the need for grid operators to manage grid stability regionally and across the nation.

California’s Renewables Portfolio Standard (RPS) Program has required that utilities procure 33% of their electricity from “eligible renewable energy resources” by 2020. On 7 October 2015, Governor Jerry Brown signed into law a bill (SB 350) that increased this goal to 50% by 2030. There is no concise definition of “eligible renewable energy resources,” but you can get a good understanding of this term in the 2011 California Energy Commission guidebook, “Renewables Portfolio Standard Eligibility – 4th Edition,” which you can download at the following link:

http://www.energy.ca.gov/2010publications/CEC-300-2010-007/CEC-300-2010-007-CMF.PDF

The “eligible renewable energy resources” include solar, wind, and other resources, several of which would not be intermittent generators.

In 2014, the installed capacity of California’s 1,051 in-state power plants (greater than 0.1 megawatts – MW) was 86.9 GW. These plants produced 198,908 GWh of electricity in 2014. An additional 97,735 GWh (about 33%) was imported from out-of-state generators, yielding a 2014 statewide total electricity consumption of almost 300,000 GWh of electricity. By 2030, 50% of total generation is mandated to be from “eligible renewable energy resources,” and a good fraction of those resources will be operating intermittently at average capacity factors in the range from 22 – 33%.

The rates we pay as electric power customers in California already are among the highest in the nation, largely because of the Renewables Portfolio Standard (RPS) Program. With the higher targets for 2030, we soon will be paying even more for the deployment, operation and maintenance of massive new grid-connected storage infrastructure that will be needed to keep the state and regional grids stable.

How Long Does it Take to Certify a Commercial Airliner?

Peter Lobner

After designing, developing, and manufacturing a new commercial airliner, I’m sure the airframe manufacturer has a big celebration on the occasion of the first flight. The ensuing flight test and ground static test programs are intended to validate the design, operating envelope, and maintenance practices and satisfy these and other requirements of the national certifying body, which in the U.S. is the Federal Aviation Administration (FAA). Meanwhile, airlines that have ordered the new aircraft are planning for its timely delivery and introduction into scheduled revenue service.

The time between first flight and first delivery of a new commercial airliner is not a set period of time. As you can see in the following chart, which was prepared by Brian Bostick (http://aviationweek.com/thingswithwings), there is great variability in the time it takes to get an airliner certified and delivered.

Time to certify an airliner

In this chart, the Douglas DC-9 has the record for the shortest certification period (205 days) with certification in November 1965. The technologically advanced supersonic Concorde had one of the longest certification periods (almost 2,500 days), with authorization in February 1976 to conduct a 16-month demonstration period with flights between Europe and the U.S. before starting regular commercial service.

The record for the longest certification period goes to the Chinese Comac ARJ21 twin-jet airliner, which is the first indigenous airliner produced in China. The first ARJ21 was delivered to a Chinese airline in November 2015. The ARJ is based on the DC-9 and reuses tooling provided by McDonnell Douglas for the licensed production of the MD-80 (a DC-9 variant) in China. I suspect that the very long certification period is a measure of the difficulty in establishing the complete aeronautical infrastructure needed to deliver an indigenous commercial airliner with an indigenous jet engine.

In the chart, compare the certification times for the following similar commercial airliners:

  • Four-engine, single aisle, long-range airliners: Boeing 707 (shortest), Douglas DC-8, Convair CV-880, Vickers VC-10, De Havilland Comet (longest)
  • Three-engine, single aisle, medium range airliners: Boeing 727 (shorter), Hawker Siddeley Trident (longer)
  • Two-engine, single aisle airliners: Douglas DC-9 (shortest), Boeing 737, Boeing 757, Airbus A320, British Aircraft Corporation BAC 1-11, Dassault Mercure, Caravelle (longest)
  • Two-engine, single aisle, short range regional jets: Embraer ERJ 145 (shortest), Bombardier CRJ-100, BAe 146, Fokker F-28, ERJ 170, Bombardier CS Series, Mitsubishi MRJ, Sukhoi Superjet, VFW-614, Comac ARJ21 (longest)
  • Four-engine, wide-body, long-range airliners: Boeing 747, Airbus A340, Airbus A380 (longest)
  • Three-engine, wide-body, long-range airliners: Douglas DC-10 (shorter), Lockheed L-1011 (longer)
  • Two-engine, wide-body airliners: Boeing 767 (shortest), Boeing 777, Airbus 350, Airbus A300, Boeing 787 (longest)

Time is money, so there is tremendous economic value in minimizing the time between first flight and first delivery. The first 16 aircraft at the top of the chart all enjoyed relatively short certification periods. This group, which includes many aircraft that appeared in the 1960s – 70, averaged about 400 days between first flight and first delivery.

More modern aircraft (blue bars in the chart representing aircraft appearing in 2000 or later) have been averaging about 800 days between first flight and first delivery (excluding ARJ21).

Solar Impulse 2 Preparing for the Next Leg of its Around-the-World Journey

Peter Lobner

In my 10 March 2015 post, I provided basic information of the remarkable Solar Impulse 2 aircraft and its mission to be the first aircraft to fly around the world on solar power. On 10 July 2015, I posted a summary of the first eight legs of the around the world flight, which started in Abu Dhabi on 9 March 2015 and ended on 3 July at Kalaeloa, a small airport outside Honolulu, Hawaii.

After arriving in Hawaii, the Solar Impulse team determined that the batteries had been damaged due to overheating on the first day of the Leg 8 flight and would have to be replaced. Solar Impulse reported the following root cause for the overheating:

“Since the plane had been exposed to harsh weather conditions from Nanjing to Nagoya, we decided to do a test flight before leaving for Hawaii. Having to perform a test flight followed by a mission flight had not been taken into account in the design process of the battery system, which did not allow the batteries to cool down in between the two” (flights).

By November 2015, the Solar Impulse engineers had upgraded the design of the whole battery system and integrated a battery cooling system. You can read the details on the Solar Impulse website at the following link:

http://blog.solarimpulse.com/post/133346944960/cool-batteries-solarimpulse

A further delay in starting Leg 9 was caused by the seasonal shortening of daylight hours in the Northern hemisphere. The late autumn and winter daylight hours weren’t long enough to allow the batteries to be fully recharged during the day along the planned route to the U.S. mainland and back to Abu Dhabi.

Solar Impulse 2 routeSource: Solar Impulse

On 26 February 2016, the upgraded Solar Impulse II made a successful “maintenance” flight in Hawaii. The flight lasted 93 minutes, reached an altitude of 8,000 feet (2,400 meters), and included tests of the stabilization and battery cooling systems.

Solar Impulse is planning to restart its around-the-world journey on 20 April 2016.

Solar Impulse composite photo over HawaiiSource: Solar Impulse

You can subscribe to news releases from the Solar Impulse team at the following link:

http://www.solarimpulse.com/subscribe