Many LLNL Atmospheric Nuclear Test Videos Declassified

Peter Lobner

Lawrence Livermore National Laboratory (LLNL) has posted 64 declassified videos of nuclear weapons tests on YouTube. LLNL reports:

“The U.S. conducted 210 atmospheric nuclear tests between 1945 and 1962, with multiple cameras capturing each event at around 2,400 frames per second. But in the decades since, around 10,000 of these films sat idle, scattered across the country in high-security vaults. Not only were they gathering dust, the film material itself was slowly decomposing, bringing the data they contained to the brink of being lost forever.

For the past five years, Lawrence Livermore National Laboratory (LLNL) weapon physicist Greg Spriggs and a crack team of film experts, archivists and software developers have been on a mission to hunt down, scan, reanalyze and declassify these decomposing films. The goals are to preserve the films’ content before it’s lost forever, and provide better data to the post-testing-era scientists who use computer codes to help certify that the aging U.S. nuclear deterrent remains safe, secure and effective.”

Operation Hardtack-1 – Nutmeg 51538. Source: LLNL

Here’s the link:

https://www.youtube.com/playlist?list=PLvGO_dWo8VfcmG166wKRy5z-GlJ_OQND5

Update 7 July 2018:

LLNL has posted more than 250 declassified videos of nuclear weapons tests on YouTube.  The newly digitized videos document several of the U.S. government’s 210 nuclear weapons tests carried out between 1945 and 1962.  You’ll find these videos at the following link:

https://www.youtube.com/user/LivermoreLab/videos

The Event Horizon Telescope

Peter Lobner

The Event Horizon Telescope (EHT) is a huge synthetic array for Very Long Baseline Interferometry (VLBI), which is created through the collaboration of millimeter / submillimeter wave radio telescopes and arrays around the world. The goal of the EHT “is to directly observe the immediate environment of a black hole with angular resolution comparable to the event horizon.”

The primary target for observation is Sagittarius A* (Sgr A*), which is the massive black hole at the center of our Milky Way galaxy. This target is of particular interest to the EHT team because it “presents the largest apparent event horizon size of any black hole candidate in the Universe.” The Sgr A* event horizon is estimated to have a Schwarzschild radius of 12 million kilometers (7.46 million miles) or a diameter of 24 million km (14.9 million miles). The galactic core (and hence Sgr A*) is estimated to be 7.6 to 8.7 kiloparsecs (about 25,000 to 28,000 lightyears, or 1.47 to 1.64e+17 miles) from Earth. At that distance, the Sgr A* black hole subtends an angle of about 2e-5 arcseconds (20 microarcseconds).

Another EHT target of interest is a much more distant black hole in the Messier 87 (M87) galaxy.

The member arrays and telescopes supporting EHT are:

  • Arizona Radio Observatory /Submillimeter Wave Telescope (ARO/SMT, Arizona, USA)
  • Atacama Pathfinder EXperiment (APEX, Chile)
  • Atacama Submillimeter Telescope Experiment (ASTE, Chile)
  • Combined Array for Research in Millimeter-wave Astronomy (CARMA, California, USA)
  • Caltech Submillimeter Observatory (Hawaii, USA)
  • Institute de Radioastronomie Millimetrique (IRAM, Spain)
  • James Clerk Maxwell Telescope (JCMT, Hawaii)
  • Large Millimeter Telescope Alfonso Serrano (LMT, Mexico)
  • The Submillimeter Array (Hawaii, USA)

The following arrays and telescopes are expected to join the EHT collaboration:

  • Atacama Large Millimeter / submillimeter Array (ALMA, Chile)
  • Northern Extended Millimeter Array (NOEMA, France)
  • South Pole Telescope (SPT, Antarctica)

Collectively, the arrays and telescopes forming the EHT provide a synthetic aperture that is almost equal to the diameter of the Earth (12,742 km, 7,918 miles).

EHT array sizeSource: graphics adapted by A. Cuadra / Science; data from Event Horizon Telescope

Technical improvements to the member telescopes and arrays are underway with the goal of systematically improving EHT performance. These improvements include development and deployment of:

  • Submillimeter dual-polarization receivers (energy content of cosmic radiation is split between two polarizations)
  • Highly stable frequency standards to enable VLBI at frequencies between 230 to 450 GHz (wavelengths of 1.3 mm – 0.6 mm).
  • Higher-bandwidth digital VLBI backends and recorders

In operations to date, EHT has been observing the Sgr A* and M87 black holes at 230 GHz (1.3 mm) with only some of the member arrays and telescopes participating. These observations have yielded angular resolutions of better than 60 microarcseconds. Significantly higher angular resolutions, up to about 15 microarcseconds, are expected from the mature EHT operating at higher observing frequencies and with longer baselines.

Coordinating observing time among all of the EHT members is a challenge, since participation in EHT is not a dedicated mission for any site. Site-specific weather also is a factor, since water in the atmosphere absorbs radiation in the EHT observing frequency bands. The next observing opportunity is scheduled between 5 – 14 April 2017. Processing the data from this observing run will take time, hence results are not expected to be known until later this year.

For more information on EHT, see the 2 March 2017 article by Daniel Clery entitled, ”This global telescope may finally see the event horizon of our galaxy’s giant black hole,” at the following link:

http://www.sciencemag.org/news/2017/03/global-telescope-may-finally-see-event-horizon-our-galaxys-giant-black-hole?utm_campaign=news_daily_2017-03-02&et_rid=215579562&et_cid=1194555

Much more information is available on the EHT website at the following link:

http://www.eventhorizontelescope.org

Radio telescope resolution

An article on the Las Cumbres Observatory (LCO) website explains how the angular resolution of radio telescopes, including VLBI arrays, is determined. In this article, the author, D. Stuart Lowe, states that “an array of radio telescopes of 217 km in diameter can produce an image with a resolution equivalent to the Hubble Space Telescope.” You’ll find this article here:

https://lco.global/spacebook/radio-telescopes/

The Hubble Space Telescope has an angular resolution of 1/10th of an arcsecond (1e-1 arcsecond).

A VLBI array with the diameter of the Earth (1.27e+7 meters) operating in the EHT’s millimeter / submillimeter wavelength band (1.3e-3 to 6.0e-4 meters) has a theoretical angular resolution of 2.6e-5 to 1.2e-5 arcseconds (25 to 12 microarcseconds).

EHT should be capable of meeting its goal of angular resolution comparable to a black hole’s event horizon.

X-ray observation of Sgr A*

Combining infrared images from the Hubble Space Telescope with images the Chandra X-ray Observatory, NASA created the following composite image showing the galactic core in the vicinity of Sgr A*. NASA reports:

“The large image contains X-rays from Chandra in blue and infrared emission from the Hubble Space Telescope in red and yellow. The inset shows a close-up view of Sgr A* in X-rays only, covering a region half a light year wide. The diffuse X-ray emission is from hot gas captured by the black hole and being pulled inwards.”

This image gives you a perspective on the resolution of Sgr A* possible at X-ray frequencies with current equipment. EHT will have much higher resolution in its radio frequency bands.

NASA Sgr A* picSource: X-Ray: NASA/UMass/D.Wang et al., IR: NASA/STScI

More details on this image are available at the following NASA link:

https://www.nasa.gov/mission_pages/chandra/multimedia/black-hole-SagittariusA.html

Animation of Sgr A* effects on nearby stars

See my 24 January 2017 post, “The Black Hole at our Galactic Center is Revealed Through Animations,” for more information on how teams of astronomers are developing a better understanding of the unseen Sgr A* black hole through long-term observations of the relative motions of nearby stars that are under the influence of this black hole.  These observations have been captured in a very interesting animation.

The First Test of Standard and Holographic Cosmology Models Ends in a Draw

Peter Lobner

Utrecht University (Netherlands) Professor Gerard ’t Hooft was the first to propose the “holographic principle,” in which all information about a volume of space can be thought of as being encoded on a lower-dimensional “boundary” of that volume.

Stanford Professor Leonard Susskind was one of the founders of string theory and, in 1995, developed the first string theory interpretation of the holographic principle to black holes. Dr. Susskind’s analysis showed that, consistent with quantum theory, information is not lost when matter falls into a black hole. Instead, it is encoded on a lower-dimensional “boundary” of the black hole, namely the event horizon.

Black hole event horizonSource: screenshot from video, “Is the Universe a Hologram?”

Extending the holographic principle to the universe as a whole, a lower-dimensional “boundary,” or “cosmic horizon,” around the universe can be thought of as a hologram of the universe. Quantum superposition suggests that this hologram is indistinguishable from the volume of space within the cosmic horizon.

You can see a short (15:49 minute) 2015 video interview of Dr. Susskind, “Is The Universe A Hologram?” at the following link:

https://www.youtube.com/watch?v=iNgIl-qIklU

If you have the time, also check out the longer (55:26) video lecture by Dr. Susskind entitled, “Leonard Susskind on The World As Hologram.” In this video, he explains the meaning of “information” and how information on an arbitrary volume of space can be encoded in one less dimension on a surface surrounding the volume.

https://www.youtube.com/watch?v=2DIl3Hfh9tY

You also might enjoy the more detailed story in Dr. Susskind’s 2008 book, “The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics.”

Leonard Susskind book cover   Source: Little, Brown and Company

In my 28 September 2016 post, “The Universe is Isotropic,” I reported on a conclusion reached by researchers using data from the Planck spacecraft’s all-sky survey of the cosmic microwave background (CMB). The researchers noted that an anisotropic universe would leave telltale patterns in the CMB. However, these researchers found that the actual CMB shows only random noise and no signs of such patterns.

More recently, a team of researchers from Canada, UK and Italy, also using the Planck spacecraft’s CBM data set, have offered an alternative view that the universe may be a hologram.  You’ll find the abstract for the 27 January 2017 original research paper by N. Afshordi, et al., “From Planck Data to Planck Era: Observational Tests of Holographic Cosmology,” in Physical Review Letters at the following link:

http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.118.041301

The authors note:

“We test a class of holographic models for the very early Universe against cosmological observations and find that they are competitive to the standard cold dark matter model with a cosmological constant (Λ CDM) of cosmology.”

“Competitive” means that neither model disproves the other.  So, we have a draw.

If you are a subscriber to Physical Review Letters, you can download the complete paper by N. Afshordi, et al. from the Physical Review Letters site.

5G Wireless Defined

Peter Lobner

In my 20 April 2016 post, “5G is Coming, Slowly,” I discussed the evolution of mobile communications technology and the prospects for the deployment of the next generation: 5G. The complexity of 5G service relative to current generation 4G (LTE) service is daunting because of rapidly increasing technical demands that greatly exceed LTE core capabilities. Examples of technical drivers for 5G include the population explosion in the Internet of Things (IoT), the near-term deployment of operational self-driving cars, and the rise of virtual and augmented reality mobile applications.

Progress toward 5G is steadily being made. Here’s a status update.

1. International Telecommunications Union (ITU) technical performance requirements

The ITU is responsible for international standardization of mobile communications technologies. On 23 February 2017, the ITU released a draft report containing their current consensus definition of the minimum technical performance requirements for 5G wireless (IMT-2020) radio service.

The ITU authors note:

“….the capabilities of IMT-2020 are identified, which aim to make IMT-2020 more flexible, reliable and secure than previous IMT when providing diverse services in the intended three usage scenarios, including enhanced mobile broadband (eMBB), ultra-reliable and low-latency communications (URLLC), and massive machine type communications (mMTC).”

This ITU’s draft technical performance requirements report is a preliminary document that is a product of the second stage of the ITU’s standardization process for 5G wireless deployment, which is illustrated below:

ITU-IMT2020 roadmap crop

Source: ITU

The draft technical performance requirements report provides technical definitions and performance specifications in each of the following categories:

  • Peak data rate
  • Peak spectral efficiency (bits per hertz of spectrum)
  • User experience data rate
  • 5th percentile user spectral efficiency
  • Average spectral efficiency
  • Area traffic capacity
  • Latency
  • Connection density
  • Energy efficiency
  • Reliability
  • Mobility
  • Mobility interruption time
  • Bandwidth

You’ll find a good overview of the ITU’s draft performance requirements in an article by Sebastian Anthony entitled, “5G Specs Announced: “20 Gbps download, 1 ms latency, 1M device per square km,” at the following link:

https://arstechnica.com/information-technology/2017/02/5g-imt-2020-specs/?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

You can download the ITU’s draft report, entitled “DRAFT NEW REPORT ITU-R [IMT-2020 TECH PERF REQ] – Minimum requirements related to technical performance for IMT-2020 radio interface(s),” at the following link:

https://www.itu.int/md/R15-SG05-C-0040/en

In the ITU standardization process diagram, above, you can see that their final standardization documents will not be available until 2019 – 2020.

2. Industry 5G activities

Meanwhile, the wireless telecommunications industry isn’t waiting for the ITU to finalize IMT-2020 before developing and testing 5G technologies and making initial 5G deployments.

3rd Generation Partnership Project (3GPP)

In February 2017, the organization 5G Americas summarized the work by 3GPP as follows:

“As the name implies the IMT-2020 process is targeted to define requirements, accept technology proposals, evaluate the proposals and certify those that meet the IMT-2020 requirements, all by the 2020 timeframe. This however, requires that 3GPP start now on discussing technologies and system architectures that will be needed to meet the IMT-2020 requirements. 3GPP has done just that by defining a two phased 5G work program starting with study items in Rel-14 followed by two releases of normative specs spanning Rel-15 and Rel-16 with the goal being that Rel-16 includes everything needed to meet IMT-2020 requirements and that it will be completed in time for submission to the IMT-2020 process for certification.”

The 2016 3GPP timeline for development of technologies and system architectures for 5G is shown below.

3GGP roadmap 2016

Source: 3GPP / 5G Americas White Paper

Details are presented in the 3GPP / 5G Americas white paper, “Wireless Technology Evolution Towards 5G: 3GPP Releases 13 to Release 15 and Beyond,” which you can download at the following link:

http://www.5gamericas.org/files/6814/8718/2308/3GPP_Rel_13_15_Final_to_Upload_2.14.17_AB.pdf

Additional details are in a February 2017 3GPP presentation, “Status and Progress on Mobile Critical Communications Standards,” which you can download here:

http://www.3gpp.org/ftp/Information/presentations/Presentations_2017/CCE-2017-3GPP-06.pdf

In this presentation, you’ll find the following diagram that illustrates the many functional components that will be part of 5G service. The “Future IMT” in the pyramid below is the ITU’s IMT-2020.

ITU 5G functions

Source: 3GPP presentation

AT&T and Verizon plan initial deployments of 5G technology

In November 2016, AT&T and Verizon indicated that their initial deployment of 5G technologies would be in fixed wireless broadband services. In this deployment concept, a 5G wireless cell would replace IEEE 802.11 wireless or wired routers in a small coverage area (i.e., a home or office) and connect to a wired / fiber terrestrial broadband system. Verizon CEO Lowell McAdam referred to this deployment concept as “wireless fiber.” You’ll find more information on these initial 5G deployment plans in the article, “Verizon and AT&T Prepare to Bring 5G to Market,” on the IEEE Spectrum website at the following link:

http://spectrum.ieee.org/telecom/wireless/verizon-and-att-prepare-to-bring-5g-to-market

Under Verizon’s current wireless network densification efforts, additional 4G nodes are being added to better support high-traffic areas. These nodes are closely spaced (likely 500 – 1,000 meters apart) and also may be able to support early demonstrations of a commercial 5G system.

Verizon officials previously has talked about an initial launch of 5G service in 2017, but also have cautioned investors that this may not occur until 2018.

DARPA Spectrum Collaboration Challenge 2 (SC2)

In my 6 June 2016 post, I reported on SC2, which eventually could benefit 5G service by:

“…developing a new wireless paradigm of collaborative, local, real-time decision-making where radio networks will autonomously collaborate and reason about how to share the RF (radio frequency) spectrum.”

SC2 is continuing into 2019.  Fourteen teams have qualified for Phase 3 of the competition, which will culminate in the Spectrum Collaboration Challenge Championship Event, which will be held on 23 October 2019 in conjunction with the 2019 Mobile World Congress in Los Angeles, CA.  You can follow SC2 news here:

https://www.spectrumcollaborationchallenge.com/media/

If SC2 is successful and can be implemented commercially, it would enable more efficient use of the RF bandwidth assigned for use by 5G systems.

3. Conclusion

Verizon’s and AT&T’s plans for early deployment of a subset of 5G capabilities are symptomatic of an industry in which the individual players are trying hard to position themselves for a future commercial advantage as 5G moves into the mainstream of wireless communications. This commercial momentum is outpacing ITU’s schedule for completing IMT-2020. The recently released draft technical performance requirements provide a more concrete (interim) definition of 5G that should remove some uncertainty for the industry.

3 April 2019 Update:  Verizon became the first wireless carrier to deliver 5G service in the U.S.

Verizon reported that it turned on its 5G networks in parts of Chicago and Minneapolis today, becoming the first wireless carrier to deliver 5G service to customers with compatible wireless devices in selected urban areas.  Other U.S. wireless carriers, including AT&T, Sprint and T-Mobile US, have announced that they plan to start delivering 5G service later in 2019.

Perspective on the Detection of Gravitational Waves

Peter Lobner

On 14 September 2015, the U.S. Laser Interferometer Gravitational-Wave Observatory (LIGO) became the first observatory to detect gravitational waves. With two separate detector sites (Livingston, Louisiana, and Hanford, Washington) LIGO was able to define an area of space from which the gravitational waves, dubbed GW150914, are likely to have originated, but was not able to pinpoint the source of the waves. See my 11 February 2016 post, “NSF and LIGO Team Announce First Detection of Gravitational Waves,” for a summary of this milestone event.

You’ll find a good overview on the design and operation of LIGO and similar laser interferometer gravity wave detectors in the short (9:06) Veratisium video, “The Absurdity of Detecting Gravitational Waves,” at the following link:

https://www.youtube.com/watch?v=iphcyNWFD10

The LIGO team reports that the Advanced LIGO detector is optimized for “a range of frequencies from 30 Hz to several kHz, which covers the frequencies of gravitational waves emitted during the late inspiral, merger, and ringdown of stellar-mass binary black holes.”

First observing run (O1) of the Advanced LIGO detector

The LIGO team defines O1 as starting on 12 September 2015 and ending on 19 January 2016. During that period, the LIGO team reported that it had, “unambiguously identified two signals, GW150914 and GW151226, with a significance of greater than 5σ,” and also identified a third possible signal, LVT151012. The following figure shows the time evolution of the respective gravitational wave signals from when they enter the LIGO detectors’ sensitive band at 30 Hz.

LIGO GW signals screenshot

Source: B. P. Abbot et al., PHYS. REV. X 6, 041015 (2016)

The second detection of gravitational waves, GW151226, occurred on 26 December 2015. You’ll find the 16 June 2016 LIGO press release for this event at the following link:

https://www.ligo.caltech.edu/news/ligo20160615

At the following link, you can view a video showing a simulation of GW151226, starting at a frequency of 35 Hz and continuing through the last 55 gravitational-wave cycles before the binary black holes merge:

https://www.ligo.caltech.edu/video/ligo20160615v3

GW151226 simularion screenshotSource: Max Planck Institute for Gravitational Physics/ Simulating eXtreme Spacetime (SXS) project

In their GW151226 press release, the LIGO team goes out on a limb and makes the following estimate:

“….we can now start to estimate the rate of black hole coalescences in the Universe based not on theory, but on real observations. Of course with just a few signals, our estimate has big uncertainties, but our best right now is somewhere between 9 and 240 binary black hole coalescences per cubic Gigaparsec per year, or about one every 10 years in a volume a trillion times the volume of the Milky Way galaxy!”

More details on the GW151226 detection are available in the paper “GW151266: Observation of Gravitational Waves from a 22-Solar Mass Black Hole Coalescence,” at the following link:

https://dcc.ligo.org/public/0124/P151226/013/LIGO-P151226_Detection_of_GW151226.pdf

LIGO releases its data to the public. Analyses of the LIGO public data already are yielding puzzling results. In December 2016, researchers reported finding “echoes” in the gravitational wave signals detected by LIGO. If further analysis indicates that the “echoes” are real, they may indicate a breakdown of Einstein’s general theory of relativity at or near the “edge” of a black hole. You can read Zeeya Marali’s 9 December 2016 article, “LIGO black hole echoes hint at general relativity breakdown,” at the following link:

http://www.nature.com/news/ligo-black-hole-echoes-hint-at-general-relativity-breakdown-1.21135

Second observing run (O2) of the Advanced LIGO detector is in progress now

Following a 10-month period when they were off-line for modifications, the Advanced LIGO detectors returned to operation on 30 November 2016 with a 10% improvement in the sensitivity of their interferometers. The LIGO team intends to further improve this sensitivity by a factor of two during the next few years.

VIRGO will add the capability to triangulate the source of gravitational waves

In my 16 December 2015 post, “100th Anniversary of Einstein’s General Theory of Relativity and the Advent of a New Generation of Gravity Wave Detectors,” I reported on other international laser interferometer gravitational wave detectors. The LIGO team has established a close collaboration with their peers at the European Gravitational Observatory, which is located near Pisa, Italy. Their upgraded detector, VIRGO, in collaboration with the two LIGO detectors, is expected to provide the capability to triangulate gravitational wave sources. With better location information on the source of gravitational waves, other observatories can be promptly notified to join the search using other types of detectors (i.e., optical, infrared and radio telescopes).

VIRGO is expected to become operational in 2017, but technical problems, primarily with the mirror suspension system, may delay startup. You’ll find a 16 February 2017 article on the current status of VIRGO at the following link:

http://www.sciencemag.org/news/2017/02/european-gravitational-wave-detector-falters

Perspective on gravitational wave detection

Lyncean member Dave Groce recommends the excellent video of an interview of Caltech Professor Kip Thorne (one of the founders of LIGO) by “Einstein” biographer Walter Issacson. This 2 November 2016 video provides a great perspective on LIGO’s first detection of gravitational waves and on the development of gravitational wave detection capabilities. You’ll find this long (51:52) but very worthwhile video at the following link:

https://www.youtube.com/watch?v=mDFF27Nr-EU

Dr. Thorne noted that, at the extremely high sensitivity of the Advanced LIGO detectors, we are beginning to see the effects of quantum fluctuations in “human sized objects,” in particular, the 40 kg (88.2 pound) mirrors in the LIGO interferometers. In each mirror, the center of mass (the average position of all the mass in the mirror) fluctuates due to quantum physics at just the level of the Advanced LIGO noise.

In the interview, Dr. Thorne also discusses several new observatories that will be become available in the following decades to expand the spectrum of gravitational waves that can be detected. These are shown in the following diagram.

Spectrum for gravitational wave detection screenshotSource: screenshot from Kip Thorne / Walter Issacson interview

  •  LISA = Laser Interferometer Space Antenna
  • PTA = Pulsar Timing Array
  • CMB = Cosmic microwave background

See my 27 September 2016 post, “Space-based Gravity Wave Detection System to be Deployed by ESA,” for additional information on LISA.

Clearly, we’re just at the dawn of gravitational wave detection and analysis. With the advent of new and upgraded gravitational wave observatories during the next decade, there will be tremendous challenges to align theories with real data.   Through this process, we’ll get a much better understanding of our Universe.

Long-duration Space Missions May Affect the Human Gut Microbiome

Peter Lobner

On 23 November 2016, Dr. Stanley Maloy gave the presentation, “Beneficial Microbes and Harmful Antibiotics,” (Talk #107) to the Lyncean Group. The focus of this presentation was on the nature of the human gut microbiome, its relationship to personal health, disruption of the gut microbiome by antibiotics and other causes, and how to restore a disrupted gut microbiome. You can find his presentation on the Past Meetings tab on the Lyncean home page or use the following direct link:

https://lynceans.org/talk-107-11232016/

In a story that’s related to Dr. Maloy’s presentation, a 3 February 2017 article by Megan Fellman entitled, “Changes in astronaut’s gut bacteria attributed to spaceflight,” provides the first results of a comparative analysis by Northwestern University researchers on changes in the gut microbiomes of NASA astronaut identical twins Scott and Mark Kelly. As part of a NASA experiment to examine the effects of long-duration space missions on humans, Scott Kelly was continuously in orbit on the International Space Station (ISS) for 340 days during 2015 – 2016, while Mark Kelly remained on Earth and served as the control subject.

Mark & Scott KellyMark (left) and Scott Kelly (right). Source: NASA

The key points reported by Northwestern University researchers were:

  • There was a shift in the balance between the two dominant groups of bacteria (Firmicutes and Bacteroidetes) in Scott Kelly’s gastrointestinal (GI) tract when he was in space. The balance returned to pre-flight levels when Scott Kelly returned to Earth.
  • Fluctuations in the same bacterial groups were seen in Mark Kelly, the control on Earth, but the fluctuations were not as great as those seen in Scott Kelly in space.
  • The surprise finding was that an expected change in diversity of gut microbes (the number of different species) was not observed in Scott Kelly while in space.
  • “Right now, we do not see anything alarming….”

You can read the complete article on this Northwestern University research at the following link:

https://news.northwestern.edu/stories/2017/february/change-in-astronauts-gut-bacteria-attributed-to-spaceflight/

So far, it looks like the human gut microbiome may not be a limiting factor in long-duration spaceflight.

Architect David Fisher’s Dynamic Skyscraper

Peter Lobner

David Fisher is the leading proponent of dynamic architecture and the inventor of the shape-changing dynamic skyscraper. The shape-changing feature is a clear differentiator between the dynamic skyscraper and earlier symmetrical rotating high-rise buildings like Suite Vollard, which was the first rotating high-rise building. This unique residential building opened in 2001 in Brazil.

David FisherSource: costruzionipallotta.it

GE.DI Group

The GE.DI Group is an Italian construction firm that has become a leading proponent of new construction systems, including David Fisher’s dynamic architecture.

“GE.DI. Group (GEstione DInamica stand for Dynamic Management) in 2008, decided to embark on a new era of architecture: the Dynamic Architecture, a project of the architect David Fisher for rotating towers, continually evolving: dynamic, ecological, made with industrial systems.”

“The revolution of Fisher put an end to the era of the static and immutable architecture and it inaugurates a new one, at the sign of the dynamism and the lifestyle. These buildings will become the symbol of a new philosophy that will change the image of our cities and the concept of living.”

More information on GE.DI Group is available at the following link:

http://www.costruzionipallotta.it/dynamic_architecture_en.htm

Concept of a Dynamic Skyscraper

Dynamic skyscraper conceptShape-changing rotating skyscraper. Source: costruzionipallotta.it

Three unique features of the dynamic skyscraper are:

1. Building exterior shape changes continuously: Each floor can rotate slowly thru 360 degrees independently of the other floors, with control over speed and direction of rotation. Coordination of the rotating floors to produce the artistic building shapes shown above may not be implemented in some applications. Nonetheless, the building’s exterior shape now has a fourth dimension: Time. The artistic possibilities of the dynamic skyscraper are shown (in time lapse) in the following 2011 video.

https://www.youtube.com/watch?v=QR2HukuFkQo

2. Prefabricated construction, except for the reinforced concrete core: After the reinforced concrete core has been completed and building services have been installed inside the core, factory manufactured prefabricated units will be transported to the construction site completely finished and will be hung from the central core. Connecting each rotating floor to electrical and water services in the stationary core will be an interesting engineering challenge. The extensive use of prefabricated construction (about 85% of total construction) greatly reduces site labor requirements, construction environmental impacts, and overall construction time. Read more on plans for prefabrication at the following link:

http://www.costruzionipallotta.it/prefabrication.htm

Building plan - dynamic skyscraper

Assembly plan for a dynamic skyscraper. Source: Source: costruzionipallotta.it

Modular unit installationPrefabricated modules being lifted into place. Source: Source: costruzionipallotta.it

3. Generates its own electric power: Horizontal wind turbine generators installed in the approximately two-foot gap between the rotating floors will be the building’s primary source of power. Roof-mounted solar panels on each floor also will be employed. Surplus power will be delivered to the grid, delivering enough power to supply about five similarly sized buildings in the vicinity. Read more on the energy generating and energy saving features of the dynamic skyscraper at the following link:

http://www.costruzionipallotta.it/green_building.htm

Wind turbine installationWind turbine installation.  Source: Source: costruzionipallotta.it

The first dynamic skyscraper may be built in Dubai

In a 14 February 2017 article entitled, “Dubai Will Have the World’s First Rotating Skyscraper by 2020,” Madison Margolin reported on the prospects for an 80-story mixed-use (office, hotel, residential) rotating skyscraper in Dubai. You can read the complete article on the Motherboard website at the following link:

https://motherboard.vice.com/en_us/article/dubai-will-have-the-worlds-first-rotating-skyscraper-by-2020?utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

Da Vinci rotating-tower DubaiSource: http://www.slideshare.net/swapnika15/dynamic-da-vincirotating-tower

Each floor of the 420 meter (1,378 ft.) Da Vinci Tower will consist of 40 factory-built modules hung from the load-bearing 22-meter (72.2 ft.) diameter reinforced concrete core. Each module will be cantilevered up to 15 meters (49.2 ft.) from the core.

Cantilevered floorsCantilevered rotating floors. Source: costruzionipallotta.it

The lower retail / office floors of the Da Vinci Tower will not rotate. The upper hotel and residential floors will rotate and each will require about 4 kW of power to rotate. Each residential floor can be configured into several individual apartments or a single “villa.” You’ll find a concept for a “luxury penthouse villa” at the following link:

http://www.costruzionipallotta.it/lifestyle.htm

You’ll find more details on the Da Vinci Tower in a slideshow at the following link:

http://www.slideshare.net/swapnika15/dynamic-da-vincirotating-tower

If it is built, the Da Vinci Tower will be the world’s first dynamic skyscraper. It also will be David Fisher’s first skyscraper.

Energy Literacy

Peter Lobner

I was impressed in 2007 by the following chart in Scientific American, which shows where our energy in the U.S. comes from and how the energy is used in electricity generation and in four consumer sectors. One conclusion is that more than half of our energy is wasted, which is clearly shown in the bottom right corner of the chart. However, this result shouldn’t be surprising.

2007 USA energy utilizationSource: Scientific American / Jen Christiansen, using LLNL & DOE 2007 data

The waste energy primarily arises from the efficiencies of the various energy conversion cycles being used. For example, the following 2003 chart shows the relative generating efficiencies of a wide range of electric power sources. You can see in the chart that there is a big plateau at 40% efficiency for many types of thermal cycle power plants. That means that 60% of the energy they used is lost as waste heat. The latest combined cycle plants have demonstrated net efficiencies as high as 62.22% (Bouchain, France, 2016, see details in my updated 17 March 2015 post, “Efficiency in Electricity Generation”).

Comparative generation  efficiencies-Eurelectric 2003Source: Eurelectric and VGB PowerTech, July 2003

Another source of waste is line loss in electricity transmission and distribution from generators to the end-users. The U.S. Energy Information Administration (EIA) estimates that electricity transmission and distribution losses average about 6% of the electricity that is transmitted and distributed.

There is an expanded, interactive, zoomable map of U.S. energy data that goes far beyond the 2007 Scientific American chart shown above. You can access this interactive map at the following link:

http://energyliteracy.com

The interactivity in the map is impressive, and the way it’s implemented encourages exploration of the data in the map. You can drill down on individual features and you can explore particular paths in much greater detail than you could in a physical chart containing the same information. Below are two example screenshots. The first screenshot is a top-level view. As in the Scientific American chart, energy sources are on the left and final disposition as energy services or waste energy is on the right. Note that waste energy is on the top right of the interactive map.

Energy literacy map 1

The second screenshot is a more detailed view of natural gas production and utilization.

Energy literacy map 2

As reported by Lulu Chang on the digitaltrends.com website, this interactive map was created by Saul Griffith at the firm Otherlab (https://otherlab.com). You can read her post at the following link:

http://www.digitaltrends.com/home/otherlab-energy-chart/

I hope you enjoy exploring the interactive energy literacy map.

Protocol for Reporting UFO Sightings

Peter Lobner

The United States Air Force began investigating unidentified flying objects (UFOs) in the fall of 1947 under a program called Project Sign, which later became Project Grudge, and in January 1952 became Project Blue Book. As you might expect, the USAF developed a reporting protocol for these projects.

Starting in 1951, the succession of Air Force documents that provided UFO reporting guidance is summarized below:

Headquarters USAF Letter AFOIN-C/CC-2

This letter, entitled, “Reporting of Information on Unidentified Flying Objects,” dated 19 December 1951, may be the original guidance document for UFO reporting. So far, I have been unable to find a copy of this document. The Project Blue Book archives contain examples of UFO reports from 1952 citing AFOIN-C/CC-2.

Air Force Letter AFL 200-5

The first reporting protocol I could find was Air Force Letter AFL 200-5, “Unidentified Flying Objects Reporting,” dated 29 April 1952, which was issued on behalf of the Secretary of the USAF by Hoyt S. Vandenberg, Chief of Staff of the USAF.

  • Defines UFOs as, “any airborne object which by performance, aerodynamic characteristics, or unusual features, does not conform to any presently known aircraft or missile type.”
  • UFO reporting is treated as an Intelligence activity (denoted by the 200-series document number)
  • Provides brief guidance on report content, which was to be submitted on AF Form 112, “Air Intelligence Information Report,” and not classified higher than RESTRICTED.
  • The local Commanding Officer is responsible for forwarding FLYOBRPTS to the appropriate agencies. FLYOBRPT is an acronym for FLYing OBject RePorT.
  • Responsibility for investigating UFOs was assigned to the Air Technical Intelligence Center (ATIC) at Wright Patterson Air Force Base, Ohio. ATIC was a field activity of the Directorate of Intelligence in USAF Headquarters.
  • AFL 200-5 does not indicate that it superseded any prior USAF UFO reporting guidance document, but it is likely that it replaced USAF letter AFOIN-C/CC-2, dated 19 December 1951.

Download AFL 200-5 at the following link:

http://www.cufon.org/cufon/AFL_200-5.pdf

How to Make FLYOBRPTs

In 1953, the AITC issued “How to Make FLYOBRPTs,” dated 25 July 1953, to help improve reporting required by AFL 200-5.

Figure 1 from How to Make a FLYOBRPT

Source: USAF

This guidance document provides an interesting narrative about UFOs through 1953, explains how to collect information on a UFO sighting, including interacting with the public during the investigation, and how to complete a FLYOBRPT using four detailed data collection forms.

  • Ground Observer’s Information Sheet (9 pages)
  • Electronics Data Sheet (radar) (5 pages)
  • Airborne Observer’s Data Sheet (9 pages) and,
  • Supporting Data form (8 pages)

This report showed that the USAF had a sense of humor about UFO reporting.

Figure 2 from How to Make a FLYOBRPTSource: USAF

Download “How to Make FLYOBRPTs” at the following link:

http://www.cufon.org/cufon/FLYOBRPT.pdf

Air Force Regulation AFR 200-2

In 1953, the Secretary of the Air Force, Harold E. Talbott, issued the original Air Force Regulation AFR 200-2, “Unidentified Flying Objects Reporting”, dated 26 August 1953.

  • Superseded AFL 200-5, dated 29 April 1952
  • Defines procedures for reporting UFOs and restrictions on public discussion by Air Force personnel
  • Change 200-2A was issued on 2 November 1953
  • Between 1954 and 1962, the USAF issued several subsequent versions of AFR 200-2, as listed below.

AFR 200-2, “Unidentified Flying Objects Reporting (Short Title: FLYOBRPT)”, dated 12 August 1954.

  • Superseded AFR 200-2 dated 26 August 1953 and Change 200-2A
  • Identifies the USAF interest in UFOs as follows: “Air Force interest in unidentified flying objects is twofold: First as a possible threat to the security of the United States and its forces, and secondly, to determine technical aspects involved.”
  • Defines an expected report format that is less comprehensive than the guidance in “How to Make FLYOBRPTs.”
  • Clarifies that Headquarters USAF will release summaries of evaluated data to the public. Also notes that it is permissible to respond to local inquiries when the object is positively identified as a “familiar object” (not a UFO). In other cases, the only response is that ATIC will analyze the data.
  • Download this version of AFR 200-2 at the following link:

http://www.cufon.org/cufon/afr200-2.htm

AFR 200-2, “Unidentified Flying Objects (UFO),” dated 5 February 1958

  • Supersedes the version dated 12 August 1954
  • Broadens the USAF interest in UFOs: “First as a possible threat to the security of the United States and its forces; second, to determine the technical or scientific characteristics of any such UFOs; third, to explain or identify all UFO sightings…”
  • Updates report formats and provides additional guidance on reporting
  • Download this version from the CIA website at the following link:

https://www.cia.gov/library/readingroom/docs/CIA-RDP81R00560R000100040072-9.pdf

AFR 200-2, “Unidentified Flying Objects (UFO),” dated 14 September 1959

  • Supersedes the version dated 5 February 1958

AFR 200-2, “Unidentified Flying Objects (UFO),” dated 20 July 1962

  • Supersedes the version dated 14 September 1959
  • Superseded by AFR 80-17

Air Force Regulation AFR 80-17

In 1966, the USAF issued AFR 80-17, “Unidentified Flying Objects (UFO),” dated 19 September 1966

  • Supersedes AFR 200-2 dated 20 July 1962.
  • Two changes were issued:
    • AFR 80-17, Change 80-17A, dated 8 November 1966
    • AFR 80-17, Change 1, dated 26 October 1968, superseded AFR 80-17A, 8 November 1966
  • No longer considers UFO reporting as an intelligence activity, as denoted by the 80-series number assigned to the AFR
  • Places UFO reporting under the Research and Development Command. This is consistent with recasting ATIC into the Foreign Technology Division (FTD) of the Air Force Systems Command at Wright-Patterson AFB.
  • Broadly redefines UFO as “any aerial phenomenon which is unknown or appears out of the ordinary to the observer.”
  • Orders all Air Force bases to provide an investigative capability
  • Change 80-17A assigned University of Colorado to conduct an independent scientific investigation of UFOs. Physicist Edward U. Condon would direct this work.

Download AFR 80-17, with change 80-17A and change 1 here:

http://www.cufon.org/cufon/afr80-17.htm

Project Blue Book’s final report

In late October 1968, the University of Colorado’s final report was completed and submitted for review by a panel of the National Academy of Sciences. The panel approved of the methodology and concurred with Edward Condon’s conclusion:

“That nothing has come from the study of UFOs in the past 21 years that has added to scientific knowledge. Careful consideration of the record as it is available to us leads us to conclude that further extensive study of UFOs probably cannot be justified in the expectation that science will be advanced thereby.”

In January 1969, a 965-page paperback version of the report was published under the title, “Scientific Study of Unidentified Flying Objects.”

On 17 December 1969, Air Force Secretary Robert C. Seamans, Jr., announced the termination of Project Blue Book.

Additional resources

You’ll find a good history by of the U.S. Air Force UFO programs written by Thomas Tulien at the following link:

http://sohp.us/history-of-the-usaf-ufo-programs/8-turning-point.php

Doomsday Clock Reset

Peter Lobner

This year is the 70th anniversary of the Doomsday Clock, which the Bulletin of the Atomic Scientists describes as follows:

“The Doomsday Clock is a design that warns the public about how close we are to destroying our world with dangerous technologies of our own making. It is a metaphor, a reminder of the perils we must address if we are to survive on the planet.”

You’ll find an overview on the Doomsday Clock here:

http://thebulletin.org/overview

The Clock was last changed in 2015 from five to three minutes to midnight. In January 2016, the Doomsday Clock’s minute hand did not change.

On 26 January 2017, the Bulletin of the Atomic Scientists Science and Security Board, in consultation with its Board of Sponsors, which includes 15 Nobel Laureates, decided to reset the Doomsday Clock to 2-1/2 minutes to midnight. This is the closest it has been to midnight in 64 years, since the early days of above ground nuclear device testing.

Two and a half minutes to midnight

The Science and Security Board warned:

“In 2017, we find the danger to be even greater (than in 2015 and 2016), the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way.”

You can read the Science and Security Board’s complete statement at the following link:

http://thebulletin.org/sites/default/files/Final%202017%20Clock%20Statement.pdf

Their rationale for resetting the clock is not based on a single issue, but rather, the aggregate effects of the following issues, as described in their statement:

A dangerous nuclear situation on multiple fronts

  • Stockpile modernization by current nuclear powers, particularly the U.S. and Russia, has the potential to grow rather than reduce worldwide nuclear arsenals
  • Stagnation in nuclear arms control
  • Continuing tensions between nuclear-armed India and Pakistan
  • North Korea’s continuing nuclear development
  • The Iran nuclear deal has been successful in accomplishing its goals in its first year, but its future is in doubt under the new U.S. administration
  • Careless rhetoric about nuclear weapons is destabilizing; for example, the U.S. administration’s suggestion that South Korea and Japan acquire their own nuclear weapons to counter North Korea

The clear need for climate action

  • The Paris Agreement went into effect in 2016
  • Continued warming of the world was measured in 2016
  • S. administration needs to make a clear, unequivocal statement that it accepts climate change, caused by human activity, as a scientific reality

Nuclear power: An option worth careful consideration

  • Nuclear power a tempting part of the solution to the climate change problem
  • The scale of new nuclear power plant construction does not match the need for clean energy
  • In the short to medium term, governments should discourage the premature closure of existing reactors that are safe and economically viable
  • In the longer term, deploy new types of reactors that can be built quickly and are at least as safe as the commercial nuclear plants now operating
  • Deal responsibly with safety issues and with the commercial nuclear waste problem

Potential threats from emerging technologies

  • Technology continues to outpace humanity’s capacity to control it
  • Cyber attacks can undermining belief in representative government and thereby endangering humanity as a whole
  • Autonomous machine systems open up a new set of risks that require thoughtful management
  • Advances in synthetic biology, including the Crispr gene-editing tool, have great positive potential, but also can be misused to create bioweapons and other dangerous manipulations of genetic material
  • Potentially existential threats posed by a host of rapidly emerging technologies need to be monitored, and to the extent possible anticipated and managed.

Reducing risk: Expert advice

  • The Board is extremely concerned about the willingness of governments around the world— including the incoming U.S. administration—to ignore or discount sound science and considered expertise during their decision-making processes

Prior to the formal decision on the 2017 setting of the Doomsday Clock, the Bulletin took a poll to determine public sentiment on what the setting should be. Here are the results of this public pole.

Results of The Bulletin Public Poll

How would you have voted?