Featured Post

Tracking air pollution disparities -- daily -- from space

Studies have shown that pollution, whether from factories or traffic-snarled roads, disproportionately affects communities where economicall...

Thursday, October 31, 2019

3-2-1-Cookoff! Astronauts to bake cookies with new test oven



space

Credit: CC0 Public Domain
More

Forget reheated, freeze-dried space grub. Astronauts are about to get a new test oven for baking chocolate chip cookies from scratch.

The next delivery of supplies for the International Space Station—scheduled for liftoff this weekend—includes the Zero G Oven. Chocolate chip cookie dough is already up there, waiting to pop into this small electric oven designed for .


As a tantalizing incentive, sample cookies baked just this week are also launching Saturday from Virginia on Northrop Grumman's Cygnus capsule, for the six station astronauts.


The experiment explores the possibility of making freshly baked goods for travelers. With NASA eyeing trips to the moon and Mars, homemade food takes on heightened importance. What's in orbit now are essentially food warmers.


Run by a New York couple, Zero G Kitchen aims to create a kitchen in space one appliance at a time, starting with the oven.


"You're in space. I mean, you want to have the smell of cookies," said Zero G Kitchen's Jordana Fichtenbaum, a social media specialist for hotels and restaurants. "The kitchen is really sort of the heart of the home to me, and the oven is kind of where it's at. So just to make (space) more comfortable and make it more pleasant, more delicious."


Out-of-this-world can also entice the public and make space exploration more relatable, according to her husband, Ian Fichtenbaum, who works in the space business.


Also collaborating on this first-of-its-kind space bake: Texas-based Nanoracks, which designed and built the oven and arranged the flight, and DoubleTree, which supplied the same cookie dough used by the hotel chain for welcome cookies.


"That's the beauty of this to me," Jordana Fichtenbaum said by phone earlier this week. "It's the same recipe and the same thing that you get on Earth."


Previous station crews have created their own pizzas using flatbread and warmed them in the galley. Astronauts have attempted other creative cuisine, mixing and heating chopped onions and garlic, for instance, and whipping up salads from station-grown greens. Results have been mixed.


The cookie baking will be slow going—the oven can bake just one cookie at a time, and it could be weeks before the astronauts have time to try it out.


Five raw cookies have been in a space station freezer since the summer. Each is in its own individual clear silicone pouch and, according to Ian Fichtenbaum, resembles a frozen hockey puck. The oven's maximum heat is 350 F (177 C), double the temperature of the U.S. and Russian food warmers aboard the space station. The cylindrical oven uses electric heating elements.


Nanorack manager Mary Murphy anticipates a baking time of 15 to 20 minutes per cookie at about 325 F (163 C). The aroma of baking cookies should fill the lab each time a cookie comes out of the and is placed on an attached cooling rack, she said.


The first cookie will be the real test; it could end up looking like a blob or a mini pancake in the absence of gravity. Three of the space-baked cookies will be returned to Earth for analysis.


"Baking doesn't always go according to plan, even on the ground," said Murphy.




Explore further



How to make your own healthy chicken tenders



© 2019 The Associated Press. All rights reserved.






Citation:
3-2-1-Cookoff! Astronauts to bake cookies with new test oven (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-cookoff-astronauts-cookies-oven.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Space | https://sciencespies.com/space/3-2-1-cookoff-astronauts-to-bake-cookies-with-new-test-oven/

Air Force claims accomplishments in Space C2 program in wake of critical report

The Air Force has submitted an acquisition strategy for the Space C2 program which is now being reviewed by the Pentagon


WASHINGTON — Following the release of a critical congressional report on the Space Command and Control program, the Air Force says it has been working with the Pentagon to address the concerns raised by the Government Accountability Office.


GAO in an Oct. 30 report criticized the program known as Space C2 for lacking an acquisition strategy and called for greater Pentagon oversight. The Air Force launched the program just over a year ago to replace the long-troubled Joint Space Operations Center Mission System, or JMS.


Col. Jennifer Krolikowski, senior materiel leader for Space C2 at the Air Force Space and Missile Systems Center, told SpaceNews Oct. 31 that she did not disagree with the GAO assessment but insisted that the Air Force already has moved to fix the problems highlighted in the report.


Under the Space C2 program, the Air Force is developing software applications used by military commanders and analysts to monitor and understand what is happening in space. JMS failed because it was managed like a traditional monolithic DoD software development program, and was chronically late and over budget. Space C2 uses commercial agile software development methods where engineers make rapid changes, ask for user feedback and adjust the software for the next increment.


Krolikowski, who has been in charge of Space C2 since August 2018, said new software applications are being delivered every few months to commanders and analysts at military space command centers at Schriever Air Force Base, Colorado, and at Vandenberg Air Force Base, California.


She said GAO had no issues with the actual software or the agile development methods but questioned the lack of a formal acquisition strategy, which is required for every DoD program.


“We have an acquisition strategy already written and in coordination with the Air Force and the Office of the Secretary of Defense,” Krolikowski said. Air Force officials will be meeting Nov. 26 with Undersecretary of Defense for Acquisition and Sustainment Ellen Lord to discuss the strategy, she said.


Lord has been a proponent of agile software development and had led a DoD-wide review of software acquisition practices. “My program is a pathfinder for them to help other programs and identify problems,” said Krolikowski.


An acquisition strategy lays out procedures for contracting, testing and setting program requirements. Although agile software development has been commonplace in the private sector for years, it is a relatively new practice in DoD. The acquisition strategy for Space C2 has to explain how those practices will be implemented.


Krolikowski highlighted some new products delivered by Space C2:


• A High-Interest Event Tracker that allows operators to monitor, track, and display information regarding potential satellite conjunctions, space launches, satellite de-orbits, re-entries, and other high interest space events. The tracker was be pursued by JMS but it was too complex and had hardly any users. Krolikowski said that after the release of the new version, 600 users signed up.


• A Radio Frequency Deconfliction tool that provides a consolidated workflow that reduces the time it takes to process radio frequency deconfliction requests.


• A software tool for intelligence analysts that helps them manage data collection taskings for ground based radars and other sensors. Krolikowski called this a “dynamic sensor planning tool” designed to produce better intelligence about the space environment.









#Space | https://sciencespies.com/space/air-force-claims-accomplishments-in-space-c2-program-in-wake-of-critical-report/

Massive wildfires hit southern Brazil's Pantanal



Fires have ravaged the Pantanal marshland area in Brazil's Mato Grosso do Sul state

Fires have ravaged the Pantanal marshland area in Brazil's Mato Grosso do Sul state
More

Wildfires are raging across the Pantanal tropical wetlands in southern Brazil, one of the most biodiverse areas in the world and a major tourist destination, regional authorities said Thursday.


The governor's office in the state of Mato Grosso do Sul said the fires were "bigger than anything seen before" in the region.


So far, more than 50,000 hectares (nearly 125,000 acres) have been affected.


The blazes follow other wildfires that say ravaged millions of hectares in the Amazon rainforest in August.


The statement from the governor's office said the situation was "critical," with blazes ravaging three towns in the Pantanal, a popular eco-tourism spot.


"Intense flames and reddish smoke have disrupted traffic" on the highways, the statement said.


The coordinator of the National Risk Management Center, Paulo Barbosa de Souza, said the blaze—fed by wind and dry vegetation—was causing "logistical difficulties."


Satellite images from the INPE space institute showed there were nearly 8,500 fires in the Pantanal area between January and October this year.


That was the worst record since 2007.




The fires were threatening three towns in Brazil's Pantanal

The fires were threatening three towns in Brazil's Pantanal
More






Explore further



As global leaders meet, the Amazon rainforest burns



© 2019 AFP






Citation:
Massive wildfires hit southern Brazil's Pantanal (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-massive-wildfires-southern-brazil-pantanal.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Environment | https://sciencespies.com/environment/massive-wildfires-hit-southern-brazils-pantanal/

Czech lab grows mustard plants for Mars


More

Scientists check plants inside of an aeroponic growing chamber system as an experiment called Marsonaut at Prague University of Life Sciences in Prague, Czech Republic, October 30, 2019. REUTERS/David W Cerny


PRAGUE (Reuters) - Czech scientists have opened a lab to experiment growing food for environments with extreme conditions and lack of water, such as Mars.


The “Marsonaut” experiment by scientist Jan Lukacevic, 29, and his team at the Prague University of Life Sciences is based on aeroponics - growing plants in the air, without soil, and limiting water use to a minimum.


The plants grow horizontally from a vertical unit and are stacked one above the other to minimize space. Researchers experiment with light and temperature changes, Lukacevic said.


The team has already succeeded in growing mustard plants, salad leaves, radishes and herbs like basil and mint.


Scientists ate their first harvest last week. 


“They taste wonderful, because they grow in a controlled environment and we supply them with bespoke nutrients,” said Lukacevic.


Strawberries are the next crop planned.


The main benefit of the growing method is that it uses 95 percent less water than normal plant cultivation and also saves space, which could boost agricultural yields in areas hit by urbanization and climate change.


Reporting by Jiri Skacel; Writing by Jan Lopatka; Editing by Alexandra Hudson and Dan Grebler







#News | https://sciencespies.com/news/czech-lab-grows-mustard-plants-for-mars/

Boeing and SpaceX preparing for commercial crew abort tests

WASHINGTON — Boeing and SpaceX are on schedule to perform two critical tests of their commercial crew vehicles in the next week with hopes that both vehicles will be ready to carry astronauts by early next year.


In an Oct. 30 presentation to the NASA Advisory Council’s Human Exploration and Operations committee, Kathy Lueders, manager of NASA’s commercial crew program, said that Boeing was still working towards a Nov. 4 pad abort test of its CST-100 Starliner spacecraft that the company announced three weeks earlier.


In that test, at the White Sands Missile Range (WSMR) in New Mexico, the Starliner will fire the abort engines in its service module to simulate escaping a launch vehicle on the pad. The Starliner will fly about 1.5 kilometers high, landing 1.5 kilometers downrange 90 seconds later.


“The vehicle is stacked up on the stand, getting ready to go,” Lueders said. “We, right now, are on the range at WSMR for next Monday morning to do this check-out. It’s a huge, huge test for us.”


Besides testing the abort motors themselves, she said key areas of interest for the test will be the separation of the Starliner’s crew module from its service module after the motors shut down, as well as the deployment of parachutes for the crew module.


The Starliner that will fly an uncrewed orbital test flight, called the Orbital Flight Test (OFT) by Boeing, doesn’t have an abort system, but Lueders said the pad abort test was critical for it nonetheless. “The way the system separates will reflect on our OFT progress,” she said. “It’s critical for us to get this test going, and that we understand it prior to us doing rollout of the spacecraft” for the orbital test flight.


That OFT mission is scheduled for launch Dec. 17 on a United Launch Alliance Atlas 5 from Cape Canaveral. That rollout of the Starliner from a Boeing facility at the Kennedy Space Center to a ULA processing center for integration onto the Atlas 5 will take place about a week after the pad abort test, she said.


The Boeing pad abort test will take place just days before SpaceX performs a static-fire test of the SuperDraco abort engines on its Crew Dragon spacecraft. Lueders said that test, in Florida, is expected to occur in the middle of next week.


An explosion took place during preparations for a similar static-fire abort test in April, destroying the Crew Dragon spacecraft that flew the Demo-1 uncrewed mission to the International Space Station in March and was being readied for an in-flight abort test planned for the summer. An investigation, still being wrapped up, implicated a leaky valve that allowed nitrogen tetroxide (NTO) oxidizer into part of the propulsion system, which, when pressurized, was hurled into a titanium check valve, igniting it.


“Having something like that happen is a big wakeup call for the team, that they have to be diligent and careful about this,” Lueders said, noting that even NASA wasn’t aware of the “compatibility issue” between NTO and titanium components at those conditions.


She added both NASA and SpaceX were fortunate the accident took place on the ground during a test with no one on board, and with access to video and other telemetry to aid the investigation. “It would have been a bad thing for us to have found out on orbit.”


If the static-fire test is successful, SpaceX will be ready to perform an in-flight abort test using that Crew Dragon spacecraft in early December. That test will involve the spacecraft, which was originally built for the Demo-2 crew test, escaping a Falcon 9 nearly 90 seconds after liftoff from the Kennedy Space Center.


Both Boeing and SpaceX, Lueders said, could be ready for crewed test flights to the ISS in early 2020. The Starliner for Boeing’s Crew Test Flight is expected to be completed by the end of the first quarter of 2020, while the new Crew Dragon spacecraft for Demo-2 should be completed and ready to ship to Florida by late December of this year.


Setting a launch date for those crewed test flights, though, will depend on the completion of the upcoming tests as well as other work to qualify the vehicles for carrying astronauts. That includes completion of parachute testing, a milestone neither company has achieved according to a chart Lueders showed in her briefing.


Boeing officials have previously said that they have completed testing of their parachutes, but that final certification of them is pending the outcomes of the pad abort test and Orbital Flight Test. SpaceX recently announced it was testing a new version of the Crew Dragon parachutes, called Mark 3, that have higher safety margins than earlier versions, which suffered at least one failure in a test earlier this year.


Lueders said little about either company’s parachute work in her presentation, beyond a passing reference to SpaceX work. “SpaceX guys did 12 chute tests in week as we’re working to perfecting the Mark 3 design,” she said. “We’re continuing to work with them on what that schedule is and finalizing that.”


Both SpaceX and NASA have provided few details about that Mark 3 parachute work, which SpaceX Chief Executive Elon Musk and NASA Administrator Jim Bridenstine emphasized when they met Oct. 10 at SpaceX’s headquarters in California. “The highest priority has been the parachutes,” Bridenstine said then. “Elon has told me, and he’s showed me, that that’s where their priority is. They’re putting as much resources and manpower as they can to getting those parachutes ready.”


A SpaceX official, speaking on background several days after that event, confirmed that testing of the Mark 3 parachute was underway, but didn’t respond to questions about the number of tests completed to date and whether all the tests were successful. A NASA spokesperson, asked about SpaceX parachute testing Oct. 25, referred questions to SpaceX.


The lack of information about the status of parachute testing stands in contrast to comments by Musk at the media appearance with Bridenstine, where he said the company would be more transparent about Crew Dragon testing. “We’ll be doing a lot of tests of the Mark 3 parachutes,” he said. “We’ll keep the public informed, so you’ll know what goes wrong and what goes right, and what we’re doing about it.”









#Space | https://sciencespies.com/space/boeing-and-spacex-preparing-for-commercial-crew-abort-tests/

WFIRST will add pieces to the dark matter puzzle



WFIRST will add pieces to the dark matter puzzle

Entangled among the galaxies in this Hubble image are mysterious-looking arcs of blue light. These are actually distorted images of remote galaxies behind the cluster. The collective gravity of all the normal and dark matter trapped inside the cluster warps space-time and affects light traveling through the cluster toward Earth. Credit: NASA, ESA, and J. Lotz and the HFF Team (STScI)
More

The true nature of dark matter is one of the biggest mysteries in the universe. Scientists are trying to determine what exactly dark matter is made of so they can detect it directly, but our current understanding has so many gaps, it's difficult to know just what we're looking for. WFIRST's ability to survey wide swaths of the universe will help us figure out what dark matter could be made of by exploring the structure and distribution of both matter and dark matter across space and time.

Why is dark matter such a perplexing topic? Scientists first suspected its existence over 80 years ago when Swiss-American astronomer Fritz Zwicky observed that in the Coma cluster were moving so quickly they should have been flung away into space—yet they remained gravitationally bound to the cluster by unseen matter. Then in the 1970s, American astronomer Vera Rubin discovered the same type of problem in individual spiral galaxies. Stars toward the edge of the galaxy move too fast to be held in by the galaxy's luminous matter—there must be much more matter than we can see in these galaxies to hold the stars in orbit. Ever since these discoveries, scientists have been trying to piece together the puzzle using sparse clues.


There is currently a wide range of dark matter candidates. We don't even have a very good idea what the mass of dark matter particles might be, which makes it difficult to work out how best to search for them. WFIRST's wide-field surveys will provide a comprehensive look at the distribution of galaxies and galaxy clusters across the universe in the most detailed dark matter studies ever undertaken, thanks to dark matter's gravitational effects. These surveys will yield new insight into the fundamental nature of dark matter, which will enable scientists to hone their searching techniques.


Most theories of the nature of dark matter particles suggest they almost never interact with normal matter. Even if someone dropped a huge chunk of dark matter on your head, you would probably perceive nothing. You wouldn't have any means of detecting its presence—all of your senses are moot when it comes to dark matter. You wouldn't even stop it from hurtling straight through your body and on toward Earth's core.


This doesn't happen to regular matter, such as cats or people, because forces between the atoms in the ground and the atoms in our bodies prevent us from falling through Earth's surface, but dark matter behaves strangely. Dark matter is so inconspicuous it is even invisible to telescopes that observe the cosmos in forms of light our eyes can't see, from radio waves to high-energy gamma rays.




WFIRST will add pieces to the dark matter puzzle

This Hubble Space Telescope mosaic shows a portion of the immense Coma galaxy cluster -- containing more than 1,000 galaxies -- located 300 million light-years away. The rapid motion of its galaxies was the first clue that dark matter existed. Credit: NASA, ESA, J. Mack (STScI) and J. Madrid (Australian Telescope National Facility
More


"Lensing" dark matter



If dark matter is invisible, how do we know it exists? While dark matter doesn't interact with normal matter in most cases, it does affect it gravitationally (which is how it was first discovered decades ago), so we can map its presence by looking at clusters of galaxies, the most massive structures in the universe.


Light always travels in a straight line, but space-time—the fabric of the universe—is curved by concentrations of mass within it. So when light passes by a mass, its path curves as well: a straight line in a curved space. Light that would normally pass near a galaxy instead bends toward and around it, producing intensified—and sometimes multiple—images of the background source. This process, called strong gravitational lensing, transforms galaxy clusters into colossal natural telescopes that give us a glimpse of distant cosmic objects that would normally be too faint to be visible.


Since more matter leads to stronger lensing effects, gravitational lensing observations provide a way to determine the location and quantity of matter in galaxy clusters. Scientists have discovered that all of the visible matter we see in galaxy clusters isn't nearly enough to create the observed warping effects. Dark matter provides the surplus gravity.


Scientists have confirmed earlier observations by measuring how much matter in the very early universe is "normal" and how much is "dark" using experiments like NASA's Wilkinson Microwave Anisotropy Probe (WMAP). Even though normal matter makes up everything we can see, the universe must contain more than five times as much dark matter to fit the observations.


WFIRST will build on previous dark matter studies by using so-called weak gravitational lensing that tracks how smaller clumps of dark matter warp the apparent shapes of more distant galaxies. Observing lensing effects on this more refined scale will enable scientists to fill in more of the gaps in our understanding of dark matter.


The mission will measure the locations and quantities of both normal matter and dark matter in hundreds of millions of galaxies. Throughout cosmic history, dark matter has driven how stars and galaxies formed and evolved. If dark matter consists of heavy, sluggish particles, it would clump together readily and WFIRST should see galaxy formation early in cosmic history. If dark matter is made up of lighter, faster-moving particles, it should take longer to settle into clumps and for large-scale structures to develop.


WFIRST's gravitational lensing studies will allow us to peer back in time to trace how galaxies and galaxy clusters formed under the influence of dark matter. If astronomers can narrow down the candidates for particles, we'll be one step closer to finally detecting them directly in experiments on Earth.




Explore further



Mapping dark matter







Citation:
WFIRST will add pieces to the dark matter puzzle (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-wfirst-pieces-dark-puzzle.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Space | https://sciencespies.com/space/wfirst-will-add-pieces-to-the-dark-matter-puzzle/

NASA microgap-cooling technology immune to gravity effects and ready for spaceflight



NASA microgap-cooling technology immune to gravity effects and ready for spaceflight

The microgap-cooling technology developed by Goddard technologist Franklin Robinson and University of Maryland professor Avram Bar-Cohen was tested twice on a Blue Origin New Shepard rocket. Credit: NASA/Franklin Robinson
More

A groundbreaking technology that would allow NASA to effectively cool tightly packed instrument electronics and other spaceflight gear is unaffected by weightlessness, and could be used on a future spaceflight mission.

During two recent flights aboard Blue Origin's New Shepard rocket, Principal Investigator Franklin Robinson, an engineer at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and Co-Investigator Avram Bar-Cohen, a University of Maryland professor, proved that their microgap-cooling technology not only removed large amounts of heat, but also carried out this all-important job in low- and high-gravity environments with nearly identical results.


The demonstrations, funded by NASA's Flight Opportunities program within the Space Technology Mission Directorate, opens the doors for the technology's use on a future spaceflight mission, Robinson said. The technology development was also supported by the agency's Center Innovation Fund.


"Gravity effects are a big risk in this type of cooling technology," Robinson said. "Our flights proved that our technology works under all conditions. We think this system represents a new thermal-management paradigm."


With microgap cooling, heat generated by tightly packed electronics is removed by flowing a coolant—in this case, a fluid called HFE 7100 that doesn't conduct electricity—through embedded, rectangular-shaped microchannels within or between heat-generating devices. As the coolant flows through these tiny gaps, it boils on the heated surfaces, producing vapor. This two-phase process offers a higher rate of heat transfer, which keeps high-power devices cool and less likely to fail due to overheating.


The embedded cooling approach represents a significant departure from more traditional cooling technologies. With more conventional approaches, designers create a "floor plan." They keep the heat-generating circuits and other hardware as far apart as possible. The heat travels into the printed circuit board, where it is directed eventually to a spacecraft-mounted radiator.




NASA microgap-cooling technology immune to gravity effects and ready for spaceflight

Ground crew recover experiments that launched on the reusable New Shepard rocket on which the microgap-cooling technology flew twice. Credit: Blue Origin
More


Designed Initially for 3-D Circuitry


Robinson and Bar-Cohen began developing the microgap technology about four years ago to assure that NASA could take advantage of next-generation 3-D circuitry when it became available.


Unlike more traditional circuits, 3-D circuits are created by literally stacking one chip atop another. Interconnects link each level to its adjacent neighbors, much like how elevators connect one floor to the next in a skyscraper. With shorter wiring linking the chips, data can move quickly both horizontally and vertically, improving bandwidth, computational speed and performance, all while consuming less power and occupying less .



Despite its advantages, 3-D circuitry presents a particular challenge for potential users both on Earth and in space: the smaller the space between the circuits, the harder it is to remove the heat, jeopardizing performance due to overheating. Because not all of the chips are in contact with a circuit board, traditional cooling techniques wouldn't work with 3-D circuitry. The avoids this problem by running coolant within and between the stacked .


Although originally conceived for use in 3-D circuitry, microgap cooling could help a host of spaceflight electronic devices, including power electronics and laser heads. They, too, are shrinking in size and need an effective system for removing heat from tightly packed spaces. "We see an application for microgap cooling in any power-dense electronic device used in space," Robinson said.


Prior to the two flights, Robinson and Bar-Cohen had tested their cooling technology at various orientations in a laboratory. However, they needed to certify the technology's operation in space and under varying gravity environments. With the successful demonstration, Robinson believes the cooling technology is ready for primetime. "I think we're now at the right technology-readiness level to implement embedded on flight projects," he said.




Explore further



NASA's emerging microgap cooling to be tested aboard reusable launch vehicle







Citation:
NASA microgap-cooling technology immune to gravity effects and ready for spaceflight (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-nasa-microgap-cooling-technology-immune-gravity.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Space | https://sciencespies.com/space/nasa-microgap-cooling-technology-immune-to-gravity-effects-and-ready-for-spaceflight/

Dark matter experiment's central component takes a deep dive—nearly a mile underground



Dark matter experiment’s central component takes a deep dive – nearly a mile underground

The LUX-ZEPLIN time projection chamber, the experiment’s main detector, is pictured here in a clean room at the Sanford Underground Research Facility before it was wrapped up and delivered underground. Credit: Matthew Kapust/Sanford Underground Research Facility
More

Q: How do you get a 5,000-pound, 9-foot-tall particle detector, designed to hunt for dark matter, nearly a mile underground?

A: Very carefully.


Last week, crews at the Sanford Underground Research Facility (SURF) in South Dakota strapped the central component of LUX-ZEPLIN (LZ) – the largest direct-detection dark matter experiment in the U.S. – below an elevator and s-l-o-w-l-y lowered it 4,850 feet down a shaft formerly used in gold-mining operations.


This final journey of LZ's central on Oct. 21 to its resting place in a custom-built research cavern required extensive planning and involved two test moves of a "dummy" detector to ensure its safe delivery.


"This was the most challenging move of a detector system that I have ever done in decades of working on experiments," said Jeff Cherwinka, the LZ chief engineer from the University of Wisconsin, who led the planning effort for the move along with SURF engineers and other support.


Jake Davis, a SURF mechanical engineer who worked on the move, said, "Between the size of the device, the confines of the space, and the multiple groups involved in the move, the entire process required rigorous attention to both the design and the scheduling. Prior to rigging the detector under the cage, we did testing with other cranes to see how it would react when suspended. We also completed analysis and testing to ensure it would remain nice and straight in the shaft."


He added, "The ride was slow, right around 100 feet per minute. The ride to the 4,850-foot level typically takes 13-15 minutes. Today, it took close to 45 minutes. I rode in the cage, watching it through an inspection port in the floor. There was a huge sigh of relief after the move, but there's still a lot of work ahead to finish LZ."



More


This video chronicles the move of the LUX-ZEPLIN central detector, known as the time projection chamber, nearly a mile underground to the research cavern where it will be used to hunt for dark matter. Credit: Matthew Kapust, Erin Broberg, and Nick Hubbard/Sanford Underground Research Facility
More

Theresa Fruth, a postdoctoral research fellow at University College London who works on LZ's central detector, said that keeping LZ well-sealed from any contaminants during its journey was a high priority—even the slightest traces of dust and dirt could ultimately affect its measurements.


"From a science perspective, we wanted the detector to come down exactly as it was on the surface," she said. "The is incredibly important, but so is the cleanliness, because we've been building this detector for 10 months in a clean room. Before the move, the detector was bagged twice, then inserted in the transporter structure. Then, the transporter was wrapped with another layer of strong plastic. We also need to move all our equipment underground so we can do the rest of the installation work underground."



The central detector, known as the LZ cryostat and time projection chamber, will ultimately be filled with 10 tons of liquid xenon that will be chilled to minus 148 degrees Fahrenheit. Scientists hope to see telltale signals of dark matter particles that are produced as they interact with the heavy xenon atoms in this cryostat.


The liquid form of xenon, a very rare element, is so dense that a chunk of granite can float atop its surface. It is this density, owing to the heavy atomic weight of xenon, that makes it a good candidate for capturing particle interactions.


The cryostat is a large tank, assembled from ultrapure titanium, is about 5.5 feet in diameter. It contains systems with a total of 625 photomultiplier tubes that are positioned at its top and bottom (see a related article). These tubes are designed to capture flashes of light produced in particle interactions.


Pawel Majewski of the Rutherford Appleton Laboratory in the U.K., who led the design, fabrication, cleaning, and delivery of LZ's inner cryostat vessel for the U.K. Science and Technology Facilities Council, said, "Now it is extremely gratifying to see it … holding the heart of the experiment and resting in its final place in the Davis Campus, one mile underground."


LZ is designed to hunt for theorized particles called WIMPs, or weakly interacting massive particles. Dark matter makes up about 27 percent of the universe, though we don't yet know what it's made of and have only detected it through its gravitational effects on normal matter.




Dark matter experiment’s central component takes a deep dive – nearly a mile underground

Crews at the Sanford Underground Research Laboratory in Lead, South Dakota, begin to lower the LUX-ZEPLIN central detector. Its nearly mile-long descent down an elevator shaft, and its delivery to a research cavern where it will hunt for dark matter, were successfully carried out last week. Credit: Nick Hubbard/Sanford Underground Research Facility
More


It is 100 times more sensitive than its predecessor experiment, called LUX, which operated in the same underground space. Placing LZ deep underground serves to shield it from much of the steady bombardment of particles that are present at the Earth's surface.


LZ's cryostat will be surrounded by a tank filled with a liquid known as a scintillator that will also be outfitted with an array of photomultiplier tubes and is designed to help weed out false signals from unwanted particle "noise." And the cryostat and scintillator tank will be embedded within a large water tank that provides a further buffer layer from unwanted particle signals.


While LUX's main detector was small enough to fit in the SURF elevator, LZ's cryostat narrowly fit in the elevator shaft.


It was first moved outside of a clean room at the surface level, picked up with a forklift, and carried into position below the elevator cage. It was then attached to the underside of the cage with slings and straps, where it was slowly moved down to the level of the Davis Cavern, its final resting place.


Once detached from the elevator cage, it was moved using air skates on a temporarily assembled surface—akin to how an air hockey puck moves across the table's surface. Because of the cryostat's size, crews had to first temporarily remove underground duct work to allow the move.


Murdock "Gil" Gilchriese, LZ project director and a physicist at Lawrence Berkeley National Laboratory (Berkeley Lab), said, "Next, the cryostat will be wrapped with multiple layers of insulation, and a few other exterior components will be installed." Berkeley Lab is the lead institution for the LZ project.


"Then it will get lowered into the outer cryostat vessel," he added. "It will take months to hook up and check out all of the cables and make everything vacuum-tight." Most of the LZ work is now concentrated underground, he said, with multiple work shifts scheduled to complete LZ assembly and installation.


There are plans to begin testing the process of liquefying xenon gas for LZ in November using a mock cryostat, and to fill the actual cryostat with xenon in spring 2020. Project completion could come as soon as July 2020, Gilchriese said.




Explore further



Scientists piece together the largest U.S.-based dark matter experiment







Citation:
Dark matter experiment's central component takes a deep dive—nearly a mile underground (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-dark-central-component-deep-divenearly.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Physics | https://sciencespies.com/physics/dark-matter-experiments-central-component-takes-a-deep-dive-nearly-a-mile-underground/

House lawmakers, with legislation in tow, push for public C-band auction

WASHINGTON — More than half a dozen House lawmakers said Oct. 29 that the FCC, not the satellite industry, should conduct an upcoming auction to transfer C-band spectrum to the 5G wireless industry.


Several of those members, speaking at an Oct. 29 hearing by the House Energy and Commerce communications and technology subcommittee, questioned the legality of an industry-run auction, saying the Federal Communications Commission is the only body authorized to conduct U.S. spectrum auctions. 


“This would be an unprecedented departure from the way Congress has instructed the FCC to reallocate spectrum in the past,” House Energy and Commerce Committee Chairman Frank Pallone (D-NJ) said. “Under the Communications Act, we required the FCC to run auctions that provide revenue to the treasury, which is critical to ensuring the American people benefit from these auctions.”


The FCC is expected to decide on how to reallocate some or all of the 500 megahertz of C-band satellite downlink spectrum for cellular 5G services by the end of this year. FCC Chairman Ajit Pai said during an Oct. 17 Senate hearing that the commission was reviewing its auction authority under Section 309 of the Communications Act, the 1934 law that established the FCC to oversee telephone, telegraph and radio communications. 


Four House members — Reps Mike Doyle (D-Pa.), Doris Matsui (D-Calif.), Bill Johnson (R-Ohio), and Greg Gianforte (R-Mont.) — introduced a bill Oct. 24 to mandate the commission conduct the C-band auction itself. 


The Clearing Broad Airwaves for New Deployment, or C-BAND Act, requires a public auction of 200 to 300 megahertz of the spectrum. 


“The [FCC] chairman does not have the authority to conduct a private auction of the C-band, and must use the auction authority provided by Congress through an FCC-led public auction,” Matsui said. “Abandoning this proven model could lead to protracted litigation, causing unnecessary delays in making this 5G spectrum available, and shortchange the American taxpayer.”


Luxembourg-based Intelsat and SES, and Telesat Canada, working together as the C-Band Alliance, had gained favor at the FCC for coming forward with their own plan to relinquish spectrum they in the past sought to preserve. 


During the Oct. 29 hearing, Reps Bob Latta (R-Ohio), Tom O’Halleran (D-Ariz) and Tony Cárdenas (D-Calif.), also said they prefer a public auction. Many suggested proceeds from selling the spectrum — which could range from a few billion dollars to $60 billion — should go to government programs to bring broadband to rural and underserved areas. 


Pressure now exists in both chambers of Congress for a public auction, after Sen. John Kennedy (R-La.), chairman of the Senate Appropriations financial services and general government subcommittee, pushed for a public auction during an Oct. 17 hearing.


Industry reactions 


C-Band Alliance officials were not among the five witnesses at the Oct. 29 hearing, but representatives with competing proposals were. 


Ross Lieberman, senior vice president of ACA Connects, a group of 700 small and medium-sized telecom providers, reiterated the group’s proposal for a public auction of 370 megahertz of C-band spectrum. ACA Connects, Charter Communications and the Competitive Carriers Association say the FCC should use some proceeds from the auction to build fiber as a replacement for satellite C-band infrastructure — a task they estimate would cost $6 billion to $7 billion and take five years to complete. 


The C-Band Alliance plan would leave ACA Connects members with too little spectrum to ensure stable continuity of service, Lieberman said. 


“Without a fiber alternative, our members will be stuck with higher prices to use a less reliable C-band that is more prone to interference and unable to meet future demands,” he said. 


Lieberman also criticized the C-Band Alliance’s estimate that it can clear 300 megahertz of spectrum in 36 months of an FCC order, arguing that cable operators would have to do most of the work upgrading and reconfiguring their systems to use less spectrum. 


Jim Frownfelter, CEO of satellite operator ABS, rebutted Lieberman’s criticism, but didn’t advocate for the C-Band Alliance plan. 


Frownfelter, who was president of Panamsat, and then at Intelsat after the latter bought Panamsat in 2005, said the satellite industry undertook a similar hardware overhaul in the early 2000s when the proliferation of high-definition channels required massive infrastructure upgrades. 


“That effort is extensively more complicated than what we are talking about here,” he said. 


Frownfelter advocated for a private spectrum auction, but using a plan put forward by Bermuda-based ABS, Spanish operator Hispasat and Brazilian operator Embratel Star One. 


Frownfelter said the C-Band Alliance plan excludes the three regional operators because they didn’t generate any C-band revenue in 2017, despite having at least partial coverage of the United States. 


ABS received its FCC license to provide C-band services in the United States in 2017, meaning the company, which has global ambitions, didn’t have time to monetize its U.S. C-band coverage, he said. 


ABS, Hispasat and Star One each spent close to $250 million building and launching satellites designed, at least in part, to serve the U.S. market, Frownfelter said. Any spectrum auction, public or private, that squanders those investments would be unfair, he argued. 


All three operators “have invested a fortune in reliance on FCC rules and have done everything right,” Frownfelter said. “We hope that the FCC and Congress will do right by us too.”


Frownfelter said the regional operator’s proposal ensures all FCC-licensed satellite operators are compensated for any spectrum loss, ensures a multi-billion-dollar contribution to the U.S. treasury, and provides financial incentives to earth-station operators that could result in a faster clearing of spectrum than the C-Band Alliance plan. 


The evening before the hearing, the C-Band Alliance increased the amount of spectrum it said it could part with from 200 megahertz to 300 megahertz. The alliance also filed a letter with the FCC — co-signed by U.S. cellular network giants AT&T and Verizon, and smaller companies Pine Belt Wireless, Bluegrass Cellular and U.S. Cellular — highlighting “areas of consensus” on C-band. 


The C-Band Alliance and other signatories listed eight such areas of consensus covering auction details that they said should help guide any auction type, “regardless of the ultimate outcome of this proceeding.”









#Space | https://sciencespies.com/space/house-lawmakers-with-legislation-in-tow-push-for-public-c-band-auction/

Novel NRL instrument enhances ability to measure nuclear materials



Novel NRL instrument enhances ability to measure nuclear materials

Evan Groopman, a research physicist, prepares a microscopic uranium particle sample to be measured in the NAUTILUS instrument in the Accelerator Mass Spectrometry Laboratory at the U.S. Naval Research Laboratory in Washington, D.C., Sept. 5. Credit: U.S. Navy photo by Nicholas E. M. Pasquini
More

Researchers with the U.S. Naval Research Laboratory (NRL) designed and built an instrument called NAUTILUS to provide new measurement capabilities unlike those available at other laboratories to measure nuclear, cosmo/geo-chemical, and electronic materials.

At the end of last year, NRL participated in an international round-robin exercise, called Nuclear Signatures Inter- Measurement Evaluation Program (NUSIMEP-9), sponsored by the European Commission (Nuclear Safety and Security Division) to measure microscopic particulate samples with "unknown" uranium isotope ratios.


"NRL recently received the final report from the international round-robin exercise and found that the Laboratory performed quite well, correctly identifying all of the "unknown" isotopic compositions," said David Willingham, a research chemist and head of the Accelerator Mass Spectrometry Section. "In this case, NRL used a globally unique mass spectrometer called NAUTILUS to perform these measurements, as part of the Accelerator Mass Spectrometry Section's participation in the NUSIMEP-9 sample analysis exercise."


The NUSIMEP-9 test samples were prepared to mimic environmental sampling/nuclear Safeguards missions, such as those performed by the International Atomic Energy Agency (IAEA). The exercise was conducted for the IAEA's Network of Analytical Laboratories (NWAL), of which NRL is not a member; however, NRL does collaborate with the laboratories to develop better uranium-bearing particle analyses.




Novel NRL instrument enhances ability to measure nuclear materials

Evan Groopman, a research physicist, inserts Nuclear Signatures Inter-laboratory Measurement Evaluation Program (NUSIMEP-9) uranium particles on a carbon planchette into the NAUTILUS to analyze the uranium isotopic composition in the Accelerator Mass Spectrometry Laboratory at the U.S. Naval Research Laboratory in Washington, D.C., Sept. 5. Credit: U.S. Navy photo by Nicholas E. M. Pasquini
More


The IAEA is responsible for deterring the proliferation of nuclear weapons by detecting early the misuse of nuclear material or technology, and by providing credible assurances that states are honoring their safeguards obligations. The analysis of nuclear material samples and environmental samples taken by IAEA inspectors is an essential component of this undertaking.


Twenty-two other laboratories were involved, many of which perform these types of measurements as their core mission (e.g., IAEA) with their instruments dedicated solely for these types of measurements.


"The NAUTILUS is much more flexible than this single type of measurement—we use it to analyze a wide variety of material compositions, including nuclear, electronic, and extraterrestrial materials," said Evan Groopman, a research physicist. "We are happy with the results of this exercise because it demonstrates that our up-and-coming group can both build a novel instrument for the Navy and apply it to a wide variety of problems, performing as well or better than laboratories that exclusively perform a single type of analysis using commercial instrumentation."




Novel NRL instrument enhances ability to measure nuclear materials

David Willingham and Evan Groopman’s research was featured on the front cover of the Nov. 21, 2018 issue of Analyst for their paper entitled “Direct, uncorrected, molecule-free analysis of 236U from uranium-bearing particles with NAUTILUS: a new kind of mass spectrometer.” . Credit: Analyst
More


A key element of the safeguards system is the physical inspection of nuclear facilities by IAEA inspectors. States declare in considerable technical detail the types and quantities of nuclear material they possess. Among other verification measures, IAEA inspectors may take nuclear material samples from various points of the nuclear fuel cycle and collect environmental samples by swiping surfaces at various locations during the conduct of a verification activity.



These samples, which may be in solid, liquid, or gaseous form, are then subject to sophisticated analysis by IAEA scientists. The scientists focus on the isotopic make-up of uranium and plutonium containedin the samples, unaware of the country from which they were obtained. The analytical results provide a powerful tool for supporting conclusions as to the correctness and completeness of states' nuclear material declarations and help to inform the IAEA's evaluation of whether a state is complying with its safeguards obligations.


In carrying out this work, the IAEA laboratories coordinate and cooperate with a wider Network of Analytical Laboratories (NWAL), comprising an additional 18 laboratories located in nine different IAEA Member States. The Environment Sample Laboratory in Seibersdorf, Austria receives and screens all swipe samples but then shares the analytical workload with its NWAL partners.




Explore further



Going small to determine where nuclear material came from and how it was made







Citation:
Novel NRL instrument enhances ability to measure nuclear materials (2019, October 31)
retrieved 31 October 2019
from https://phys.org/news/2019-10-nrl-instrument-ability-nuclear-materials.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Physics | https://sciencespies.com/physics/novel-nrl-instrument-enhances-ability-to-measure-nuclear-materials/

Researchers double sorghum grain yield to improve food supply



Researchers double sorghum grain yield to improve food supply

The left image shows the grains of a normal sorghum plant. The right image depicts how the amount of grains doubled in the genetic variant. Credit: Ware lab/CSHL, 2019
More

Plant scientists at Cold Spring Harbor Laboratory (CSHL) and USDA's Agricultural Research Service (ARS), in their search for solutions to global food production challenges, have doubled the amount of grains that a sorghum plant can yield.

Sorghum, one of the world's most important sources of food, animal feed, and biofuel, is considered a model crop for research because it has a high tolerance to drought, heat, and high-salt conditions. Increasing the grain yield has become even more important to plant breeders, farmers, and researchers as they try to address and overcome food security issues related to , growing populations, and land and water shortages.


Led by Doreen Ware, CSHL Adjunct Professor and research scientist at the U.S. Department of Agriculture, and USDA colleague Zhanguo Xin, Ph.D, the research team identified novel genetic variations that occurred in sorghum's MSD2 gene, increasing the grain yield 200 percent. MSD2 is part of a gene line that boosts flower fertility by lowering the amount of jasmonic acid, a hormone that controls the development of seeds and flowers.


"When this hormone is decreased, you have a release of development that does not normally occur," said Nicholas Gladman, a postdoctoral fellow in Ware's lab and first author on the study, recently published in The International Journal of Molecular Sciences. "That allows for the full formation of the female sex organs in these flowers, which then allows for increased fertility"


MSD2 is regulated by MSD1, a gene discovered by Ware's team last year. Manipulating either gene increases seed and flower production.


"Major cereal crops are very close to each other evolutionarily. A lot of the that they share have similar functions," said Yinping Jiao, a postdoctoral associate in the Ware Lab and an author on the study. "This gene that plays an important role controlling the sorghum yield may also help us improve the yield of other crops like maize or rice."


Ware's lab uses this type of genetic research to understand how plants have changed over time.


"These genetic analyses actually give us the molecular mechanisms that provide more opportunities to engineer crops in the future," she said.




Explore further



The secret to tripling the number of grains in sorghum and perhaps other staple crops



More information:
Gladman et. al, "Fertility of Pedicellate Spikelets in Sorghum is Controlled by a Jasmonic Acid Regulatory Module" appeared in The International Journal of Molecular Sciences on October 8, 2019.








Citation:
Researchers double sorghum grain yield to improve food supply (2019, October 30)
retrieved 30 October 2019
from https://phys.org/news/2019-10-sorghum-grain-yield-food.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Biology | https://sciencespies.com/biology/researchers-double-sorghum-grain-yield-to-improve-food-supply/

Wednesday, October 30, 2019

GAO report critical of Air Force space command and control program

GAO said the Air Force Space C2 program requires more oversight from the Defense Department.


WASHINGTON — The Government Accountability Office in a new report gave low marks to the Air Force’s Space Command and Control program, a long troubled effort to provide commanders with better tools to monitor what is happening in space.


The Space Command and Control program, or Space C2, is managed by the Air Force Space and Missile Systems Center. In a report released Oct. 30, GAO said the program suffers from disjointed management and requires more oversight from the Defense Department.


Space C2 is the name the Air Force gives to a collection of systems that are being developed to help predict attacks and avoid collisions in space. According to GAO, the program is facing “challenges and unknowns, from management issues to technical complexity.”


GAO’s review was requested by the House Armed Services Committee in the National Defense Authorization Act for Fiscal Year 2018. The committee has been critical of the program and recommended a $21.5 million cut to the Air Force’s $72.8 million request for fiscal year 2020.


The Air Force over the past three decades started three space command and control projects — and all ended significantly over budget and schedule, and with key capabilities undelivered, GAO said. The Air Force’s latest effort uses agile software development methods to incrementally deliver updates.


GAO is not critical of the agile software approach but calls for greater oversight and metrics to track improvements. “Until the program develops a comprehensive acquisition strategy to more formally plan the program, it is too early to determine whether these efforts will help to ensure long-term program success,” says the report.


Kevin Fahey, assistant secretary of defense for acquisition, said in response to the report that DoD agreed with GAO’s critique. Fahey wrote in his response that the undersecretary of defense for acquisition and sustainment in a May 2019 memo directed the Air Force to provide an acquisition strategy for Space C2 by November 2019.









#Space | https://sciencespies.com/space/gao-report-critical-of-air-force-space-command-and-control-program/

MIT Engineers Develop Battery That Can Remove Carbon Dioxide From Air

More

A pair of MIT engineers just created a new way to efficiently remove carbon dioxide from the air. The system can be developed commercially at low cost and has a myriad of potential applications.


While there are alternative ways to remove carbon dioxide from gas, they typically only work well on carbon dioxide levels significantly higher than atmospheric concentrations and require lengthy chemical processes to ultimately remove the carbon dioxide.


This approach uses a unique battery that can absorb carbon dioxide while it is charging up and release pure carbon dioxide when the battery is discharged.


The paper outlining the recent research was published in Energy & Environmental Science and spearheaded by two engineers from the MIT Chemical Engineering department.


The battery consists of thin sheets coated with stacks of electrodes (using the compound polyanthraquinone). These electrodes have a natural affinity for carbon dioxide when charging and absorb the gas even at the 400 ppm concentration in the atmosphere.


When the battery is discharged, the electrodes switch to having no affinity to carbon dioxide and release the 100% carbon dioxide gas. The entire reaction can take place at room temperature and pressure.


There are a whole host of potential applications to this newly developed method. For instance, soft-drink bottling plants use carbon dioxide to carbonate their beverages, sometimes burning fossil fuels to get carbon dioxide. This method could essentially pull CO2 from the atmosphere and provide a pure source for beverage carbonation.


The batteries could be used dually on power plant exhausts, where one battery is charging while the other discharging, allowing for continuous capture of carbon dioxide.


That gas can then be compressed and injected underground, called carbon capture and storage. A growing method for removing carbon dioxide from the atmosphere is to pressurize it and inject it into depleted geologic formations. These geologic formations are commonly depleted oil and gas fields where porous rock has the ability to absorb significant volumes. Alternatively, the gas could be used for industrial applications requiring carbon dioxide.


The researchers estimate that this technology could be developed to a commercial scale in rolls similar to industrial paper. Also, the end cost is estimated to be significantly lower than competing processes to remove carbon dioxide from the air.


"In my laboratories, we have been striving to develop new technologies to tackle a range of environmental issues that avoid the need for thermal energy sources, changes in system pressure, or addition of chemicals to complete the separation and release cycles," says T. Alan Hatton from the research team.






#News | https://sciencespies.com/news/mit-engineers-develop-battery-that-can-remove-carbon-dioxide-from-air/

Hold On to Your Lederhosen: Oktoberfest Produces a Whole Lot of Methane Gas


Ah, Oktoberfest. The annual festival draws some six million revelers to Munich, where the music is thumping, the sausage is sizzling and the beer (so, so much beer) is flowing. But all of these good times might not be so great for the environment. As Kai Kupferschmidt reports for Science, a new study has found that Oktoberfest releases considerable amounts of methane gas into the atmosphere.



















While the celebration was taking place in 2018, a team of scientists scurried around the perimeter of the festival sampling the air. (They weren’t allowed to enter the festival area due to safety concerns, and one can only imagine the FOMO.) Taking into account wind speed and direction, they estimated that 1,500 kilograms (3,306 pounds) of methane were emitted during the 16-day party.








In a preprint paper, which is under review in the journal Atmospheric and Chemistry Physics, the researchers note that they were not aware of any other studies dealing with methane emissions from festivals. So they decided to compare emissions from Oktoberfest to those wafting out of Boston, which is, the study authors write, known to be “a very leaky city.” On average, Oktoberfest released 6.7 micrograms of methane per square meter per second—10 times the average regional emission flux in Boston.








“Although it is difficult to compare the small and densely populated Oktoberfest premises with the entire city area of Boston,” the researchers acknowledge, “the comparison shows that the emission flux of Oktoberfest is significant.”








Methane is a greenhouse gas, the second most significant one after carbon dioxide. It doesn’t live for very long in the atmosphere, but it is highly effective at trapping radiation. “Per unit of mass, the impact of methane on climate change over 20 years is 84 times greater than [carbon dioxide]; over a 100-year period it is 28 times greater,” the Climate and Clean Air Coalition warns.








The amount of methane in the atmosphere has been on the rise since 2007, following a period of stability that began in the 1990s, reports Fred Pearce of Yale Environment 360. Pearce adds that researchers suspect the recent bump is being caused by the “the activities of microbes in wetlands, rice paddies, and the guts of ruminants,” which are mammals like cattle, sheep and goats that have a unique digestive system. Oil and gas drilling, along with hydraulic fracturing (or “fracking”), also play a major role in leaking methane gas into the environment.








Previous research has looked at the ways in which large festivals contribute to the emissions of other air pollutants, like nitrogen oxide and polycyclic aromatic hydrocarbons, but the connection between festivals and methane emissions had not previously been studied, according to the authors of the new report.








The major culprit was likely incomplete combustion in natural gas-powered cooking and heating appliances. (And in case you were wondering, the digestive byproducts of too much beer and greasy food—burps and flatulence, in other words—were probably not responsible for a significant portion of Oktoberfest’s methane output.)








Granted, there are more serious environmental concerns associated with big festivals, like people travelling by plane to get to them. But festivals take place around the world, and they have been an overlooked source of significant methane emissions, Jia Chen, lead author of the study, notes in an interview with the Guardian’s Ian Sample. This doesn’t mean that Oktoberfest and other celebrations should be canceled—just that festival organizers should implement measures, like improving gas appliances, to curb methane emissions.








“Small steps,” Chen tells Sample, “can bring us closer to achieving the world climate goals.”














#News | https://sciencespies.com/news/hold-on-to-your-lederhosen-oktoberfest-produces-a-whole-lot-of-methane-gas/

Carbon bomb: Study says climate impact from loss of intact tropical forests grossly underreported



Carbon bomb: Study says climate impact from loss of intact tropical forests grossly underreported

Road for oil palm plantations in West Kalimantan, Indonesia. Credit: Rainforest Action Network
More

A new study in the journal Science Advances says that carbon impacts from the loss of intact tropical forests has been grossly underreported.

The study calculates new figures relating to intact tropical lost between 2000-2013 that show a staggering increase of 626 percent in the long-term net impacts through 2050. The revised total equals two years' worth of all global land-use change emissions.


The authors of the study, from WCS, University of Queensland, University of Oxford, Zoological Society of London, World Resources Institute, University of Maryland, and University of Northern British Columbia, found that direct clearance of intact resulted in just 3.2 percent of gross carbon emissions from all deforestation across the pan-tropics. However, when they factored in full carbon accounting, which considers forgone carbon removals (carbon sequestration that would occur annually into the future if cleared or degraded forest had remained intact after year 2000), selective logging, edge effects and declines of carbon-dense tree species due to overhunting of seed-dispersing animals, they discovered that the figure skyrocketed by a factor of more than six times.


Said the study's lead author Sean Maxwell of WCS and the University of Queensland: "Our results revealed that continued destruction of intact tropical forests is a ticking time bomb for carbon emissions. There is an urgent need to safeguard these landscapes because they play an indispensable role in stabilizing the climate."


According to 2013 estimates, 549 million acres of intact tropical forests remain. Only 20 percent of tropical forests can be considered "intact," but those areas store some 40 percent of the above-ground carbon found in all tropical forests.


The authors say that intact forest retention rarely attracts funding from schemes designed to avoid land-use and land cover change emissions in developing nations.


Notably, the Reducing Emissions from Deforestation and Forest Degradation (REDD+) approach enables developing countries to receive for enhancing carbon stocks, or avoiding the loss of carbon that would otherwise be emitted due to land-use and land cover change. Among other activities, REDD+ covers support for conservation of forests not under immediate threat, and was formally adopted by parties to the United Nations Framework Convention on Climate Change in 2008 at the 14th Conference of the Parties in Poland. Since then, however, financial support and implementation has predominantly focused on areas with high historical rates of deforestation (i.e. 'deforestation frontiers'). This is widely believed to deliver more immediate and more clearly demonstrable emission reductions than conserving intact forest areas. The latter tend to be treated as negligible sources of emissions as a result of the short timescales and conservative assumptions under which REDD+ operates—assumptions which the present study suggests are causing key opportunities to be missed.



Said WCS's Tom Evans, a co-author of the study: "The relative value of retaining intact tropical forest areas increases if one takes a longer-term view and considers the likely state of the world's forests by mid-century—a milestone date in the Paris Agreement. Agricultural expansion, logging, infrastructure and fires reduced the global extent of intact forests by 7.2 percent between 2000 and 2013 alone, yet the eventual carbon emissions locked in by these losses have not been comprehensively estimated."


The authors go on to say that a comparable analysis is needed for intact forests outside of the tropics such as the boreal forests of Canada and Russia, given that approximately half to two-thirds of carbon removals on earth's intact ecosystems occur outside the tropics. Without this global clean-up service, CO2 from human activities would accumulate in the atmosphere markedly faster than it does at present.


Said co-author James Watson of WCS and the University of Queensland: "Clearly, the climate mitigation potential of retaining intact forests is significant, but without proactive conservation action by national governments, supported by the global community, this potential will continue to dwindle.


At least 35 percent of the intact forests studied are home to, and protected by, Indigenous Peoples. Intact forests also provide exceptional levels of many other environmental services—for example they protect watersheds much better than degraded forests, return moisture to the air that falls in distant regions as rain, and help to keep vast numbers of species safe from extinction. When compared to forests that have been degraded by large-scale human activities, intact forests are more resistant to shocks such as fire and drought and usually less accessible to logging and agriculture conversion, making them one of our best conservation bets in the face of a rapidly changing climate.




Explore further



Amazon deforestation has a significant impact on the local climate in Brazil



More information:
"Degradation and forgone removals increase the carbon impact of intact forest loss by 626%" Science Advances (2019). DOI: 10.1126/sciadv.aax2546 , https://advances.sciencemag.org/content/5/10/eaax2546









Citation:
Carbon bomb: Study says climate impact from loss of intact tropical forests grossly underreported (2019, October 30)
retrieved 30 October 2019
from https://phys.org/news/2019-10-carbon-climate-impact-loss-intact.html



This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.








#Environment | https://sciencespies.com/environment/carbon-bomb-study-says-climate-impact-from-loss-of-intact-tropical-forests-grossly-underreported/