The affected mangroves will be compensated at the rate of 1:5 by depositing money to the mangroves cell, which will do the compensatory afforestation.
from "mangroves" - Google News https://ift.tt/2KNLcEH
via IFTTT
The affected mangroves will be compensated at the rate of 1:5 by depositing money to the mangroves cell, which will do the compensatory afforestation.
India Government all set to procure E5 series Shinkansen high speed Bullet trains from Japan for Mumbai-Ahmedabad High Speed Rail corridor.
Hours after the National High Speed Rail Corp Ltd (NHSRCL), implementing the 508-km Mumbai-Ahmedabad bullet train project, said on Saturday that it has ...
After redesign, only 3ha will get affected as compared to the earlier 12ha of mangroves.
The Bullet train station in Thane will be reworked to save mangroves, according to National High Speed Rail Corporation Limited (NHRSCL).
Under the new design not 53000 but an estimated 32044 mangroves may be affected.. Bullet train. thane mangrove. Mangrove forest. Mangrove conservation.
Under the new design, an estimated 32044 mangroves may be affected, official said.By modifying the design of thane station, we have reduced the destruction ...
The National High Speed Rail Corporation Limited (NHSRCL) on Saturday said that the required wildlife, Coastal Region Zone (CRZ) and forest clearance has ...
Thousands of mangroves will be affected by the Mumbai-Ahmedabad bullet train project, the Maharashtra government has said. The 508-km line that would cost ...
Maharashtra Transport Minister Diwakar Raote had said on Monday in a reply to a question in the state Legislative Council that as many as 54000 mangroves ...
The destroyed mangroves will be compensated 1:5 by the mangroves cell through compensatoiry afforestation. The NHSRCL will deposit money into the ...
SINGAPORE - Bird enthusiasts and nature lovers will soon have more to enjoy in Pulau Ubin, where floating wetlands will be vastly expanded to provide more ...
NHSRCL, the implementing agency of the Railways' bullet train project, said Saturday that they have reworked the design of the station in Thane, Maharashtra, ...
The Centre has released guidelines for the preparation of a Coastal Zone Management Plan (CZMP) for states on the seaboard, including Maharashtra.
CEBU CITY — The Department of Environment and Natural Resources (DENR) in Central Visayas is investigating a local developer for cutting mangroves and ...
In yet another revision since 2018, the number of mangrove trees that will be lost due to the Mumbai-Ahmedabad high speed rail (bullet train) project has been ...
Navi Mumbai will lose another six acres of mangroves, and this time it is for developing infrastructure for the Navi Mumbai airport . Earlier, the project proponent, ...
Mumbai: Maharashtra Housing and Area Development Authority (Mhada) will refund the earnest money deposit (EMD) with interest to the mangroves affected ...
SARASOTA, Fla. — Hidden away off the shores of Sarasota lies hidden tunnels filled with Florida wildlife. Here's what you need to know about the Lido Key ...
Mangrove Island has begin to erode fairly significantly over the past four or five decades.
Scottish scientists have launched an ambitious new mangrove forest conservation project in east Africa with backing from Hollywood actor Leonardo DiCaprio.
Embrace Qatar's heritage by traveling to the North, Enjoying eco-tourist kayaking Adventure at Al-Thahkira's Mangroves.
Maharashtra Transport Minister Diwakar Raote said today that as many as 54000 mangroves spread over 13.36 hectares will be affected because of the ...
The oceans are losing oxygen. Numerous studies based on direct measurements in recent years have shown this. Since water can dissolve less gas as temperatures rise, these results were not surprising. In addition to global warming, factors such as eutrophication of the coastal seas also contribute to the ongoing deoxygenation. Will the oceans become completely oxygen-depleted at some point in the future if global warming continues? Such anoxic phases have actually occurred several times in the Earth's history, combined with major mass extinction events. They were also accompanied by high carbon dioxide concentrations in the atmosphere and high global temperatures.
Today, scientists of the GEOMAR Helmholtz Centre for Ocean Research Kiel published model simulations in the international journal Nature Communications on the development of the oxygen content of the oceans up to the year 8000. In their scenario, they assume that a large part of the fossil resources will be burnt, that emissions will continue to rise until the end of the century and then decrease to zero by the year 2300. In the model, the planet heats up by a further 6 degrees, and temperatures remain at this high level until the end of the simulation.
The surprising result concerns the oxygen content of the ocean: After a further decrease over several hundred years, the oxygen inventory of the ocean rises again and even reaches a higher level than before industrialization in just under 4000 years. At first glance, it seems paradoxical that despite the expected further expansion of the already existing oxygen minimum zones in the world's oceans, the model yields an unexpected increase in oxygen as global temperatures rise.
It is known from investigations of the Kiel Collaborative Research Centre 754 that such oxygen-poor areas are death zones for larger organisms such as fish or cephalopods. However, certain bacteria that breathe nitrate instead of oxygen thrive there very well. "They draw their energy from a chemical process we call denitrification. It is an important component of the nitrogen cycle, that results in less oxygen being consumed during respiration of organic material than that produced during photosynthesis" explains Professor Oschlies.
In the new model simulation, the researchers have for the first time consistently coupled the oxygen cycle with the nitrogen cycle in such long-term global simulations. The researchers found that due to the extended oxygen minimum zones, more and more organic material is no longer respired with oxygen but with nitrate through denitrification. After several thousand years, the associated oxygen savings exceed the oxygen loss of the oceans caused by warming. "However, we cannot speak of a recovery, since the extensive oxygen minimum zones near the sea surface would stay. A large part of the additional oxygen goes into the deep ocean," says Angela Landolfi, co-author of the study.
However, there is a new problem: the anoxic phases that have occurred in Earth's history during warm climatic conditions are even more difficult to explain with the new findings. There are obviously factors and feedback processes in the complex interactions of biological, physical and chemical processes in the ocean that are not yet fully understood. "This is why the study is also important for the present. It points to knowledge gaps, such as the interaction of denitrification and nitrogen fixation, that can also be relevant for ongoing ocean changes," says Andreas Oschlies, summarising the significance of the study.
Story Source:
Materials provided by Helmholtz Centre for Ocean Research Kiel (GEOMAR). Note: Content may be edited for style and length.
In a 600-ft.-long saltwater wave tank on the coast of New Jersey, a team of New Jersey Institute of Technology (NJIT) researchers is conducting the largest-ever simulation of the Deepwater Horizon spill to determine more precisely where hundreds of thousands of gallons of oil dispersed following the drilling rig's explosion in the Gulf of Mexico in 2010.
Led by Michel Boufadel, director of NJIT's Center for Natural Resources (CNR), the initial phase of the experiment involved releasing several thousand gallons of oil from a one-inch pipe dragged along the bottom of the tank in order to reproduce ocean current conditions.
"The facility at Ohmsett allows us to simulate as closely as possible the conditions at sea, and to thus observe how droplets of oil formed and the direction and distance they traveled," Boufadel said.
Later this summer, his team will conduct the second phase of the experiment, when they will apply dispersants to the oil as it shoots into the tank to observe the effects on droplet formation and trajectory.
His team's research, conducted at the U.S. Department of the Interior's Ohmsett facility at Naval Weapons Station Earle in Leonardo, N.J., was detailed in a recent article, "The perplexing physics of oil dispersants," in the Proceedings of the National Academy of Sciences (PNAS).
"These experiments are the largest ever conducted by a university in terms of the volume of oil released and the scale," he noted. "The data we obtained, which has not been published yet, is being used by other researchers to calibrate their models."
The team expects to come away from these experiments with insights they can apply to a variety of ocean-based oil releases.
"Rather than limiting ourselves to a forensic investigation of the Deepwater Horizon release, we are using that spill to explore spill scenarios more generally," Boufadel said. "Our goal is not to prepare for the previous spill, but to broaden the horizons to explore various scenarios."
More than nine years after the Deepwater Horizon drilling rig exploded, sending up to 900,000 tons of oil and natural gas into the Gulf of Mexico, there are, however, lingering questions about the safety and effectiveness of a key element of the emergency response: injecting chemicals a mile below the ocean surface to break up oil spewing from the ruptured sub-sea wellhead to prevent it from reaching environmentally sensitive regions.
To date, spill cleanups have focused primarily on removing or dispersing oil on the ocean surface and shoreline, habitats deemed more important ecologically. Knowledge of the deep ocean is in general far murkier, and at the time of the accident, BP's drilling operation was the deepest in the world.
Two years ago, Boufadel and collaborators from the Woods Hole Oceanographic Institution, NJIT, Texas A&M University and the Swiss Federal Institute of Aquatic Science and Technology pooled their scientific and technical expertise to provide some of the first answers to these controversial policy questions.
The team began by developing physical models and computer simulations to determine the course the oil and gas took following the eruption, including the fraction of larger, more buoyant droplets that floated to the surface and the amount of smaller droplets entrapped deep below it due to sea stratification and currents. Boufadel and Lin Zhao, a postdoctoral researcher in the CNR, developed a model that predicted the size of droplets and gas bubbles emanating from the wellhead during the sub-surface blowout; they then factored in water pressure, temperature and oil properties into the model, and employed it to analyze the effects of the injected dispersants on this stream.
"Among other tests of our model, we studied the hydrodynamics of various plumes of oil jetting into different wave tanks," Zhao noted. Researchers at Texas A&M in turn created a model to study the movement of pollutants away from the wellhead.
The researchers determined that the use of dispersants had a substantial impact on air quality in the region of the spill by reducing the amount of toxic compounds such as benzene that reached the surface of the ocean, thus protecting emergency workers on the scene from the full brunt of the pollution. Their study was published in PNAS.
"Government and industry responders were faced with an oil spill of unprecedented size and sea depth, pitting them in a high-stakes battle against big unknowns," Christopher Reddy, a senior scientist at Woods Hole Oceanographic Institution, and Samuel Arey, a senior researcher at the Swiss Federal Institute of Aquatic Science and Technology, wrote in Oceanus magazine.
"Environmental risks posed by deep-sea petroleum releases are difficult to predict and assess due to the lack of prior investigations," Boufadel noted. "There is also a larger debate about the impact of chemical dispersants. There is a school of thought that says all of the oil should be removed mechanically."
Boufadel added that the water-soluble and volatile compounds that did not reach the surface were entrapped in a water mass that formed a stable intrusion at 900 to 1,300 meters below the surface.
"These predictions depend on local weather conditions that can vary from day to day. However, we predict that clean-up delays would have been much more frequent if subsurface dispersant injection had not been applied," Reddy and Arey said, adding, "But this is not the final say on the usage of dispersants."
The current experiment is an attempt to provide more definitive answers.
A Kenyan court has halted construction of the country's first coal-fired power station on environmental grounds in a blow for the $2bn project's Chinese backers ...
Take a kayak journey through hidden tunnels filled with Florida wildlife.
Last fortnight, mid-day's ground report on five Uran villages fearing flooding is followed by news of 54000 mangroves threatened by the bullet train; Mumbai ...
Best visited from March to October, now is the perfect time to pack your bags and discover this true Indonesian highlight.
Congress leader Milind M. Deora warned that the government's plans to chop off mangroves for the Bullet Train project could be disastrous for Mumbai's survival ...
Anantara Eastern Mangroves has partnered with the Department of Culture and Tourism Abu Dhabi (DCT) and Experience Hub to promote the UAE capital over ...
Scottish scientists have launched an ambitious new mangrove forest conservation project in east Africa with backing from Hollywood actor Leonardo DiCaprio.
Major nullahs in Vasai-Virar region have been taken up for cleaning for the first time in a decade, to enable flow of rain water into creeks and avoi.
Rajahmundry: An effort to get Godavari mangroves a world heritage site status has begun with the state government issuing an order constituting an expert ...
The 35th annual Creole Classic Fishing Tournament held over the weekend at Bridge Side Marina in Grand Isle raised over $20000 for local causes that ...
Cayman Compass is the Cayman Islands' most trusted news website. We provide you with the latest breaking news from the Cayman Islands, as well as other ...
Just like the temperatures of late, area fishing remains hot. Terry Serigny with Terry's Live Bait in Leeville said anglers have been hooking on to plenty of nice ...
NAIROBI (Reuters) - A Kenyan environmental tribunal delayed a license for a planned coal power plant in the coastal Lamu region on Wednesday, saying the ...
New Delhi: Maharashtra will soon loose over 13.36 hectares of mangroves. According to a Hindustan Times report, a minimum of 54,000 mangrove trees will be ...
Mumbai– Former Union minister and Mumbai Congress President Milind M. Deora on Wednesday warned that the government's plans to chop off mangroves for ...
Transport Minister of Maharashtra Government has revealed that 54 thousand mangroves will be razed off for the purpose of construction.
Motorists heading to Gujarat via the Mumbai-Ahmedabad national highway can expect a smooth ride in the near future as work on the four-lane Varsova c.
FILE PHOTO. A man disappeared after rowing his boat to a mangrove area along the coast of Phang Nga's Takua Thung district yesterday. Surat Sumalee, head ...
Sixty-one per cent of all mangrove trees in the country are found in Lamu.
Longtime residents of Grand Cayman will already know how drastically the island has changed in recent decades, as construction and development have ...
Govt. plans to plant five trees for each one cut; 1379 hectares need to be acquired.
Coral reefs face many challenges to their survival, including the global acidification of seawater as a result of rising carbon dioxide levels in the atmosphere. A new study led by scientists at UC Santa Cruz shows that at least three Caribbean coral species can survive and grow under conditions of ocean acidification more severe than those expected to occur during this century, although the density of their skeletons was lower than normal.
The study took advantage of the unusual seawater chemistry found naturally at sites along the Caribbean coastline of Mexico's Yucatan Peninsula, where water discharging from submarine springs has lower pH than the surrounding seawater, with reduced availability of the carbonate ions corals need to build their calcium carbonate skeletons.
In a two-year field experiment, the international team of researchers transplanted genetically identical fragments of three species of corals to a site affected by the springs and to a nearby control site not influenced by the springs, and then monitored the survival, growth rates, and other physiological traits of the transplants. They reported their findings in a paper published June 26 in Proceedings of the Royal Society B.
"The good news is the corals can survive and deposit calcium carbonate, but the density of their skeletons is reduced, which means the framework of the reef would be less robust and might be more susceptible to storm damage and bioerosion," said Adina Paytan, a research professor at UCSC's Institute of Marine Sciences and corresponding author of the paper.
Of the three species tested, the one that performed best in the low-pH conditions was Siderastrea siderea, commonly known as massive starlet coral, a slow-growing species that forms large dome-shaped structures. Another slow-growing dome-shaped species, Porites astreoides (mustard hill coral), did almost as well, although its survival rate was 20 percent lower. Both of these species outperformed the fast-growing branching coral Porites porites (finger coral).
Coauthor Donald Potts, professor of ecology and evolutionary biology at UC Santa Cruz, said the transplanted species are all widespread throughout the Caribbean. "The slow-growing, dome-shaped corals tend to be more tolerant of extreme conditions, and they are important in building up the permanent structure of the reef," he said. "We found that they have the potential for persistence in acidified conditions."
Corals will have to cope with more than ocean acidification, however. The increasing carbon dioxide level in the atmosphere is also driving climate change, resulting in warmer ocean temperatures and rising sea levels. Unusually warm temperatures can disrupt the symbiosis between coral polyps and the algae that live in them, leading to coral bleaching. And rapidly rising sea levels could leave slow-growing corals at depths where they would die from insufficient sunlight.
Nevertheless, Potts noted that several species of Caribbean corals have long fossil records showing that they have persisted through major changes in Earth's history. "These are species with a history of survival and tolerance," he said.
He added that both S. siderea and P. astreoides had higher chlorophyll concentrations at the low-pH site, indicating that their algal symbionts were responding positively and potentially increasing the energy resources available to the corals for resisting stress.
Both of the slow-growing species that did well under acidified conditions have internal fertilization and brood their larvae, so that their offspring have the potential to settle immediately in the same area, Potts said. "This means there is potential for local genetic adaptation over successive generations to changing environmental conditions," he said.
The authors also noted that the differences among coral species in survival and calcification under acidified conditions could be useful information for reef restoration efforts and perhaps even for efforts to genetically modify corals to give them greater stress tolerance.
Paytan said she remains "cautiously optimistic," despite the many threats facing coral reefs worldwide.
"These corals are more robust than we thought," she said. "They have the potential to persist with ocean acidification, but it costs them energy to cope with it, so we have to do all we can to reduce other stressors, such as nutrient pollution and sedimentation."
Paytan and Potts said the collaboration with Mexican researchers was essential to the success of the project, enabling frequent monitoring of the transplanted corals throughout the two-year experiment.
Researchers have discovered 56 previously uncharted subglacial lakes beneath the Greenland Ice Sheet bringing the total known number of lakes to 60.
Although these lakes are typically smaller than similar lakes in Antarctica, their discovery demonstrates that lakes beneath the Greenland Ice Sheet are much more common than previously thought.
The Greenland Ice Sheet covers an area approximately seven times the size of the UK, is in places more than three kilometres thick and currently plays an important role in rising global sea levels.
Subglacial lakes are bodies of water that form beneath ice masses. Meltwater is derived from the pressure of the thick overlying ice, heat generated by the flow of the ice, geothermal heat retained in the Earth, or water on the surface of the ice that drains to the bed. This water can become trapped in depressions or due to variations in ice thickness.
Knowledge of these new lakes helps form a much fuller picture of where water occurs and how it drains under the ice sheet, which influences how the ice sheet will likely respond dynamically to rising temperatures.
Published in Nature Communications this week, their paper, "Distribution and dynamics of Greenland subglacial lakes," provides the first ice-sheet wide inventory of subglacial lakes beneath the Greenland Ice Sheet.
By analysing more than 500,000 km of airborne radio echo sounding data, which provide images of the bed of the Greenland Ice Sheet, researchers from the Universities of Lancaster, Sheffield and Stanford identified 54 subglacial lakes, as well as a further two using ice-surface elevation changes.
Lead author Jade Bowling of the Lancaster Environment Centre, Lancaster University, said:
"Researchers have a good understanding of Antarctic subglacial lakes, which can fill and drain and cause overlying ice to flow quicker. However, until now little was known about subglacial lake distribution and behaviour beneath the Greenland Ice Sheet.
"This study has for the first time allowed us to start to build up a picture of where lakes form under the Greenland Ice Sheet. This is important for determining their influence on the wider subglacial hydrological system and ice-flow dynamics, and improving our understanding of the ice sheet's basal thermal state."
The newly discovered lakes range from 0.2-5.9 km in length and the majority were found beneath relatively slow moving ice away from the largely frozen bed of the ice sheet interior and seemed to be relatively stable.
However, in the future as the climate warms, surface meltwater will form lakes and streams at higher elevations on the ice sheet surface, and the drainage of this water to the bed could cause these subglacial lakes to drain and therefore become active. Closer to the margin where water already regularly gets to the bed, the researchers saw some evidence for lake activity, with two new subglacial lakes observed to drain and then refill.
Dr Stephen J. Livingstone, Senior Lecturer in Physical Geography, University of Sheffield, said:
"The lakes we have identified tend to cluster in eastern Greenland where the bed is rough and can therefore readily trap and store meltwater and in northern Greenland, where we suggest the lakes indicate a patchwork of frozen and thawed bed conditions.
"These lakes could provide important targets for direct exploration to look for evidence of extreme life and to sample the sediments deposited in the lake that preserve a record of environmental change."
Story Source:
Materials provided by Lancaster University. Note: Content may be edited for style and length.
New research led by climate scientists from the University of Bristol suggests that the representation of clouds in climate models is as, or more, important than the amount of greenhouse gas emissions when it comes to projecting future Greenland ice sheet melt.
Recent research shows that the whole of the Greenland ice sheet could be gone within the next thousand years, raising global sea level by more than seven metres.
However, most of the predictions about the future of the Greenland ice sheet focus on the impact of different greenhouse gas emission scenarios on its evolution and sea level commitment.
New research published today in the journal Nature Climate Change, shows that in a warming world, cloud microphysics play as an important role as greenhouse gases and, for high emission scenarios, dominate the uncertainties in projecting the future melting of the ice sheet.
The difference in potential melt caused by clouds, mainly stems from their ability to control the longwave radiation at the surface of the ice sheet.
They act like a blanket. The highest melt simulation has the thickest blanket (thickest clouds) with strongest warming at the surface which leads to two times more melt.
Conversely, the lower end melt simulation has the thinnest blanket (thinnest clouds) which in turn leads to less longwave warming at the surface and less melt over Greenland.
The uncertainties in Greenland Ice Sheet melt due to clouds, until the end of the 21st century could equate to 40,000 gigatons of extra ice melt. This is equivalent to 1,500 years of domestic water supply of the USA and 11 cm of global sea level rise.
PhD student, Stefan Hofer, from the University of Bristol's School of Geographical Sciences and member of the Black and Bloom and Global Mass projects, is the lead author of the new study.
He said: "Until now we thought that differences in modelled projections of the future evolution of the Greenland Ice Sheet were mainly determined by the amount of our future greenhouse gas emissions.
"However, our study clearly shows that the uncertainties in our predictions of Greenland melt are equally dependent on how we represent clouds in those models.
"Until the end of the 21st century, clouds can increase or decrease the sea level rise coming from Greenland Ice Sheet by 11 cm."
The main message of the paper is that clouds are the principal source of uncertainties in modelling future Greenland melt and consequent sea level contribution.
Ten percent of the global population live in coastal areas threatened by global sea level rise. Therefore, constraining the uncertainties due to clouds in sea level rise predictions will be needed for more accurate mitigation plans.
Stefan Hofer added: "Observations of cloud properties in the Arctic are expensive and can be challenging.
"There are only a handful of long-term observations of cloud properties in the Arctic which makes it very challenging to constrain cloud properties in our climate models.
"The logical next step would be to increase the amount of long-term observations of cloud properties in the Arctic, which then can be used to improve our climate models and predictions of future sea level rise."
Story Source:
Materials provided by University of Bristol. Note: Content may be edited for style and length.
For the past 20 years, officials from the U.S. Navy and leaders in the shipbuilding industry have convened on MIT’s campus each spring for the MIT Ship Design and Technology Symposium. The daylong event is a platform to update industry and military leaders on the latest groundbreaking research in naval construction and engineering being conducted at MIT.
The main event of the symposium was the design project presentations given by Course 2N (Naval Construction and Engineering) graduate students. These projects serve as a capstone of their three-year curriculum.
This year, recent graduate Andrew Freeman MEng '19, SM '19, who was advised by Dick K. P. Yue, the Philip J. Solondz Professor of Engineering, and William Taft MEng '19, SM '19, who works with James Kirtley, professor of electrical engineering and computer science, presented their current research. Rear Admiral Ronald A. Boxall, director of surface warfare at the U.S. Navy, served as keynote speaker at the event, which took place in May.
“The Ship Design and Technology Symposium gives students in the 2N program the opportunity to present ship and submarine design and conversions, as well as thesis research, to the leaders of the U.S. Navy and design teams from industry,” explains Joe Harbour, professor of the practice of naval construction at MIT. “Through the formal presentation and poster sessions, the naval and industrial leaders can better understand opportunities to improve designs and design processes.”
Since 1901, the Course 2N program has been educating active-duty officers in the Navy and U.S. Coast Guard, in addition to foreign naval officers. This year, eight groups of 2N students presented design or conversion project briefs to an audience of experts in the Samberg Conference Center.
The following three projects exemplify the ways in which these students are adapting existing naval designs and creating novel designs that can help increase the capabilities and efficiency of naval vessels.
The next generation of hospital ships
The Navy has a fleet of hospital ships ready for any major combat situations that might arise. These floating hospitals allow doctors to care for large numbers of casualties, perform operations, stabilize patients, and help transfer patients to other medical facilities.
Lately, these ships have been instrumental in response efforts during major disasters — such as the recent hurricanes in the Caribbean. The ships also provide an opportunity for doctors to train local medical professionals in developing countries.
The Navy's current fleet of hospital ships is aging. Designed in the 1980s, these ships require an update to complement the way naval operations are conducted in modern times. As such, the U.S. Navy is looking to launch the next fleet of hospital ships in 2035.
A team of Course 2N students including Aaron Sponseller, Travis Rapp, and Robert Carelli was tasked with assessing current hospital ship designs and proposing a design for the next generation of hospital ships.
“We looked at several different hull form sizes that could achieve the goals of our sponsors, and assigned scores to rank their attributes and determine which one could best achieve their intended mission,” explains Carelli.
In addition to visiting the USS Mercy, a hospital ship that was commissioned during World War II, the team toured nearby Tufts Medical Center to get a sense of what a state-of-the-art medical facility looked like. One thing that immediately struck the team was how different the electrical needs of a modern-day medical facility are from the needs nearly 40 years ago, when the medical ships were first being designed.
“Part of the problem with the current ships is they scaled their electrical capacity with older equipment from the 1980s in mind,” adds Rapp. This capacity doesn’t account for the increased electrical burden of digital CT scans, high-tech medical devices, and communication suites.
The current ships have a separate propulsion plant and electrical generation plant. The team found that combining the two would increase the ship’s electrical capacity, especially while "on station" — a term used when a ship maintains its position in the water.
“These ships spend a lot of time on station while doctors operate on patients,” explains Carelli. “By using the same system for propelling and electrical generation, you have a lot more capacity for these medical operations when it’s on station and for speed when the ship is moving.”
The team also recommended that the ship be downsized and tailored to treat intensive care cases rather than having such large stable patient areas. “We trimmed the fat, so to speak, and are moving the ship toward what really delivers value — intensive care capability for combat operations,” says Rapp.
The team hopes their project will inform the decisions the Navy makes when they do replace large hospital ships in 2035. “The Navy goes through multiple iterations of defining how they want their next ship to be designed and we are one small step in that process,” adds Sponseller.
Autonomous fishing vessels
Over the past few decades, advances in artificial intelligence and sensory hardware have led to increasingly sophisticated unmanned vehicles in the water. Sleek autonomous underwater vehicles operate below the water’s surface. Rather than work on these complex and often expensive machines, Course 2N students Jason Barker, David Baxter, and Brian Stanfield assessed the possibility of using something far more commonplace for their design project: fishing vessels.
“We were charged with looking at the possibility of going into a port, acquiring a low-end vessel like a fishing boat, and making that boat an autonomous machine for various missions,” explains Barker.
With such a broad scope, Barker and his teammates set some parameters to guide their research. They honed in on one fishing boat in particular: a 44 four-drum seiner.
The next step was determining how such a vessel could be outfitted with sensors to carry out a range of missions including measuring marine life, monitoring marine traffic in a given area, carrying out intelligence, surveillance and reconnaissance (ISR) missions, and, perhaps most importantly, conducting search and rescue operations.
The team estimated that the cost of transforming an everyday fishing boat into an autonomous vehicle would be roughly $2 million — substantially lower than building a new autonomous vehicle. The relatively low cost could make this an appealing exercise in areas where piracy is a potential concern. “Because the price of entry is so low, it’s not as risky as using a capital asset in these areas,” Barker explains.
The low price could also lead to a number of such autonomous vehicles in a given area. “You could put out a lot of these vessels,” adds Barker. “With the advances of swarm technologies you could create a network or grid of autonomous boats.”
Increasing endurance and efficiency in Freedom-class ships
For Course 2N student Charles Hasenbank, working on a conversion project for the engineering plant of Freedom-class ships was a natural fit. As a lieutenant in the U.S. Navy, Hasenbank served on the USS Freedom.
Freedom-class ships can reach upwards of 40 knots, 10 knots faster than most combat ships. “To get those extra knots requires a substantial amount of power,” explains Hasenbank. This power is generated by two diesel engines and two gas turbines that are also used to power large aircraft like the Dreamliner.
For their new frigate program, the Navy is looking to achieve a maximum speed of 30 knots, making the extra power provided by these engines unnecessary. The endurance range of these new frigates, however, would be higher than what the current Freedom-class ships allow. As such, Hasenbank and his fellow students Tikhon Ruggles and Cody White were tasked with exploring alternate forms of propulsion.
The team had five driving criteria in determining how to best convert the ships’ power system — minimize weight changes, increase efficiency, maintain or decrease acquisition costs, increase simplicity, and improve fleet commonality.
“The current design is a very capable platform, but the efficiencies aren’t there because speed was a driving factor,” explains Hasenbank.
When redesigning the engineering plant, the team landed on the use of four propellers, which would maintain the amount of draft currently experienced by these ships. To accommodate this change, the structure of the stern would need to be altered.
By removing a step currently in the stern design, the team made an unexpected discovery. Above 12 knots, their stern design would decrease hull resistance. “Something we didn’t initially expect was we improved efficiency and gained endurance through decreasing the hull resistance,” adds Hasenbank. “That was a nice surprise along the way.”
The team’s new design would be able to meet the 30 knot speed requirement of the new frigate program and it would add anywhere between 500 and 1,000 nautical miles of endurance to the ship.
Along with the other design projects presented at the MIT Ship Design and Technology Symposium, the work conducted by Hasenbank and his team could inform important decisions the U.S. Navy has to make in the coming years as it looks to update and modernize its fleet.
Increased solar radiation penetrating through the damaged ozone layer is interacting with the changing climate, and the consequences are rippling through the Earth's natural systems, effecting everything from weather to the health and abundance of sea mammals like seals and penguins. These findings were detailed in a review article published today in Nature Sustainability by members of the United Nations Environment Programme's Environmental Effects Assessment Panel, which informs parties to the Montreal Protocol.
"What we're seeing is that ozone changes have shifted temperature and precipitation patterns in the southern hemisphere, and that's altering where the algae in the ocean are, which is altering where the fish are, and where the walruses and seals are, so we're seeing many changes in the food web," said Kevin Rose, a researcher at Rensselaer Polytechnic Institute who serves on the panel and is a co-author of the review article.
The 1987 Montreal Protocol on Substances that Deplete the Ozone Layer -- the first multilateral environmental agreement to be ratified by all member nations of the United Nations -- was designed to protect Earth's main filter for solar ultraviolet radiation by phasing out production of harmful humanmade substances, such as the chlorofluorocarbons class of refrigerants. The treaty has largely been considered a success, with global mean total ozone projected to recover to pre-1980 levels by the middle of the 21st century. Earlier this year, however, researchers reported detecting new emissions of ozone depleting substances emanating from East Asia, which could threaten ozone recovery.
While ozone depletion has long been known to increase harmful UV radiation at the Earth's surface, its effect on climate has only recently become evident. The report points to the Southern Hemisphere, where a hole in the ozone layer above Antarctica has pushed the Antarctic Oscillation -- the north-south movement of a wind belt that circles the Southern Hemisphere -- further south than it has been in roughly a thousand years. The movement of the Antarctic Oscillation is in turn directly contributing to climate change in the Southern Hemisphere.
As climate zones have shifted southward, rainfall patterns, sea-surface temperatures, and ocean currents across large areas of the southern hemisphere have also shifted, impacting terrestrial and aquatic ecosystems. The effects can be seen in Australia, New Zealand, Antarctica, South America, Africa, and the Southern Ocean.
In the oceans, for example, some areas have become cooler and more productive, where other areas have become warmer and less productive.
Warmer oceans are linked to declines in Tasmanian kelp beds and Brazilian coral reefs, and the ecosystems that rely on them. Cooler waters have benefitted some populations of penguins, seabirds, and seals, who profit from greater populations of krill and fish. One study reported that female albatrosses may have become a kilogram heavier in certain areas because of the more productive cooler waters linked to ozone depletion.
Rose also pointed to subtler feedback loops between climate and UV radiation described in the report. For example, higher concentrations of carbon dioxide have led to more acidic oceans, which reduces the thickness of calcified shells, rendering shellfish more vulnerable to UV radiation. Even humans, he said, are likely to wear lighter clothes in a warmer atmosphere, making themselves more susceptible to damaging UV rays.
The report found that climate change may also be affecting the ozone layer and how quickly the ozone layer is recovering.
"Greenhouse gas emissions trap more heat in the lower atmosphere which leads to a cooling of the upper atmosphere. Those colder temperatures in the upper atmosphere are slowing the recovery of the ozone layer," Rose said.
As one of three scientific panels to support the Montreal Protocol, the Environmental Effects Assessment Panel focused in particular on the effects of UV radiation, climate change, and ozone depletion. Thirty-nine researchers contributed to the article, which is titled "Ozone depletion, ultraviolet radiation, climate change and prospects for a sustainable future." Rose, an aquatic ecologist, serves on the aquatic ecosystems working group, which is one of seven working groups that are part of the panel.
"This international collaboration focusing on a pressing problem of global significance exemplifies the research vision of The New Polytechnic at Rensselaer," said Curt Breneman, dean of the Rensselaer School of Science."
Swimming in the ocean alters the skin microbiome and may increase the likelihood of infection, according to research presented at ASM Microbe 2019, the annual meeting of the American Society for Microbiology.
"Our data demonstrate for the first time that ocean water exposure can alter the diversity and composition of the human skin microbiome," said Marisa Chattman Nielsen, MS, a PhD student at the University of California, Irvine, the lead author on the study. While swimming normal resident bacteria were washed off while ocean bacteria were deposited onto the skin."
The researchers detected ocean bacteria on all participants after air drying and at six and 24 hours post-swim, but some participants had acquired more ocean bacteria and/or had them persist for longer.
The research was motivated by previous studies which have shown associations between ocean swimming and infections, and by the high prevalence of poor water quality at many beaches, due to wastewater and storm water runoff. Recent research has demonstrated that changes in the microbiome can leave the host susceptible to infection, and influence disease states. Exposure to these waters can cause gastrointestinal and respiratory illness, ear infections, and skin infections.
The investigators sought 9 volunteers at a beach who met criteria of no sunscreen use, infrequent exposure to the ocean, no bathing within the last 12 hours, and no antibiotics during the previous six months. The researchers swabbed the participants on the back of the calf before they entered the water, and again after subjects had air dried completely following a ten-minute swim and at six and 24 hours post swim.
Before swimming, all individuals had different communities from one-another, but after swimming, they all had similar communities on their skin, which were completely different from the "before swim" communities. At six hours post swim, the microbiomes had begun to revert to their pre-swim composition, and at 24 hours, they were far along in that process.
"One very interesting finding was that Vibrio species -- only identified to the genus level -- were detected on every participant after swimming in the ocean, and air drying," said. Nielsen. (The Vibrio genus includes the bacterium that causes cholera.) At six hours post swim, they were still present on most of the volunteers, but by 24 hours, they were present only on one individual.
"While many Vibrio are not pathogenic, the fact that we recovered them on the skin after swimming demonstrates that pathogenic Vibrio species could potentially persist on the skin after swimming," said Nielsen. The fraction of Vibrio species detected on human skin was more than 10 times greater than the fraction in the ocean water sample, suggesting a specific affinity for attachment to human skin.
Skin is the body's first line of defense, both physically and immunologically, during exposure to contaminated water. "Recent studies have shown that human skin microbiome plays an important role in immune system function, localized and systemic diseases, and infection," said Nielsen. "A healthy microbiome protects the host from colonization and infection by opportunistic and pathogenic microbes."
Story Source:
Materials provided by American Society for Microbiology. Note: Content may be edited for style and length.
In a new survey of the sub-seafloor off the U.S. Northeast coast, scientists have made a surprising discovery: a gigantic aquifer of relatively fresh water trapped in porous sediments lying below the salty ocean. It appears to be the largest such formation yet found in the world. The aquifer stretches from the shore at least from Massachusetts to New Jersey, extending more or less continuously out about 50 miles to the edge of the continental shelf. If found on the surface, it would create a lake covering some 15,000 square miles. The study suggests that such aquifers probably lie off many other coasts worldwide, and could provide desperately needed water for arid areas that are now in danger of running out.
The researchers employed innovative measurements of electromagnetic waves to map the water, which remained invisible to other technologies. "We knew there was fresh water down there in isolated places, but we did not know the extent or geometry," said lead author Chloe Gustafson, a PhD. candidate at Columbia University's Lamont-Doherty Earth Observatory. "It could turn out to be an important resource in other parts of the world." The study appears this week in the journal Scientific Reports.
The first hints of the aquifer came in the 1970s, when companies drilled off the coastline for oil, but sometimes instead hit fresh water. Drill holes are just pinpricks in the seafloor, and scientists debated whether the water deposits were just isolated pockets or something bigger. Starting about 20 years ago, study coauthor Kerry Key, now a Lamont-Doherty geophysicist, helped oil companies develop techniques to use electromagnetic imaging of the sub-seafloor to look for oil. More recently, Key decided to see if some form of the technology could also be used also to find fresh-water deposits. In 2015, he and Rob L. Evans of Woods Hole Oceanographic Institution spent 10 days on the Lamont-Doherty research vessel Marcus G. Langseth making measurements off southern New Jersey and the Massachusetts island of Martha's Vineyard, where scattered drill holes had hit fresh-water-rich sediments.
They dropped receivers to the seafloor to measure electromagnetic fields below, and the degree to which natural disruptions such as solar winds and lightning strikes resonated through them. An apparatus towed behind the ship also emitted artificial electromagnetic pulses and recorded the same type of reactions from the subseafloor. Both methods work in a simple way: salt water is a better conductor of electromagnetic waves than fresh water, so the freshwater stood out as a band of low conductance. Analyses indicated that the deposits are not scattered; they are more or less continuous, starting at the shoreline and extending far out within the shallow continental shelf -- in some cases, as far as 75 miles. For the most part, they begin at around 600 feet below the ocean floor, and bottom out at about 1,200 feet.
The consistency of the data from both study areas allowed to the researchers to infer with a high degree of confidence that fresh water sediments continuously span not just New Jersey and much of Massachusetts, but the intervening coasts of Rhode Island, Connecticut and New York. They estimate that the region holds at least 670 cubic miles of fresh water. If future research shows the aquifer extends further north and south, it would rival the great Ogallala Aquifer, which supplies vital groundwater to eight Great Plains states, from South Dakota to Texas.
The water probably got under the seabed in one of two different ways, say the researchers. Some 15,000 to 20,000 years ago, toward the end of the last glacial age, much of the world's water was locked up in mile-deep ice; in North America, it extended through what is now northern New Jersey, Long Island and the New England coast. Sea levels were much lower, exposing much of what is now the underwater U.S. continental shelf. When the ice melted, sediments formed huge river deltas on top of the shelf, and fresh water got trapped there in scattered pockets. Later, sea levels rose. Up to now, the trapping of such "fossil" water has been the common explanation for any fresh water found under the ocean.
But the researchers say the new findings indicate that the aquifer is also being fed by modern subterranean runoff from the land. As water from rainfall and water bodies percolates through onshore sediments, it is likely pumped seaward by the rising and falling pressure of tides, said Key. He likened this to a person pressing up and down on a sponge to suck in water from the sponge's sides. Also, the aquifer is generally freshest near the shore, and saltier the farther out you go, suggesting that it mixes gradually with ocean water over time. Terrestrial fresh water usually contains less than 1 part per thousand salt, and this is about the value found undersea near land. By the time the aquifer reaches its outer edges, it rises to 15 parts per thousand. (Typical seawater is 35 parts per thousand.)
If water from the outer parts of the aquifer were to be withdrawn, it would have to be desalinated for most uses, but the cost would be much less than processing seawater, said Key. "We probably don't need to do that in this region, but if we can show there are large aquifers in other regions, that might potentially represent a resource" in places like southern California, Australia, the Mideast or Saharan Africa, he said. His group hopes to expand its surveys.
While fisheries are traditionally managed at the national level, the study reveals the degree to which each country's fishing economy relies on the health of its neighbors' spawning grounds, highlighting the need for greater international cooperation.
Led by researchers at the University of California, Berkeley, the London School of Economics, and the University of Delaware, the study used a particle tracking computer simulation to map the flow of fish larvae across national boundaries. It is the first to estimate the extent of larval transport globally, putting fishery management in a new perspective by identifying hotspots of regional interdependence where cooperative management is needed most.
"Now we have a map of how the world's fisheries are interconnected, and where international cooperation is needed most urgently to conserve a natural resource that hundreds of millions of people rely on," said co-author Kimberly Oremus, assistant professor at the University of Delaware's School of Marine Science and Policy.
The vast majority of the world's wild-caught marine fish, an estimated 90%, are caught within 200 miles of shore, within national jurisdictions. Yet even these fish can be carried far from their spawning grounds by currents in their larval stage, before they're able to swim. This means that while countries have set national maritime boundaries, the ocean is made up of highly interconnected networks where most countries depend on their neighbors to properly manage their own fisheries. Understanding the nature of this network is an important step toward more effective fishery management, and is essential for countries whose economies and food security are reliant on fish born elsewhere.
The authors brought together their expertise in oceanography, fish biology, and economics to make progress on this complex problem.
"Data from a wide range of scientific fields needed to come together to make this study possible," said lead author Nandini Ramesh, a post-doctoral researcher in the Department of Earth and Planetary Science at the University of California, Berkeley. "We needed to look at patterns of fish spawning, the life cycles of different species, ocean currents, and how these vary with the seasons in order to begin to understand this system." The study combined data from satellites, ocean moorings, ecological field observations, and marine catch records, to build a computer model of how eggs and larvae of over 700 species of fish all over the world are transported by ocean currents.
The research shows that ocean regions are connected to each other in what's known as a "small world network," the same phenomenon that allows strangers to be linked by six degrees of separation. That adds a potential new risk: threats in one part of the world could result in a cascade of stresses, affecting one region after another.
"We are all dependent on the oceans," said co-author James Rising, assistant professorial research fellow at the Grantham Research Institute in the London School of Economics. "When fisheries are mismanaged or breeding grounds are not protected, it could affect food security half a world away."
A surprising finding of the study was how interconnected national fisheries are, across the globe. "This is something of a double-edged sword," explained lead author Ramesh, "On one hand, it implies that mismanagement of a fishery can have negative effects that easily propagate to other countries; on the other hand, it implies that multiple countries can benefit by targeting conservation and/or management efforts in just a few regions."
"By modeling dispersal by species, we could connect this ecosystem service to the value of catch, marine fishing jobs, food security and gross domestic product," Oremus added. "This allowed us to talk about how vulnerable a nation is to the management of fisheries in neighboring countries."
They found that the tropics are especially vulnerable to this larval movement -- particularly when it comes to food security and jobs.
"Our hope is that this study will be a stepping stone for policy makers to study their own regions more closely to determine their interdependencies," said Ramesh. "This is an important first step. This is not something people have examined before at this scale."
Just beyond where conventional scuba divers can go is an area of the ocean that still is largely unexplored. In waters this deep -- about 100 to at least 500 feet below the surface -- little to no light breaks through.
Researchers must rely on submersible watercraft or sophisticated diving equipment to be able to study ocean life at these depths, known as the mesophotic zone. These deep areas span the world's oceans and are home to extensive coral reef communities, though little is known about them because it is so hard to get there.
A collaborative research team from the University of Washington, College of Charleston, University of California Berkeley, University of Hawaii and other institutions has explored the largest known coral reef in the mesophotic zone, located in the Hawaiian Archipelago, through a series of submersible dives. There, they documented life along the coral reef, finding a surprising amount of coral living in areas where light levels are less than 1% of the light available at the surface.
Their findings were published April 8 in the journal Limnology and Oceanography.
"Because mesophotic corals live close to the limits of what is possible, understanding their physiology will give us clues of the extraordinary strategies corals use to adapt to low-light environments," said lead author Jacqueline Padilla-Gamiño, an assistant professor in the UW School of Aquatic and Fishery Sciences.
Knowing how these deep coral reefs function is important because they appear to be hotspots for biodiversity, and home to many species found only in those locations, Padilla-Gamiño explained. Additionally, close to half of all corals in the ocean have died in the past 30 years, mostly due to warm water temperatures that stress their bodies, causing them to bleach and eventually die. This has been documented mostly in shallower reefs where more research has occurred. Scientists say that more information about deeper reefs in the mesophotic zone is critical for preserving that habitat.
"Mesophotic reefs in Hawaii are stunning in their sheer size and abundance," said co-author Heather Spalding at College of Charleston. "Although mesophotic environments are not easily seen, they are still potentially impacted by underwater development, such as cabling and anchoring, and need to be protected for future generations. We are on the tip of the iceberg in terms of understanding what makes these astounding reefs tick."
Padilla-Gamiño was on board during two of the team's eight submersible dives off the coast of Maui that took place from 2000 to 2011. Each dive was a harrowing adventure: Researchers spent up to eight hours in cramped quarters in the submersible that was tossed from the back of a larger boat, then disconnected once the submersible reached the water.
Once in the mesophotic zone, they collected specimens using a robot arm, and captured video footage and photos of life that has rarely been seen by humans.
"It's a really unbelievable place," Padilla-Gamiño said. "What is surprising is that, in theory, these corals should not be there because there's so little light. Now we're finally understanding how they function to be able to live there."
By collecting coral samples and analyzing their physiology, the researchers found that different corals in the mesophotic zone use different strategies to deal with low amounts of light. For example, some species of corals change the amount of pigments at deeper depths, while other species change the type and size of symbionts, which are microscopic seaweeds living inside the tissue of corals, Padilla-Gamiño explained. These changes allow corals to acquire and maximize the light available to perform photosynthesis and obtain energy.
Additionally, the corals at deeper depths are likely eating other organisms like zooplankton to increase their energy intake and survive under very low light levels. They probably do this by filter feeding, Padilla-Gamiño said, but more research is needed to know for sure.
The researchers hope to collect more live coral samples from the mesophotic zone to be able to study in the lab how the symbionts, and the corals they live inside, function.
"The more we can study this, the more information we can have about how life works. This is a remarkable system with enormous potential for discovery," Padilla-Gamiño said. "Our studies provide the foundation to explore physiological ?exibility, identify novel mechanisms to acquire light and challenge current paradigms on the limitations of photosynthetic organisms like corals living in deeper water."
Other co-authors are Celia Smith at University of Hawaii at M?noa; Melissa Roth at UC Berkeley; Lisa Rodrigues at Villanova University; Christina Bradley at Salisbury University; and Robert Bidigare and Ruth Gates at Hawaii Institute of Marine Biology.
The study was funded by the National Oceanic and Atmospheric Administration and the National Science Foundation.
A newly comprehensive study shows that melting of Himalayan glaciers caused by rising temperatures has accelerated dramatically since the start of the 21st century. The analysis, spanning 40 years of satellite observations across India, China, Nepal and Bhutan, indicates that glaciers have been losing the equivalent of more than a vertical foot and half of ice each year since 2000 -- double the amount of melting that took place from 1975 to 2000. The study is the latest and perhaps most convincing indication that climate change is eating the Himalayas' glaciers, potentially threatening water supplies for hundreds of millions of people downstream across much of Asia.
"This is the clearest picture yet of how fast Himalayan glaciers are melting over this time interval, and why," said lead author Joshua Maurer, a Ph.D. candidate at Columbia University's Lamont-Doherty Earth Observatory. While not specifically calculated in the study, the glaciers may have lost as much as a quarter of their enormous mass over the last four decades, said Maurer. The study appears this week in the journal Science Advances.
Currently harboring some 600 billion tons of ice, the Himalayas are sometimes called the Earth's "Third Pole." Many other recent studies have suggested that the glaciers are wasting, including one this year projecting that up to two-thirds of the current ice cover could be gone by 2100. But up to now, observations have been somewhat fragmented, zeroing in on shorter time periods, or only individual glaciers or certain regions. These studies have produced sometimes contradictory results, both regarding the degree of ice loss and the causes. The new study synthesizes data from across the region, stretching from early satellite observations to the present. The synthesis indicates that the melting is consistent in time and space, and that rising temperatures are to blame. Temperatures vary from place to place, but from 2000 to 2016 they have averaged 1 degree Centigrade (1.8 degrees Fahrenheit) higher than those from 1975 to 2000.
Maurer and his colleagues analyzed repeat satellite images of some 650 glaciers spanning 2,000 kilometers from west to east. Many of the 20th-century observations came from recently declassified photographic images taken by U.S. spy satellites. The researchers created an automated system to turn these into 3D models that could show the changing elevations of glaciers over time. They then compared these images with post-2000 optical data from more sophisticated satellites, which more directly convey elevation changes.
They found that from 1975 to 2000, glaciers across the region lost an average of about 0.25 meters (10 inches) of ice each year in the face of slight warming. Following a more pronounced warming trend starting in the 1990s, starting in 2000 the loss accelerated to about half a meter (20 inches) annually. Recent yearly losses have averaged about 8 billion tons of water, or the equivalent 3.2 million Olympic-size swimming pools, says Maurer. Most individual glaciers are not wasting uniformly over their entire surfaces, he noted; melting has been concentrated mainly at lower elevations, where some ice surfaces are losing as much as 5 meters (16 feet) a year.
Some researchers have argued that factors other than temperature are affecting the glaciers. These include changes in precipitation, which seems to be declining in some areas (which would tend to reduce the ice), but increasing in others (which would tend to build it). Another factor: Asian nations are burning ever-greater loads of fossil fuels and biomass, sending soot into the sky. Much of it eventually lands on snowy glacier surfaces, where it absorbs solar energy and hastens melting. Maurer agrees that both soot and precipitation are factors, but due to the region's huge size and extreme topography, the effects are highly variable from place to place. Overall, he says, temperature is the overarching force. To confirm this, he and his colleagues compiled temperature data during the study period from ground stations and then calculated the amount of melting that observed temperature increases would be expected to produce. They then compared those figures with what actually happened. They matched. "It looks just like what we would expect if warming were the dominant driver of ice loss," he said.
Ice loss in the Himalayas resembles the far more closely studied European Alps, where temperatures started going up somewhat earlier, in the 1980s. Glaciers there started wasting shortly after that increase, and rapid loss of ice has continued since then. The Himalayas are generally not melting as fast as the Alps, but the general progression is similar, say the researchers. The study does not include the huge adjoining ranges of high-mountain Asia such as the Pamir, Hindu Kush or Tian Shan, but other studies suggest similar melting is underway there as well.
Some 800 million people depend in part on seasonal runoff from Himalayan glaciers for irrigation, hydropower and drinking water. The accelerated melting appears so far to be swelling runoff during warm seasons, but scientists project that this will taper off within decades as the glaciers lose mass. This, they say, will eventually lead to water shortages. A separate study published this May estimates that yearly runoff is now about 1.6 times greater than if the glaciers were replenished at the same rate they were melting. As a result, in many high-mountain drainages, meltwater lakes are building rapidly behind natural dams of rocky debris; these are threatening downstream communities with potentially destructive and deadly outburst floods. Even on Mount Everest, long-lost corpses of climbers who failed to return are emerging from melting ice and snow along trails.
The study shows that "even glaciers in the highest mountains of the world are responding to global air temperature increases driven by the combustion of fossil fuels," said Joseph Shea, a glacial geographer at the University of Northern British Columbia who was not involved in the study. "In the long term, this will lead to changes in the timing and magnitude of streamflow in a heavily populated region."
"It shows how endangered [the Himalayas] are if climate change continues at the same pace in the coming decades," said Etienne Berthier, a glaciologist at France's Laboratory for Studies in Geophysics and Spatial Oceanography, who also was not involved in the study.
The study was coauthored by Joerg Schaefer and Alison Corley of Lamont-Doherty Earth Observatory, and Summer Rupper of the University of Utah.
New research shows an iceless Greenland may be in the future. If worldwide greenhouse gas emissions remain on their current trajectory, Greenland may be ice-free by the year 3000. Even by the end of the century, the island could lose 4.5% of its ice, contributing up to 13 inches of sea level rise.
"How Greenland will look in the future -- in a couple of hundred years or in 1,000 years -- whether there will be Greenland, or at least a Greenland similar to today, it's up to us," said Andy Aschwanden, a research associate professor at the University of Alaska Fairbanks Geophysical Institute.
Aschwanden is lead author on a new study published in the June issue of Science Advances. UAF Geophysical Institute researchers Mark Fahnestock, Martin Truffer, Regine Hock and Constantine Khrulev are co-authors, as is Doug Brinkerhoff, a former UAF graduate student.
This research uses new data on the landscape under the ice today to make breakthroughs in modeling the future. The findings show a wide range of scenarios for ice loss and sea level rise based on different projections for greenhouse gas concentrations and atmospheric conditions. Currently, the planet is moving toward the high estimates of greenhouse gas concentrations.
Greenland's ice sheet is huge, spanning over 660,000 square miles. It is almost the size of Alaska and 80% as big as the U.S. east of the Mississippi River. Today, the ice sheet covers 81% of Greenland and contains 8% of Earth's fresh water.
If greenhouse gas concentrations remain on the current path, the melting ice from Greenland alone could contribute as much as 24 feet to global sea level rise by the year 3000, which would put much of San Francisco, Los Angeles, New Orleans and other cities under water.
However, if greenhouse gas emissions are cut significantly, that picture changes. Instead, by 3000 Greenland may lose 8% to 25% of ice and contribute up to approximately 6.5 feet of sea level rise. Between 1991 and 2015, Greenland's ice sheet has added about 0.02 inches per year to sea level, but that could rapidly increase.
Projections for both the end of the century and 2200 tell a similar story: There are a wide range of possibilities, including saving the ice sheet, but it all depends on greenhouse gas emissions.
The researchers ran 500 simulations for each of the three climate scenarios using the Parallel Ice Sheet Model, developed at the Geophysical Institute, to create a picture of how Greenland's ice would respond to different climate scenarios. The model included parameters on ocean and atmospheric conditions as well as ice geometry, flow and thickness.
Simulating ice sheet behavior is difficult because ice loss is led by the retreat of outlet glaciers. These glaciers, at the margins of ice sheets, drain the ice from the interior like rivers, often in troughs hidden under the ice itself.
This study is the first model to include these outlet glaciers. It found that their discharge could contribute as much as 45% of the total mass of ice lost in Greenland by 2200.
Outlet glaciers are in contact with water, and water makes ice melt faster than contact with air, like thawing a chicken in the sink. The more ice touches water, the faster it melts. This creates a feedback loop that dramatically affects the ice sheet.
However, to simulate how the ice flows, the scientists need to know how thick the ice is.
The team used data from a NASA airborne science campaign called Operation IceBridge. Operation IceBridge uses aircraft equipped with a full suite of scientific instruments, including three types of radar that can measure the ice surface, the individual layers within the ice and penetrate to the bedrock to collect data about the land beneath the ice. On average, Greenland's ice sheet is 1.6 miles thick, but there is a lot of variation depending on where you measure.
"Ice is in very remote locations," said Fahnestock. "You can go there and make localized measurements. But the view from space and the view from airborne campaigns, like IceBridge, has just fundamentally transformed our ability to make a model to mimic those changes."
Because previous research results lacked these details, scientists could not simulate present-day conditions as accurately, which makes it more difficult to predict what will happen in the future.
"If it's raining in D.C. today, your best guess is that it's raining tomorrow, too," Aschwanden said. "If you don't know what the weather is today, it's all guessing."
However, that doesn't mean researchers know exactly what will happen.
"What we know from the last two decades of just watching Greenland is not because we were geniuses and figured it out, but because we just saw it happen," Fahnestock said. As for what we will see in the future, "it depends on what we are going to do next."
Naples may be the nation's best beach town to live in, but here are 10 of the shoreline spots that make all of Southwest Florida a great coastal region.
Odisha may be fortunate of having a pretty long coastline right from Icchapuram in the south to the east Digha in West Bengal, but it remains like other coastal ...
Take a boat tour or other water excursion in Biscayne National Park to see Stiltsville, snorkel a shipwreck, paddle board in remote and beautiful Jones Lagoon ...
XTU architects have published their competition entry for the Founder's Memorial in Singapore's Bay East Garden. Inspired by the mangroves and banyans of ...
DIY houseplant kits on trend - from Horticulture Week.
ILOILO CITY – More than a week before the new local officials assume their post, the city government's incoming mayor has already relayed his intent to ...
Yida project includes plans for a manufacturing hub, but opponents warn construction is decimating coastal vegetation.
The census ward that includes the slum-dominated areas of Mankhurd and Govandi has significantly more green cover than upscale Pali Hill and Khar.But.
Kenya has pledged to restore 5.1 million hectares of degraded land.
A pair of Miami architects who infuriated neighbors and drew the scrutiny of county environmental regulators when they chopped down mangroves at their.
THE Department of Environment and Natural Resources (DENR) has ordered a land developer to show legal basis on why he destroyed mangroves in Cansaga ...