Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
October 12, 2017 | https://www.sciencedaily.com/releases/2017/10/171012200202.htm | Combination of El Niño and 2016 Ecuador earthquake likely worsened Zika outbreak | A Zika virus outbreak in coastal Ecuador in 2016 was likely worsened by a strong El Niño and a magnitude 7.8 earthquake that struck the region in April, according to a new study. | A new research commentary suggests the earthquake left more people exposed to disease-carrying mosquitos, and climate variability associated with the 2014-2016 El Niño event created more favorable mosquito breeding grounds. Warmer temperatures and increased rainfall, combined with destruction of the region's infrastructure and a population influx into large cities, likely caused the number of Zika cases to increase 12-fold in just three months, according to the study's authors. The research was accepted for publication in Zika was first observed in Africa in the 1950s and recently spread to South America and Southeast Asia. The disease is transmitted by mosquitoes and usually causes a mild illness with symptoms such as headaches, rash and eye infections. Zika virus infection in pregnant mothers can result in a variety of birth defects. As of September 2017, approximately 6,811 suspected and confirmed cases of Zika have occurred in Ecuador, according to a World Health Organization report.El Niño is the warm phase of a regular climate pattern that occurs in the Pacific Ocean. It brings warmer air temperatures and higher rainfall levels to the west coast of South America. Previous research established a link between the 2014-2016 El Niño and the spread of Zika in South America, but the new study goes further and examines the interaction between these two events and the 2016 earthquake.The new commentary suggests changes in the climate can amplify the worst effects of natural disasters and disease outbreaks in socially vulnerable regions. Areas that are already stressed by short-term climate changes like El Niño can be sent over the edge due to a catastrophe and may struggle to recuperate afterwards, said Cecilia Sorensen, a Living Closer Foundation fellow in climate and health policy at the University of Colorado School of Medicine in Aurora, Colorado and lead author of the new study.The authors studied the effects of short-term changes in Ecuador's climate, not long-term global warming patterns. But extreme El Niño events such as the one observed in 2016 are projected to increase in frequency due to human-caused climate change. Sorensen's team suspects that the combination of increased extreme events and long-term warming could lead to conditions that favor the spread of mosquito-borne diseases.The findings are important because of their applicability to recent events, like recent earthquakes in Mexico and hurricanes in the Caribbean and the U.S., according to Ángel G. Muñoz, a research associate at the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory."The main message of the authors is related to the important question of how a combination of natural hazards can increase the vulnerability of the population, making people's exposure higher and lowering their adaptive capacity during and after the occurrence of such hazards," he said.A magnitude 7.8 earthquake struck the province of Manabi in coastal Ecuador on April 16, 2016. The quake affected approximately 720,000 people, destroyed much of the region's sanitation and healthcare infrastructure, and resulted in a massive influx of displaced residents into urban areas.Sorensen and the study co-authors worked with the non-governmental organization Walking Palms Global Initiative to operate a mobile health clinic after the earthquake. They saw many women and children coming in with symptoms typical of mosquito-borne illnesses like dengue fever and Zika. In July of 2016, UNICEF reported the number of Zika cases in Ecuador spiked from 92 cases before the earthquake to 1,106 cases just three months after the event. 80 percent of these new cases occurred in Manabi.The research team set out to study how damage from the earthquake and short-term changes in weather associated with El Niño could have potentially exposed more people to mosquitoes and exacerbated the outbreak."We saw so many people affected by the earthquake that were sleeping outside without any shelter from mosquitoes, so we were worrying that the region's changing climate could facilitate the spread of diseases," Sorensen said. "Natural disasters can create a niche for emerging diseases to come out and affect more people."Sorensen's team reviewed the existing research on the link between short-term changes in climate and disease transmission. They then applied those findings to explain the role of the earthquake and El Niño in the Zika outbreak.The researchers suggest El Niño created ideal conditions for Zika-carrying mosquitos to breed and make more copies of the Zika virus. The warmer air temperatures and increased rainfall brought by El Niño have previously been associated with a higher likelihood of dengue outbreaks. Warmer temperatures can accelerate viral replication in mosquitoes and influence mosquitos' development and breeding habits.Additionally, the El Niño event brought warmer sea-surface temperatures, which have been shown to correlate with outbreaks of mosquito-transmitted diseases. Estimates from remote sensing data in coastal Ecuador show that sea-surface temperatures were higher than average from 2014-2016.The team also believes an increase in water scarcity after the earthquake indirectly benefitted mosquito development. The quake damaged municipal water systems, forcing people to store water in open containers outside their homes. These containers served as additional habitats for mosquito larvae to grow in.The new findings could be used by governments to identify and protect vulnerable communities before natural disasters happen, Sorensen said."One idea is to develop disease models that can use existing climate models to predict where these vectors will show up due to climate variability," she said. "Applying these new models to areas that have pre-existing social vulnerabilities could identify susceptible regions, allowing us to direct healthcare resources there ahead of time." | Earthquakes | 2,017 |
October 12, 2017 | https://www.sciencedaily.com/releases/2017/10/171012122835.htm | Climate change may accelerate infectious disease outbreaks, say researchers | Aside from inflicting devastating natural disasters on often vulnerable communities, climate change can also spur outbreaks of infectious diseases like Zika , malaria and dengue fever, according to a new study by researchers at the University of Colorado Anschutz Medical Campus. | "Climate change presents complex and wide-reaching threats to human health," said Cecilia Sorensen, MD, lead author of the study and the Living Closer Foundation Fellow in Climate and Health Policy at CU Anschutz. "It can amplify and unmask ecological and socio-political weaknesses and increase the risk of adverse health outcomes in socially vulnerable regions."When natural disasters strike such places, she said, the climatic conditions may make the public health crisis significantly worse.The researchers said these vulnerabilities can happen anywhere. After Hurricane Katrina hit New Orleans, cases of West Nile disease doubled the next year. Climate change in Africa appears to be increasing cases of malaria. And the recent destruction in Houston, Florida and Puerto Rico due to hurricanes may usher in more infectious diseases in the years ahead.The study focused specifically on a magnitude 7.7 earthquake that struck coastal Ecuador in April 2016, coinciding with an exceptionally strong El Niño event. El Niños are associated with heavy rainfall and warmer air temperatures. They are also linked to outbreaks of dengue fever.Sorensen, a clinical instructor in emergency medicine at CU Anschutz, was in Ecuador with her co-authors working with the Walking Palms Global Initiative. They were operating a mobile health clinic after the disaster."We were seeing all of these viral symptoms in the wake of the quake," she said. "We noticed a huge spike in Zika cases where the earthquake occurred. Prior to this, there were only a handful of Zika cases in the whole country." In fact, the researchers found the number of Zika cases had increased 12-fold in the quake zone.Zika virus is transmitted by mosquitos. Symptoms are usually mild but the infection can cause major abnormalities and even death in a developing fetus.Warmer temperatures and increased rainfall from the El Niño, along with a devastated infrastructure and an influx of people into larger cities, likely caused the spike in Zika cases, Sorensen said."We saw so many people affected by the earthquake that were sleeping outside without any shelter from mosquitoes, so we were worrying that the region's changing climate could facilitate the spread of diseases," she said. "Natural disasters can create a niche for emerging diseases to come out and affect more people."Sorensen's team reviewed the existing research on the link between short-term climate changes and disease transmission. They applied those findings to explain the role of the earthquake and El Niño in the Zika outbreak.The researchers suggest El Niño created ideal conditions for Zika-carrying mosquitos to breed and make more copies of the Zika virus. The warmer temperatures and increased rainfall from El Niño have previously been associated with a higher likelihood of dengue outbreaks. Warmer temperatures can also accelerate viral replication in mosquitoes and influence mosquitos' development and breeding habits.At the same time, the El Niño event brought warmer sea-surface temperatures, which have been shown to correlate with outbreaks of mosquito-transmitted diseases. Estimates from remote sensing data in coastal Ecuador show that sea-surface temperatures were higher than average from 2014-2016.The team also believes an increase in water scarcity after the earthquake indirectly benefited mosquito development. The quake damaged municipal water systems, forcing people to store water in open containers outside their homes. These served as additional habitats for mosquito larvae.The new findings could be used by governments to identify and protect vulnerable communities before natural disasters happen, Sorensen said."One idea is to develop disease models that can use existing climate models to predict where these vectors will show up due to climate variability," she said. "Applying these new models to areas that have pre-existing social vulnerabilities could identify susceptible regions, allowing us to direct healthcare resources there ahead of time." | Earthquakes | 2,017 |
October 5, 2017 | https://www.sciencedaily.com/releases/2017/10/171005190243.htm | Old Faithful's geological heart revealed | Old Faithful is Yellowstone National Park's most famous landmark. Millions of visitors come to the park every year to see the geyser erupt every 44-125 minutes. But despite Old Faithful's fame, relatively little was known about the geologic anatomy of the structure and the fluid pathways that fuel the geyser below the surface. Until now. | University of Utah scientists have mapped the near-surface geology around Old Faithful, revealing the reservoir of heated water that feeds the geyser's surface vent and how the ground shaking behaves in between eruptions. The map was made possible by a dense network of portable seismographs and by new seismic analysis techniques. The results are published in Geophysical Research Letters. Doctoral student Sin-Mei Wu is the first author.For Robert Smith, a long-time Yellowstone researcher and distinguished research professor of geology and geophysics, the study is the culmination of more than a decade of planning and comes as he celebrates his 60th year working in America's first national park."Here's the iconic geyser of Yellowstone," Smith says. "It's known around the world, but the complete geologic plumbing of Yellowstone's Upper Geyser Basin has not been mapped nor have we studied how the timing of eruptions is related to precursor ground tremors before eruptions."Old Faithful is an iconic example of a hydrothermal feature, and particularly of the features in Yellowstone National Park, which is underlain by two active magma reservoirs at depths of 5 to 40 km depth that provide heat to the overlying near-surface groundwater. In some places within Yellowstone, the hot water manifests itself in pools and springs. In others, it takes the form of explosive geysers.Dozens of structures surround Old Faithful, including hotels, a gift shop and a visitor's center. Some of these buildings, the Park Service has found, are built over thermal features that result in excessive heat beneath the built environment. As part of their plan to manage the Old Faithful area, the Park Service asked University of Utah scientists to conduct a geologic survey of the area around the geyser.For years, study co-authors Jamie Farrell and Fan-Chi Lin, along with Smith, have worked to characterize the magma reservoirs deep beneath Yellowstone. Although geologists can use seismic data from large earthquakes to see features deep in the earth, the shallow subsurface geology of the park has remained a mystery, because mapping it out would require capturing everyday miniature ground movement and seismic energy on a much smaller scale. "We try to use continuous ground shaking produced by humans, cars, wind, water and Yellowstone's hydrothermal boilings and convert it into our signal," Lin says. "We can extract a useful signal from the ambient background ground vibration."To date, the University of Utah has placed 30 permanent seismometers around the park to record ground shaking and monitor for earthquakes and volcanic events. The cost of these seismometers, however, can easily exceed $10,000. Small seismometers, developed by Fairfield Nodal for the oil and gas industry, reduce the cost to less than $2,000 per unit. They're small white canisters about six inches high and are totally autonomous and self-contained. "You just take it out and stick it in the ground," Smith says.In 2015, with the new instruments, the Utah team deployed 133 seismometers in the Old Faithful and Geyser Hill areas for a two-week campaign.The sensors picked up bursts of intense seismic tremors around Old Faithful, about 60 minutes long, separated by about 30 minutes of quiet. When Farrell presents these patterns, he often asks audiences at what point they think the eruption of Old Faithful takes place. Surprisingly, it's not at the peak of shaking. It's at the end, just before everything goes quiet again.After an eruption, the geyser's reservoir fills again with hot water, Farrell explains. "As that cavity fills up, you have a lot of hot pressurized bubbles," he says. "When they come up, they cool off really rapidly and they collapse and implode." The energy released by those implosions causes the tremors leading up to an eruption.Typically, researchers create a seismic signal by swinging a hammer onto a metal plate on the ground. Lin and Wu developed the computational tools that would help find useful signals among the seismic noise without disturbing the sensitive environment in the Upper Geyser Basin. Wu says she was able to use the hydrothermal features themselves as a seismic source, to study how seismic energy propagates by correlating signals recorded at the sensor close to a persistent source to other sensors. "It's amazing that you can use the hydrothermal source to observe the structure here," she says.When analyzing data from the seismic sensors, the researchers noticed that tremor signals from Old Faithful were not reaching the western boardwalk. Seismic waves extracted from another hydrothermal feature in the north slowed down and scattered significantly in nearly the same area suggesting somewhere west of Old Faithful was an underground feature that affects the seismic waves in an anomalous way. With a dense network of seismometers, the team could determine the shape, size, and location of the feature, which they believe is Old Faithful's hydrothermal reservoir.Wu estimates that the reservoir, a network of cracks and fractures through which water flows, has a diameter of around 200 meters, a little larger than the University of Utah's Rice-Eccles Stadium, and can hold approximately 300,000 cubic meters of water, or more than 79 million gallons. By comparison, each eruption of Old Faithful releases around 30 mThe team is far from done answering questions about Yellowstone. They returned for another seismic survey in November 2016 and are planning their 2017 deployment, to begin after the park roads close for the winter. Wu is looking at how air temperature might change the subsurface structure and affect the propagation of seismic waves. Farrell is using the team's seismic data to predict how earthquake waves might reverberate through the region. Smith is looking forward to conducting similar analysis in Norris Geyser Basin, the hottest geothermal area of the park. Lin says that the University of Utah's research program in Yellowstone owes much to Smith's decades-long relationship with the park, enabling new discoveries. "You need new techniques," Lin says, "but also those long-term relationships." | Earthquakes | 2,017 |
October 5, 2017 | https://www.sciencedaily.com/releases/2017/10/171005151119.htm | Do earthquakes have a 'tell'? | Researchers have long had good reason to believe that earthquakes are inherently unpredictable. But a new finding from Northwestern University might be a seismic shift for that old way of thinking. | An interdisciplinary team recently discovered that "slow earthquakes," which release energy over a period of hours to months, could potentially lead to nearby "regular earthquakes." The finding could help seismologists better forecast some strong earthquakes set to occur within a certain window of time, enabling warnings and other preparations that may save lives."While the build-up of stress in Earth's crust is largely predictable, stress release via regular earthquakes is more chaotic in nature, which makes it challenging to predict when they might occur," said Kevin Chao, a data science scholar in the Northwestern Institute on Complex Systems. "But in recent years, more and more research has found that large earthquakes in subduction zones are often preceded by foreshocks and slow earthquakes."Supported by the National Science Foundation, the research was published in the Chao and his colleagues began their work several years ago by turning to a region within Taiwan, home to approximately 100 seismic stations that have continuously recorded ground motion for years. It was there the team noticed deep tremors, a type of slow earthquake that typically recurs in days- or weeks-long cycles."Deep tremor is very sensitive to small stress changes," Chao said. "So, we decided to use them as stress meters to monitor local variations in stress build-up and release before and after large earthquakes."To detect and monitor this deep tremor activity, Chao's team developed a sophisticated set of algorithms and applied it to data from 10 seismic stations in Taiwan. They discovered that deep tremor started to change its behavior about two months before the occurrence of a 6.4-magnitude earthquake in March 2010 in southern Taiwan. The tremor's duration, for example, increased by two-fold before this event and continued to increase afterwards.Although deep tremor was first reported in 2002, scientists have not found many cases in which behavior changed before large earthquakes. "After the 6.4-magnitude earthquake occurred, we noticed a potential to study deep tremor near the event," Chao said. "We identified the increase in tremor duration three weeks before the earthquake, but we initially could not draw conclusions because tremor rates increase all the time and for different reasons.But three years after the 6.4-magnitude, Chao and his colleagues noticed that their observations of tremor activity coincided with nearby a GPS recording, which indicated a flip in the direction of ground motion near tremor sources.By combining data from earth observatories, such as GPS and seismic stations, with statistics and a series of algorithms, the team showed that changes in deep tremor patterns could signal an impending earthquake nearby. To further test the finding, Chao examined four additional earthquakes and discovered that similar precursory patterns did exist. He and Van der Lee hope that this work will inspire more data-driven research in the seismology field."Much more data analysis of these tiny but fascinating tremor signals is necessary," he said, "before mid- to short-term earthquake forecasting become reliable." | Earthquakes | 2,017 |
October 4, 2017 | https://www.sciencedaily.com/releases/2017/10/171004120510.htm | Assessing regional earthquake risk and hazards in the age of exascale | With emerging exascale supercomputers, researchers will soon be able to accurately simulate the ground motions of regional earthquakes quickly and in unprecedented detail, as well as predict how these movements will impact energy infrastructure -- from the electric grid to local power plants -- and scientific research facilities. | Currently, an interdisciplinary team of researchers from the Department of Energy's (DOE's) Lawrence Berkeley (Berkeley Lab) and Lawrence Livermore (LLNL) national laboratories, as well as the University of California at Davis are building the first-ever end-to-end simulation code to precisely capture the geology and physics of regional earthquakes, and how the shaking impacts buildings. This work is part of the DOE's Exascale Computing Project (ECP), which aims to maximize the benefits of exascale -- future supercomputers that will be 50 times faster than our nation's most powerful system today -- for U.S. economic competitiveness, national security and scientific discovery."Due to computing limitations, current geophysics simulations at the regional level typically resolve ground motions at 1-2 hertz (vibrations per second). Ultimately, we'd like to have motion estimates on the order of 5-10 hertz to accurately capture the dynamic response for a wide range of infrastructure," says David McCallen, who leads an ECP-supported effort called High Performance, Multidisciplinary Simulations for Regional Scale Seismic Hazard and Risk Assessments. He's also a guest scientist in Berkeley Lab's Earth and Environmental Sciences Area.One of the most important variables that affect earthquake damage to buildings is seismic wave frequency, or the rate at which an earthquake wave repeats each second. Buildings and structures respond differently to certain frequencies. Large structures like skyscrapers, bridges, and highway overpasses are sensitive to low frequency shaking, whereas smaller structures like homes are more likely to be damaged by high frequency shaking, which ranges from 2 to 10 hertz and above. McCallen notes that simulations of high frequency earthquakes are more computationally demanding and will require exascale computers.In preparation for exascale, McCallen is working with Hans Johansen, a researcher in Berkeley Lab's Computational Research Division (CRD), and others to update the existing SW4 code -- which simulates seismic wave propagation -- to take advantage of the latest supercomputers, like the National Energy Research Scientific Computing Center's (NERSC's) Cori system. This manycore system contains 68 processor cores per chip, nearly 10,000 nodes and new types of memory. NERSC is a DOE Office of Science national user facility operated by Berkeley Lab. The SW4 code was developed by a team of researchers at LLNL, led by Anders Petersson, who is also involved in the exascale effort.With recent updates to SW4, the collaboration successfully simulated a 6.5 magnitude earthquake on California's Hayward fault at 3-hertz on NERSC's Cori supercomputer in about 12 hours with 2,048 Knights Landing nodes. This first-of-a-kind simulation also captured the impact of this ground movement on buildings within a 100-square kilometer (km) radius of the rupture, as well as 30km underground. With future exascale systems, the researchers hope to run the same model at 5-10 hertz resolution in approximately five hours or less."Ultimately, we'd like to get to a much larger domain, higher frequency resolution and speed up our simulation time, " says McCallen. "We know that the manner in which a fault ruptures is an important factor in determining how buildings react to the shaking, and because we don't know how the Hayward fault will rupture or the precise geology of the Bay Area, we need to run many simulations to explore different scenarios. Speeding up our simulations on exascale systems will allow us to do that."This work was published in the recent issue of Institute of Electrical and Electronics Engineers (IEEE) Computer Society's Computers in Science and Engineering.Historically, researchers have taken an empirical approach to estimating ground motions and how the shaking stresses structures. So to predict how an earthquake would affect infrastructure in the San Francisco region, researchers might look at a past event that was about the same size -- it might even have happened somewhere else -- and use those observations to predict ground motion in San Francisco. Then they'd select some parameters from those simulations based on empirical analysis and surmise how various buildings may be affected."It is no surprise that there are certain instances where this method doesn't work so well," says McCallen. "Every single site is different -- the geologic makeup may vary, faults may be oriented differently and so on. So our approach is to apply geophysical research to supercomputer simulations and accurately model the underlying physics of these processes."To achieve this, the tool under development by the project team employs a discretization technique that divides Earth into billions of zones. Each zone is characterized with a set of geologic properties. Then, simulations calculate the surface motion for each zone. With an accurate understanding of surface motion in a given zone, researchers also get more precise estimates for how a building will be affected by shaking.The team's most recent simulations at NERSC divided a 100km x 100km x 30km region into 60 billion zones. By simulating 30km beneath the rupture site, the team can capture how surface-layer geology affects ground movements and buildings. Eventually, the researchers would like to get their models tuned up to do hazard assessments. As Pacific Gas & Electric (PG&E) begins to implement a very dense array of accelerometers into their SmartMeters -- a system of sensors that collects electric and natural gas use data from homes and businesses to help the customer understand and reduce their energy use -- McCallen is working with the utility company about potentially using that data to get a more accurate understanding of how the ground is actually moving in different geologic regions."The San Francisco Bay is an extremely hazardous area from a seismic standpoint and the Hayward fault is probably one of the most potentially risky faults in the country," says McCallen. "We chose to model this area because there is a lot of information about the geology here, so our models are reasonably well-constrained by real data. And, if we can accurately measure the risk and hazards in the Bay Area, it'll have a big impact."He notes that the current seismic hazard assessment for Northern California identifies the Hayward Fault as the most likely to rupture with a magnitude 6.7 or greater event before 2044. Simulations of ground motions from large -- magnitude 7.0 or more -- earthquakes require domains on the order of 100-500 km and resolution on the order of about one to five meters, which translates into hundreds of billions of grid points. As the researchers aim to model even higher frequency motions between 5 to 10 hertz, they will need denser computational grids and finer time-steps, which will drive up computational demands. The only way to ultimately achieve these simulations is to exploit exascale computing, McCallen says. | Earthquakes | 2,017 |
October 3, 2017 | https://www.sciencedaily.com/releases/2017/10/171003094005.htm | Earth's tectonic plates are weaker than once thought | No one can travel inside Earth to study what happens there. So scientists must do their best to replicate real-world conditions inside the lab. | "We are interested in large-scale geophysical processes, like how plate tectonics initiates and how plates move underneath one another in subduction zones," said David Goldsby, an associate professor at the University of Pennsylvania. "To do that, we need to understand the mechanical behavior of olivine, which is the most common mineral in the upper mantle of Earth."Goldsby, teaming with Christopher A. Thom, a doctoral student at Penn, as well as researchers from Stanford University, the University of Oxford and the University of Delaware, has now resolved a long-standing question in this area of research. While previous laboratory experiments resulted in widely disparate estimates of the strength of olivine in Earth's lithospheric mantle, the relatively cold and therefore strong part of Earth's uppermost mantle, the new work, published in the journal Because olivine in Earth's mantle has a larger grain size than most olivine samples tested in labs, the results suggest that the mantle, which comprises up to 95 percent of the planet's tectonic plates, is in fact weaker than once believed. This more realistic picture of the interior may help researchers understand how tectonic plates form, how they deform when loaded with the weight of, for example, a volcanic island such as Hawaii, or even how earthquakes begin and propagate.For more than 40 years, researchers have attempted to predict the strength of olivine in Earth's lithospheric mantle from the results of laboratory experiments. But tests in a lab are many layers removed from the conditions inside Earth, where pressures are higher and deformation rates are much slower than in the lab. A further complication is that, at the relatively low temperatures of earth's lithosphere, the strength of olivine is so high that it is difficult to measure its plastic strength without fracturing the sample. The results of existing experiments have varied widely, and they don't align with predictions of olivine strength from geophysical models and observations.In an attempt to resolve these discrepancies, the researchers employed a technique known as nanoindentation, which is used to measure the hardness of materials. Put simply, the researchers measure the hardness of a material, which is related to its strength, by applying a known load to a diamond indenter tip in contact with a mineral and then measuring how much the mineral deforms. While previous studies have employed various high-pressure deformation apparatuses to hold samples together and prevent them from fracturing, a complicated set-up that makes measurements of strength challenging, nanoindentation does not require such a complex apparatus."With nanoindentation," Goldsby said, "the sample in effect becomes its own pressure vessel. The hydrostatic pressure beneath the indenter tip keeps the sample confined when you press the tip into the sample's surface, allowing the sample to deform plastically without fracture, even at room temperature."Performing 800 nanoindentation experiments in which they varied the size of the indentation by varying the load applied to the diamond tip pressed into the sample, the research team found that the smaller the size of the indent, the harder, and thus stronger, olivine became."This indentation size effect had been seen in many other materials, but we think this is the first time it's been shown in a geological material," Goldsby said.Looking back at previously collected strength data for olivine, the researchers determined that the discrepancies in those data could be explained by invoking a related size effect, whereby the strength of olivine increases with decreasing grain size of the tested samples. When these previous strength data were plotted against the grain size in each study, all the data fit on a smooth trend which predicts lower-than-thought strengths in Earth's lithospheric mantle.In a related paper by Thom, Goldsby and colleagues, published recently in the journal Geophysical Research Letters, the researchers examined patterns of roughness in faults that have become exposed at Earth's surface due to uplifted plates and erosion."Different faults have a similar roughness, and there's an idea published recently that says you might get those patterns because the strength of the materials on the fault surface increases with the decreasing scale of roughness," Thom said. "Those patterns and the frictional behavior they cause might be able to tell us something about how earthquakes nucleate and how they propagate."In future work, the Penn researchers and their team would like to study size-strength effects in other minerals and also to focus on the effect of increasing temperature on size effects in olivine. | Earthquakes | 2,017 |
September 29, 2017 | https://www.sciencedaily.com/releases/2017/09/170929113010.htm | Erosion from ancient tsunami in Northern California | When you're investigating complex questions, you've often got to dig deep to find answers. A group of UC Santa Barbara geologists and their colleagues studying tsunamis did exactly that. | The team used ground-penetrating radar (GPR) to search for physical evidence of a large tsunami that pounded the Northern California coast near Crescent City some 900 years ago. They discovered that the giant wave removed three to five times more sand than any historical El Niño storm across the Pacific Coast of the United States. The researchers also estimated how far inland the coast eroded. Their findings appear in the journal "We found a very distinct signature in the GPR data that indicated a tsunami and confirmed it with independent records detailing a tsunami in the area 900 years ago," explained lead author Alexander Simms, an associate professor in UCSB's Department of Earth Science and the campus's Earth Research Institute. "By using GPR, we were able to see a much broader view of the damage caused by that tsunami and measure the amount of sand removed from the beach."According to Simms, the magnitude and geography of that epic wave were similar to the one that occurred in Japan in 2011. Geologic records show that these large tsunamis hit the northwestern United States (Northern California through Washington state) every 300 to 500 years. The last one occurred in January 1700, which means another tsunami could happen any time in the next 200 to 300 years."People have tried to figure out how far inland these waves hit, but our analysis provides concrete evidence of just how far inland the coast was eroded," Simms said. "Any structures would not only have been inundated, they would have been eroded away by the tsunami wave."When a tsunami recedes from land, it removes sand and reshapes the coastline. In the case of the event 900 years ago, the beach was eroded more than 6 feet down and more than 360 feet inland."That's a big wedge of sand that moved from the beach," Simms explained. "But because there is so much sand in the system along the coast right after a tsunami, the beach heals pretty quickly on geological timescales. Some of the sand returns from being taken out to sea by the tsunami, but some comes from river catchments that deliver additional sand to the beach as a result of the concomitant earthquake."While the erosional scar can heal rather quickly, Simms noted, initially the coast is reshaped due to newly formed channels, cuts and scarps. Once the beach fills in, he added, the coastline straightens and returns to what it looked like prior to the tsunami. The paper demonstrates this process after the Dec. 26, 2004, Sumatra tsunami, with satellite imagery taken before the event, one month after and four years later."The important thing to remember is that these tsunamis can erode the beach up to 360 feet inland," Simms said. "That means you have to be far inland to be safe when one of them occurs." | Earthquakes | 2,017 |
September 28, 2017 | https://www.sciencedaily.com/releases/2017/09/170928142009.htm | Database of earthquakes triggered by human activity is growing, with some surprises | The Human-Induced Earthquake Database (HiQuake), the world's most complete database of earthquake sequences proposed to have been triggered by human activity, now includes approximately 730 entries, according to a report published October 4 in the "Data Mine" column of the journal | Mining projects (37%) and water impounded behind dams (23%) are the most commonly reported causes of induced earthquakes, but unconventional oil and gas extraction projects using hydraulic fracturing, are now a frequent addition to the database, said Miles Wilson, a geophysicist at Durham University working on the HiQuake research effort."Any successful hydraulic fracturing operation induces microseismicity because the rock is fractured. The number of hydraulically fractured boreholes has increased in recent years, so there is obviously going to be a trend between the number of successfully hydraulically fractured boreholes and the amount of associated microseismicity," Wilson said. "The more important trend is that between hydraulically fractured boreholes and unusually large earthquakes, most likely related to the reactivation of pre-existing geological faults."Other human activities related to unconventional extraction contribute to induced earthquakes as well, Wilson noted. "The most obvious induced seismicity trend in HiQuake is the recent increase in the number of waste-fluid disposal projects reported to have induced earthquakes. This increase is consistent with increased waste-fluid disposal activities in the USA."HiQuake, which is freely available at To build the database, Wilson and his colleagues analyzed peer-reviewed literature, academic presentations, media articles, and industry and government reports for projects where scientific evidence suggests that the human activity was the cause of an earthquake sequence. Each entry in the database corresponds to a project, or phase of a project. The projects extend back almost 150 years, with most maximum observed magnitude earthquakes falling between magnitude 3 and 4.The largest proposed induced earthquake in the database was the 2008 magnitude 7.9 Wenchuan earthquake that occurred in China in response to the impoundment of the Zipingpu Reservoir only a few kilometers away from the mainshock epicenter. The HiQuake researchers were initially surprised to find that such large magnitude earthquakes were proposed as induced, Wilson said, "but most of the stress released in these cases is of natural tectonic origin. The anthropogenic activity is just the final straw that releases this built-up stress."At first, Wilson and colleagues were also surprised by the variety of proposed causes for these quakes, including nuclear explosions and the building of heavy skyscrapers. "With hindsight we probably shouldn't be surprised by any anthropogenic cause. All anthropogenic projects influence forces acting in Earth's crust, for example by adding or removing mass, so we shouldn't be surprised that Earth responds to these changes and that in some cases earthquakes are the response."Human activities that act on the crust are likely to multiply in the future, Wilson noted, as projects to tap into geothermal sources of energy and to store carbon dioxide emissions become more widespread."Additionally, mines may become larger, deeper, and more extensive, surface water reservoir impoundments more common, and buildings on larger scales could be built to meet a growing world population and resource demand," he said. "Perhaps one day a balance will need to be struck between earthquake hazard and resource demand." | Earthquakes | 2,017 |
September 28, 2017 | https://www.sciencedaily.com/releases/2017/09/170928094201.htm | Large earthquakes along Olympic Mountain faults, Washington State, USA | A comprehensive study of faults along the north side of the Olympic Mountains of Washington State emphasizes the substantial seismic hazard to the northern Puget Lowland region. The study examined the Lake Creek-Boundary Creek and Sadie Creek faults along the north flank the Olympic Mountains, and concludes that there were three to five large, surface-rupturing earthquakes along the faults within the last 13,000 years. | The study published September 27 in the While the presence of large earthquakes in the region is not surprising, given the ongoing tectonic deformation in the region, said Alan Nelson and Steve Personius of the U.S. Geological Survey, the Lake Creek-Boundary Creek fault, and other young, active faults like it, pose a significant earthquake hazard for the northern Puget Lowland region. The Puget Lowland includes Seattle and extends through western Washington from Bellingham in the north to Olympia and Tacoma in the south.The threat of a magnitude 8 to 9 megathrust earthquake and tsunami in the Pacific Northwest at the Cascadia subduction zone offshore, where the Juan de Fuca tectonic plate is pushed underneath the North American plate, often steals the seismic hazard spotlight in the region. But much shallower, upper-plate earthquakes also can produce strong ground shaking and damage. At least nine active upper-plate faults, like the Lake Creek-Boundary Creek fault, have been documented in the Puget Lowland, said Nelson."If you consider the hazard from these upper-plate faults, whose earthquake epicenters are only 10 or 15 kilometers deep, future upper-plate earthquakes will be much closer to large population centers in the Puget Lowland region," Nelson said, "than will larger earthquakes on the plate boundary of the Cascadia subduction zone."Even if the time intervals between earthquakes on each upper-plate fault are thousands of years, Nelson noted, "when you put all those faults together, the chances of a damaging earthquake on one of those many faults is higher than it is for a megathrust earthquake, at least on average, over the last few thousands of years."The researchers studied airborne lidar imagery collected in 2002 and 2015 to identify fault scarps along the heavily forested areas of the Lake Creek-Boundary Creek and Sadie Creek faults. Mapping and dating of the stratigraphy in five trenches across scarps of the eastern section of the fault shows that there have been at least three earthquakes over the past 8000 years, and the lidar mapping also shows evidence for multiple earthquakes on the western section of the fault.To gain a better understanding of the age, number, and magnitude of earthquakes on the faults, Elizabeth Schermer at Western Washington University and her colleagues plan additional trenching of fault scarps and coring of swampy areas along some scarps later this year.The new BSSA study suggests that the Olympic Mountains have been moving westward, relative to the Coast Range and Puget Lowland, since the late Pleistocene, said Schermer.Under this scenario, although an accretionary wedge in Earth's crust underneath the Olympic Mountains is being pushed eastward, the mountains also form a triangular block between the Lake-Creek-Boundary-Creek fault on the north and faults on their southeast flank. Because the Olympic block is being squeezed between the northward moving Oregon Coast Range and Vancouver Island, the block moves or "escapes" westward. The two types of movements working together produce the uplift that created the mountain range, she noted.This westward "escape" model for the Olympics "predicts different rates of slip and therefore different sizes and frequencies of earthquakes on the other faults that interact with each other in the region," said Schermer.Data on the history of the Lake-Creek-Boundary Creek fault and others in the region can help seismologists test their models of how the dynamics of the Cascadia subduction zone and the northward movement of the Oregon Coast Range may affect upper-plate earthquakes in the Puget Lowland."We're looking at one small part of a complex jigsaw puzzle with the Lake Creek-Boundary Creek fault, but we need to figure out slip rates and earthquake histories on a lot of different pieces to really be able to put it all together," Schermer said. | Earthquakes | 2,017 |
September 13, 2017 | https://www.sciencedaily.com/releases/2017/09/170913192943.htm | Measuring a crucial mineral in the mantle | University of Delaware professor Jessica Warren and colleagues from Stanford University, Oxford University and University of Pennsylvania, reported new data that material size-effects matter in plate tectonics. | Plate tectonics, the way the Earth's plates move apart and come back together, has been used since the 1960s to explain the location of volcanoes and earthquakes.The study (link here) published Wednesday, Sept. 13 in the American Association for the Advancement of Science journal "Measuring the strength of olivine is critical to understanding how strong tectonic plates are, which, in turn, matters to how plates break and create subduction zones like those along the Cascadia plate, which runs down the west coast of Canada to the west coast of the United States," said Warren, a geologist in the College of Earth, Ocean, and Environment. It's also important for understanding how plates move around over the million-year time scales.The paper demonstrated that olivine's strength is size-sensitive and that olivine is stronger the smaller the volume that is measured, something that has been known in materials science for many metals and ceramics, but has not been studied in a geological material before.Warren explained that the problem with studying rocks on the earth's surface is that they are no longer subjected to the high pressures found inside the earth that cause materials to flow (like ice in a glacier). Recreating these elevated pressures in the laboratory is difficult, making it hard for scientists to study material strength in the lab.The researchers used a technique, called instrumented nanoindentation, to measure olivine's strength. The technique allowed them to recreate pressure conditions similar to those inside the earth by pressing a diamond tip that was carefully machined to a specific geometry into the olivine crystal to measure the material's response. The diamond tips ranged in size from 5 to 20 microns (0.000001 meter). The researchers performed hundreds of indentation tests on tiny olivine crystals less than a centimeter square and found that the olivine crystal became weaker as the size of the diamond tip increased.To validate this size-effect, the researchers reviewed the available literature data on the strength of olivine to determine the sizes and areas that had been tested in previous experiments dating to the late 1970s. The size-effect showed up in the old data, too."The reason 40 years' worth of data don't agree from one experiment to the next is because scientists were measuring different sizes or areas of olivine," Warren said. "But if you plot the same information as a function of the sample size, the datasets, in fact agree, and display the same general trend -- the larger the indentation in the material tested, the weaker the olivine becomes."Now that Warren and her colleagues understand this size-effect, they are turning their attention to how temperature affects the strength of olivine, and more broadly, on where tectonic plates might break and give rise to potential subduction zones.Temperatures inside the earth are much hotter than on the surface and can range from 1,470 to 2,200 degrees Fahrenheit (800 to 1,200 degrees Celsius).The team also will consider what role water plays in the structure of olivine minerals and rocks in the earth. According to Warren, current estimates suggest the earth contains the equivalent of 50 percent to 4 times the amount of water found in the global ocean."When geologists look at how faults buckle and deform, it is at a very small length scale where conditions in size effect really matter, just like our olivine tests in the laboratory," Warren said. "But this size effect disappears when you get to a large enough length scale on tectonic plates, so we need to consider other things like when temperature and water begin to play a role." | Earthquakes | 2,017 |
September 11, 2017 | https://www.sciencedaily.com/releases/2017/09/170911122735.htm | Earthquake triggers 'slow motion' quakes in New Zealand | Slow slip events, a type of slow motion earthquake that occurs over days to weeks, are thought to be capable of triggering larger, potentially damaging earthquakes. In a new study led by The University of Texas at Austin, scientists have documented the first clear-cut instance of the reverse -- a massive earthquake immediately triggering a series of large slow slip events. | Some of the slow slip events occurred as far away as 300 miles from the earthquake's epicenter. The study of new linkages between the two types of seismic activity, published in "This is probably the clearest example worldwide of long-distance, large-scale slow slip triggering," said lead author Laura Wallace, a research scientist at the University of Texas Institute for Geophysics (UTIG). She also holds a joint position at GNS Science, a New Zealand research organization that studies natural hazards and resources.Co-authors include other GNS scientists, as well as scientists from Georgia Tech and the University of Missouri. UTIG is a research unit of the UT Jackson School of Geosciences.In November 2016, the second largest quake ever recorded in New Zealand -- the 7.8 magnitude Kaik?ura quake -- hit the country's South Island. A GPS network operated by GeoNet, a partnership between GNS Science and the New Zealand Earthquake Commission, detected slow slip events hundreds of miles away beneath the North Island. The events occurred along the shallow part of the Hikurangi subduction zone that runs along and across New Zealand.A subduction zone is an area where a tectonic plate dives or "subducts" beneath an adjacent tectonic plate. This type of fault is responsible for causing some of the world's most powerful earthquakes, which occur when areas of built-up stress between the plates rupture.Slow slip events are similar to earthquakes, as they involve more rapid than normal movement between two pieces of Earth's crust along a fault. However, unlike earthquakes (where the movement occurs in seconds), movement in these slow slip events or "silent earthquakes" can take weeks to months to occur.The GPS network detected the slow slip events occurring on the Hikurangi subduction zone plate boundary in the weeks and months following the Kaik?ura earthquake. The slow slip occurred at less than 9 miles deep below the surface (or seabed) and spanned an area of more than 6,000 square miles offshore from the Hawke's Bay and Gisborne regions, comparable with the area occupied by the state of New Jersey. There was also a deeper slow slip event triggered on the subduction zone at 15-24 miles beneath the Kapiti Coast region, just west of New Zealand's capital city Wellington. This deeper slow slip event near Wellington is still ongoing today."The slow slip event following the Kaik?ura earthquake is the largest and most widespread episode of slow slip observed in New Zealand since these observations started in 2002," Wallace said.The triggering effect was probably accentuated by an offshore "sedimentary wedge" -- a mass of sedimentary rock piled up at the edge of the subduction zone boundary offshore from the North Island's east coast. This layer of more compliant rock is particularly susceptible to trapping seismic energy, which promotes slip between the plates at the base of the sedimentary wedge where the slow slip events occur."Our study also suggests that the northward traveling rupture during the Kaik?ura quake directed strong pulses of seismic energy towards the North Island, which also influenced the long-distance triggering of the slow slip events beneath the North Island," said Yoshihiro Kaneko, a seismologist at GNS Science.Slow slip events in the past have been associated with triggering earthquakes, including the magnitude 9.0 Tohoku earthquake that struck Japan in 2011. The researchers have also found that the slow slip events triggered by the Kaik?ura quake were the catalyst for other quakes offshore from the North Island's east coast, including a magnitude 6.0 just offshore from the town of Porangahau on Nov. 22, 2016.Although scientists are still in the early stages of trying to understand the relationships between slow slip events and earthquakes, Wallace said that the study results highlight additional linkages between these processes. | Earthquakes | 2,017 |
August 30, 2017 | https://www.sciencedaily.com/releases/2017/08/170830122545.htm | Machine-learning earthquake prediction in lab shows promise | By listening to the acoustic signal emitted by a laboratory-created earthquake, a computer science approach using machine learning can predict the time remaining before the fault fails. | "At any given instant, the noise coming from the lab fault zone provides quantitative information on when the fault will slip," said Paul Johnson, a Los Alamos National Laboratory fellow and lead investigator on the research, which was published today in "The novelty of our work is the use of machine learning to discover and understand new physics of failure, through examination of the recorded auditory signal from the experimental setup. I think the future of earthquake physics will rely heavily on machine learning to process massive amounts of raw seismic data. Our work represents an important step in this direction," he said.Not only does the work have potential significance to earthquake forecasting, Johnson said, but the approach is far-reaching, applicable to potentially all failure scenarios including nondestructive testing of industrial materials brittle failure of all kinds, avalanches and other events.Machine learning is an artificial intelligence approach to allowing the computer to learn from new data, updating its own results to reflect the implications of new information.The machine learning technique used in this project also identifies new signals, previously thought to be low-amplitude noise, that provide forecasting information throughout the earthquake cycle. "These signals resemble Earth tremor that occurs in association with slow earthquakes on tectonic faults in the lower crust," Johnson said. "There is reason to expect such signals from Earth faults in the seismogenic zone for slowly slipping faults."Machine learning algorithms can predict failure times of laboratory quakes with remarkable accuracy. The acoustic emission (AE) signal, which characterizes the instantaneous physical state of the system, reliably predicts failure far into the future. This is a surprise, Johnson pointed out, as all prior work had assumed that only the catalog of large events is relevant, and that small fluctuations in the AE signal could be neglected.To study the phenomena, the team analyzed data from a laboratory fault system that contains fault gouge, the ground-up material created by the stone blocks sliding past one another. An accelerometer recorded the acoustic emission emanating from the shearing layers.Following a frictional failure in the labquake, the shearing block moves or displaces, while the gouge material simultaneously dilates and strengthens, as shown by measurably increasing shear stress and friction. "As the material approaches failure, it begins to show the characteristics of a critical stress regime, including many small shear failures that emit impulsive acoustic emissions," Johnson described."This unstable state concludes with an actual labquake, in which the shearing block rapidly displaces, the friction and shear stress decrease precipitously, and the gouge layers simultaneously compact," he said. Under a broad range of conditions, the apparatus slide-slips fairly regularly for hundreds of stress cycles during a single experiment. And importantly, the signal (due to the gouge grinding and creaking that ultimately leads to the impulsive precursors) allows prediction in the laboratory, and we hope will lead to advances in prediction in Earth, Johnson said. | Earthquakes | 2,017 |
August 25, 2017 | https://www.sciencedaily.com/releases/2017/08/170825104002.htm | The losses that come after the earthquake: Devastating and costly | Earthquakes: Nature's most unpredictable and one of her most devastating natural disasters. When high intensity earthquakes strike they can cause thousands of deaths and billions of dollars in damaged property. For decades, experts have studied major earthquakes; most have focused on fatalities and destruction in terms of the primary effects, the shaking unleashed. | A new study takes a different approach to generate a more complete picture.The study, titled, "Losses Associated with Secondary Effects in Earthquakes," published in Since 1900, 2.3 million people have died in 2,233 earthquakes, yet it is important to understand that 93 percent of the fatalities that occurred as a result of violent earthquakes happened in only 1 percent of key earthquakes. In other words, the worst devastation tends to happen in only a very few quakes and generally as a result of dire secondary effects. Indeed fully 40 percent of economic losses and deaths result from secondary effects rather than the shaking itself. Several key earthquakes have changed our knowledge of secondary effects and serve as models to understand and heed in planning communities, homes and buildings, highways, and infrastructure such as nuclear power plants.In 2004 the Indian Ocean earthquake unleashed tsunamis that killed a total of 227,300 people in Indonesia, Sri Lanka, India, and Thailand, plus more than $10 billion in damages. In 2011, the Tohoku earthquake created a series of huge tsunami waves, which damaged coastal communities killing more than 17,900 people, forcing more than 50,000 households to relocate, and caused the Fukushima nuclear power plant failure, a nuclear disaster second only to Chernobyl in Russia in 1986, but which spread radiation across the Pacific Ocean. Studying the Indian Ocean and Tohoku earthquakes gives us information to create maximum tsunami height models for these high risk areas to better predict how populations, property, and gross domestic product might be impacted in the future by similar events.The 1995 Kobe earthquake, also in Japan, and the 2011 Christchurch earthquake in New Zealand provide insight into the devastation that liquefaction can cause. Liquefaction occurs when sandy soils that are partially or fully saturated are turned from solids to liquid by the stress exerted upon the material by the earthquake. The result: soils that suddenly lose their strength and integrity and flow as landslides; liquefaction is especially destructive to buildings, highways, and mountain communities, such as Christchurch, New Zealand.In the past, fire has been the greatest contributor to damage following earthquakes. The 1906 San Francisco fire, created an inferno of property damage. Five-sixths of the total damage was due to fire, worth tens of billions of dollars in today's market. Many of San Francisco's Victorian-era mansions, shops and businesses, and infrastructure -- indeed, whole neighborhoods -- burned to the ground in the city by the bay. In 1923, again in Japan, a fire that erupted following the Great Kanto earthquake killed more than 92,000 people and was responsible for two-thirds of the total damage, amounting in today's market to hundreds of billions of dollars.High intensity earthquakes can also cause severe flooding. While most dams and reservoirs have been designed to withstand earthquake forces, but the simple lateral motion of an earthquake can cause natural and human-made structures to fail, and unload large volumes of water. Landslides can also block rivers and create 'quake lakes,' which can then flood settlements downstream, as happened in 2008 following the Sichuan earthquake in China. The authors say that of 6,800 plus dams and reservoirs worldwide, 623 are expected to have a significant shaking hazard within a return period of 475 years and that of these 333 are more than 45 years old and should be reassessed.The authors further detail their process for disaggregating fatalities and damages resulting from secondary effects as compared to the actual shaking caused by the earthquake by presenting two case studies: the 2011 Tohoku earthquake and associated tsunamis and the 1960 Chile earthquake and tsunami sequence and landslides.As experts collect more data on secondary effects and resulting losses from high-intensity earthquakes, three benefits emerge. First, better models can be developed to understand the inherent risks and projected losses of building and living in certain areas. Secondly, scientists can reassess historic events, many of which were insufficiently recorded at the time. Thirdly, in this paper the authors demonstrate that to truly learn from these violent events data must be shared internationally and new technologies employed to process large volumes of information -- otherwise, these tragedies appear as isolated, random events, rather than as natural disasters to which we can and must adapt. | Earthquakes | 2,017 |
August 23, 2017 | https://www.sciencedaily.com/releases/2017/08/170823131226.htm | More than expected hidden beneath Andean Plateau | Seismologists investigating how Earth forms new continental crust have compiled more than 20 years of seismic data from a wide swath of South America's Andean Plateau and determined that processes there have produced far more continental rock than previously believed. | "When crust from an oceanic tectonic plate plunges beneath a continental tectonic plate, as it does beneath the Andean Plateau, it brings water with it and partially melts the mantle, the layer below Earth's crust," said Rice University's Jonathan Delph, co-author of the new study published online this week in Delph, a Wiess Postdoctoral Research Associate in Rice's Department of Earth, Environmental and Planetary Science, said the findings suggest that mountain-forming regions like the Andean Plateau, which geologists refer to as "orogenic plateaus," could produce much larger volumes of continental rock in less time than previously believed.Study lead author Kevin Ward, a postdoctoral researcher at the University of Utah, said, "When we compared the amount of trapped plutonic rock beneath the plateau with the amount of erupted volcanic rock at the surface, we found the ratio was almost 30:1. That means 30 times more melt gets stuck in the crust than is erupted, which is about six times higher than what's generally believed to be the average. That's a tremendous amount of new material that has been added to the crust over a relatively short time period."The Andean Plateau covers much of Bolivia and parts of Peru, Chile and Argentina. Its average height is more than 12,000 feet, and though it is smaller than Asia's Tibetan Plateau, different geologic processes created the Andean Plateau. The mountain-building forces at work in the Andean plateau are believed to be similar to those that worked along the western coast of the U.S. some 50 million years ago, and Delph said it's possible that similar forces were at work along the coastlines of continents throughout Earth's history.Most of the rocks that form Earth's crust initially came from partial melts of the mantle. If the melt erupts quickly, it forms basalt, which makes up the crust beneath the oceans on Earth; but there are still questions about how continental crust, which is more buoyant than oceanic crust, is formed. Delph said he and Ward began their research in 2016 as they were completing their Ph.D.s at the University of Arizona. The pair spent several months combining public datasets from seismic experiments by several U.S. and German institutions. Seismic energy travels through different types of rock at different speeds, and by combining datasets that covered a 500-mile-wide swath of the Andean Plateau, Ward and Delph were able to resolve large plutonic volumes that had previously been seen only in pieces.Over the past 11 million years, volcanoes have erupted thousands of cubic miles' worth of material over much of the Andean Plateau. Ward and Delph calculated their plutonic-to-volcanic ratio by comparing the volume of regions where seismic waves travel extremely slowly beneath volcanically active regions, indicating some melt is present, with the volume of rock deposited on the surface by volcanoes."Orogenic oceanic-continental subduction zones have been common as long as modern plate tectonics have been active," Delph said. "Our findings suggest that processes similar to those we observe in the Andes, along with the formation of supercontinents, could have been a significant contributor to the episodic formation of buoyant continental crust." | Earthquakes | 2,017 |
August 22, 2017 | https://www.sciencedaily.com/releases/2017/08/170822103204.htm | Earth history: How continents were recycled | Researchers from Germany and Switzerland have used computer simulations to analyse how plate tectonics have evolved on Earth over the last three billion years. They show that tectonic processes have changed in the course of the time, and demonstrate how those changes contributed to the formation and destruction of continents. The model reconstructs how present-day continents, oceans and the atmosphere may have evolved. | Priyadarshi Chowdhury and Prof Dr Sumit Chakraborty from Ruhr-Universität Bochum, together with Prof Dr Taras Gerya from the Swiss Federal Institute of Technology in Zürich (ETH), present their work in the journal The Earth formed approximately four and a half billion years ago. There was a phase -- perhaps even several -- when it was mainly composed of molten rock. As it cooled, solid rock and the Earth's crust formed. Generally speaking, there are two types of crust on Earth: a lighter continental crust that is rich in silicon and constitutes the dry land above sea level, and a denser oceanic crust where water gathers in the form of large oceans. "These properties render the Earth habitable," says Sumit Chakraborty. "We haven't found anything comparable anywhere else in the universe."Even though the young Earth did have continents and oceans, there were initially perhaps no plates and, consequently, no plate tectonics. The question when they emerged is much disputed. The Earth's crust slowly assumed its present dynamic form: in some places the plates go into the mantle; in other places new plates form from the hot material that rises from the interior of the Earth.Also, the question when plate tectonics first emerged is not the only one that remains unanswered; it is also unclear whether that process has always been the same and whether continents last forever or are recycled. These are the questions that the German-Swiss research team investigated. Their new thermomechanical computer model supports the growing notion that perhaps plate tectonics was already operating approximately three billion years ago. More uniquely, the study demonstrates how the Earth's earliest continental crust -- richer in iron and magnesium -- was destroyed some two or three billion years ago and how the present continental crust -- richer in silicon -- formed from it.On the young Earth, continents were recycled all the time. Continental recycling still takes place today when two continents collide, but it progresses more slowly and in a different manner than it used to. "Over time, the continental crust became prone to preservation during continent-continent collision," says Priyadarshi Chowdhury. On the old, still hot Earth, thin layers peeled off from the Earth's crust whereas on the present-day Earth, chunks of the continental crust break off in the collision zones, i.e. in places where one plate moves under another.The researchers assume that the destruction of the early iron-magnesium rich continental crust was crucial for the formation of the silicon-rich continents and that it was the reason why these continents could rise above sea level to a larger extent. "These changes to the continental character might have contributed to the Great Oxygenation Event on Earth -- and, consequently, to the origin of life as we know it," suspects Chowdhury. | Earthquakes | 2,017 |
August 16, 2017 | https://www.sciencedaily.com/releases/2017/08/170816084935.htm | How friction evolves during an earthquake | By simulating earthquakes in a lab, engineers at Caltech have documented the evolution of friction during an earthquake -- measuring what could once only be inferred, and shedding light on one of the biggest unknowns in earthquake modeling. | Before an earthquake, static friction helps hold the two sides of a fault immobile and pressed against each other. During the passage of an earthquake rupture, that friction becomes dynamic as the two sides of the fault grind past one another. Dynamic friction evolves throughout an earthquake, affecting how much and how fast the ground will shake and thus, most importantly, the destructiveness of the earthquake."Friction plays a key role in how ruptures unzip faults in the earth's crust," says Vito Rubino, research scientist at Caltech's Division of Engineering and Applied Science (EAS). "Assumptions about dynamic friction affect a wide range of earthquake science predictions, including how fast ruptures will occur, the nature of ground shaking, and residual stress levels on faults. Yet the precise nature of dynamic friction remains one of the biggest unknowns in earthquake science."Previously, it commonly had been believed that the evolution of dynamic friction was mainly governed by how far the fault slipped at each point as a rupture went by -- that is, by the relative distance one side of a fault slides past the other during dynamic sliding. Analyzing earthquakes that were simulated in a lab, the team instead found that sliding history is important but the key long-term factor is actually the slip velocity -- not just how far the fault slips, but how fast.Rubino is the lead author on a paper on the team's findings that was published in The team conducted the research at a Caltech facility, directed by Rosakis, that has been unofficially dubbed the "seismological wind tunnel." At the facility, researchers use advanced high-speed optical diagnostics and other techniques to study how earthquake ruptures occur."Our unique facility allows us to study dynamic friction laws by following individual, fast-moving shear ruptures and recording friction along their sliding faces in real time," Rosakis says. "This allows us for the first time to study friction point-wise and without having to assume that sliding occurs uniformly, as is done in classical friction studies," Rosakis adds.To simulate an earthquake in the lab, the researchers first cut in half a transparent block of a type of plastic known as homalite, which has similar mechanical properties to rock. They then put the two pieces together under pressure, simulating the static friction that builds up along a fault line. Next, they placed a small nickel-chromium wire fuse at the location where they wanted the epicenter of the quake to be. Triggering the fuse produced a local pressure release, which reduced friction at that location, and allowed a very fast rupture to propagate up the miniature fault.In this study, the team recorded these simulated earthquakes using a new diagnostic method that combines high-speed photography (at 2 million frames per second) with a technique called digital image correlation, in which individual frames are compared and contrasted with one another and changes between those images -- indicating motion -- are tracked with sub-pixel accuracy."Some numerical models of earthquake rupture, including the ones developed in my group at Caltech, have used friction laws with slip-velocity dependence, based on a collection of rock mechanics experiments and theories. It is gratifying to see those formulations validated by the spontaneous mini-earthquake ruptures in our study, " Lapusta says.In future work, the team plans to use its observations to improve the existing mathematical models about the nature of dynamic friction and to help create new ones that better represent the experimental observations; such new models would improve computer earthquake simulations. | Earthquakes | 2,017 |
August 14, 2017 | https://www.sciencedaily.com/releases/2017/08/170814104409.htm | New plate adds plot twist to ancient tectonic tale | A microplate discovered off the west coast of Ecuador adds another piece to Earth's tectonic puzzle, according to Rice University scientists. | Researchers led by Rice geophysicist Richard Gordon discovered the microplate, which they have named "Malpelo," while analyzing the junction of three other plates in the eastern Pacific Ocean.The Malpelo Plate, named for an island and an underwater ridge it contains, is the 57th plate to be discovered and the first in nearly a decade, they said. They are sure there are more to be found.The research by Gordon, lead author Tuo Zhang and co-authors Jay Mishra and Chengzu Wang, all of Rice, appears in How do geologists discover a plate? In this case, they carefully studied the movements of other plates and their evolving relationships to one another as the plates move at a rate of millimeters to centimeters per year.The Pacific lithospheric plate that roughly defines the volcanic Ring of Fire is one of about 10 major rigid tectonic plates that float and move atop Earth's mantle, which behaves like a fluid over geologic time. Interactions at the edges of the moving plates account for most earthquakes experienced on the planet. There are many small plates that fill the gaps between the big ones, and the Pacific Plate meets two of those smaller plates, the Cocos and Nazca, west of the Galapagos Islands.One way to judge how plates move is to study plate-motion circuits, which quantify how the rotation speed of each object in a group (its angular velocity) affects all the others. Rates of seafloor spreading determined from marine magnetic anomalies combined with the angles at which the plates slide by each other over time tells scientists how fast the plates are turning."When you add up the angular velocities of these three plates, they ought to sum to zero," Gordon said. "In this case, the velocity doesn't sum to zero at all. It sums to 15 millimeters a year, which is huge."That made the Pacific-Cocos-Nazca circuit a misfit, which meant at least one other plate in the vicinity had to make up the difference. Misfits are a cause for concern -- and a clue.Knowing the numbers were amiss, the researchers drew upon a Columbia University database of extensive multibeam sonar soundings west of Ecuador and Colombia to identify a previously unknown plate boundary between the Galapagos Islands and the coast.Previous researchers had assumed most of the region east of the known Panama transform fault was part of the Nazca plate, but the Rice researchers determined it moves independently. "If this is moving in a different direction, then this is not the Nazca plate," Gordon said. "We realized this is a different plate and it's moving relative to the Nazca."Evidence for the Malpelo plate came with the researchers' identification of a diffuse plate boundary that runs from the Panama Transform Fault eastward to where the diffuse plate boundary intersects a deep oceanic trench just offshore of Ecuador and Colombia."A diffuse boundary is best described as a series of many small, hard-to-spot faults rather than a ridge or transform fault that sharply defines the boundary of two plates," Gordon said. "Because earthquakes along diffuse boundaries tend to be small and less frequent than along transform faults, there was little information in the seismic record to indicate this one's presence.""With the Malpelo accounted for, the new circuit still doesn't close to zero and the shrinking Pacific Plate isn't enough to account for the difference either," Zhang said. "The nonclosure around this triple junction goes down -- not to zero, but only to 10 or 11 millimeters a year."Since we're trying to understand global deformation, we need to understand where the rest of that velocity is going," he said. "So we think there's another plate we're missing."Plate 58, where are you?Gordon is the W.M. Keck Professor of Geophysics. Zhang and Wang are Rice graduate students and Mishra is a Rice alumnus.The National Science Foundation supported the research. | Earthquakes | 2,017 |
August 2, 2017 | https://www.sciencedaily.com/releases/2017/08/170802152517.htm | Similar characteristics found in human-induced and natural earthquakes | Whether an earthquake occurs naturally or as a result of unconventional oil and gas recovery, the destructive power is the same, according to a new study appearing in | The finding contradicts previous observations suggesting that induced earthquakes exhibit weaker shaking than natural ones. The work could help scientists make predictions about future earthquakes and mitigate their potential damage."People have been debating the strength of induced earthquakes for decades -- our study resolves this question," said co-author William Ellsworth, a professor in the Geophysics Department at Stanford's School of Earth, Energy & Environmental Sciences and co-director of the Stanford Center for Induced and Triggered Seismicity (SCITS). "Now we can begin to reduce our uncertainty about how hard induced earthquakes shake the ground, and that should lead to more accurate estimates of the risks these earthquakes pose to society going forward."Earthquakes in the central U.S. have increased over the past 10 years due to the expansion of unconventional oil and gas operations that discard wastewater by injecting it into the ground. About 3 million people in Oklahoma and southern Kansas live with an increased risk of experiencing induced earthquakes."The stress that is released by the earthquakes is there already -- by injecting water, you're just speeding up the process," said co-author Gregory Beroza, the Wayne Loel Professor in geophysics at Stanford Earth and co-director of SCITS. "This research sort of simplifies things, and shows that we can use our understanding of all earthquakes for more effective mitigation."Oklahoma experienced its largest seismic event in 2016 when three large earthquakes measuring greater than magnitude 5.0 caused significant damage to the area. Since the beginning of 2017, the number of earthquakes magnitude 3.0 and greater has fallen, according to the Oklahoma Geological Survey. That drop is partly due to new regulations to limit wastewater injection that came out of research into induced earthquakes.To test the destructive power of an earthquake, researchers measured the force driving tectonic plates to slip, known as stress drop -- measured by the difference between a fault's stress before and after an earthquake. The team analyzed seismic data from 39 humanmade and natural earthquakes ranging from magnitude 3.3 to 5.8 in the central U.S. and eastern North America. After accounting for factors such as the type of fault slip and earthquake depth, results show the stress drops of induced and natural earthquakes in the central U.S. share the same characteristics.A second finding of the research shows that most earthquakes in the eastern U.S. and Canada exhibit stronger shaking potential because they occur on what's known as reverse faults. These types of earthquakes are typically associated with mountain building and tend to exhibit more shaking than those that occur in the central U.S. and California. Although the risk for naturally occurring earthquakes is low, the large populations and fragile infrastructure in this region make it vulnerable when earthquakes do occur.The team also analyzed how deep the earthquakes occur underground and concluded that as quakes occur deeper, the rocks become stronger and the stress drop, or force behind the earthquakes, becomes more powerful."Both of these conclusions give us new predictive tools to be able to forecast what the ground motions might be in future earthquakes," Ellsworth said. "The depth of the quake is also going to be important, and that needs to be considered as people begin to revise these ground-motion models that describe how strong the shaking will be."The scientists said that the types of rocks being exploited by unconventional oil and gas recovery in the U.S. and Canada can be found all over the world, making the results of this study widely applicable."As we can learn better practices, we can help ensure that the hazards induced earthquakes pose can be reduced in other parts of the world as well," Ellsworth said. | Earthquakes | 2,017 |
August 1, 2017 | https://www.sciencedaily.com/releases/2017/08/170801101730.htm | High tsunami danger in Alaska, perhaps elsewhere | Scientists probing under the seafloor off Alaska have mapped a geologic structure that they say signals potential for a major tsunami in an area that normally would be considered benign. They say the feature closely resembles one that produced the 2011 Tohoku tsunami off Japan, killing some 20,000 people and melting down three nuclear reactors. Such structures may lurk unrecognized in other areas of the world, say the scientists. The findings appear in the print edition of the journal | The discovery "suggests this part of Alaska is particularly prone to tsunami generation," said seismologist Anne Bécel of Columbia University's Lamont-Doherty Earth Observatory, who led the study. "The possibility that such features are widespread is of global significance." In addition to Alaska, she said, waves could hit more southerly North American coasts, Hawaii and other parts of the Pacific.Tsunamis can occur as giant plates of ocean crust dive under adjoining continental crust, a process called subduction. Some plates get stuck for decades or centuries and tension builds, until they suddenly slip by each other. This produces a big earthquake, and the ocean floor may jump up or down like a released spring. That motion transfers to the overlying water, creating a surface wave.The 2011 Japan tsunami was a surprise, because it came partly on a "creeping" segment of seafloor, where the plates move steadily, releasing tension in frequent small quakes that should prevent a big one from building. But researchers are now recognizing it may not always work that way. Off Japan, part of the leading edge of the overriding continental plate had become somewhat detached from the main mass. When a relatively modest quake dislodged this detached wedge, it jumped, unleashing a wave that topped 130 feet in places. The telltale sign of danger, in retrospect: a fault in the seafloor that demarcated the detached section's boundary landward of the "trench," the zone where the two plates initially meet. The fault had been known to exist, but no one had understood what it meant.The researchers in the new study have now mapped a similar system in the Shumagin Gap, a creeping subduction zone near the end of the Alaska Peninsula some 600 miles from Anchorage. The segment is part of a subduction arc spanning the peninsula and the Aleutian Islands. Sailing on a specially equipped research vessel, the scientists used relatively new technology to penetrate deep into the seafloor with powerful sound pulses. By reading the echoes, they created CAT-scan-like maps of both the surface and what is underneath. The newly mapped fault lies between the trench and the coast, stretching perhaps 90 miles underwater more or less parallel to land. On the seafloor, it is marked by scarps about 15 feet high, indicating that the floor has dropped one side and risen on the other. The fault extends down more than 20 miles, all the way to where the two plates are moving against each other.The team also analyzed small earthquakes in the region, and found a cluster of seismicity where the newly identified fault meets the plate boundary. This, they say, confirms that the fault may be active. Earthquake patterns also suggest that frictional properties on the seaward side of the fault differ from those on the landward side. These differences may have created the fault, slowly tearing the region off the main mass; or the fault may be the remains of a past sudden movement. Either way, it signals danger, said coauthor Donna Shillington, a Lamont-Doherty seismologist."With that big fault there, that outer part of the plate could move independently and make a tsunami a lot more effective," said Shillington. "You get a lot more vertical motion if the part that moves is close to the seafloor surface." A rough analogy: imagine snapping off a small piece of a dinner plate, laying the two pieces together on a table and pounding the table from below; the smaller piece will probably jump higher than if the plate were whole, because there is less holding it down.Other parts of the Aleutian subduction zone are already known to be dangerous. A 1946 quake and tsunami originating further west killed more than 160 people, most in Hawaii. In 1964, an offshore quake killed around 140 people with landslides and tsunamis, mainly in Alaska; 19 people died in Oregon and California, and waves were detected as far off as Papua New Guinea and even Antarctica. In July 2017, an offshore quake near the western tip of the Aleutians triggered a Pacific-wide tsunami warning, but luckily it produced just a six-inch local wave.As for the Shumagin Gap, in 1788, Russian colonists then living on nearby Unga Island recorded a great quake and tsunami that wiped out coastal structures and killed many native Aleut people. The researchers say it may have originated at the Shumagin Gap, but there is no way to be sure. Rob Witter, a geologist with the U.S. Geological Survey (USGS), has scoured area coastlines for evidence of such a tsunami, but so far evidence has eluded him, he said. The potential danger "remains a puzzle here," he said. "We know so little about the hazards of subduction zones. Every little bit of new information we can glean about how they work is valuable, including the findings in this new paper."The authors say that apart from Japan, such a fault structure has been well documented only off Russia's Kuril Islands, east of the Aleutians. But, Shillington said, "We don't have images from many places. If we were to look around the world, we would probably see a lot more." John Miller, a retired USGS scientist who has studied the Aleutians, said that his own work suggests other segments of the arc have other threatening features that resemble both those in the Shumagin and off Japan. "The dangers of areas like these are just now being widely recognized," he said.Lamont seismologists have been studying earthquakes in the Aleutians since the 1960s, but early studies were conducted mainly on land. In the 1980s, the USGS collected the same type of data used in the new study, but seismic equipment now able to produce far more detailed images deep under the sea floor made this latest discovery possible, said Bécel. She and others conducted the imaging survey aboard the Marcus G. Langseth, the United States' flagship vessel for acoustic research. Owned by the U.S. National Science Foundation, it is operated by Lamont-Doherty on behalf the nation's universities and other research institutions. | Earthquakes | 2,017 |
July 26, 2017 | https://www.sciencedaily.com/releases/2017/07/170726141642.htm | Group relocation preserves social connections among elderly Japanese Tsunami survivors | Relocating in groups, rather than individually, increased informal socializing and social participation among older survivors of the 2011 Great East Japan Earthquake and Tsunami, a new study shows. The finding suggests local authorities should consider moving residents (especially older adults, who are disproportionately impacted by disasters) out of disaster areas and into temporary or permanent housing as a community. | Hirojuki Hikichi and colleagues monitored adults over the age of 65 roughly seven months before the disaster, originally to conduct an analysis about healthy aging -- inadvertently providing baseline information for their "natural experiment" prior to the onset of the disaster.This unintentional design, in effect, eliminated the problem of "recall bias" (where accuracy of memory may influence the outcome of most studies conducted in post-disaster settings).Following the earthquake, which had an epicenter about 80 kilometers east of the study area in the city of Iwanuma, and tsunami, some survivors were relocated in groups to temporary public trailer housing, or moved on their own. The researchers surveyed survivors about their individual social behaviors (such as informal socializing with neighbors) as well as perceptions about their communities.Nearly three years after the disaster, respondents who relocated with the whole community showed increases in informal socializing and social participation, including frequency of meeting with friends and participating in sports and hobby clubs. On the other hand, individual relocation was associated with decreased social cohesion and less informal socializing and social participation.Taken together, the results offer insight into how disasters impact socialization, and may contribute to improved disaster resilience. | Earthquakes | 2,017 |
July 24, 2017 | https://www.sciencedaily.com/releases/2017/07/170724113557.htm | Strength of tectonic plates may explain shape of the Tibetan Plateau | Geoscientists have long puzzled over the mechanism that created the Tibetan Plateau, but a new study finds that the landform's history may be controlled primarily by the strength of the tectonic plates whose collision prompted its uplift. Given that the region is one of the most seismically active areas in the world, understanding the plateau's geologic history could give scientists insight to modern day earthquake activity. | The new findings are published in the journal Even from space, the Tibetan Plateau appears huge. The massive highland, formed by the convergence of two continental plates, India and Asia, dwarfs other mountain ranges in height and breadth. Most other mountain ranges appear like narrow scars of raised flesh, while the Himalaya Plateau looks like a broad, asymmetrical scab surrounded by craggy peaks."The asymmetric shape and complex subsurface structure of the Tibetan Plateau make its formation one of the most significant outstanding questions in the study of plate tectonics today," said University of Illinois geology professor and study co-author Lijun Liu.In the classic model of Tibetan Plateau formation, a fast-moving Indian continental plate collides head-on with the relatively stationary Asian plate about 50 million years ago. The convergence is likely to have caused the Earth's crust to bunch up into the massive pile known as the Himalaya Mountains and Tibetan Plateau seen today, but this does not explain why the plateau is asymmetrical, Liu Said."The Tibetan Plateau is not uniformly wide," said Lin Chen, the lead author from the Chinese Academy of Sciences. "The western side is very narrow and the eastern side is very broad -- something that many past models have failed to explain."Many of those past models have focused on the surface geology of the actual plateau region, Liu said, but the real story might be found further down, where the Asian and Indian plates meet."There is a huge change in topography on the plateau, or the Asian plate, while the landform and moving speed of the Indian plate along the collision zone are essentially the same from west to east," Liu said. "Why does the Asian plate vary so much?"To address this question, Liu and his co-authors looked at what happens when tectonic plates made from rocks of different strengths collide. A series of 3-D computational continental collision models were used to test this idea."We looked at two scenarios -- a weak Asian plate and a strong Asian plate," said Liu. "We kept the incoming Indian plate strong in both models."When the researchers let the models run, they found that a strong Asian plate scenario resulted in a narrow plateau. The weak Asian plate model produced a broad plateau, like what is seen today."We then ran a third scenario which is a composite of the strong and weak Asian plate models," said Liu. "An Asian plate with a strong western side and weak eastern side results in an orientation very similar to what we see today."This model, besides predicting the surface topography, also helps explain some of the complex subsurface structure seen using seismic observation techniques."It is exciting to see that such a simple model leads to something close to what we observe today," Liu said. "The location of modern earthquake activity and land movement corresponds to what we predict with the model, as well." | Earthquakes | 2,017 |
July 20, 2017 | https://www.sciencedaily.com/releases/2017/07/170720142223.htm | Crustal limestone platforms feed carbon to many of Earth's arc volcanoes | A new analysis suggests that much of the carbon released from volcanic arcs, chains of volcanoes that arise along the tectonic plates of a subduction zone, comes from remobilizing limestone reservoirs in the Earth's crust. Previous research suggested carbon was sourced from the mantle as a result of the subduction process. | The discovery ultimately impacts the amount of organic carbon scientists believe was buried in the past. Carbon cycling between surface reservoirs and the mantle over geologic history is important because the imbalance greatly influences the amount of total carbon at Earth's surface. However, the source for carbon from volcanic arc outgassing remained uncertain.Emily Mason and colleagues compiled a global data set of carbon and helium isotopes to determine the origin of the carbon. The data reveal that many volcanic arcs mobilize carbon from large, crustal carbonate platforms -- particularly in Italy, the Central American Volcanic Arc, Indonesia, and Papua New Guinea.In contrast, arcs located in the northern Pacific, such as Japan and Kuril-Kamchatka, release carbon dioxide with an isotope signature indicative of a mantle source.The recognition of a large amount of crustal carbon in the overall carbon isotope signature requires, from a mass balance consideration, downward revision of how much organic carbon was buried in the past. | Earthquakes | 2,017 |
July 19, 2017 | https://www.sciencedaily.com/releases/2017/07/170719084814.htm | Sea cave preserves 5,000-year snapshot of tsunamis | An international team of scientists digging in a sea cave in Indonesia has discovered the world's most pristine record of tsunamis, a 5,000-year-old sedimentary snapshot that reveals for the first time how little is known about when earthquakes trigger massive waves. | "The devastating 2004 Indian Ocean tsunami caught millions of coastal residents and the scientific community off-guard," says co-author Benjamin Horton, a professor in the Department of Marine and Coastal Sciences at Rutgers University-New Brunswick. "Our geological record from a cave illustrates that we still cannot predict when the next earthquake will happen.""Tsunamis are not evenly spaced through time," says Charles Rubin, the study's lead author and a professor at the Earth Observatory of Singapore, part of Nanyang Technological University. "Our findings present a worrying picture of highly erratic tsunami recurrence. There can be long periods between tsunamis, but you can also get major tsunamis that are separated by just a few decades."The discovery, reported in the current issue of The discovery was made in a sea cave on the west coast of Sumatra in Indonesia, just south of the city of Banda Aceh, which was devastated by the tsunami of December 2004. The stratigraphic record reveals successive layers of sand, bat droppings and other debris laid down by tsunamis between 7,900 and 2,900 years ago. The stratigraphy since 2,900 years ago was washed away by the 2004 tsunami.The L-shaped cave had a rim of rocks at the entrance that trapped successive layers of sand inside. The researchers dug six trenches and analyzed the alternating layers of sand and debris using radio carbon dating. The researchers define "pristine" as stratigraphic layers that are distinct and easy to read. "You have a layer of sand and a layer of organic material that includes bat droppings, so simply it is a layer of sand and a layer of bat crap, and so on, going back for 5,000 years," Horton says.The record indicates that 11 tsunamis were generated during that period by earthquakes along the Sunda Megathrust, the 3,300-mile-long fault running from Myanmar to Sumatra in the Indian Ocean. The researchers found there were two tsunami-free millennia during the 5,000 years, and one century in which four tsunamis struck the coast. In general, the scientists report, smaller tsunamis occur relatively close together, followed by long dormant periods, followed by great quakes and tsunamis, such as the one that struck in 2004.Rubin, Horton and their colleagues were studying the seismic history of the Sunda Megathrust, which was responsible for the 2004 earthquake that triggered the disastrous tsunami. They were looking for places to take core samples that would give them a good stratigraphy.This involves looking for what Horton calls "depositional places" -- coastal plains, coastal lake bottoms, any place to plunge a hollow metal cylinder six or seven meters down and produce a readable sample. But for various reasons, there was no site along the southwest coast of Sumatra that would do the job. But Patrick Daly, an archaeologist at EOS who had been working on a dig in the coastal cave, told Rubin and Horton about it and suggested it might be the place they were looking for.Looking for tsunami records in a sea cave was not something that would have occurred to Horton, and he says Daly's professional generosity -- archaeologists are careful about who gets near their digs -- and his own and Rubin's openness to insights from other disciplines made the research possible. Horton says this paper may be the most important in his career for another reason."A lot of (the research) I've done is incremental," he says. "I have a hypothesis, and I do deductive science to test the hypothesis. But this is really original, and original stuff doesn't happen all that often." | Earthquakes | 2,017 |
July 12, 2017 | https://www.sciencedaily.com/releases/2017/07/170712200226.htm | Slow earthquakes occur continuously in the Alaska-Aleutian subduction zone | Seismologists at the University of California, Riverside studying earthquakes in the seismically and volcanically active Alaska-Aleutian subduction zone have found that "slow earthquakes" are occurring continuously, and could encourage damaging earthquakes. | Slow earthquakes are quiet, can be as large as magnitude 7, and last days to years. Taking place mainly at the boundary between tectonic plates, they happen so slowly that people don't feel them. A large slow earthquake is typically associated with abundant seismic tremor -- a continuous weak seismic chatter -- and low frequency (small and repeating) earthquakes."In the Alaska-Aleutian subduction zone, we found seismic tremor, and visually identified three low frequency earthquakes," said Abhijit Ghosh, an assistant professor of Earth sciences, who led the research published recently in The Alaska-Aleutian subduction zone, which stretches from the Gulf of Alaska to the Kamchatka Peninsula in the Russian Far East, is one of the most active plate boundaries in the world. It is 3800 km long and forms the plate boundary between the Pacific and North American plates. In the last 80 years, four massive earthquakes (greater than magnitude 8) have occurred here.Ghosh explained that tectonic tremor -- which causes a weak vibration of the ground -- and low frequency earthquakes are poorly studied in the Alaska-Aleutian subduction zone due to limited data availability, difficult logistics, and rugged terrain.But using two months of high-quality continuous seismic data recorded from early July-September 2012 at 11 stations in the Akutan Island, Ghosh and his graduate student, Bo Li, detected near-continuous tremor activity with an average of 1.3?hours of tectonic tremor per day using a "beam back projection" method -- an innovative array-based method Ghosh developed to automatically detect and locate seismic tremor. Using the seismic arrays the method continuously scans the subsurface for any seismic activity. Just like a radar antenna, it determines from which direction the seismic signal originates and uses that information to locate it. Practically, it can track slow earthquakes minute-by-minute.Ghosh and Li found that tremor sources were clustered in two patches with a nearly 25?km gap in between them, possibly indicating that frictional properties determining earthquake activities change laterally along this area. Ghosh explained that this gap impacts the region's overall stress pattern and can affect earthquake activity nearby."In addition, slow earthquakes seem to have 'sweet spots' along the subduction fault that produces majority of the tremor activity," he said. "We found that the western patch has a larger depth range and shows higher tremor source propagation velocities. More frequent tremor events and low frequency earthquakes in the western patch may be a result of higher fluid activity in the region and indicate a higher seismic slip rate than the eastern region."Ghosh, Li, and their collaborators in multiple institutions in the United States have taken the next step by installing three additional seismic arrays in a nearby island to simultaneously image the subduction fault and volcanic system."This ambitious experiment will provide new insights into the seismic activity and subduction processes in this region," Ghosh said. | Earthquakes | 2,017 |
July 12, 2017 | https://www.sciencedaily.com/releases/2017/07/170712145604.htm | Foreshock activities leading up to Pawnee earthquake | A University of Oklahoma geophysics professor, Xiaowei Chen, details the foreshock activities leading up to the Pawnee earthquake, and highlights the complicated relationship between seismicity and wastewater injection rates in a research study published this week in | "In this study, we sought to better understand the nucleation processes of large earthquakes in Oklahoma, with the focus on the triggering process of the Pawnee earthquake. We began with an overview of occurrence patterns of earthquakes in Oklahoma, and their relationship with injection zones. Then, we focused on Pawnee County with a detailed analysis of the relationship between injection and precursory activities, as well as stress interactions between magnitude 3 plus foreshocks and the mainshock," Chen said.Chen, a professor in the OU School of Geology and Geophysics, led the study and collaborated with Nori Nakata, OU geophysics professor; Colin Pennington and Jackson Haffener, OU graduate students; Jefferson Chang and Jacob Walter, Oklahoma Geological Survey researchers; as well as collaborators Zhongwen Zhan, Caltech; and Sidao Ni and Xiaohui He, China. The study suggests that the Pawnee earthquake was a result of a complicated interplay among wastewater injection, faults and prior earthquakes in the region.Within the broader Pawnee area, increased seismic activities started in 2014, but only until May 2016 did researchers detect microearthquakes in the immediate vicinity of the Pawnee magnitude 5.8 epicenter. The foreshocks from May to September 2016 occurred in two major episodes, and the seismicity rate correlates with wastewater injection rates from nearby wells. The pattern of foreshocks also reveals possible aseismic (or slow) slip near where the magnitude 5.8 occurred, which appears to drive foreshocks to "migrate" along the Sooner Lake Fault. Additionally, the three largest foreshocks were optimally-oriented so that their slip may have promoted failure along the Sooner Lake Fault. | Earthquakes | 2,017 |
July 6, 2017 | https://www.sciencedaily.com/releases/2017/07/170706121154.htm | How strike-slip faults form: The origin of earthquakes | Structural geologist Michele Cooke calls it the "million-dollar question" that underlies all work in her laboratory at the University of Massachusetts Amherst: what goes on deep in Earth as strike-slip faults form in the crust? This is the fault type that occurs when two tectonic plates slide past one another, generating the waves of energy we sometimes feel as earthquakes. | Geologists have been uncertain about the factors that govern how new faults grow, says Cooke. In recent years she and colleagues have offered the first systematic explorations of such fault evolution. In their new paper, she and her team of students provide experimental results to illustrate the process, with videos, and report on how they re-enact such events in wet clay in the lab. Details appear in the current online edition of Cooke says, "When I give talks to other geologists I put up a picture of a fault and ask, wouldn't you love to be able to see exactly how that formed? Well, in my lab that's what we do. We set up the conditions for faulting on a small scale and watch them unfold. People have done this before, but we've developed methods so we can see faults grow in very, very fine detail, at a finer resolution than anyone has documented before."The UMass Amherst researchers take a mechanical efficiency approach to understanding fault development. It states that faults in the crust reorganize in accord with "work optimization" principles, or what Cooke refers to as the "Lazy Earth" hypothesis. It focuses on fault systems' effectiveness at transforming input energy into movement along the faults. Like lightning striking the closest object, when forming a fault Earth takes the easiest path.For this National Science Foundation-supported work, the researchers load a tray with kaolin, also known as china clay, prepared so its viscosity and length scale to that of Earth's crust. All the experiments involve two slabs of wet clay moving in opposite directions under one of three base boundary conditions, that is, different ways of "loading" the fault. One scenario begins with a pre-existing fault, another with localized displacement beneath the clay, and a third that is characterized by a displacement across a wider zone of shear beneath the clay.Data from the two-hour experiments record strain localization and fault evolution that represents millions of years at the scale of tens of kilometers during strike-slip fault maturation. Cooke says, "We have captured very different conditions for fault formation in our experiments that represent a range of conditions that might drive faulting in the crust."She adds, "We found that faults do evolve to increase kinematic efficiency under different conditions, and we learned some surprising things along the way. One of them is that faults shut off along the way. We suspected this, but our experiment is the first to document it in detail. Another especially surprising finding is that fault irregularities, which are inefficient, persist rather than the system forming a straight, efficient fault."The authors, who include graduate students Alex Hatem and Kevin Toeneboehn, identify four stages in fault evolution: pre-faulting, localization, linkage and slip. The process starts simply, advances to a peak of complexity, after which complexity suddenly drops off and the fault simplifies again, lengthening into a "through-going" or continuous single, surface crack.In videos by Hatem, shear strain is clearly seen to distort the crust along the area where two base plates meet. In the next stage numerous echelon faults develop. These are step-like fractures parallel to each other that get pulled length-wise as strain increases until they suddenly link. In the last stage, these join to form a final single fault. Cooke says, "We were very excited to see that portions of the faults shut off as the system reorganized, and also that the irregularities persisted along the faults."An interesting finding, but not a surprise is that for the most part all faults went through a similar process. Cooke says, "We tested the various extremes but came out of this with a common kind of evolution that's true for all. If there's not already a fault, then you see echelon faults, small faults parallel to each other but at an angle to the shear. Probably the most insightful bit is the details of fault evolution within those extremes. What you're left with at the end is a long fault with abandoned segments on either side, which is something we see in the field all the time. It's a nice confirmation that our lab experiments replicate what is going on within Earth."Another insight, the researchers say, results from measuring the kinematic or geometric efficiency, the percent of applied displacement expressed as slip on the faults. "An inefficient fault will have less slip and more deformation around the zones," Cooke explains. "We can see it happening in the experiments and it supports the idea that faults evolve to become efficient and Earth optimizes work. This is the Lazy Earth; the efficiency is increasing even though the fault is becoming more complex."Finally the geologist adds, "We saw that when the faults eventually link up, they don't necessarily make a perfectly straight fault. That tells me that irregularities can persist along mature faults because of the material. It's an insight into how you get persistent irregularities that we see in the real earth's crust. Structural geologists are surprised by irregularities, because if faults evolve to minimize work then all faults should be straight. But we have evidence now to show these irregularities persist. We have irregular faults that are active for millions of years." | Earthquakes | 2,017 |
July 5, 2017 | https://www.sciencedaily.com/releases/2017/07/170705133017.htm | Engineers find way to evaluate green roofs | Green infrastructure is an attractive concept, but there is concern surrounding its effectiveness. Researchers at the University of Illinois at Urbana-Champaign are using a mathematical technique traditionally used in earthquake engineering to determine how well green infrastructure works and to communicate with urban planners, policymakers and developers. | Green roofs are flat, vegetated surfaces on the tops of buildings that are designed to capture and retain rainwater and filter any that is released back into the environment."The retention helps ease the strain that large amounts of rain put on municipal sewer systems, and filtration helps remove any possible contaminants found in the stormwater," said Reshmina William, a civil and environmental engineering graduate student who conducted the study with civil and environmental engineering professor Ashlynn Stillwell.A good-for-the-environment solution to mitigating stormwater runoff may seem like a no-brainer, but a common concern regarding green roofs is the variability of their performance. One challenge is figuring out how well the buildings that hold them up will respond to the increased and highly variable weight between wet and dry conditions. Another challenge is determining how well they retain and process water given storms of different intensity, duration and frequency, William said.While studying reliability analysis in one of her courses, William came up with the idea to use a seemingly unrelated mathematical concept called fragility curves to confront this problem."Earthquake engineering has a similar problem because it is tough to predict what an earthquake is going to do to a building," William said. "Green infrastructure has a lot more variability, but that is what makes fragility curves ideal for capturing and defining the sort of dynamics involved."William and Stillwell chose to study green roofs over other forms of green infrastructure for a very simple reason: There was one on campus fitted with the instrumentation needed to measure soil moisture, rainfall amount, temperature, humidity and many other variables that are plugged into their fragility curve model."This is a unique situation because most green roofs don't have monitoring equipment, so it is difficult for scientists to study what is going on," Stillwell said. "We are very fortunate in that respect."William said the primary goal of this research is to facilitate communication between scientists, policymakers, developers and the general public about the financial risk and environmental benefit of taking on such an expense."One of the biggest barriers to the acceptance of green infrastructures is the perception of financial risk," William said. "People want to know if the benefit of a green roof is going to justify the cost, but that risk is mitigated by knowing when an installation will be most effective, and that is where our model comes in."The results of their model and risk analysis, which appear in the | Earthquakes | 2,017 |
July 5, 2017 | https://www.sciencedaily.com/releases/2017/07/170705132932.htm | Forgotten archives reveal street-level impact of 1918 Puerto Rico earthquake and tsunami | Repair petitions filed in the wake of the 1918 Puerto Rico earthquake and tsunami, stored and forgotten in the San Juan archives for nearly 100 years, are giving scientists a house-by-house look at the damage wrought by the magnitude 7.3 event. | In the journal The researchers combed through handwritten and often heartbreaking petitions for funds to repair homes battered or washed away by the tsunami, or damaged by earthquake ground shaking. Together, the data provide a "pretty accurate picture to find out where the damage was, and how far the tsunami made it inland," said LaForge.At the south end of town, in particular, the tsunami's three to four meter- high mark could be determined from repair petitions from houses closely clustered together -- where some homes reported wave damage and some were untouched by the waves.The address-level findings are consistent with a 1919 reconnaissance of the earthquake damage and more modern calculations of tsunami wave heights, the researchers say. But the new study provides more detailed "ground truth" of what happened during the 1918 quake, said LaForge, and could be useful in predicting which parts of Aguadilla would be mostly likely to suffer damage during the next major earthquake.In the United States, the Caribbean and Latin America, LaForge said, "finding and interpreting written historical earthquake damage accounts is difficult and time consuming, but we have learned that researching these old earthquakes has become more important over time."The October 11, 1918 Puerto Rico earthquake and tsunami is the most recent damaging seismic event to affect the island. More than 100 people died, and the island sustained $4 million dollars (1918 dollars) in damage, especially in the towns of Aguadilla, Mayagüez, Aguada and Añasco.As part of the relief efforts after the earthquake, residents whose homes were damaged or destroyed submitted petitions for repair funds to a Special Earthquake Commission established after the event. Inspectors came out to review the damage claimed in each petition, and funds were awarded based on their recommendations.McCann, a former professor at the University of Puerto Rico, stumbled across boxes of these petitions, unsealed for nearly 100 years, in the General Archive in San Juan, Puerto Rico. He later mentioned them to LaForge, who had worked with the U.S. Bureau of Reclamation and Puerto Rico Electric Power Authority on seismic hazard studies of dams on the island.The two received a grant from the National Earthquake Hazards Reduction Program (NEHRP) to digitize and study more than 6000 pages of the petitions and other records and photographs related to the earthquake. Although 275 petitions were known to be received from Aguadilla, only 88 (32%) were discovered in the San Juan archives. Most of these appear to be petitions to repair damage rather than replacement of entire homes."The layout of the town is pretty much the same as it was in 1918," LaForge explained. "And we had these detailed damage descriptions by neighborhood and street and address in some cases. We thought if we can match up these addresses with modern-day addresses to know where they were, we could get a pretty good picture of where the damage was and how severe it was."The petitions marked other losses as well. "Reading through the actual reports was very poignant at times," LaForge said. "Some of these people lost family members, or knew people who drowned. You get a real idea of what people went through."LaForge hopes that other researchers -- students at the University of Puerto Rico, perhaps -- will use the digitized petition data to learn more about the earthquake and tsunami impact in other towns such as Mayagüez. "The dataset in general is a real gold mine."The Seismological Society of America, which publishes | Earthquakes | 2,017 |
July 5, 2017 | https://www.sciencedaily.com/releases/2017/07/170705095421.htm | Can satellites be used as an early warning system for landslides? | Researchers are using satellite data to accurately map the movement of the earth before a landslide in a bid to develop a life-saving early warning system. | The team from Newcastle University (UK), Chengdu University of Technology, Tongji University, China Academy of Space Technology and Wuhan University (China) have been tracking the devastating events of last week when a massive landslide struck Xinmo Village, Maoxian County, Sichuan Province in China.Triggered by heavy rain, the Maoxian landslide swept away homes in Xinmo village, blocking a 2km section of river and burying 1,600 meters of road. The collapsed rubble was estimated to be about eight million cubic meters.Three days later, a second landslide hit Xinmo Village and almost at the same time, a third landslide occurred in Shidaguan Town, 20km away from Xinmo Village.Using ESA's Sentinel-1 satellite radar mission -- which comprises a constellation of two polar-orbiting satellites, operating day and night in all-weather conditions -- the research team were able to capture before and after images of the landslides.This provides vital information about the extent of the disaster which can be used to assess the damage and future risk in the area.Professor Zhenhong Li, Professor of Imaging Geodesy at Newcastle University, explains:"It is still hard, if not impossible, to detect a landslide using traditional techniques, especially in mountain areas. Using the satellite radar data, we were able to efficiently detect and map the active landslide over a wide region, identifying the source of the landslide and also its boundaries."Going forward, we can use this information to set up real-time monitoring systems -- such as GPS, Beidou and Galileo -- for those sites and whenever we detect abnormal behaviour, the system can send out an early warning message."In fact, while we were monitoring the Maoxian landslides we managed to identify over 10 other active landslides in the same region and forwarded this information to the relevant agencies."Sichuan province is prone to earthquakes, including the devastating Great Wenchuan Earthquake of 2008 when a 7.9 magnitude quake hit the area, killing over 70,000 people.Professor Li says their data suggests the Maoxian (Shidaguan) landslide had been sliding for at least six months before it failed."When you consider this sort of timescale it suggests that a landslide Early Warning System is not only possible but would also be extremely effective," says Professor Li."If we can detect movement at a very early stage then in many cases it is likely we would have time put systems in place to save lives."Professor Li and the team have been working on active faults and landslides in Southwest China for over ten years and have identified several active landslides in the area south to Maoxian County but this is the first time they have studied the Maoxian region.Ultimately, the team hope to use the technology to detect and map active landslides in the whole region of SW China, and then build a landslide database.The research findings were presented at the Dragon-4 symposium in Copenhagen on 27 June 2017. | Earthquakes | 2,017 |
July 3, 2017 | https://www.sciencedaily.com/releases/2017/07/170703085442.htm | Guidelines to reduce the risk of minor earthquakes during hydraulic fracturing | Keele University researchers are advising on new safety guidelines for hydraulic fracturing to help prevent minor earthquakes. | Researcher Developer Dr Rachel Westwood, Research Fellow Mr Sam Toon, and Emeritus Professor Peter Styles from Keele's School of Geography, Geology and Environment -- together with Professor Nigel Cassidy who is now at Birmingham University -- have published their study advising on hydraulic fracturing safety guidelines for legislative bodies, including governments, environmental agencies, health and safety executives and local planning authorities.The study, Horizontal respect distance for hydraulic fracturing in the vicinity of existing faults in deep geological reservoirs: A review and modelling study, focused on the safety of hydraulic fracturing -- where millions of gallons of water, sand and chemicals are pumped under high pressure deep underground to deliberately break apart rock and release the gas trapped inside.In 2014, hydraulic fracturing for shale gas, taking place near Blackpool, United Kingdom, caused two minor earthquakes. This was due to the fluid injected into the rock flowing into a pre-existing natural crack in the rock, known as a fault, triggering a minor earthquake.Dr Westwood said: "The aim of the research was to investigate the minimum horizontal distance that hydraulic fracturing should occur from pre-existing faults in order to reduce the risk of an earthquake similar to the one that occurred near Blackpool."The review looked at seismicity caused by hydraulic fracturing and we created a model that looked at whether it could trigger movement of a pre-existing fault."This study is the first to advise legislative bodies as there are currently no guidelines set for the horizontal distance required between the fluid injection points and pre-existing cracks in the rock. This research was carried out as part of the ReFINE (Researching Fracking in Europe) consortium, and funded by Ineos, Shell, Chevron, Total, GDF Suez, Centrica and Natural Environment Research Council (UK).Mr Toon said: "We used data from the Blackpool earthquake and our computerized model which shows the layers within the rocks and the horizontal well. When water is injected into the gas reservoir it creates new fractures that intersect with the natural fractures in the rock and creates a network of open fractures which makes it easier to get the gas out of the rock."The study found that the distance required from the injection point depends on the intensity of the natural fracture network, how many and how close together the fractures are, and also what stresses are required to activate a fault. We found depending on the stress trigger threshold and the fracture intensity the distance is up to 433 metres."The researchers have also published a further study, funded by the Horizon 2020 project SHEER (SHale gas Exploration and Exploitation induced Risks) that analyses the effect of pumping parameters on hydraulic fracture networks and the local stresses it causes during shale gas extraction. The research found for safe fracking there needs to be a compromise of flow distance and fracture area which can be controlled using pumping time and flow rate. | Earthquakes | 2,017 |
June 28, 2017 | https://www.sciencedaily.com/releases/2017/06/170628144920.htm | 'Bulges' in volcanoes could be used to predict eruptions | A team of researchers from the University of Cambridge have developed a new way of measuring the pressure inside volcanoes, and found that it can be a reliable indicator of future eruptions. | Using a technique called 'seismic noise interferometry' combined with geophysical measurements, the researchers measured the energy moving through a volcano. They found that there is a good correlation between the speed at which the energy travelled and the amount of bulging and shrinking observed in the rock. The technique could be used to predict more accurately when a volcano will erupt. Their results are reported in the journal Data was collected by the US Geological Survey across Kīlauea in Hawaii, a very active volcano with a lake of bubbling lava just beneath its summit. During a four-year period, the researchers used sensors to measure relative changes in the velocity of seismic waves moving through the volcano over time. They then compared their results with a second set of data which measured tiny changes in the angle of the volcano over the same time period.As Kīlauea is such an active volcano, it is constantly bulging and shrinking as pressure in the magma chamber beneath the summit increases and decreases. Kīlauea's current eruption started in 1983, and it spews and sputters lava almost constantly. Earlier this year, a large part of the volcano fell away and it opened up a huge 'waterfall' of lava into the ocean below. Due to this high volume of activity, Kīlauea is also one of the most-studied volcanoes on Earth.The Cambridge researchers used seismic noise to detect what was controlling Kīlauea's movement. Seismic noise is a persistent low-level vibration in the Earth, caused by everything from earthquakes to waves in the ocean, and can often be read on a single sensor as random noise. But by pairing sensors together, the researchers were able to observe energy passing between the two, therefore allowing them to isolate the seismic noise that was coming from the volcano."We were interested in how the energy travelling between the sensors changes, whether it's getting faster or slower," said Clare Donaldson, a PhD student in Cambridge's Department of Earth Sciences, and the paper's first author. "We want to know whether the seismic velocity changes reflect increasing pressure in the volcano, as volcanoes bulge out before an eruption. This is crucial for eruption forecasting."One to two kilometres below Kīlauea's lava lake, there is a reservoir of magma. As the amount of magma changes in this underground reservoir, the whole summit of the volcano bulges and shrinks. At the same time, the seismic velocity changes. As the magma chamber fills up, it causes an increase in pressure, which leads to cracks closing in the surrounding rock and producing faster seismic waves -- and vice versa."This is the first time that we've been able to compare seismic noise with deformation over such a long period, and the strong correlation between the two shows that this could be a new way of predicting volcanic eruptions," said Donaldson.Volcano seismology has traditionally measured small earthquakes at volcanoes. When magma moves underground, it often sets off tiny earthquakes, as it cracks its way through solid rock. Detecting these earthquakes is therefore very useful for eruption prediction. But sometimes magma can flow silently, through pre-existing pathways, and no earthquakes may occur. This new technique will still detect the changes caused by the magma flow.Seismic noise occurs continuously, and is sensitive to changes that would otherwise have been missed. The researchers anticipate that this new research will allow the method to be used at the hundreds of active volcanoes around the world. | Earthquakes | 2,017 |
June 27, 2017 | https://www.sciencedaily.com/releases/2017/06/170627134442.htm | Distant earthquakes can cause underwater landslides | New research finds large earthquakes can trigger underwater landslides thousands of miles away, weeks or months after the quake occurs. | Researchers analyzing data from ocean bottom seismometers off the Washington-Oregon coast tied a series of underwater landslides on the Cascadia Subduction Zone, 80 to 161 kilometers (50 to 100 miles) off the Pacific Northwest coast, to a 2012 magnitude-8.6 earthquake in the Indian Ocean -- more than 13,500 kilometers (8,390 miles) away. These underwater landslides occurred intermittently for nearly four months after the April earthquake.Previous research has shown earthquakes can trigger additional earthquakes on other faults across the globe, but the new study shows earthquakes can also initiate submarine landslides far away from the quake."The basic assumption ... is that these marine landslides are generated by the local earthquakes," said Paul Johnson, an oceanographer at the University of Washington in Seattle and lead author of the new study published in the The new findings could complicate sediment records used to estimate earthquake risk. If underwater landslides could be triggered by earthquakes far away, not just ones close by, scientists may have to consider whether a local or a distant earthquake generated the deposits before using them to date local events and estimate earthquake risk, according to the study's authors.The submarine landslides observed in the study are smaller and more localized than widespread landslides generated by a great earthquake directly on the Cascadia margin itself, but these underwater landslides generated by distant earthquakes may still be capable of generating local tsunamis and damaging underwater communications cables, according to the study authors.The discovery that the Cascadia landslides were caused by a distant earthquake was an accident, Johnson said.Scientists had placed ocean bottom seismometers off the Washington-Oregon coast to detect tiny earthquakes, and also to measure ocean temperature and pressure at the same locations. When Johnson found out about the seismometers at a scientific meeting, he decided to analyze the data the instruments had collected to see if he could detect evidence of thermal processes affecting seafloor temperatures, such as methane hydrate formation.Johnson and his team combined the seafloor temperature data with pressure and seismometer data and video stills of sediment-covered instruments from 2011-2015. Small variations in temperature occurred for several months, followed by large spikes in temperature over a period of two to 10 days. They concluded these changes in temperature could only be signs of multiple underwater landslides that shed sediments into the water. These landslides caused warm, shallow water to become denser and flow downhill along the Cascadia margin following the 8.6-magnitude Indian Ocean earthquake on April 11, 2012, causing the temperature spikes.The Cascadia margin runs for more than 1,100 kilometers (684 miles) off the Pacific Northwest coastline from north to south, encompassing the area above the underlying subduction zone, where one tectonic plate slides beneath another.Steep underwater slopes hundreds of feet high line the margin. Sediment accumulates on top of these steep slopes. When the seismic waves from the Indian Ocean earthquake reached these steep underwater slopes, they jostled the thick sediments piled on top of the slopes. This shaking caused areas of sediment to break off and slide down the slope, creating a cascade of landslides all along the slope. The sediment did not fall all at once so the landslides occurred for up to four months after the earthquake, according to the authors.The steeper-than-average slopes off the Washington-Oregon coast, such as those of Quinault Canyon, which descends 1,420 meters (4,660 feet) at up to 40-degree angles, make the area particularly susceptible to submarine landslides. The thick sediment deposits also amplify seismic waves from distant earthquakes. Small sediment particles move like ripples suspended in fluid, amplifying the waves."So these things are all primed, ready to collapse, if there is an earthquake somewhere," Johnson said.The new finding could have implications for tsunamis in the region and may complicate estimations of earthquake risk, according to the study's authors.Subduction zones like the Cascadia margin are at risk for tsunamis. As one tectonic plate slides under the other, they become locked together, storing energy. When the plates finally slip, they release that energy and cause an earthquake. Not only does this sudden motion give any water above the fault a huge shove upward, it also lowers the coastal land next to it as the overlying plate flattens out, making the shoreline more vulnerable to the waves of displaced water.Submarine landslides increase this risk. They also push ocean water out of the way when they occur, which could spark a tsunami on the local coast, Johnson said.Scientists also use underwater sediment records to estimate earthquake risk. By drilling sediment cores offshore and calculating the age between landslide deposits, scientists can create a timeline of past earthquakes used to predict how often an earthquake might occur in the region in the future and how intense it could be.An earthquake off the Pacific Northwest would create submarine landslides all along the coast from British Columbia to California. But the new study found that a distant earthquake might only result in landslides up to 20 or 30 kilometers (12 to 19 miles) wide. That means when scientists take sediment cores to determine how frequent local earthquakes occur, they may not be able to tell if the sediment layers arrived on the seafloor as a result of a distant or local earthquake.Johnson says more core sampling over a wider range of the margin would be needed to determine a more accurate reading of the geologic record and to update estimates of earthquake risk. | Earthquakes | 2,017 |
June 26, 2017 | https://www.sciencedaily.com/releases/2017/06/170626124610.htm | Evidence for past large earthquakes in the Eastern Tennessee seismic zone | The Eastern Tennessee Seismic Zone (ETSZ), a zone of small earthquakes stretching from northeastern Alabama to southwestern Virginia, may have generated earthquakes of magnitude 6 or greater within the last 25,000 years, according to a study published June 27 in the | The ETSZ is the second-most active natural seismic zone in the central and eastern United States, behind the New Madrid Seismic Zone in the Mississippi River region that produced the 1811-1812 magnitude 7+ earthquakes. In historic times, the ETSZ has not produced earthquakes larger than magnitude 4.8.The ETSZ region is home to several nuclear power plants and hydroelectric dams related to the Tennessee Valley Authority, along with major population centers such as Knoxville and Chattanooga, making it important to determine whether the region is capable of a large damaging earthquake.Randel Cox of the University of Memphis and colleagues searched for signs of ancient earthquakes below the muddy waterline of Douglas Lake, a 1943 Tennessee Valley Authority lake created by impounding the French Broad River. The level of the lake is drawn down in winter to accommodate snowmelt, which exposes river sediments and the signs of past seismic activity.At two sites along the lake, the researchers uncovered clay-filled fractures, signs of soil liquefaction and polished rock shear fractures called slickenlines, that point to at least three past earthquakes in the area. At one site, a thrust fault with one meter displacement suggests that one of the earthquakes could have been magnitude 6 or even larger.A combination of features convinced Cox and colleagues that they were looking at a record of past earthquakes rather than signs of an ancient landslide. The researchers are collecting data now indicating that these features cross valley floors, which "strongly corroborate the results of this paper, that these features are related to earthquakes," said Cox.Using a technique called optically stimulated luminescence to assign dates to the minerals contained in sediments surrounding these seismic features, Cox and colleagues narrowed the possible ages of these earthquakes to between 25,000 and 15,000 years ago. This would place them in the late Pleistocene, during the last North American ice age."I think we've got a pretty good case that this is related to active faulting, and that it does demonstrate that at least in periods of time in the past there have been strong earthquakes in the ETSZ," said Cox.Cox said it might be possible that these large earthquakes were only active during the late Pleistocene, when seismic stresses in the crust changed with the advance and retreat of massive ice sheets. "But we don't have enough data right now to say whether or not this is some kind of ephemeral or maybe periodic activity," he noted.Post-ice age sediments, which might tell us more about the current potential for large earthquakes in the region, are mostly underwater as a result of Tennessee Valley Authority projects, Cox added.Cox and colleagues also note that their study adds to a body of research suggesting that the Appalachian Mountains are undergoing a new period of uplift. "The ETSZ is right along the Smoky Mountains, which are a subrange of the Appalachians," Cox said. "We may have found a fault that is accommodating the uplift of the Smokies." | Earthquakes | 2,017 |
June 26, 2017 | https://www.sciencedaily.com/releases/2017/06/170626124338.htm | Hydraulic fracturing rarely linked to felt seismic tremors | New research suggests hydraulic fracturing and saltwater disposal has limited impact on seismic events. | For the past two years, UAlberta geophysicist Mirko Van der Baan and his team have been poring over 30 to 50 years of earthquake rates from six of the top hydrocarbon-producing states in the United States and the top three provinces by output in Canada: North Dakota, Ohio, Oklahoma, Pennsylvania, Texas, West Virginia, Alberta, British Columbia, and Saskatchewan.With only one exception, the scientists found no province- or state-wide correlation between increased hydrocarbon production and seismicity. They also discovered that human-induced seismicity is less likely in areas that have fewer natural earthquakes.The anomaly was in Oklahoma, where seismicity rates have changed dramatically in the last five years, with strong correlation to saltwater disposal related to increased hydrocarbon production."It's not as simple as saying 'we do a hydraulic fracturing treatment, and therefore we are going to cause felt seismicity.' It's actually the opposite. Most of it is perfectly safe," said Van der Baan, who is also director of the Microseismicity Industry Consortium.The findings, as well as continued monitoring, will help point industry experts toward developing mitigation strategies for the oft-maligned practice."What we need to know first is where seismicity is changing as it relates to hydraulic fracturing or saltwater disposal. The next question is why is it changing in some areas and not others," continued Van der Baan. "If we can understand why seismicity changes, then we can start thinking about mitigation strategies."Though Van der Baan noted that hydraulic fracturing has been in practice since the 1950s, it has come under increased scrutiny in the last handful of years due to both increased production as well as the use of the increased treatment volumes. He said an important next step will be continued monitoring."Hydraulic fracturing is not going away. The important thing is that we need to find the balance between the economic impact and environmental sustainability of any industry," he said.Van der Baan will be sharing the studies' findings extensively with industry and university students this fall when he travels to 25 different cities in North America to meet with as many different professional societies as this year's Society for Exploration Geophysicists honorary lecturer."Human-induced seismicity and large-scale hydrocarbon production in the USA and Canada" appeared in the scientific journal | Earthquakes | 2,017 |
June 19, 2017 | https://www.sciencedaily.com/releases/2017/06/170619092211.htm | Six key impact areas of shale oil and gas development in Texas | Development of shale oil and gas has fundamentally changed the energy sector. This development has resulted in billions of dollars for the state of Texas and thousands of jobs, but it's also had an impact on the state's communities and their land, air, water and infrastructure. | A comprehensive review of the impacts of oil and gas development in Texas by a cross-disciplinary task force of top researchers -- organized by The Academy of Medicine, Engineering and Science of Texas (TAMEST) -- finds a wide range of both benefits and consequences for the state's environment and communities. These impacts are detailed in a new report by the TAMEST Shale Task Force, TAMEST is Texas' premier scientific organization, bringing together the state's best and brightest scientists and researchers. TAMEST membership includes all Texas-based members of the National Academies of Sciences, Engineering, and Medicine and the state's Nobel Laureates."In life, we learn by doing. This report shows what we've learned in Texas about the impacts from shale oil and gas development, and I hope others can benefit from our experience," said Christine Ehlig-Economides, task force chair.The TAMEST Shale Task Force report focuses on six areas of impacts: seismicity, land, water, air, transportation, and economic and social impacts. Key highlights from the report include:The majority of known faults present in Texas are stable and are not prone to generating earthquakes. To date, induced earthquakes in Texas have been associated with wastewater disposal wells, not with hydraulic fracturing.Shale oil and gas development activities in Texas have resulted in fragmentation of habitat on the landscape. However, there is a lack of information and scientific data on what the impacts of fragmentation have been and are on landscape -- vegetative resources, agriculture and wildlife.The production of shale oil and gas results in emissions of greenhouse gases, photochemical air pollutants and air toxics. Air emission sources from shale oil and gas development are diverse, have complex behavior and are distributed across a large number of individual sites.The most common pathways for contaminating drinking water sources and causing environmental damage are with surface spills and well casing leaks near the surface. The depth and separation between oil-bearing and drinking water-bearing zones make contamination of potential drinking water unlikely.Transportation is one of the most far-reaching and consistent impacts of shale oil and gas development. Texas accounts for about half of the drilling activity in the country at any given time, and all of that activity requires a very large number of heavy truckloads, which have far greater impact on roads than typical passenger vehicle traffic.For the most part, shale oil and gas development contributes positively to local, regional and state economies, with some unintended consequences, including impacts to local infrastructure such as roads and increased cost of living, and not everyone within a community benefits equally from such developments.Communities in shale regions: | Earthquakes | 2,017 |
June 15, 2017 | https://www.sciencedaily.com/releases/2017/06/170615142834.htm | Seasonal rain and snow trigger small earthquakes on California faults | California's winter rains and snow depress the Sierra Nevada and Coast Ranges, which then rebound during the summer, changing the stress on the state's earthquake faults and causing seasonal upticks in small quakes, according to a new study by University of California, Berkeley seismologists. | The weight of winter snow and stream water pushes down the Sierra Nevada mountains by about a centimeter, or three-eighths of an inch, while ground and stream water depress the Coast Ranges by about half that. This loading and the summer rebound -- the rise of the land after all the snow has melted and much of the water has flowed downhill -- makes the earth's crust flex, pushing and pulling on the state's faults, including its largest, the San Andreas.The researchers can measure these vertical motions using the regional global positioning system and thus calculate stresses in the earth due the water loads. They found that on average, the state's faults experienced more small earthquakes when these seasonal stress changes were at their greatest.The central San Andreas Fault, for example, sees an increase in small quakes -- those greater than magnitude 2 -- in late summer and early fall as the water load diminishes in the mountains. Most people can't feel quakes below about magnitude 2.5.The faults along the eastern edge of the Sierra Nevada see an uptick in late spring and early summer due to this seasonal unloading, the researchers found."It's not that all earthquakes happen in September. There is no earthquake season," said Roland Bürgmann, a UC Berkeley professor of earth and planetary science and the senior author of a paper appearing this week in the journal While the impact of this annual up and down movement of the mountains surrounding the Central Valley is small -- increasing the chance of earthquakes by a few percentage points at most -- the study gives seismologists information about how faults rupture and what kinds of stresses are important in triggering quakes."This study supports the notion that the state's faults are critically stressed so that these small perturbations can affect the earthquake cycle and sometimes promote failure," said first author Christopher Johnson, a UC Berkeley graduate student. "It is advancing the clock on these different faults."Previous studies have shown that daily stresses caused by the ebb and flow of ocean tides don't seem to trigger small or large quakes in California. Rarely, though, extremely large earthquakes -- megaquakes -- can trigger large quakes thousands of miles away. An 8.6 magnitude quake that occurred in the Indian Ocean in 2012 triggered 16 large quakes greater than magnitude 5.5 around the world.The amount of stress generated by seasonal water loading in California is similar to the stresses induced by the seismic waves from distant megaquakes, Johnson said."We don't see an increase in large-magnitude earthquakes from these low-amplitude stresses caused by seasonal water storage," Johnson said. "What these results are showing, however, is that we do see a correlation with small earthquakes from low-amplitude stresses."Johnson and Bürgmann, members of the Berkeley Seismological Laboratory, looked at 3,600 earthquakes over a nine-year period, 2006-2015, and correlated their occurrence with the calculated peak stress on the fault where they occurred. The stress was calculated from the amount the mountains deformed, as measured by a GPS system, using models of rock mechanics that predict stress changes on faults."We are finding that on the central San Andreas, the late summer months are when we see most seismicity, and that correlates with the larger stress changes," Johnson said. "It is not during the rainy season; it is more of the unloading that is resulting in the larger stresses, for that one fault."Interestingly, only shear stress -- that caused by back and forth sliding motion -- triggered an excess of quakes, not changes in compression that clamp or unclamp the fault.The researchers also looked at all historic large quakes greater than magnitude 5.5 since 1781, and saw somewhat more earthquakes when water unloading stresses are high than when the stresses are low."We look at historic records for larger events, and we do see this seasonality, but we are not at the point that we can provide further evidence to hazard estimates that would say that during these periods of time we would expect more large earthquakes to occur," Johnson cautioned.The studies are designed to better understand "what makes earthquakes go," Bürgmann said. "Looking at the responses to these periodic stresses is like running a fault mechanics experiment at the scale of all of California." | Earthquakes | 2,017 |
June 15, 2017 | https://www.sciencedaily.com/releases/2017/06/170615142730.htm | Japanese slow earthquakes could shed light on tsunami generation | Understanding slow-slip earthquakes in subduction zone areas may help researchers understand large earthquakes and the creation of tsunamis, according to an international team of researchers that used data from instruments placed on the seafloor and in boreholes east of the Japanese coast. | "This area is the shallowest part of the plate boundary system," said Demian Saffer, professor of geosciences, Penn State. "If this region near the ocean trench slips in an earthquake, it has the potential to generate a large tsunami."Two tectonic plates meet here, the Pacific Plate and the Eurasian Plate, in a subduction zone where the Pacific plate slides beneath the Eurasian plate. This type of earthquake zone forms the "ring of fire" that surrounds the Pacific Ocean, because once the end of the plate that is subducting -- sliding underneath -- reaches the proper depth, it triggers melting and forms volcanoes. Mt. St. Helens in the American Cascade Mountains is one of these volcanoes, as is Mt. Fuji, about 60 miles southwest of Tokyo. Subduction zones are often also associated with large earthquakes.The researchers focused their study on slow earthquakes, slip events that happen over days or weeks. Recent research by other groups has shown that these slow earthquakes are an important part of the overall patterns of fault slip and earthquake occurrence at the tectonic plate boundaries and can explain where some of the energy built up on a fault or in a subduction zone goes."These valuable results are important for understanding the risk of a tsunami," says James Allan, program director in the National Science Foundation's Division of Ocean Sciences, which supports IODP. "Such tidal waves can affect the lives of hundreds of thousands of people and result in billions of dollars in damages, as happened in Southeast Asia in 2004. The research underscores the importance of scientific drillship-based studies, and of collecting oceanographic and geologic data over long periods of time."In 2009 and 2010, the IODP (Integrated Ocean Drilling Program, now the International Ocean Discovery Program) NanTroSEIZE project drilled two boreholes in the Nankai Trough offshore southwest of Honshu, Japan, and in 2010 researchers installed monitoring instruments in the holes that are part of a network including sensors on the seafloor. Saffer and Eiichiro Araki, senior research scientist, Japan Agency for Marine-Earth Science and Technology, co-lead authors, published their results today (June 16) in "Until we had these data, no one knew if zero percent or a hundred percent of the energy in the shallow subduction zone was dissipated by slow earthquakes," said Saffer. "We have found that somewhere around 50 percent of the energy is released in slow earthquakes. The other 50 percent could be taken up in permanent shortening of the upper plate or be stored for the next 100- or 150-year earthquake. We still don't know which is the case, but it makes a big difference for tsunami hazards. The slow slip could reduce tsunami risk by periodically relieving stress, but it is probably more complicated than just acting as a shock absorber."The researchers found a series of slow slip events on the plate interface seaward of recurring magnitude 8 earthquake areas east of Japan. These slow earthquakes lasted days to weeks, some triggered by other unconnected earthquakes in the area and some happening spontaneously. According to the researchers, this family of slow earthquakes occurred every 12 to 18 months."The area where these slow earthquakes take place is uncompacted, which is why it has been thought that these shallow areas near the trench act like a shock absorber, stopping deeper earthquakes from reaching the surface," said Saffer. "Instead we have discovered slow earthquakes of magnitude 5 or 6 in the region that last from days to weeks."These earthquakes typically go unnoticed because they are so slow and very far offshore.The researchers also note that because earthquakes that occur at a distance from this subduction zone, without any direct connection, can trigger the slow earthquakes, the area is much more sensitive than previously thought. The slow earthquakes are triggered by the shaking, not by any direct strain on the area."The question now is whether it releases stress when these slow earthquakes occur," said Saffer. "Some caution is required in simply concluding that the slow events reduce hazard, because our results also show the outer part of the subduction area can store strain. Furthermore, are the slow earthquakes doing anything to load deeper parts of the area that do cause big earthquakes? We don't know." | Earthquakes | 2,017 |
June 7, 2017 | https://www.sciencedaily.com/releases/2017/06/170607123923.htm | Seismic CT scan points to rapid uplift of Southern Tibet | Using seismic data and supercomputers, Rice University geophysicists have conducted a massive seismic CT scan of the upper mantle beneath the Tibetan Plateau and concluded that the southern half of the "Roof of the World" formed in less than one-quarter of the time since the beginning of India-Eurasia continental collision. | The research, which appears online this week in the journal "The features that we see in our tomographic image are very different from what has been seen before using traditional seismic inversion techniques," said Min Chen, the Rice research scientist who headed the project. "Because we used full waveform inversion to assimilate a large seismic data set, we were able to see more clearly how the upper-mantle lithosphere beneath Southern Tibet differs from that of the surrounding region. Our seismic image suggests that the Tibetan lithosphere thickened and formed a denser root that broke away and sank deeper into the mantle. We conclude that most of the uplift across Southern Tibet likely occurred when this lithospheric root broke away."The research could help answer longstanding questions about Tibet's formation. Known as the "Roof of the World," the Tibetan Plateau stands more than three miles above sea level. The basic story behind its creation -- the tectonic collision between the Indian and Eurasian continents -- is well-known to schoolchildren the world over, but the specific details have remained elusive. For example, what causes the plateau to rise and how does its high elevation impact Earth's climate?"The leading theory holds that the plateau rose continuously once the India-Eurasia continental collision began, and that the plateau is maintained by the northward motion of the Indian plate, which forces the plateau to shorten horizontally and move upward simultaneously," said study co-author Fenglin Niu, a professor of Earth science at Rice. "Our findings support a different scenario, a more rapid and pulsed uplift of Southern Tibet."It took three years for Chen and colleagues to complete their tomographic model of the crust and upper-mantle structure beneath Tibet. The model is based on readings from thousands of seismic stations in China, Japan and other countries in East Asia. Seismometers record the arrival time and amplitude of seismic waves, pulses of energy that are released by earthquakes and that travel through Earth. The arrival time of a seismic wave at a particular seismometer depends upon what type of rock it has passed through. Working backward from instrument readings to calculate the factors that produced them is something scientists refer to as an inverse problem, and seismological inverse problems with full waveforms incorporating all kinds of usable seismic waves are some of the most complex inverse problems to solve.Chen and colleagues used a technique called full waveform inversion, "an iterative full waveform-matching technique that uses a complicated numerical code that requires parallel computing on supercomputers," she said."The technique really allows us to use all the wiggles on a large number of seismographs to build up a more realistic 3-D model of Earth's interior, in much the same way that whales or bats use echo-location," she said. "The seismic stations are like the ears of the animal, but the echo that they are hearing is a seismic wave that has either been transmitted through or bounced off of subsurface features inside Earth."The tomographic model includes features to a depth of about 500 miles below Tibet and the Himalaya Mountains. The model was computed on Rice's DAVinCI computing cluster and on supercomputers at the University of Texas that are part of the National Science Foundation's Extreme Science and Engineering Discovery Environment (XSEDE)."The mechanism that led to the rise of Southern Tibet is called lithospheric thickening and foundering," Chen said. "This happened because of convergence of two continental plates, which are each buoyant and not easy to subduct underneath the other plate. One of the plates, in this case on the Tibetan side, was more deformable than the other, and it began to deform around 45 million years ago when the collision began. The crust and the rigid lid of upper mantle -- the lithosphere -- deformed and thickened, and the denser lower part of this thickened lithosphere eventually foundered, or broke off from the rest of the lithosphere. Today, in our model, we can see a T-shaped section of this foundered lithosphere that extends from a depth of about 250 kilometers to at least 660 kilometers."Chen said that after the denser lithospheric root broke away, the remaining lithosphere under Southern Tibet experienced rapid uplift in response."The T-shaped piece of foundered lithosphere sank deeper into the mantle and also induced hot upwelling of the asthenosphere, which leads to surface magmatism in Southern Tibet," she said.Such magmatism is documented in the rock record of the region, beginning around 30 million years ago in an epoch known as the Oligocene."The spatial correlation between our tomographic model and Oligocene magmatism suggests that the Southern Tibetan uplift happened in a relatively short geological span that could have been as short as 5 million years," Chen said. | Earthquakes | 2,017 |
June 6, 2017 | https://www.sciencedaily.com/releases/2017/06/170606170332.htm | New evidence reveals source of 1586 Sanriku, Japan tsunami | A team of researchers, led by Dr. Rhett Butler, geophysicist at the University of Hawai'i at M?noa (UHM), re-examined historical evidence around the Pacific and discovered the origin of the tsunami that hit Sanriku, Japan in 1586 -- a mega-earthquake from the Aleutian Islands that broadly impacted the north Pacific. Until now, this was considered an orphan tsunami, a historical tsunami without an obvious local earthquake source, likely originating far away. | Butler and scientists from the National Tropical Botanical Garden, UHM School of Ocean and Earth Science and Technology, and NOAA's Pacific Tsunami Warning Center analyzed material deposited into Makauwahi Cave, Kauai during a tsunami -- specifically, coral fragments that were previously dated to approximately the sixteenth century using carbon-14. Using specific isotopes of naturally-occurring thorium and uranium in the coral fragments, they determined a very precise age of the tsunami event that washed the coral ashore. Prior carbon-14 dates had an uncertainty of ±120 years, whereas the uranium-thorium date is more precise, 1572±21 years. This increased precision allowed better comparison with dated, known tsunamis and earthquakes throughout the Pacific."Although we were aware of the 1586 Sanriku tsunami, the age of the Kauai deposit was too uncertain to establish a link," said Butler. "Also, the 1586 Sanriku event had been ascribed to an earthquake in Lima, Peru. After dating the corals, their more precise date matched with that of the Sanriku tsunami."Further, re-analysis of the Peruvian evidence showed that the 1586 Peruvian earthquake was not large enough to create a measurable tsunami hitting Japan. They found additional corroborative evidence around the Pacific which strengthened the case. Earthquakes from Cascadia, the Alaskan Kodiak region, and Kamchatka were incompatible with the Sanriku data in several ways. However, a mega-earthquake (magnitude greater than 9.25) in the Aleutians was consistent with evidence from Kauai and the northeast coast of Japan."Hawaii is surrounded by the 'ring of fire' where mega-earthquakes generate great tsunamis impacting our island shores -- the 2011 Tohoku Japan is the most recent example," said Butler. "Even though there was no seismic instrumentation in the 16th century, we offer a preponderance of evidence for the occurrence of a magnitude 9 earthquake in the Aleutian Islands. Our knowledge of past events helps us to forecast tsunami effects and thereby enable us to assess this risk for Hawaii."Forecast models of a great Aleutian event inform the development of new maps of extreme tsunami inundation zones for the State of Hawai'i. By linking evidence on Kauai to other sites around the Pacific, we can better understand the Aleutian earthquake that generated the tsunami.Butler and colleagues at UHM are now working to determine how frequently great earthquakes along the Cascadia margin of the Pacific Northwest might occur. These events have the potential to devastate the coasts of Oregon and Washington, and send a dangerous tsunami to Hawai'i's shores. | Earthquakes | 2,017 |
June 5, 2017 | https://www.sciencedaily.com/releases/2017/06/170605155932.htm | Why rocks flow slowly in Earth's middle mantle | For decades, researchers have studied the interior of the Earth using seismic waves from earthquakes. Now a recent study, led by Arizona State University's School of Earth and Space Exploration Associate Professor Dan Shim, has re-created in the laboratory the conditions found deep in the Earth, and used this to discover an important property of the dominant mineral in Earth's mantle, a region lying far below our feet. | Shim and his research team combined X-ray techniques in the synchrotron radiation facility at the U.S. Department of Energy's National Labs and atomic resolution electron microscopy at ASU to determine what causes unusual flow patterns in rocks that lie 600 miles and more deep within the Earth. Their results have been published in the Planet Earth is built of layers. These include the crust at the surface, the mantle and the core. Heat from the core drives a slow churning motion of the mantle's solid silicate rocks, like slow-boiling fudge on a stove burner. This conveyor-belt motion causes the crust's tectonic plates at the surface to jostle against each other, a process that has continued for at least half of Earth's 4.5 billion-year history.Shim's team focused on a puzzling part of this cycle: Why does the churning pattern abruptly slow at depths of about 600 to 900 miles below the surface?"Recent geophysical studies have suggested that the pattern changes because the mantle rocks flow less easily at that depth," Shim said. "But why? Does the rock composition change there? Or do rocks suddenly become more viscous at that depth and pressure? No one knows."To investigate the question in the lab, Shim's team studied bridgmanite, an iron-containing mineral that previous work has shown is the dominant component in the mantle."We discovered that changes occur in bridgmanite at the pressures expected for 1,000 to 1,500 km depths," Shim said. "These changes can cause an increase in bridgmanite's viscosity -- its resistance to flow."The team synthesized samples of bridgmanite in the laboratory and subjected them to the high-pressure conditions found at different depths in the mantle.The experiments showed the team that, above a depth of 1,000 kilometers and below a depth of 1,700 km, bridgmanite contains nearly equal amounts of oxidized and reduced forms of iron. But at pressures found between those two depths, bridgmanite undergoes chemical changes that end up significantly lowering the concentration of iron it contains.The process starts with driving oxidized iron out of the bridgmanite. The oxidized iron then consumes the small amounts of metallic iron that are scattered through the mantle like poppy seeds in a cake. This reaction removes the metallic iron and results in making more reduced iron in the critical layer.Where does the reduced iron go? The answer, said Shim's team, is that it goes into another mineral present in the mantle, ferropericlase, which is chemically prone to absorbing reduced iron."Thus the bridgmanite in the deep layer ends up with less iron," explained Shim, noting that this is the key to why this layer behaves the way it does."As it loses iron, bridgmanite becomes more viscous," Shim said. "This can explain the seismic observations of slowed mantle flow at that depth." | Earthquakes | 2,017 |
June 5, 2017 | https://www.sciencedaily.com/releases/2017/06/170605110059.htm | Earliest human-made climate change took place 11,500 years ago | The vast majority of climate scientists agree that climate-warming trends over the past century have been due to human activities. A new Tel Aviv University study has uncovered the earliest known geological indications of humanmade climate change from 11,500 years ago. Within a core sample retrieved from the Dead Sea, researchers discovered basin-wide erosion rates dramatically incompatible with known tectonic and climatic regimes of the period recorded. | "Human impact on the natural environment is now endangering the entire planet," said Prof. Shmuel Marco, Head of TAU's School of Geosciences, who led the research team. "It is therefore crucial to understand these fundamental processes. Our discovery provides a quantitative assessment for the commencement of significant human impact on the Earth's geology and ecosystems." The results of the study were published in The research was conducted by TAU post-doctoral student Dr. Yin Lu and in collaboration with Prof. Dani Nadel and Prof. Nicolas Waldman, both of the University of Haifa. It took place as part of the Dead Sea Deep Drilling project, which harnessed a 1,500-foot-deep drill core to delve into the Dead Sea basin. The core sample provided the team with a sediment record of the last 220,000 years.The newly-discovered erosion occurred during the Neolithic Revolution, the wide-scale transition of human cultures from hunting and gathering to agriculture and settlement. The shift resulted in an exponentially larger human population on the planet."Natural vegetation was replaced by crops, animals were domesticated, grazing reduced the natural plant cover, and deforestation provided more area for grazing," said Prof. Marco. "All these resulted in the intensified erosion of the surface and increased sedimentation, which we discovered in the Dead Sea core sample."The Dead Sea drainage basin serves as a natural laboratory for understanding how sedimentation rates in a deep basin are related to climate change, tectonics, and human-made impacts on the landscape."We noted a sharp threefold increase in the fine sand that was carried into the Dead Sea by seasonal floods," said Prof. Marco. "This intensified erosion is incompatible with tectonic and climatic regimes during the Holocene, the geological epoch that began after the Pleistocene some 11,700 years ago."The researchers are currently in the process of recovering the record of earthquakes from the same drill core. "We have identified disturbances in the sediment layers that were triggered by the shaking of the lake bottom," Prof. Marco said. "It will provide us with a 220,000-year record -- the most extensive earthquake record in the world." | Earthquakes | 2,017 |
June 1, 2017 | https://www.sciencedaily.com/releases/2017/06/170601125115.htm | Extreme geothermal activity discovered beneath New Zealand’s Southern Alps | An international team, including University of Southampton scientists, has found unusually high temperatures, greater than 100°C, close to Earth's surface in New Zealand -- a phenomenon typically only seen in volcanic areas such as Iceland or Yellowstone, USA. | The researchers made the discovery while boring almost a kilometre into the Alpine Fault, the major tectonic boundary between the Australian and Pacific plates -running the length of the country's South Island. The team was working to better understand what happens at a tectonic plate boundary in the build-up to a large earthquake.The Deep Fault Drilling Project (DFDP) borehole, was drilled at Whataroa to the north of Franz Josef Glacier and discovered extremely hot, highly pressured groundwater flowing near to the fault line. Water at temperatures of more than 100°C is normally only found at depths of over three kilometres, but in this case was encountered at just over 600m depth.In an article published in the international journal Nature, computer models are used show these high temperatures result from a combination of the uplift of hot rocks along the tectonic plate boundary and groundwater flow caused by high mountains close to the Alpine Fault.Professor Damon Teagle, who leads the Southampton group involved in the project, says: "The Alpine Fault extends over such a massive distance, it is visible from space. It is potentially New Zealand's greatest geohazard, failing in the form of large earthquakes about every 300 years. With the last event occurring in 1717 AD, there is a high probability of a major (magnitude 7 to 8) earthquake in the next 50 years -- making research into its behaviour all the more important."Thermal and hydrological computer modelling pre-drilling by University of Southampton PhD student Jamie Coussens, supervisor Dr Nick Woodman and other colleagues, helped predict the high temperatures and borehole fluid pressures to enable the safe drilling of this borehole. Their models, now calibrated against real sub-surface observations, explain how such high temperatures occur at shallow depths."The Southern Alps receive a lot of rain and snow -- about ten times more than the UK. Much of this water flows into the ground, down beneath the high mountain ridges, before being heated and returning to the surface in valleys," says Jamie Coussens. "The rocks that this water flows through are being moved upwards at about 10 mm a year on the Alpine Fault. This slip is very fast in geological terms and has carried up hot rocks from 30 km depth, faster than they can cool."Although warm springs are common in the region most of the hot groundwater flows up into, or near, the gravely beds of large rivers and becomes diluted at the surface by cooler river waters.The result has implications for our understanding of the strength of the Alpine Fault and fault zones in general, as failure properties of fault rocks are influenced by temperature and geothermal fluids.Professor Teagle comments: "The temperature profile of the DFDP borehole is really exciting. These very high shallow temperatures prove early theoretical models of rapid tectonic uplift first suggested in the 1980s for the Southern Alps by profound thinkers such as Peter Koons and Rick Allis. I was inspired by these theories as an undergraduate at the University of Otago in southern New Zealand -- it is wonderful to see these early conceptual predictions proven with borehole observations." | Earthquakes | 2,017 |
May 25, 2017 | https://www.sciencedaily.com/releases/2017/05/170525141547.htm | Why the Sumatra earthquake was so severe | An international team of scientists has found evidence suggesting the dehydration of minerals deep below the ocean floor influenced the severity of the Sumatra earthquake, which took place on December 26, 2004. | The earthquake, measuring magnitude 9.2, and the subsequent tsunami, devastated coastal communities of the Indian Ocean, killing over 250,000 people.Research into the earthquake was conducted during a scientific ocean drilling expedition to the region in 2016, as part of the International Ocean Discovery Program (IODP), led by scientists from the University of Southampton and Colorado School of Mines.During the expedition on board the research vessel JOIDES Resolution, the researchers sampled, for the first time, sediments and rocks from the oceanic tectonic plate which feeds the Sumatra subduction zone. A subduction zone is an area where two of the Earth's tectonic plates converge, one sliding beneath the other, generating the largest earthquakes on Earth, many with destructive tsunamis.Findings of a study on sediment samples found far below the seabed are now detailed in a new paper led by Dr Andre Hüpers of the MARUM-Center for Marine Environmental Sciences at University of Bremen - published in the journal Expedition co-leader Professor Lisa McNeill, of the University of Southampton, says: "The 2004 Indian Ocean tsunami was triggered by an unusually strong earthquake with an extensive rupture area. We wanted to find out what caused such a large earthquake and tsunami and what this might mean for other regions with similar geological properties."The scientists concentrated their research on a process of dehydration of sedimentary minerals deep below the ground, which usually occurs within the subduction zone. It is believed this dehydration process, which is influenced by the temperature and composition of the sediments, normally controls the location and extent of slip between the plates, and therefore the severity of an earthquake.In Sumatra, the team used the latest advances in ocean drilling to extract samples from 1.5 km below the seabed. They then took measurements of sediment composition and chemical, thermal, and physical properties and ran simulations to calculate how the sediments and rock would behave once they had travelled 250 km to the east towards the subduction zone, and been buried significantly deeper, reaching higher temperatures.The researchers found that the sediments on the ocean floor, eroded from the Himalayan mountain range and Tibetan Plateau and transported thousands of kilometres by rivers on land and in the ocean, are thick enough to reach high temperatures and to drive the dehydration process to completion before the sediments reach the subduction zone. This creates unusually strong material, allowing earthquake slip at the subduction fault surface to shallower depths and over a larger fault area - causing the exceptionally strong earthquake seen in 2004.Dr Andre Hüpers of the University of Bremen says: "Our findings explain the extent of the large rupture area, which was a feature of the 2004 earthquake, and suggest that other subduction zones with thick and hotter sediment and rocks, could also experience this phenomenon."This will be particularly important for subduction zones with limited or no historic subduction earthquakes, where the hazard potential is not well known. Subduction zone earthquakes typically have a return time of a few hundred to a thousand years. Therefore our knowledge of previous earthquakes in some subduction zones can be very limited."Similar subduction zones exist in the Caribbean (Lesser Antilles), off Iran and Pakistan (Makran), and off western USA and Canada (Cascadia). The team will continue research on the samples and data obtained from the Sumatra drilling expedition over the next few years, including laboratory experiments and further numerical simulations, and they will use their results to assess the potential future hazards both in Sumatra and at these comparable subduction zones. | Earthquakes | 2,017 |
May 25, 2017 | https://www.sciencedaily.com/releases/2017/05/170525141544.htm | US nuclear regulators greatly underestimate potential for nuclear disaster | The U.S. Nuclear Regulatory Commission (NRC) relied on faulty analysis to justify its refusal to adopt a critical measure for protecting Americans from the occurrence of a catastrophic nuclear-waste fire at any one of dozens of reactor sites around the country, according to an article in the May 26 issue of | Published by researchers from Princeton University and the Union of Concerned Scientists, the article argues that NRC inaction leaves the public at high risk from fires in spent-nuclear-fuel cooling pools at reactor sites. The pools -- water-filled basins that store and cool used radioactive fuel rods -- are so densely packed with nuclear waste that a fire could release enough radioactive material to contaminate an area twice the size of New Jersey. On average, radioactivity from such an accident could force approximately 8 million people to relocate and result in $2 trillion in damages.These catastrophic consequences, which could be triggered by a large earthquake or a terrorist attack, could be largely avoided by regulatory measures that the NRC refuses to implement. Using a biased regulatory analysis, the agency excluded the possibility of an act of terrorism as well as the potential for damage from a fire beyond 50 miles of a plant. Failing to account for these and other factors led the NRC to significantly underestimate the destruction such a disaster could cause."The NRC has been pressured by the nuclear industry, directly and through Congress, to low-ball the potential consequences of a fire because of concerns that increased costs could result in shutting down more nuclear power plants," said paper co-author Frank von Hippel, a senior research physicist at Princeton's Program on Science and Global Security (SGS), based at the Woodrow Wilson School of Public and International Affairs. "Unfortunately, if there is no public outcry about this dangerous situation, the NRC will continue to bend to the industry's wishes."Von Hippel's co-authors are Michael Schoeppner, a former postdoctoral researcher at Princeton's SGS, and Edwin Lyman, a senior scientist at the Union of Concerned Scientists.Spent-fuel pools were brought into the spotlight following the March 2011 nuclear disaster in Fukushima, Japan. A 9.0-magnitude earthquake caused a tsunami that struck the Fukushima Daiichi nuclear power plant, disabling the electrical systems necessary for cooling the reactor cores. This led to core meltdowns at three of the six reactors at the facility, hydrogen explosions, and a release of radioactive material."The Fukushima accident could have been a hundred times worse had there been a loss of the water covering the spent fuel in pools associated with each reactor," von Hippel said. "That almost happened at Fukushima in Unit 4."In the aftermath of the Fukushima disaster, the NRC considered proposals for new safety requirements at U.S. plants. One was a measure prohibiting plant owners from densely packing spent-fuel pools, requiring them to expedite transfer of all spent fuel that has cooled in pools for at least five years to dry storage casks, which are inherently safer. Densely packed pools are highly vulnerable to catching fire and releasing huge amounts of radioactive material into the atmosphere.The NRC analysis found that a fire in a spent-fuel pool at an average nuclear reactor site would cause $125 billion in damages, while expedited transfer of spent fuel to dry casks could reduce radioactive releases from pool fires by 99 percent. However, the agency decided the possibility of such a fire is so unlikely that it could not justify requiring plant owners to pay the estimated cost of $50 million per pool.The NRC cost-benefit analysis assumed there would be no consequences from radioactive contamination beyond 50 miles from a fire. It also assumed that all contaminated areas could be effectively cleaned up within a year. Both of these assumptions are inconsistent with experience after the Chernobyl and Fukushima accidents.In two previous articles, von Hippel and Schoeppner released figures that correct for these and other errors and omissions. They found that millions of residents in surrounding communities would have to relocate for years, resulting in total damages of $2 trillion -- nearly 20 times the NRC's result. Considering the nuclear industry is only legally liable for $13.6 billion, thanks to the Price Anderson Act of 1957, U.S. taxpayers would have to cover the remaining costs.The authors point out that if the NRC does not take action to reduce this danger, Congress has the authority to fix the problem. Moreover, the authors suggest that states that provide subsidies to uneconomical nuclear reactors within their borders could also play a constructive role by making those subsidies available only for plants that agreed to carry out expedited transfer of spent fuel."In far too many instances, the NRC has used flawed analysis to justify inaction, leaving millions of Americans at risk of a radiological release that could contaminate their homes and destroy their livelihoods," said Lyman. "It is time for the NRC to employ sound science and common-sense policy judgments in its decision-making process." | Earthquakes | 2,017 |
May 24, 2017 | https://www.sciencedaily.com/releases/2017/05/170524152628.htm | The birth and death of a tectonic plate | Several hundred miles off the Pacific Northwest coast, a small tectonic plate called the Juan de Fuca is slowly sliding under the North American continent. This subduction has created a collision zone with the potential to generate huge earthquakes and accompanying tsunamis, which happen when faulted rock abruptly shoves the ocean out of its way. | In fact, this region represents the single greatest geophysical hazard to the continental United States; quakes centered here could register as hundreds of times more damaging than even a big temblor on the San Andreas Fault. Not surprisingly, scientists are interested in understanding as much as they can about the Juan de Fuca Plate.This microplate is "born" just 300 miles off the coast, at a long range of underwater volcanoes that produce new crust from melt generated deep below. Part of the global mid-ocean ridge system that encircles the planet, these regions generate 70 percent of Earth's tectonic plates. However, because the chains of volcanoes lie more than a mile beneath the sea surface, scientists know surprisingly little about them.UC Santa Barbara geophysicist Zachary Eilon and his co-author Geoff Abers at Cornell University have conducted new research -- using a novel measurement technique -- that has revealed a strong signal of seismic attenuation or energy loss at the mid-ocean ridge where the Juan de Fuca Plate is created. The researchers' attenuation data imply that molten rock here is found even deeper within Earth than scientists had previously thought. This in turn helps scientists understand the processes by which Earth's tectonic plates are built, as well as the deep plumbing of volcanic systems. The results of the work appear in the journal "We've never had the ability to measure attenuation this way at a mid-ocean ridge before, and the magnitude of the signal tells us that it can't be explained by shallow structure," said Eilon, an assistant professor in UCSB's Department of Earth Science. "Whatever is down there causing all this seismic energy to be lost extends really deep, at least 200 kilometers beneath the surface. That's unexpected, because we think of the processes that give rise to this -- particularly the effect of melting beneath the surface -- as being shallow, confined to 60 km or less."According to Eilon's calculations, the narrow strip underneath the mid-ocean ridge, where hot rock wells up to generate the Juan de Fuca Plate, has very high attenuation. In fact, its levels are as high as scientists have seen anywhere on the planet. His findings also suggest that the plate is cooling faster than expected, which affects the friction at the collision zone and the resulting size of any potential megaquake.Seismic waves begin at an earthquake and radiate away from it. As they disperse, they lose energy. Some of that loss is simply due to spreading out, but another parameter also affects energy loss. Called the quality factor, it essentially describes how squishy Earth is, Eilon said. He used the analogy of a bell to explain how the quality factor works."If I were to give you a well-made bell and you were to strike it once, it would ring for a long time," he explained. "That's because very little of the energy is actually being lost with each oscillation as the bell rings. That's very low attenuation, very high quality. But if I give you a poorly made bell and you strike it once, the oscillations will die out very quickly. That's high attenuation, low quality."Eilon looked at the way different frequencies of seismic waves attenuated at different rates. "We looked not only at how much energy is lost but also at the different amounts by which various frequencies are delayed," he explained. "This new, more robust way of measuring attenuation is a breakthrough that can be applied in other systems around the world."Attenuation is a very hard thing to measure, which is why a lot of people ignore it," Eilon added. "But it gives us a huge amount of new information about Earth's interior that we wouldn't have otherwise."Next year, Eilon will be part of an international effort to instrument large unexplored swaths of the Pacific with ocean bottom seismometers. Once that data has been collected, he will apply the techniques he developed on the Juan de Fuca in the hope of learning more about what lies beneath the seafloor in the old oceans, where mysterious undulations in Earth's gravity field have been measured."These new ocean bottom data, which are really coming out of technological advances in the instrumentation community, will give us new abilities to see through the ocean floor," Eilon said. "This is huge because 70 percent of Earth's surface is covered by water and we've largely been blind to it -- until now."The Pacific Northwest project was an incredibly ambitious community experiment," he said. "Just imagine the sort of things we'll find out once we start to put these instruments in other places." | Earthquakes | 2,017 |
May 24, 2017 | https://www.sciencedaily.com/releases/2017/05/170524085528.htm | Atlas of the Human Planet 2017: How exposed are we to natural hazards? | One out of three people in the world is exposed to earthquakes, a number which almost doubled in the past 40 years. Around 1 billion in 155 countries are exposed to floods and 414 million live near one of the 220 most dangerous volcanoes. The 2017 edition of the Atlas of the Human Planet, by the European Commission's Joint Research Centre, looks at the exposure of people and built-up areas to the six major natural hazards, and its evolution over the last 40 years. The atlas will be presented during the 2017 Global Platform for Disaster Risk Reduction meeting in Cancun, Mexico. | The atlas covers six major natural hazards: earthquakes, volcanos, tsunamis, tropical cyclone winds, tropical cyclone storm surge and floods. Global exposure to these hazards has doubled between 1975 and 2015, mostly due to urbanisation, population growth and socioeconomic development. Some of the hazards pose a threat to a particularly large number of people in different regions of the world.Off all hazards, the largest number of people are exposed to earthquakes. The number of people living in seismic areas increased by 93% in 40 years (from 1.4 billion in 1975 to 2.7 billion in 2015). In 2015, more than 400 million people lived near one of the 220 most dangerous volcanoes, exposed to the consequences of possible eruptions. Tsunamis affect coastal areas in many regions, with dangerous areas more concentrated in Asia. The highest amount of built-up surface exposed to tsunamis is in Japan by far, followed by China and the United States of America. Its population is four times more exposed than that of China, the second most affected country.More than 170 million people in Europe are potentially exposed to earthquakes, almost a quarter of the total population. In Italy, Romania, or Greece the share of exposed over total population reaches over 80%.Flooding is the most common of the hazards studied. Germany has the highest number of people exposed to floods, about 8 million (10% of the national population), followed by France with 5.7 million (9%).Eleven million Europeans live within 100 km from an active volcano, which eruptions could affect not only housing and settlements, but also everyday activities, including transportation. The potentially exposed built-up surface increased by 86% from 1975.Exposure to floods, the most frequent natural disaster, is highest in Asia (76.9% of the global exposed population) and in Africa (12.2%). The world population potentially exposed to flood is around 1 billion in 155 countries in 2015. 11% of the area built-up on Earth is potentially exposed to this hazard, too.Tropical cyclone winds pose a threat to 1.6 billion people in 89 countries, up from 1 billion in 1975. In 2015, 640 million people were exposed to extremely strong cyclone winds, with the largest built-up surface exposed to strong cyclone winds found in China and Japan. Furthermore, 50 million Chinese are exposed to storm surge as consequence of tropical cyclones, up by almost 20 million in the last 40 years.Global analysis of exposure and its development over the last 40 years helps us better understand what affects disaster risk over time and risk drivers. It is also useful in identifying effective policy actions for more resilient communities.The exposure data and the findings of the Atlas support the implementation of the post-2015 international frameworks: the UN Framework Convention on Climate Change, the Sendai Framework for Disaster Risk Reduction 2015-2030, the Sustainable Development Goals (SDGs), and the New Urban Agenda (Habitat III). The Global Human Settlement Layer (GHSL, see below) baseline data provides insights into developments over the last 40 years and into the impact that policies have on them. Researchers and policy makers can also use the data to aggregate exposure information at all geographical scales, from the city level to the region, continent and global.The Atlas of the Human Planet 2017 builds on its first edition published in 2016, in which JRC scientists combined earth observation with spatial modelling techniques to create the Global Human Settlement Layer (GHSL). The GHSL is the first global, fine scale, multi-temporal and open data on the physical characteristics and the dynamics of human settlements, covering 40 years of satellite observations data. The GHSL dataset has been now combined with the best available global hazard maps to measure the potential exposure to natural hazards over time.At the UN Global Platform for Disaster Risk Reduction, the JRC will present also the report 'Science for Disaster Risk Management 2017: knowing more and losing less', a flagship product of the European Commission's Disaster Risk Management Knowledge Centre (DRMKC), compiling the state-of-the-art in disaster risk management. | Earthquakes | 2,017 |
May 23, 2017 | https://www.sciencedaily.com/releases/2017/05/170523081933.htm | Supercomputing helps researchers understand Earth's interior | Contrary to posters you may have seen hanging on the walls in science buildings and classrooms, Lijun Liu, professor of geology at Illinois, knows that Earth's interior is not like an onion. | While most textbooks demonstrate the outer surface of the Earth as the crust, the next inner level as the mantle, and then the most inner layer as the core, Liu said the reality isn't as clear-cut."It's not just in layers, because the Earth's interior is not stationary," Liu said.In fact, underneath our feet there's tectonic activity that many scientists have been aware of, but Liu and his team have created a computer model to help better explain it -- a model so effective that researchers believe it has the potential to predict where earthquakes and volcanoes will occur.Using this model, Liu, along with doctoral student Jiashun Hu, and Manuele Faccenda from the University of Padua in Italy, recently published a research paper in the journal of "It's well-known that there are plate tectonics driving the Earth's evolution, but exactly how this process works is not entirely clear," he said.Liu and Hu looked specifically at the continent of South America to determine which tectonic factors contribute to the deformation, or the evolution, of the mantle.To answer this question, the team created a data-centric model using the Blue Waters supercomputer at the National Center for Supercomputing Applications at Illinois. The sophisticated four-dimensional data-oriented geodynamic models are among the first of their kind."We are actually the first ones to use data assimilation models in studying mantle deformation, in an approach similar to weather forecasting," Liu said. "We are trying to produce a system model that simultaneously satisfies all the observations we have. We can then obtain a better understanding about dynamic processes of the Earth evolution."While there are many debates in regards to how the Earth's internal evolution is driven, the model created by the team seemed to find an answer that better fits available observations and underlying physics. The team found that the subducting slab -- a portion of the oceanic plate that slides beneath a continental plate -- is the dominant driving force behind the deformation of the mantle.Essentially, the active subduction of the slab determines most other processes that happen as part of a chain reaction. "The result is game-changing. The driving force of mantle flow is actually simpler than people thought," Liu said. "It is the most direct consequence of plate tectonics. When the slab subducts, it naturally controls everything surrounding it. In a way this is elegant, because it's simple."By understanding this mechanism of Earth evolution, the team can make better predictions regarding the movement of the mantle and the lithosphere, or crust.The team then evaluated the model's predictions using other data. Hu, the lead author on the paper, said that by comparing the predictions to tectonic activities such as the formation of mountains and volcanoes, a clear consistency emerged."We think our story is correct," Hu said.Consequently, the model also provides interesting insight on the evolution of continents as far back as the Jurassic, when dinosaurs roamed the Earth on Pangaea, the only continent at the time. This is still the team's ongoing research.Liu said that in a separate paper that uses the same simulation, published by Liu and Hu in "We found that whenever you see a lack of earthquakes in a region, it corresponds to a hole in the slab," Liu said. "Because of the missing slab in the hole, there's no way to generate earthquakes, so we might be able to know where more earthquakes will take place."The model also explained why certain volcanoes might exist further inland and have different compositions, despite the common thought that volcanoes should exist solely along the coast, as a result of water coming off the down-going slab. As the model helps explain, a volcano can form inland if the slab subducts at a shallower angle, and a hole in the shallow slab allows for a special type of magma to form by melting of the crust."Ultimately this model will provide a promising way of solving the question of how and why continents move the way they do," Liu said. "The answer should depend on what the mantle is doing. This is a way to much better understand Earth evolution."The team is currently expanding the model to analyze the entire globe."We are looking forward to more exciting results," Liu said. | Earthquakes | 2,017 |
May 16, 2017 | https://www.sciencedaily.com/releases/2017/05/170516143417.htm | Aftermath of supereruption shows Toba magma system's great size | The rare but spectacular eruptions of supervolcanoes can cause massive destruction and affect climate patterns on a global scale for decades -- and a new study has found that these sites also may experience ongoing, albeit smaller eruptions for tens of thousands of years after. | In fact, Oregon State University researchers were able to link recent eruptions at Mt. Sinabung in northern Sumatra to the last eruption on Earth of a supervolcano 74,000 years ago at the Toba Caldera some 25 miles away.The findings are being reported this week in the journal "The recovery from a supervolcanic eruption is a long process, as the volcano and the magmatic system try to re-establish equilibrium -- like a body of water that has been disrupted by a rock being dropped into it," said Adonara Mucek, an Oregon State doctoral candidate and lead author on the study."At Toba, it appears that the eruptions continued for at least 15,000 to 20,000 years after the supereruption and the structural adjustment continued at least until a few centuries ago -- and probably is continuing today. It is the magmatic equivalent to aftershocks following an earthquake."This is the first time that scientists have been able to pinpoint what happens following the eruption of a supervolcano. To qualify as a supervolcano, the eruption must reach at least magnitude 8 on the Volcano Explosivity Index, which means the measured deposits for that eruption are greater than 1,000 cubic kilometers, or 240 cubic miles.When Toba erupted, it emitted a volume of magma 28,000 times greater than that of the 1980 eruption of Mount St. Helens in Washington state. It was so massive, it is thought to have created a volcanic winter on Earth lasting years, and possibly triggering a bottleneck in human evolution.Other well-known supervolcano sites include Yellowstone Park in the United States, Taupo Caldera in New Zealand, and Campi Flegrei in Italy."Supervolcanoes have lifetimes of millions of years during which there can be several supereruptions," said Shanaka "Shan" de Silva, an Oregon State University volcanologist and co-author on the study. "Between those eruptions, they don't die. Scientists have long suspected that eruptions continue after the initial eruption, but this is the first time we've been able to put accurate ages with those eruptions."Previous argon dating studies had provided rough ages of eruptions at Toba, but those eruption dates had too much range of error, the researchers say. In their study, the OSU researchers and their colleagues from Australia, Germany, the United States and Indonesia were able to decipher the most recent volcanic history of Toba by measuring the amount of helium remaining in zircon crystals in erupted pumice and lava.The helium remaining in the crystals is a remnant of the decaying process of uranium, which has a well-understood radioactive decay path and half-life."Toba is at least 1.3 million years old, its supereruption took place about 74,000 years ago, and it had at least six definitive eruptions after that -- and probably several more," Mucek said. "The last eruption we have detected occurred about 56,000 years ago, but there are other eruptions that remain to be studied."The researchers also managed to estimate the history of structural adjustment at Toba using carbon-14 dating of lake sediment that has been uplifted up to 600 meters above the lake in which they formed. These data show that structural adjustment continued from at least 30,000 years ago until 2,000 years ago -- and may be continuing today.The study also found that the magma in Toba's system has an identical chemical fingerprint and zircon crystallization history to Mt. Sinabung, which is currently erupting and is distinct from other volcanoes in Sumatra. This suggests that the Toba system may be larger and more widespread than previously thought, de Silva noted."Our data suggest that the recent and ongoing eruptions of Mt. Sinabung are part of the Toba system's recovery process from the supereruption," he said.The discovery of the connection does not suggest that the Toba Caldera is in danger of erupting on a catastrophic scale any time soon, the researchers emphasized. "This is probably 'business as usual' for a recovering supervolcano," de Silva said. It does emphasize the importance of having more sophisticated and frequent monitoring of the site to measure the uplift of the ground and image the magma system, the researchers note."The hazards from a supervolcano don't stop after the initial eruption," de Silva said. "They change to more local and regional hazards from eruptions, earthquakes, landslides and tsunamis that may continue regularly for several tens of thousands of years."Toba remains alive and active today."As large as the Toba eruption was, the reservoir of magma below the caldera is much, much greater, the researchers say. Studies at other calderas around Earth, such as Yellowstone, have estimated that there is between 10 and 50 times as much magma than is erupted during a supereruption. | Earthquakes | 2,017 |
May 16, 2017 | https://www.sciencedaily.com/releases/2017/05/170516104721.htm | From where will the next big earthquake hit the city of Istanbul? | The city of Istanbul is focus of great concern for earthquake researchers. This 15-million metropole is situated very close to the so-called North Anatolian Fault Zone which runs just outside of the city gates below the Marmara Sea. Here in the underground there is a constant build-up of energy which results from an interlocking of the tectonic plates causing plate movement to come to a halt until a great tremor releases this energy. Scientists, therefore, reckon with an earthquake with a magnitude of 7 or greater in this region in the coming years. | The extent of such seismic threat to this Turkish city of Istanbul actually depends on how strongly the plates are entangled and on the exact nucleation point of the earthquake. A team led by Marco Bohnhoff from the GFZ German Research Centre for Geoscience now presents a study indicating that the next major earthquake is more likely to originate in Istanbul's eastern Marmara Sea. "This is both good news and bad news for the city with over 15 million inhabitants. The good news: "The rupture propagation will then run eastwards i.e. away from the city," explains the researcher. "The bad news is that there will only be a very short early warning phase of a few seconds." Early warning times are extremely important in order to switch traffic lights to red, to block tunnels and bridges or to shut down critical infrastructure. The research results are now published in the scientific journal "The estimations presented by Marco Bohnhoff and his team are based on the analysis of numerous small quakes along the Marmara fault. Results have shown that the degree of locking in the western part of the fracture zone is lower and that the two tectonic plates are creeping past one another at a very slow rate. During this process small tremors of the same signature, so-called "repeaters," occur at distinct recurrence times. Further east, close to Istanbul, however, repeaters have not been observed and the tectonic plates appear to be completely locked here. This leads to a build-up of tectonic energy and increases the probability of a large earthquake there.Such observations were possible due to a new high-precision seismicity catalog for the region. For this purpose, the researchers have thoroughly evaluated the earthquake activity by combining the two major Turkish Earthquake Measurement Networks with measurement data from the GFZ Plate Boarder Observatory within the framework of a German-Turkish cooperation project. "In this way we have found recurring earthquakes below the western Marmara Sea" says Bohnhoff. "From this we deduce that below the western Marmara Sea the two tectonic plates (for the most part -- 25-75%) are moving slowly past each other thus accumulating less energy than if they were completely locked."And what will happen if it actually comes to the feared strong earthquake below the western Marmara Sea? "In such a case there would likewise be good news and bad news," says Bohnhoff. Good would be a somewhat longer early warning period, bad would be the fact that the rupture propagation would then take place in the direction of Istanbul resulting in more severe ground shaking than if the origin was further east. However, the current data obtained suggests the opposite: an earthquake with an epicenter at the gates of the city, which would allow the people only very little time to find protection, but which would trigger less powerful ground movements. | Earthquakes | 2,017 |
May 16, 2017 | https://www.sciencedaily.com/releases/2017/05/170516091139.htm | Porewater salinity: Key to reconstructing 250,000 years of Lake Van’s history | The sediments of Lake Van in Eastern Anatolia (Turkey) are a valuable climate archive. Now, using the salinity measured in sediment porewater, scientists have reconstructed the huge lake-level fluctuations that occurred over the past 250,000 years. This approach -- based on simple physical concepts -- is likely to be more widely applied in the future. | In 2010, an international research team collected sediment cores from the bottom of Lake Van in Eastern Anatolia. Since then, scientists at Eawag and the Universities of Bern, Bonn and Istanbul have used core samples to reconstruct 600,000 years of climatic and environmental history. The sediments of Lake Van -- a waterbody seven times the size of Lake Constance, with no outflows and remaining ice‑free even during ice ages -- provide a record not only of seasonal changes but also of volcanic eruptions, earthquakes, and extended glacial and interglacial (warm) periods, as well as other environmental data. Lake Van is the world's largest soda lake. With a salinity (salt concentration) of approximately 23 grams per litre and a pH of 10, its water can be (and is, by local communities) used directly for washing.Further light has now been shed on the lake's history by a group of scientists led by Eawag and the University of Bern. Based on variations in the salinity of porewater extracted from sediment cores, major changes in lake level over the past 250,000 years have been reconstructed -- ranging from 200 metres below to 105 metres above the current lake level. In the study, published in Nature's Scientific Reports, the authors explain the essentially simple principle underlying their analysis: as the amount of salt dissolved in the lake, in absolute terms, remains constant, the salinity of the lake water is inversely proportional to the volume of water in the lake basin. Because the morphometry of the basin is known, once the water volume has been calculated, the lake level can also be determined. Two major lake-level increases (248,000 and 135,000 years ago) and one major decrease (30,000 years ago) were thus identified. Earlier lake-level changes could not be determined, as the measurements in older sediments reflect the long-term steady-state lake salinity.The lake levels estimated using porewater salinity are in agreement with other indications, such as lake terraces still visible high above the current lake surface, or erosive channels now submerged. Previously it was not possible for these structures to be precisely dated, as the lake lies within a highly active tectonic zone -- at the junction of the Eurasian, Afro-Arabian and Persian plates. When the lake was at its highest level, there must have been an outflow into the Tigris, in the southwestern part of the basin. As salt was thus exported, salinity must have declined sharply -- a finding supported by other evidence, e.g. the occurrence of unbroken shells of freshwater mussels in lake terraces.As the water balance of Lake Van is mainly controlled by river discharge and evaporation, the lake‑level reconstruction also provides insights into past precipitation regimes in the lake catchment. Not far from the lake is Mount Ararat, where Noah's Ark is supposed to have come to rest after the biblical flood. First author Yama Tomonaga comments: "We cannot, of course, identify individual wet years or decades, but changes that lasted for ten thousand years are evident."Unfortunately, Tomonaga adds, no conclusions can be drawn from the study as to whether the lake level is likely to rise or fall in the future. The authorities in Van would welcome a projection, as the lake level has risen by around two metres since 1960 (see chart). However, Tomonaga emphasizes the fascinating prospect that the straightforward methodology and the model developed for this study could also be applied to other landlocked salt lakes, such as the Caspian Sea. Indeed, if the ocean is regarded as the Earth's largest closed-basin system, the model offers a completely new approach for estimating -- at least in an order-of-magnitude approximation -the oceanic salinity balance and average global sedimentation rate. | Earthquakes | 2,017 |
May 12, 2017 | https://www.sciencedaily.com/releases/2017/05/170512093940.htm | What goes down, must come up: Stirring things up in the Earth's mantle | New insights into the convection patterns of the Earth's mantle and its chemical makeup have been revealed by a researcher from the University of Leicester. | The new findings suggest that the mantle does not flow ubiquitously, as has been previously thought -- and that it is instead divided into two very large domains that convect only within themselves, with little evidence of them mixing together.The research, led by Dr Tiffany Barry from the University of Leicester, Department of Geology and published in the journal Scientific Reports, The research suggests that upper mantle material flows to lower parts of the mantle when it reaches a subduction zone, where one tectonic plate descends beneath another one.This descending slab of material acts as a sort of curtain, preventing upper mantle material mixing all the way around the globe and keeping the two domains separate.Dr Barry explained: "One of the ways our planet is unique is in the amazing way it has mobile plates at the surface that move and jostle about over time. This movement of the plates results in the process we call plate tectonics, and no other planet we know shows evidence of this process. Why or how plate tectonics started on this planet is not understood, but it has been utterly essential in the production of the crust and oceans that we recognise as Earth today. What is also not well constrained is what effect plate tectonics has on the internal workings of the Earth."We have found that when mantle material reaches the bottom of the mantle, at the outer core, it does not spread out and go anywhere around the core, but instead returns to the same hemisphere of the globe from where it came. We have modelled this dominantly up-down motion of convection and found that it can persist for 100's millions of years."On the basis of past plate motions and geochemical evidence, we speculate that this process of mantle convection could have been a dominant process since at least 550 million years ago, and potentially since the start of plate tectonics."The researchers combined spherical numerical computer models (3D finite element modelling) with the best available reconstructions of how Earth's plates have moved over the past 200 million years to track mathematical particles placed at different depths of the modelled mantle.With these models they examined where the mantle freely moves to during the history of plates moving around at the surface. Having tracked where particles flow in the models, the team then examined chemical isotope evidence from past ocean basins, which are a good analogy for the composition of the upper mantle in the past.With this data they were able to test whether former ocean basins, that are no longer present, had the same or different composition to subsequent basins that formed geographically in the same region of the globe.Dr Barry added: "I'm incredibly excited by this work; it has been a research question I've been pondering for nearly two decades. It feels like a real privilege to have been able to piece together a robust and convincing model that can explain the feature of the chemical differences in ocean floor crust."This new research overturns our understanding of how the inside of the earth convects and stirs, and how it is divided up, and for the first time explains observations that were first noted in the late 1980s." | Earthquakes | 2,017 |
May 2, 2017 | https://www.sciencedaily.com/releases/2017/05/170502142403.htm | Wastewater injection rates may have been key to Oklahoma's largest earthquake | Changes to the rate of wastewater injection in disposal wells may have contributed to conditions that led to last year's Pawnee earthquake in Oklahoma, according to a new report published May 3 as part of a focus section in | The magnitude 5.8 Pawnee quake, felt widely across Oklahoma, is the largest earthquake recorded in the state since the 1950s. Most Oklahoma earthquakes since 2009 are thought to have been triggered by wastewater produced by oil and gas drilling that has been injected back into the ground. The Pawnee earthquake occurred in a region with active wastewater disposal wells, and is potentially the largest such induced earthquake to have occurred in Oklahoma so far, write University of Oklahoma seismologists Xiaowei Chen and Norimitsu Nakata in their preface to the section.When Andrew Barbour of the U.S. Geological Survey and colleagues examined new injection data from nearby disposal wells in Osage County, they found a significant increase in injection rates in the years leading up to the Pawnee mainshock. Some wells injected wastewater at a constant rate, while others injected the water at a variable rate. The overall injected volume was roughly the same between these two types of wells.Barbour and colleagues' models of injection indicate, however, that it may have been the variable-rate wells that were most important for the Pawnee event. Their findings suggest that "long-term injection may have been responsible for a gradual loading of the fault to the point where it primed the fault for failure triggered by the short-term high-rate injection..." the authors write. They note that in the absence of these variable rate injections, however, the fault may have still failed at a much later time. | Earthquakes | 2,017 |
May 2, 2017 | https://www.sciencedaily.com/releases/2017/05/170502121249.htm | Geologists use radioactive clock to document longest earthquake record | Using radioactive elements trapped in crystallized, cream-colored "veins" in New Mexican rock, geologists have peered back in time more than 400,000 years to illuminate a record of earthquakes along the Loma Blanca fault in the Rio Grande rift. | It is the longest record of earthquakes ever documented on a fault, showing 13 distinct seismic events -- nine of which occurred at regular intervals averaging 40,000-to-50,000 years and four that clustered together just five-to-11,000 years apart.The work, described in a study published last week in the "We weren't expecting any of this," Goodwin says. "It's been quite the odyssey for us."The researchers initially set out to better understand the background, or default, earthquake activity along the 14-mile-long Loma Blanca fault, south of Albuquerque. An intraplate fault like this generally produces earthquakes much less frequently than those at the boundaries of tectonic plates, like California's San Andreas Fault, and tends to be less well understood by geologists. However, some intraplate faults have experienced increased seismicity in recent years, likely due to deep injection wells used for wastewater disposal.In places like Texas, Oklahoma and Ohio, Williams explains, earthquake activity has been linked to high-pressure wastewater injected far beneath Earth's surface. In part to better understand these human-driven events, the researchers wanted to get a handle on what mechanical factors control natural earthquakes and how often a given fault slips to cause one."We can't predict an exact date for when they will occur, and it's unlikely that we ever will," Goodwin explains, "but we want to understand what is driving them so we can better prepare."To look for answers, Williams began to examine "veins" made of the mineral calcite that streak segments of rock along the fault. Calcite precipitates out of pressurized fluids that travel through rock near faults during some earthquakes and gets deposited in layers, like rings of a tree. During subsequent earthquakes, the calcite fractures and heals, leaving a distinct signature like old broken bones.Williams looked at the radioactive elements uranium and thorium trapped in these calcite crystals, using them as a kind of clock based on the rate at which uranium decays into thorium. He and the rest of the research team, which includes Warren Sharp from the Berkeley Geochronology Center and Peter Mozley of the New Mexico Institute of Mining and Technology, could measure the age of each "generation" of calcite found in the veins and determine when earthquakes occurred relative to one another. The magnitude of these earthquakes at Loma Blanca has been estimated to be between 6.2 and 6.9, by analogy with more recent events.The team showed that earthquakes on the fault were controlled by two different processes. Earthquakes that occurred at regular intervals were the result of accumulated stress that eventually caused the fault to fail every 40,000 years or so but were not driven by fluids. However, the unusual cluster of earthquakes that occurred roughly 430,000 years ago was the result of an increase in fluid pressure deep beneath the surface. Increases in fluid pressure in Earth can decrease the friction between the two sides of a fault, leading to easier sliding -- like the pressurized air on an air hockey table.The team also showed that calcite associated with two seismic events in the earthquake cluster indicates very rapid carbon dioxide degassing, which can occur when fluid under high pressure is released -- like opening the top of a shaken bottle of soda."When pore pressure increases far enough over the background level, the fault fails and cracks form, releasing fluid from the basin," Goodwin explains. Williams says that injected wastewater is likely to increase pressure at a faster rate than most faults have experienced in the geologic past.The findings also help contribute to a longstanding question in geology regarding the mechanics of earthquakes in intraplate faults and whether they occur periodically or randomly in time.Today, Williams is working to improve the methods the team used. "We want to understand how the calcite-dating method can be used to contribute to documenting the seismic record of other faults," he says. | Earthquakes | 2,017 |
May 1, 2017 | https://www.sciencedaily.com/releases/2017/05/170501131647.htm | Earthquakes can make thrust faults open violently and snap shut | It is a common trope in disaster movies: an earthquake strikes, causing the ground to rip open and swallow people and cars whole. The gaping earth might make for cinematic drama, but earthquake scientists have long held that it does not happen. | Except, it can, according to new experimental research from Caltech.The work, appearing in the journal Thrust faults have been the site of some of the world's largest quakes, such as the 2011 Tohoku earthquake off the coast of Japan, which damaged the Fukushima nuclear power plant. They occur in weak areas of the earth's crust where one slab of rock compresses against another, sliding up and over it during an earthquake.A team of engineers and scientists from Caltech and École normale supérieure (ENS) in Paris have discovered that fast ruptures propagating up toward the earth's surface along a thrust fault can cause one side of a fault to twist away from the other, opening up a gap of up to a few meters that then snaps shut.Thrust fault earthquakes generally occur when two slabs of rock press against one another, and pressure overcomes the friction holding them in place. It has long been assumed that, at shallow depths the plates would just slide against one another for a short distance, without opening.However, researchers investigating the Tohoku earthquake found that not only did the fault slip at shallow depths, it did so by up to 50 meters in some places. That huge motion, which occurred just offshore, triggered a tsunami that caused damage to facilities along the coast of Japan, including at the Fukushima Daiichi Nuclear Power Plant.In the Nature paper, the team hypothesizes that the Tohoku earthquake rupture propagated up the fault and -- once it neared the surface -- caused one slab of rock to twist away from another, opening a gap and momentarily removing any friction between the two walls. This allowed the fault to slip 50 meters.That opening of the fault was supposed to be impossible."This is actually built into most computer models of earthquakes right now. The models have been programed in a way that dictates that the walls of the fault cannot separate from one another," says Ares Rosakis, Theodore von Kármán Professor of Aeronautics and Mechanical Engineering at Caltech and one of the senior authors of the Nature paper. "The findings demonstrate the value of experimentation and observation. Computer models can only be as realistic as their built-in assumptions allow them to be."The international team discovered the twisting phenomenon by simulating an earthquake in a Caltech facility that has been unofficially dubbed the "Seismological Wind Tunnel." The facility started as a collaboration between Rosakis, an engineer studying how materials fail, and Hiroo Kanamori, a seismologist exploring the physics of earthquakes and a coauthor of the Nature study. "The Caltech research environment helped us a great deal to have close collaboration across different scientific disciplines," Kanamori said. "We seismologists have benefited a great deal from collaboration with Professor Rosakis's group, because it is often very difficult to perform experiments to test our ideas in seismology."At the facility, researchers use advanced high-speed optical diagnostics to study how earthquake ruptures occur. To simulate a thrust fault earthquake in the lab, the researchers first cut in half a transparent block of plastic that has mechanical properties similar to that of rock. They then put the broken pieces back together under pressure, simulating the tectonic load of a fault line. Next, they place a small nickel-chromium wire fuse at the location where they want the epicenter of the quake to be. When they set off the fuse, the friction at the fuse's location is reduced, allowing a very fast rupture to propagate up the miniature fault. The material is photoelastic, meaning that it visually shows -- through light interference as it travels in the clear material -- the propagation of stress waves. The simulated quake is recorded using high-speed cameras and the resulting motion is captured by laser velocimeters (particle speed sensors)."This is a great example of collaboration between seismologists, tectonisists and engineers. And not to put too fine a point on it, US/French collaboration," says Harsha Bhat, coauthor of the paper and a research scientist at ENS. Bhat was previously a postdoctoral researcher at Caltech.The team was surprised to see that, as the rupture hit the surface, the fault twisted open and then snapped shut. Subsequent computer simulations -- with models that were modified to remove the artificial rules against the fault opening -- confirmed what the team observed experimentally: one slab can twist violently away from the other. This can happen both on land and on underwater thrust faults, meaning that this mechanism has the potential to change our understanding of how tsunamis are generated. | Earthquakes | 2,017 |
April 27, 2017 | https://www.sciencedaily.com/releases/2017/04/170427112226.htm | Winemakers lose billions of dollars every year due to natural disasters | Every year, worldwide wine industry suffers losses of more than ten billion US dollars from damaged assets, production losses, and lost profits due to extreme weather events and natural disasters. A multidisciplinary European-Australian team of researchers led by Dr. James Daniell of Karlsruhe Institute of Technology (KIT) examines the extent to which regions are affected by the risks and how climate change influences wine industry. At the 2017 Annual Conference of the European Geosciences Union (EGU) in Vienna, Daniell presented a global risk index for wine regions. | The wine regions of Mendoza and San Juan in Argentina are exposed to the highest risks due to extreme weather and natural hazards worldwide. Kakheti and Racha in Georgia come in at number 2, followed by Southern Cahul in Moldova (number 3), Northwest Slovenia (number 4), and Yaruqui in Ecuador and Nagano in Japan (number 5). These are the first results of a current worldwide study and the first release of the global risk index for wine regions presented by the head of the study, Dr. James Daniell, of the Geophysical Institute (GPI) and the Center for Disaster Management and Risk Reduction Technology (CEDIM) of KIT at the 2017 Annual Conference of the European Geosciences Union (EGU) in Vienna in the session of "Natural hazard event analyses for risk reduction and adaptation." The EGU honored Daniell by granting him the "Early Career Scientist Award in Natural Hazards for 2017." The study is carried out and the index is developed in cooperation with seismologists, meteorologists, and representatives of other disciplines from KIT, Australian National University, University of Adelaide, Griffith University, University of New South Wales, and University College London as well as Risklayer GmbH, a company located in Karlsruhe. The "WineRisk" website summarizes the results of the study and presents solutions for wine regions.The study covers more than 7,500 wine regions in 131 countries. There is no wine region in the world that is not exposed to extreme weather or natural disasters. Events, such as frost, hail, floods, heat, drought, forest fires, and bushfires as well as earthquakes make worldwide wine industry lose more than 10 billion US$ every year according to conservative estimations. These losses result from damaged assets, losses of production, and lost profit."Cold waves and frost have a large impact," James Daniell says. In the last few days, much frost occurred across Europe, with Slovakia, Bosnia, Serbia, Hungary, Austria, and Czech Republic having the worst impact. Hailstorms are one of the largest yearly natural threats to European winemakers. Traditional wine countries like France and Italy have seen huge losses in the past five years due to hail and frost, with many losses being recorded in the regions of Burgundy and Piedmont. The hail losses from 2012 to 2016 in some vineyards totaled 50 to 90 percent of the value of the crop and caused long-term damage to many old vines. It is not just Europe that is affected by hail. All over the world, winegrowing regions are affected by at least one hail event per year, which can cause damage to the single vintage or to multiple vintages depending on the growth phase of the vines. According to James Daniell, hail nets can save the crop in most cases, given a large hail event. "Cost-benefit analyses generally show that the premium wines should be the ones covered by hail nets, with insurance or other cheaper methods used for other wines."Earthquakes have the ability to knock out the infrastructure of entire wine regions for a number of years. In the past years, earthquakes struck Chile, New Zealand, and the USA, among other smaller events causing damage around the world. Over 125 million liters of wine were lost in Chile in 2010, mainly due to the failure of steel tanks. "Earthquake-resistant design could have saved many millions of liters," Daniell says. Earthquakes also cause large losses to buildings, tanks, barrels, equipment, and chemicals. Even small earthquakes do not only cause financial loss, but also historical loss by destroying tasting rooms and rare wine collections. A few dollars investment in stabilization mechanisms, such as quake wax, zip ties or bolts, can often save millions of dollars loss. In addition, natural disasters are associated with losses of jobs and tourism.Global climate change will have both positive and negative effects on wine industry, according to the study. Researchers expect a general shift of wine-growing regions southward and northward, while some wine regions closer to the equator may be lost. Many wines may indeed improve. "The English, Canadian, and Northern China wine regions will likely increase production markedly and continue to improve their market share and quality of production," predicts Dr. Daniell. The scientists expect that many wineries will master climate changes by changing grape varieties or harvest times. In addition, they will profit from new grape strains, innovative technologies to optimize production and reduce damage due to biological pathogens and insects, and new methods to overcome extreme weather events.The study also covers problems, such as bushfires causing smoke taint to vines. However, smaller-scale studies are required before the results can be included globally in the index. In addition, the effects of floods on vines are being explored. Nevertheless, a major volcanic eruption would likely cause the largest global impact to the wine industry, examples being the Laki eruption of 1783/84 or the Tambora eruption in 1815 which caused the famous "year without a summer" in 1816. Atmospheric changes, lack of sunlight, and global transport problems could cause major issues not only for the wine industry, other food security issues would likely be more important.Despite all these hazards, the wine industry continues to grow and diversify. "Through detailed natural hazard analysis, research can help winemakers and governments alike to prepare adequately for the natural hazards that they face and to reduce losses," Dr. James Daniell says. The geophysicist born in Australia also developed the CATDAT database covering socioeconomic data on natural disasters. Last year, he published CATDAT statistics, according to which 8 million people died and over 7 trillion US$ of loss were caused by natural disasters since 1900. | Earthquakes | 2,017 |
April 26, 2017 | https://www.sciencedaily.com/releases/2017/04/170426164703.htm | Tsunami formation: Study challenges long-held theory | A new NASA study is challenging a long-held theory that tsunamis form and acquire their energy mostly from vertical movement of the seafloor. | An undisputed fact was that most tsunamis result from a massive shifting of the seafloor -- usually from the subduction, or sliding, of one tectonic plate under another during an earthquake. Experiments conducted in wave tanks in the 1970s demonstrated that vertical uplift of the tank bottom could generate tsunami-like waves. In the following decade, Japanese scientists simulated horizontal seafloor displacements in a wave tank and observed that the resulting energy was negligible. This led to the current widely held view that vertical movement of the seafloor is the primary factor in tsunami generation.In 2007, Tony Song, an oceanographer at NASA's Jet Propulsion Laboratory in Pasadena, California, cast doubt on that theory after analyzing the powerful 2004 Sumatra earthquake in the Indian Ocean. Seismograph and GPS data showed that the vertical uplift of the seafloor did not produce enough energy to create a tsunami that powerful. But formulations by Song and his colleagues showed that once energy from the horizontal movement of the seafloor was factored in, all of the tsunami's energy was accounted for. Those results matched tsunami data collected from a trio of satellites -the NASA/Centre National d'Etudes Spatiales (CNES) Jason, the U.S. Navy's Geosat Follow-on and the European Space Agency's Environmental Satellite.Further research by Song on the 2004 Sumatra earthquake, using satellite data from the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE) mission, also backed up his claim that the amount of energy created by the vertical uplift of the seafloor alone was insufficient for a tsunami of that size."I had all this evidence that contradicted the conventional theory, but I needed more proof," Song said.His search for more proof rested on physics -- namely, the fact that horizontal seafloor movement creates kinetic energy, which is proportional to the depth of the ocean and the speed of the seafloor's movement. After critically evaluating the wave tank experiments of the 1980s, Song found that the tanks used did not accurately represent either of these two variables. They were too shallow to reproduce the actual ratio between ocean depth and seafloor movement that exists in a tsunami, and the wall in the tank that simulated the horizontal seafloor movement moved too slowly to replicate the actual speed at which a tectonic plate moves during an earthquake."I began to consider that those two misrepresentations were responsible for the long-accepted but misleading conclusion that horizontal movement produces only a small amount of kinetic energy," Song said.To put his theory to the test, Song and researchers from Oregon State University in Corvallis simulated the 2004 Sumatra and 2011 Tohoku earthquakes at the university's Wave Research Laboratory by using both directly measured and satellite observations as reference. Like the experiments of the 1980s, they mimicked horizontal land displacement in two different tanks by moving a vertical wall in the tank against water, but they used a piston-powered wave maker capable of generating faster speeds. They also better accounted for the ratio of how deep the water is to the amount of horizontal displacement in actual tsunamis.The new experiments illustrated that horizontal seafloor displacement contributed more than half the energy that generated the 2004 and 2011 tsunamis."From this study, we've demonstrated that we need to look at not only the vertical but also the horizontal movement of the seafloor to derive the total energy transferred to the ocean and predict a tsunami," said Solomon Yim, a professor of civil and construction engineering at Oregon State University and a co-author on the study.The finding further validates an approach developed by Song and his colleagues that uses GPS technology to detect a tsunami's size and strength for early warnings.The JPL-managed Global Differential Global Positioning System (GDGPS) is a very accurate real-time GPS processing system that can measure seafloor movement during an earthquake. As the land shifts, ground receiver stations nearer to the epicenter also shift. The stations can detect their movement every second through real-time communication with a constellation of satellites to estimate the amount and direction of horizontal and vertical land displacement that took place in the ocean. They developed computer models to incorporate that data with ocean floor topography and other information to calculate the size and direction of a tsunami."By identifying the important role of the horizontal motion of the seafloor, our GPS approach directly estimates the energy transferred by an earthquake to the ocean," Song said. "Our goal is to detect a tsunami's size before it even forms, for early warnings."The study is published in | Earthquakes | 2,017 |
April 26, 2017 | https://www.sciencedaily.com/releases/2017/04/170426131003.htm | Hard rocks from Himalaya raise flood risk for millions | Scientists have shown how earthquakes and storms in the Himalaya can increase the impact of deadly floods in one of Earth's most densely populated areas. | Large volumes of hard rock dumped into rivers by landslides can increase flood risk up to hundreds of kilometres downstream, potentially affecting millions of people, researchers say.The findings could help researchers improve flood risk maps for the Ganga Plain, a low-lying region covering parts of India, Nepal and Pakistan. They could also provide fresh insight into the long-term impacts of earthquakes and storms in the region.Until now, little was known about how landslides in the Himalaya could affect flood risk downstream on the Ganga Plain.For the first time, scientists at the University of Edinburgh have traced the path of rocks washed down from the Himalayan mountains onto the Plain.They found that large landslides in the southern, lower elevation ranges of the Himalaya are more likely to increase flood risk than those in the high mountains further north.Rocks in the south are extremely hard and travel only a short distance -- less than 20 km -- to reach the Plain. This means much of this rock -- such as quartzite -- reaches the Plain as gravel or pebbles, which can build up in rivers, altering the natural path of the water, the team says.Rocks from more northerly regions of the Himalaya tend to be softer, and the team found they often travel at least 100 km to reach the Plain. These types of rock -- including limestone and gneiss -- are gradually broken down into sand which, unlike gravel and pebbles, is dispersed widely as it travels downstream.Understanding whether landslides will produce vast quantities of gravel or sand is crucial for predicting how rivers on the Ganga Plain will be affected, researchers say.The study is published in the journal Elizabeth Dingle, PhD student in the University of Edinburgh's School of GeoSciences, who led the study, said: "Our findings help to explain how events in the Himalaya can have drastic effects on rivers downstream and on the people who live there. Knowing where landslides take place in the mountains could help us better predict whether or not large deposits of gravel will reach the Ganga Plain and increase flood risk." | Earthquakes | 2,017 |
April 26, 2017 | https://www.sciencedaily.com/releases/2017/04/170426092332.htm | New model could help predict major earthquakes | A Nagoya University-led team reveals the mechanisms behind different earthquakes at a plate boundary on the west coast of South America, shedding light on historical seismic events and potentially aiding prediction of the future risk from these natural disasters. | When tectonic plates that have been sliding past each other get stuck, a huge amount of energy builds up, and is eventually released in the form of an earthquake. Although much is known about the mechanisms behind this process, more needs to be understood about what happens at particular plate boundaries to determine the risk of earthquakes and tsunamis at specific sites and potentially to predict when these events might occur.In a breakthrough in this field, researchers at Nagoya University and their colleagues in South America have studied several earthquakes that occurred at the Ecuador-Colombia subduction zone over the last hundred years, revealing the relationships between different earthquakes and the size and location of the ruptures at plate boundaries that caused them. The findings were published in The team used a combination of data sources and models to study large earthquakes that struck the west coast of South America in 1906, 1942, 1958, 1979, and 2016. These included information on tsunami waveforms recorded at sites across the Pacific, data on seismic waves obtained by monitoring stations in Ecuador and Colombia, and previous work on the intensity of coupling, or locking together, of adjacent plates and the distance that they slipped past each other to cause each earthquake."The Ecuador-Colombia subduction zone, where the Nazca plate passes underneath the South American plate, is particularly interesting because of the frequency of large earthquakes there," says study author Hiroyuki Kumagai of the Graduate School of Environmental Studies, Nagoya University. "It's also a good site to investigate whether the ruptures at plate boundaries causing huge earthquakes are linked to subsequent large earthquakes years or decades later."By carefully modeling the fault area where these earthquakes arose in combination with the other data, the team showed that the strongest of the earthquakes, that of 1906, involved a rupture at a different site than the other earthquakes. They also used data on the known speed at which the plates are moving past each other and the simulated "slip" of a plate associated with the 2016 earthquake to show that the 1942 and 2016 earthquakes were triggered by ruptures at the same site."Now that we can precisely link previous earthquakes to ruptures at specific sites along plate boundaries, we can gauge the risks associated with the build-up of pressure at these sites and the likely frequency of earthquakes there," lead author Masahiro Yoshimoto says. "Our data also reveal for the first time differences in rupture mechanisms between oceanic trenches and deeper coastal regions in this subduction zone."The findings provide a foundation for risk prediction tools to assess the likelihood of earthquakes and tsunamis striking this region and their potential periodicity and intensity. | Earthquakes | 2,017 |
April 17, 2017 | https://www.sciencedaily.com/releases/2017/04/170417144928.htm | Lessons from Parkfield help predict continued fault movements after earthquakes | A new study shows that the San Andreas Fault continued to slip gradually for six to twelve years after the 2004 magnitude 6.0 Parkfield, California earthquake, raising the issue of continued damage to structures built across fault zones after damaging earthquakes. This long period of "afterslip" compares to just a year of afterslip for a similar magnitude quake in Napa, California in 2014, demonstrating large variation in fault behavior after earthquakes. | The findings, reported April 18 in the Afterslip refers to the aseismic movement or "creeping" that takes place along a fault, including the trace of its surface rupture, after an earthquake. Some faults have long been known to creep aseismically between earthquakes, such as the part of the San Andreas Fault involved in the Parkfield quake, and these faults are thought to be more prone to afterslip. The fault involved in the Napa earthquake, although not known to be creeping between quakes, exhibited less afterslip.James Lienkaemper, an emeritus research geophysicist with the U.S. Geological Survey, said the urban Hayward Fault near San Francisco also experiences interseismic creep, and is located in a similar geological setting to the Parkfield fault. He and his colleague Forrest McFarland of San Francisco State University suggest in the BSSA paper that there could be significant afterslip for possibly up to a decade along the Hayward Fault after an expected magnitude 6.8 earthquake there.Prolonged afterslip could delay post-earthquake recovery by continuing to cause damage to critical infrastructure built across the fault such as rapid transit and utilities, said Lienkaemper, who noted that better forecasts of afterslip "can be used to plan temporary and final repairs to rapid transit tracks, water, gas and data lines.""Other major faults of the San Francisco Bay Areas, including Rodgers Creek, Northern Calaveras and Concord-Green Valley, also expect large earthquakes, and should expect significant afterslip, especially in locations where interseismic creep rates are high, and where faults cross deep sedimentary basins," Lienkaemper added.Lienkaemper and McFarland found that only about 74 percent of the predicted amount of afterslip for the Parkfield earthquake -- a maximum expected value of about 35 centimeters -- was complete a year after the earthquake. Their analysis of slip at the six-year mark indicated that afterslip would be completed everywhere on the ruptured fault between six and twelve years after the 2004 earthquake.The ends of the Parkfield fault were the most likely areas to experience prolonged slipping after the earthquake, Lienkaemper noted, suggesting that slip in these parts of the fault gradually increased as stress transferred from movement of the central section of the rupture.By contrast, nearly all of the predicted afterslip for the Napa quake was completed a year after the mainshock. After comparing the Parkfield and Napa afterslip with historical afterslip data from around the world, the researchers suggest that there may be a considerable range in the duration of afterslip events. Understanding the susceptibility of a fault to creep, however, could refine estimates of the amount and duration of fault movement during recovery after large earthquakes. | Earthquakes | 2,017 |
April 13, 2017 | https://www.sciencedaily.com/releases/2017/04/170413120048.htm | Forecasting large earthquakes along the Wasatch Front, Utah | There is a 43% probability that the Wasatch Front region in Utah will experience at least one magnitude 6.75 or greater earthquake, and a 57 % probability of at least one magnitude 6.0 earthquake, in the next 50 years, say researchers speaking at the 2017 Seismological Society of America's (SSA) Annual Meeting. | In their report released in 2016, the Working Group on Utah Earthquake Probabilities, established by the Utah Geological Survey and the U.S. Geological Survey, presented their first forecast for large earthquakes along faults in the Wasatch region, running roughly from Nephi, Utah north to the Utah-Idaho border. (A map of the region is available from the USGS.) The Working Group's project is the first comprehensive study of large earthquake risk in the U.S. West outside of California.At the SSA Annual Meeting, Ivan Wong of Lettis Consultants International and colleagues will discuss the detailed forecast from the 2016 report, including their findings that at least 22 large earthquakes have ruptured parts of the Wasatch fault zone between Nephi and Brigham City, Utah in the past 6000 years. The data also suggest that some segments of the fault may be more likely to rupture than others, based on the average time between earthquakes. For instance, the segment of the fault around Brigham City ruptures on average every 1100 years, but has not experienced an earthquake in 2500 years. | Earthquakes | 2,017 |
April 12, 2017 | https://www.sciencedaily.com/releases/2017/04/170412115758.htm | Developing a microinsurance plan for California earthquakes | Nine out of ten Californians are uninsured against earthquake risk, which could slow economic recovery in neighborhoods and cities around the state after a damaging quake. On-demand or use-based small insurance policies -- sometimes called microinsurance -- could help fill in that financial gap, according to a presentation at the 2017 Seismological Society of America's (SSA) Annual Meeting. | Unlike traditional insurance policies, which tend to cover only structural damage and involve extensive documentation, a microinsurance policy would automatically pay a fixed sum to all of the insured who fall into a "payout zone" affected by the earthquake. At the SSA meeting, Kate Stillwell of Jumpstart Insurance Solutions Inc. will present a case study showing how data on seismic shaking intensity are being used to develop a microinsurance policy for California.Stillwell is a structural engineer who has worked with several organizations over the course of her career to determine standards for keeping buildings safe in the event of an earthquake. Her experience in this area, she said, led her to believe that "there are so many other pieces of the puzzle that we're going to need to recover and to be resilient in the event of an earthquake, and one of the big missing pieces is having enough money in the system."Payout on a microinsurance policy could be used for anything from property damage, to lost wages from time off work, to vet bills for an injured pet, Stillwell noted.Stillwell will discuss a model for creating a payout zone that is defined by a combination of shaking intensities mapped out over the area affected by an earthquake and census blocks, the smallest geographical unit used by the U.S. Census Bureau.A different type of microinsurance policy against earthquake risk has already been adopted in Turkey, said Stillwell, who noted that microinsurance has also been used to protect against crop damage in several parts of the world. The concept could be expanded to many kinds of "high-consequence, low-probability" events, she said, including hurricanes, volcano erupts, tsunami and landslides. | Earthquakes | 2,017 |
April 12, 2017 | https://www.sciencedaily.com/releases/2017/04/170412111208.htm | Anticipating hazards from fracking-induced earthquakes in Canada and US | As hydraulic fracturing operations expand in Canada and in some parts of the United States, researchers at the 2017 Seismological Society of America's (SSA) Annual Meeting are taking a closer look at ways to minimize hazards from the earthquakes triggered by those operations. | Hydraulic fracturing, or fracking, is a method of hydrocarbon recovery that uses high-pressure injections of fluid to break apart rock and release trapped oil and natural gas. At the SSA Annual Meeting, experts will speak about the growing recognition that hydraulic fracturing or fracking can produce earthquakes magnitude 3 and larger, acknowledging that this type of seismic activity is difficult to predict and may be difficult to stop once it begins.Most induced earthquakes in Canada have been linked to hydraulic fracturing, in contrast to induced earthquakes studied in the central and eastern United States. In the U.S., these earthquakes have been linked primarily to massive amounts of wastewater injected back into the ground after oil and gas recovery. However, some presentations at the SSA meeting will take a closer look at the possibilities for fracking earthquakes in the United States.Michael Brudzinski of Miami University and his colleagues will discuss their work to identify swarms of small magnitude earthquakes in Ohio that appear to be correlated in time and space with hydraulic fracturing or wastewater disposal. Their work suggest that there are roughly three times more earthquake sequences of magnitude 2 or larger induced by hydraulic fracturing compared to wastewater disposal in the area -- even though there are about 10 times more hydraulic fracturing wells than wastewater disposal wells. Their technique, they say, provides evidence of induced seismicity from hydraulic fracturing in Oklahoma, Arkansas, Pennsylvania, West Virginia and Texas as well.Zenming Wang and colleagues are preparing for the onset of oil and gas exploration in the Rome Trough of eastern Kentucky, conducting a study of the natural background seismicity in the area to be able to better identify induced earthquakes if they occur. In their SSA presentation, they will also discuss how an area like eastern Kentucky might assess and prepare for ground shaking hazards from induced earthquakes, since the ruptures may occur on unmapped or "quiet" faults.In western Alberta and eastern British Columbia in Canada, a significant increase in the rate of felt earthquakes from hydraulic fracturing has researchers looking at ways to mitigate potential damage to infrastructure in the region. In her SSA presentation, Gail Atkinson of Western University will discuss the factors that affect the likelihood of damaging ground motion from fracking-induced earthquakes. Based on these factors, Atkinson proposes targeted "exclusion zones" with a radius of about five kilometers around critical infrastructure such as major dams. This would be combined real-time monitoring to track the rate of seismic events of magnitude 2 or greater within 25 kilometers, with fracking operations adjusted to potentially reduce this rate to less hazardous levels. | Earthquakes | 2,017 |
April 12, 2017 | https://www.sciencedaily.com/releases/2017/04/170412111125.htm | Seismologists offer detailed look at New Zealand's Kaikoura earthquake | The magnitude 7.8 Kaikoura earthquake that struck the South Island of New Zealand last November was the largest on-land recorded earthquake in the country's history. In a special session at the 2017 Seismological Society of America's (SSA) Annual Meeting, researchers will gather to describe their findings on the quake and its implications for further seismic activity in the region. | Shaking from the earthquake lasted about two minutes and 20 seconds, and was felt in the New Zealand capital of Wellington, about 260 kilometers away from the quake's epicenter. Deformation of the sea floor off the coast of Kaikoura caused a tsunami that rose to about seven meters at its peak. The earthquake triggered about 80,000 landslides over an area of about 9000 square kilometers, according to research by the U.S. Geological Survey, which hopes to use data collected on the Kaikoura quake to improve models of earthquake-generated landslides and ground failure.The Kaikoura quake is the fourth earthquake of magnitude 7 or larger to hit the region in the past seven years, which has seismologists suspecting that the earthquakes are connected in some way, write session chairs Bill Fry and Matt Gerstenberger of GNS Science.One of the most unusual aspects of the Kaikoura earthquake was the widespread slow slip events (SSE) triggered by the main quake, say Gerstenberger and colleagues. Slow slip events are millimeters-small movements that occur over hours or months in subduction zones, where one tectonic plate slides beneath another. SSE have been detected in the New Zealand subduction zone before, but the Kaikoura earthquake appears to have sped up the rate of slip in some areas, produced simultaneous patches of slip in other areas, and triggered slip in other regions where no slow slip events have been previously detected -- in this case, at a place where the far southern portion of a slice of oceanic crust called the Hikurangi Plateau is sliding underneath the Indo-Australian plate underlying New Zealand's North Island. Seismologists are carefully monitoring these slow slip events, as they may increase the likelihood of other large earthquakes on the southern portion of North Island in the near future.Ground faulting during the earthquake was very complex, rupturing at nine to 12 different faults with multiple orientations and overlaps, resulting in a combined rupture length of about 180 kilometers (nearly 112 miles), according to work by GNS Science's Nicola Litchfield and colleagues. About half of the faults had been identified previously as active in a 2010 New Zealand National Seismic Hazard Model, Mark Stirling of the University of Otago and colleagues say, which had predicted a slightly smaller magnitude earthquake of 7.6 if a multi-fault earthquake were to occur.Evidence from past seismic activity in the area suggests that earthquakes occur on these faults at intervals ranging from 300 to 400 years to many thousands of years, which indicate that the Kaikoura quake was a relatively rare event, Litchfield and colleagues note.The unusual faulting of the earthquake also has researchers studying whether the quake was primarily a megathrust event, meaning that it was the kind of earthquake caused by one tectonic plate being forced under another at a subduction zone. However, the multiple surface-rupturing fault segments might suggest that the earthquake event was mostly restricted to strike-slip motion on the upper tectonic plate. In their SSA presentation, Penn State University's Kevin Furlong and colleagues will demonstrate how the surface faulting could be compatible with a megathrust event. | Earthquakes | 2,017 |
April 11, 2017 | https://www.sciencedaily.com/releases/2017/04/170411130744.htm | Bombay beach event demonstrates difficulties in earthquake swarm forecasting | In September 2016, about 100 small earthquakes between magnitude 2 and 4.3 took place in Bombay Beach, rattling the region in Southern California and raising questions about whether the swarm's location near the southern end of the San Andreas Fault would trigger a larger earthquake. | In a presentation at the 2017 Seismological Society of America's (SSA) Annual Meeting, U.S. Geological Survey seismologist Andreas Llenos will discuss lessons learned from the 2016 Bombay Beach swarm, in particular the challenges in modeling swarms and communicating their risk to the public.Earthquake swarms are triggered by short-term processes such as fluid flow in rock layers or aseismic fault creeping. Unlike earthquake aftershocks, which decrease their rate over time in predictable ways, it can be difficult to forecast how long a swarm may last once it begins, Llenos says. "For example, there are swarms that only last for a couple of days, and there are swarms that go on for months. Even in just the Bombay Beach area, the 2001 swarm lasted a day or two, the 2009 swarm lasted more like a week, and the 2016 swarm lasted several days. And the swarm-driving process might itself vary over time as well."In her SSA presentation, LLenos will discuss how seismologists are exploring different models to determine how earthquake swarms should be viewed in terms of raising the normal or background rate of seismic activity in an area, and how this can affect the probabilities and magnitudes of larger earthquakes on faults in the region.Better models will help seismologists as they discuss the risks of these swarms with the public, Llenos notes."One of the issues we ran into [with the 2016 swarm] was how to convey to the public in our online statements what the probabilities were likely to do over the next week. For a typical mainshock, this is relatively straightforward. Since the number of aftershocks decrease over time, the likelihood of a larger event will also decrease over time," says Llenos. "But for swarms, because we didn't know how long the higher background rate would last, we needed to think a bit more about how to convey that the probabilities over the next week may change depending on if the swarm activity increases or decreases." | Earthquakes | 2,017 |
April 11, 2017 | https://www.sciencedaily.com/releases/2017/04/170411130741.htm | Performance of earthquake early warning systems | The future of earthquake early warning systems may be contained in smartphones -- and vehicles, and "smart" appliances and the increasing number of everyday objects embedded with sensors and communication chips that connect them with a global network. | At a presentation at the 2017 Seismological Society of America's (SSA) Annual Meeting, Benjamin Brooks of the U.S. Geological Survey and colleagues will share data from a recent project in Chile that provided early detection, estimates and locations for earthquakes using a network of sensor boxes equipped with smartphones and consumer-quality GPS chips. Data collected by the sensor boxes is transmitted through an Android app developed by the researchers and analyzed to produce earthquake source models, which in turn can be used to create ground shaking forecasts and local tsunami warnings.The sensor stations have successfully detected three magnitude 5 or larger earthquakes since December 2016, with no false alarms. Although the smartphone-based sensors in the study are distributed in a fixed network, Brooks and colleagues say, it may be possible to someday harness individual smartphones and "smart" appliances into a crowd-sourced network for earthquake early warning.On the U.S West Coast, seismologists at the University of Washington are expanding and testing the capabilities of earthquake early warning systems already under development, such as the G-FAST system in the Pacific Northwest, and ShakeAlert in California. Brendan Crowell and colleagues will discuss the performance of G-FAST as tested by 1300 simulated megathrust earthquakes of magnitudes between 7.5 and 9.5 in the Cascadia region. Renate Hartog will present data suggesting that the algorithms behind ShakeAlert can be configured to work for the Pacific Northwest as well as California, suggesting that a West Coast-wide earthquake early warning system could be closer to reality.In other presentations at the SSA Annual Meeting, researchers will also discuss how earthquake early warning systems are developing ways to improve real-time ground motion alerts. Many early warning systems perform best when asked to pinpoint the magnitude and location of earthquakes, but ground motion warnings are also key to predicting and preventing infrastructure damage and destruction. | Earthquakes | 2,017 |
April 11, 2017 | https://www.sciencedaily.com/releases/2017/04/170411130728.htm | Seismic listening system offers new look at Old Faithful geyser | After deploying hundreds of seismometers around the Old Faithful Geyser in 2015 and 2016, scientists have a clearer picture of how the geyser erupts and what may lie beneath the popular tourist attraction in Yellowstone National Park. | At the 2017 Seismological Society of America's (SSA) Annual Meeting, Jamey Farrell of the University of Utah will describe how this seismic ear to the ground is helping the national park plan for its future infrastructure needs around the geyser. The data also offer a unique glimpse at the active hydrothermal system below the geyser, and how its activity could be used to monitor the eruption of less predictable geysers.Farrell and his colleagues created a more complete map of the active hydrothermal features under Old Faithful, and in the area called Geyser Hill. The seismic noise captured by their instruments works similarly to sonar, explained Farrell, allowing the researchers to reconstruct "what these geothermal structures look like under the ground, what's happening underground when they're getting ready for an eruption, and how these things are related to each other and how they react to outside stresses, even earthquakes from around the world."The researchers were able to fill in more details about the underground tremor signal connected to Old Faithful, which starts about 45 minutes before eruption, building up to a peak about 25 minutes before eruption, and slowly dying down until the eruption occurs. "We found that the amplitudes of those tremor signals are much higher than the actual eruption of Old Faithful itself," said Farrell.For the first time, the scientists also saw this tremor signal disappear in an area west and northwest of the geyser. Seismic signals don't travel well through ground saturated with hydrothermal fluids, Farrell explained, " so what we think that's telling us is that we have this shallow, very saturated body of ground there, and it's probably the reservoir that Old Faithful is pulling water from when it erupts."Studying the tremor pattern of Old Faithful could help narrow down its exact eruption times; right now, its eruptions can be predicted within a window of ten minutes or so. Farrell said seismologists want to study other geysers to determine whether there might be a regular signal pattern before eruption. This could help pinpoint eruption times for unpredictable geysers like Steamboat Geyser in Yellowstone's Norris Geyser Basin, which is the tallest active geyser in the world.Farrell and his colleagues have only a few weeks each November to make their observations, between the time Yellowstone closes to the public and the arrival of winter weather. When the park is open, he said, "cultural noise" like the trudge of feet on the park's boardwalks drowns out natural seismic signals. | Earthquakes | 2,017 |
April 11, 2017 | https://www.sciencedaily.com/releases/2017/04/170411130725.htm | Could a Colorado earthquake have been triggered by dinosaur extinction impact? | Researchers have found signs of fault displacement at well-known rock outcrops in Colorado that mark the end-Cretaceous asteroid impact that may have hurried the extinction of the dinosaurs. They will present their results in a poster at the 2017 Seismological Society of America's (SSA) Annual Meeting. | Norm Sleep of Stanford University and colleagues suggest that the impact, which occurred near the Yucatán Peninsula of Mexico, could have generated massive seismic waves that triggered earthquakes as far away as Colorado, in the center of a tectonic plate where no previous fault had existed.Sleep and his colleagues found evidence for the fault in two areas in Colorado's Trinidad Lakes State Park, where a layer of iridium generated by the asteroid impact clearly marks the boundary between Cretaceous and Tertiary-age rocks, at the time of the dinosaurs' extinction about 65 million years ago. At the Long's Canyon and Madrid Canyon roadcuts, "there is a fault that slipped about a meter at the time of the impact," Sleep said. "It offset the material below the impact layer but not above, but it's not something that would be obvious to the casual observer."The researchers suggest that the Colorado earthquake may have been as large as magnitude 6. Very strong seismic waves from the impact -- much larger than would be generated by a regular earthquake, Sleep said -- would be necessary to trigger an earthquake in this location, in the middle of a tectonic plate with no previous faults.The end-Cretaceous asteroid strike, however, could have generated ground velocities of a meter or two per second, Sleep said. "The ground would be moving up and down and sideways like a ship in a strong storm."At the time of the earthquake, the area in Colorado was a swampy, delta-like environment, crossed by large braided streams that ran from the young Rocky Mountains. Sleep and his colleagues saw signs that the earthquake had diverted a small stream in the area.This summer, the researchers will be checking in New Mexico near the Raton Basin for further signs of intraplate quakes that may have been triggered by the asteroid strike. | Earthquakes | 2,017 |
April 18, 2017 | https://www.sciencedaily.com/releases/2017/04/170418115234.htm | Communicating tsunami evacuations effectively | The extent of damage caused by a 2011 earthquake and tsunami in Japan demonstrated that the outcomes of disaster mitigation research were not fully applied in reality. Computer simulations are normally used to develop effective evacuation strategies. However, a gap remains between computer-generated results and what is actually feasible for people, since simulations are not usually tested in real life. | Led by Professor Michinori Hatayama, researchers at Kyoto University combined computer simulations with fieldwork done in partnership with the residents of Mangyo, Kuroshio in Kochi prefecture in southern Japan. It is thought that the Kuroshio area will be the most affected in the event of a long-anticipated earthquake and tsunami originating in the Nankai Trough, a depression at the bottom of the ocean about 900km off the southern coast of Japan's mainland.The team's aim was to facilitate effective communication between disaster mitigation professionals and society at large so that research outcomes are better utilized. They also wanted to encourage the locals to develop practical evacuation plans to help them feel less pessimistic about their survival odds. Some residents had previously expressed hopelessness to the media should such a disaster happen.Hatayama and his colleagues interviewed the residents to find out how they would react in the case of a tsunami warning and what issues make them feel pessimistic. They then put the responses into a computer simulation system that includes geographical information and tsunami hazard data. The output, which was shared with the locals in workshops, showed an animation of how they would react to a tsunami, allowing them to identify what challenges they faced in order to evacuate successfully. This was followed by discussions about what the locals perceived to be issues, such as an elevated shelter being too far away, and possible alternative plans, such as using a much nearer evacuation tower. The researchers conducted evacuation drills so that the residents could experience the alternative plans.They found that this method made evacuation drills more efficient. Contrary to previously practiced drills, the researchers placed problem awareness ahead of practice, which helped shorten the time needed to determine possible alternative solutions.The field activities also allowed some residents to realize evacuation was possible; something they were unsure of before. They regarded alternative plans as feasible evacuation options.However, some locals tended not to accept the alternative plans as they might involve some risks. For example, they preferred to evacuate by car rather than on foot, despite the fact that roads are often blocked by debris or are congested in a disaster.New research should look into real life examples in order to find a solution to these problems, the researchers conclude. | Earthquakes | 2,017 |
April 10, 2017 | https://www.sciencedaily.com/releases/2017/04/170410161932.htm | What happens to the boats? The 1755 Lisbon earthquake and Portuguese tsunami literacy | In their paper published this week in | Vasconcelos and colleagues use the 1 Nov. 1755 Lisbon, Portugal, earthquake as the basis for their study. The earthquake, with a Richter magnitude estimated between 8.6 and 9, is one of the largest in recorded history. Not only that, the city was then hit by a tsunami and engulfed in fire.The earthquake and its aftermath remains part of the generational memory of area residents, but it has been forgotten or not heard of by younger groups.The main aim of this study was to evaluate Portuguese citizens' scientific literacy regarding tsunamis and to analyze their knowledge related to the 1755 earthquake. Vasconcelos and colleagues conducted 206 structured interviews and asked the general public to collaborate. At the beginning of the interviews, people were shown a drawing representing a tsunami epicenter and three boats in different locations. They were asked to decide which boat they thought would suffer the most. Only 42.7 of the interviewees gave the correct answer, that boat three (the one closest to shore) would experience the most damage. An equally high percentage of respondents (43.4%) chose boat 1 (farthest from shore), and 13.6% of respondents chose boat 2 (in the middle).In analyzing the results of that and other questions, the authors found that people showed a wide lack of knowledge regarding tsunamis, including the 1755 event. However, the majority of interviewees recognized the need to know more about these issues.The authors conclude that this evidence indicates the importance of including historical, social, and scientific issues in geosciences programs, giving more relevance to teaching seismic risks, their prevention, and possible responses.They write, "If we want citizens to be active and to play a responsible role in the development of their own society, these historical socio-scientific issues must be clearly and decisively addressed in the classroom." | Earthquakes | 2,017 |
April 10, 2017 | https://www.sciencedaily.com/releases/2017/04/170410085154.htm | The Forecaster's Dilemma: Evaluating forecasts of extreme events | Accurate predictions of extreme events do not necessarily indicate the scientific superiority of the forecaster or the forecast method from which they originate. The way forecast evaluation is conducted in the media can thus pose a dilemma. | When it comes to extreme events, public discussion of forecasts often focuses on predictive performance. After the international financial crisis of 2007, for example, the public paid a great deal of attention to economists who had correctly predicted the crisis, attributing it to their superior predictive ability. However, restricting forecast evaluation to subsets of extreme observations has unexpected and undesired effects, and is bound to discredit even the most expert forecasts. In a recent article, statisticians Dr. Sebastian Lerch and Prof. Tilmann Gneiting (both affiliated with HITS and the Karlsruhe Institute of Technology), together with their coauthors from Norway and Italy, analyzed and explained this phenomenon and suggested potential remedies. The research team used theoretical arguments, simulation experiments and a real data study on economic variables. The article has just been published in the peer-reviewed journal Statistical Science. It is based on research funded by the Volkswagen Foundation.Forecast evaluation is often only conducted in the public arena if an extreme event has been observed; in particular, if forecasters have failed to predict an event with high economic or societal impact. An example of what this can mean for forecasters is the devastating L'Aquila earthquake in 2009 that caused 309 deaths. In the aftermath, six Italian seismologists were put on trial for not predicting the earthquake. They were found guilty of involuntary manslaughter and sentenced to six years in prison until the Supreme Court in Rome acquitted them of the charges.But how can scientists and outsiders, such as the media, evaluate the accuracy of forecasts predicting extreme events? At first sight, the practice of selecting extreme observations while discarding non-extreme ones and proceeding on the basis of standard evaluation tools seems quite logical. Intuitively, accurate predictions on the subset of extreme observations may suggest superior predictive abilities. But limiting the analyzed data to selected subsets can be problematic. "In a nutshell, if forecast evaluation is conditional on observing a catastrophic event, predicting a disaster every time becomes a worthwhile strategy," says Sebastian Lerch, member of the "Computational Statistics" group at HITS. Given that media attention tends to focus on extreme events, expert forecasts are bound to fail in the public eye, and it becomes tempting to base decision making on misguided inferential procedures. "We refer to this critical issue as the 'forecaster's dilemma,'" adds Tilmann Gneiting.This predicament can be avoided if forecasts take the form of probability distributions, for which standard evaluation methods can be generalized to allow for specifically emphasizing extreme events. The paper uses economic forecasts of GDP growth and inflation rates in the United States between 1985 and 2011 to illustrate the forecaster's dilemma and how these tools can be used to address it.The results of the study are especially relevant for scientists seeking to evaluate the forecasts of their own methods and models, and for external third parties who need to choose between competing forecast providers, for example to produce hazard warnings or make financial decisions.Although the research paper focused on an economic data set, the conclusions are important for many other applications: The forecast evaluation tools are currently being tested for use by national and international weather services.Link: | Earthquakes | 2,017 |
April 7, 2017 | https://www.sciencedaily.com/releases/2017/04/170407143316.htm | Volcanic arcs form by deep melting of rock mixtures | Beneath the ocean, massive tectonic plates collide and grind against one another, which drives one below the other. This powerful collision, called subduction, is responsible for forming volcanic arcs that are home to some of Earth's most dramatic geological events, such as explosive volcanic eruptions and mega earthquakes. | A new study published in the journal Researchers led by the Woods Hole Oceanographic Institution (WHOI) have discovered a previously unknown process involving the melting of intensely-mixed metamorphic rocks -- known as mélange rocks -- that form through high stress during subduction at the slab-mantle boundary.Until now, it was long-thought that lava formation began with a combination of fluids from a subducted tectonic plate, or slab, and melted sediments that would then percolate into the mantle. Once in the mantle, they would mix and trigger more melting, and eventually erupt at the surface."Our study clearly shows that the prevailing fluid/sediment melt model cannot be correct," says Sune Nielsen, a WHOI geologist and lead author of the paper. "This is significant because nearly all interpretations of geochemical and geophysical data on subduction zones for the past two decades are based on that model."Instead, what Nielsen and his colleague found was that mélange is actually already present at the top of the slab before mixing with the mantle takes place."This study shows -- for the first time -- that mélange melting is the main driver of how the slab and mantle interact," says Nielsen.This is an important distinction because scientists use measurements of isotope and trace elements to determine compositions of arc lavas and better understand this critical region of subduction zones. When and where the mixing, melting, and redistribution of trace elements occurs generates vastly different isotopic signature ratios.The study builds on a previous paper by Nielsen's colleague and co-author Horst Marschall of Goethe University in Frankfurt, Germany. Based on field observations of mélange outcrops, Marschall noted that blobs of low-density mélange material, called diapirs, might rise slowly from the surface of the subducting slab and carry the well-mixed materials into the mantle beneath arc volcanoes."The mélange-diapir model was inspired by computer models and by detailed field work in various parts of the world where rocks that come from the deep slab-mantle interface have been brought to the surface by tectonic forces," Marschall says. "We have been discussing the model for at least five years now, but many scientists thought the mélange rocks played no role in the generation of magmas. They dismissed the model as 'geo-fantasy.'"In their new work, Nielsen and Marschall compared mixing ratios from both models with chemical and isotopic data from published studies of eight globally representative volcanic arcs: Marianas, Tonga, Lesser Antilles, Aleutians, Ryukyu, Scotia, Kurile, and Sunda."Our broad-scale analysis shows that the mélange mixing model fits the literature data almost perfectly in every arc worldwide, while the prevailing sediment melt/fluid mixing lines plot far from the actual data," Nielsen says.Understanding the processes that occur at subduction zones is important for many reasons. Often referred to as the planet's engine, subduction zones are the main areas where water and carbon dioxide contained within old seafloor are recycled back into the deep Earth, playing critical roles in the control of long-term climate and the evolution of the planet's heat budget.These complex processes occur on scales of tens to thousands of kilometers over months to hundreds of millions of years, but can generate catastrophic earthquakes and deadly tsunamis that can occur in seconds."A large fraction of Earth's volcanic and earthquake hazards are associated with subduction zones, and some of those zones are located near where hundreds of millions of people live, such as in Indonesia," Nielsen says. "Understanding the reasons for why and where earthquakes occur, depends on knowing or understanding what type of material is actually present down there and what processes take place."The research team says the study's findings call for a reevaluation of previously published data and a revision of concepts relating to subduction zone processes. Because mélange rocks have largely been ignored, there is almost nothing known about their physical properties or the range of temperatures and pressures they melt at. Future studies to quantify these parameters stand to provide even greater insight into the role of mélange in subduction zones and the control it exerts over earthquake generation and subduction zone volcanism. | Earthquakes | 2,017 |
March 28, 2017 | https://www.sciencedaily.com/releases/2017/03/170328135508.htm | A seismic mapping milestone | Because of Earth's layered composition, scientists have often compared the basic arrangement of its interior to that of an onion. There's the familiar thin crust of continents and ocean floors; the thick mantle of hot, semisolid rock; the molten metal outer core; and the solid iron inner core. | But unlike an onion, peeling back Earth's layers to better explore planetary dynamics isn't an option, forcing scientists to make educated guesses about our planet's inner life based on surface-level observations. Clever imaging techniques devised by computational scientists, however, offer the promise of illuminating Earth's subterranean secrets.Using advanced modeling and simulation, seismic data generated by earthquakes, and one of the world's fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3-D picture of Earth's interior. Currently, the team is focused on imaging the entire globe from the surface to the core-mantle boundary, a depth of 1,800 miles.These high-fidelity simulations add context to ongoing debates related to Earth's geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view. In 2016, the team released its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team's model is notable for its global scope and high scalability."This is the first global seismic model where no approximations -- other than the chosen numerical method -- were used to simulate how seismic waves travel through Earth and how they sense heterogeneities," said Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. "That's a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging."The project's genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique. This technique leverages more information than competing methods, using forward waves that travel from the quake's origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.The problem with testing this theory? "You need really big computers to do this," Bozdag said, "because both forward and adjoint wave simulations are performed in 3-D numerically."In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy's (DOE's) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at DOE's Oak Ridge National Laboratory. After trying out its method on smaller machines, Tromp's team gained access to Titan in 2013 through the Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, program.Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through Earth.As seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through. For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.Each seismogram represents a narrow slice of the planet's interior. By stitching many seismograms together, researchers can produce a 3-D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone's hotspots, to subducted plates under New Zealand.This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2-D x-ray images taken from many perspectives are combined to create 3-D images of areas inside the body.In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3-D numerical simulations employed by Tromp's team isn't constrained in this way. "We can use the entire data -- anything and everything," Bozdag said.Running its GPU version of the SPECFEM3D_GLOBE code, Tromp's team used Titan to apply full-waveform inversion at a global scale. The team then compared these "synthetic seismograms" with observed seismic data supplied by the Incorporated Research Institutions for Seismology (IRIS), calculating the difference and feeding that information back into the model for further optimization. Each repetition of this process improves global models."This is what we call the adjoint tomography workflow, and at a global scale it requires a supercomputer like Titan to be executed in reasonable timeframe," Bozdag said. "For our first-generation model, we completed 15 iterations, which is actually a small number for these kinds of problems. Despite the small number of iterations, our enhanced global model shows the power of our approach. This is just the beginning, however."For its initial global model, Tromp's team selected earthquake events that registered between 5.8 and 7 on the Richter scale -- a standard for measuring earthquake intensity. That range can be extended slightly to include more than 6,000 earthquakes in the IRIS database -- about 20 times the amount of data used in the original model.Getting the most out of all the available data requires a robust automated workflow capable of accelerating the team's iterative process. Collaborating with OLCF staff, Tromp's team has made progress toward this goal.For the team's first-generation model, Bozdag carried out each step of the workflow manually, taking about a month to complete one model update. Team members Matthieu Lefebvre, Wenjie Lei, and Youyi Ruan of Princeton University and the OLCF's Judy Hill developed new automated workflow processes that hold the promise of reducing that cycle to a matter of days."Automation will really make it more efficient, and it will also reduce human error, which is pretty easy to introduce," Bozdag said.Additional support from OLCF staff has contributed to the efficient use and accessibility of project data. Early in the project's life, Tromp's team worked with the OLCF's Norbert Podhorszki to improve data movement and flexibility. The end result, called Adaptable Seismic Data Format (ASDF), leverages the Adaptable I/O System (ADIOS) parallel library and gives Tromp's team a superior file format to record, reproduce, and analyze data on large-scale parallel computing resources.In addition, the OLCF's David Pugmire helped the team implement in situ visualization tools. These tools enabled team members to check their work more easily from local workstations by allowing visualizations to be produced in conjunction with simulation on Titan, eliminating the need for costly file transfers."Sometimes the devil is in the details, so you really need to be careful and know what you're looking at," Bozdag said. "David's visualization tools help us to investigate our models and see what is there and what is not."With visualization, the magnitude of the team's project comes to light. The billion-year cycle of molten rock rising from the core-mantle boundary and falling from the crust -- not unlike the motion of globules in a lava lamp -- takes form, as do other geologic features of interest.At this stage, the resolution of the team's global model is becoming advanced enough to inform continental studies, particularly in regions with dense data coverage. Making it useful at the regional level or smaller, such as the mantle activity beneath Southern California or the earthquake-prone crust of Istanbul, will require additional work."Most global models in seismology agree at large scales but differ from each other significantly at the smaller scales," Bozdag said. "That's why it's crucial to have a more accurate image of Earth's interior. Creating high-resolution images of the mantle will allow us to contribute to these discussions."To improve accuracy and resolution further, Tromp's team is experimenting with model parameters under its most recent INCITE allocation. For example, the team's second-generation model will introduce anisotropic inversions, which are calculations that better capture the differing orientations and movement of rock in the mantle. This new information should give scientists a clearer picture of mantle flow, composition, and crust-mantle interactions.Additionally, team members Dimitri Komatitsch of Aix-Marseille University in France and Daniel Peter of King Abdullah University in Saudi Arabia are leading efforts to update SPECFEM3D_GLOBE to incorporate capabilities such as the simulation of higher-frequency seismic waves. The frequency of a seismic wave, measured in Hertz, is equivalent to the number of waves passing through a fixed point in one second. For instance, the current minimum frequency used in the team's simulation is about 0.05 hertz (1 wave per 20 seconds), but Bozdag said the team would also like to incorporate seismic waves of up to 1 hertz (1 wave per second). This would allow the team to model finer details in Earth's mantle and even begin mapping Earth's core.To make this leap, Tromp's team is preparing for Summit, the OLCF's next-generation supercomputer. Set to arrive in 2018, Summit will provide at least five times the computing power of Titan. As part of the OLCF's Center for Accelerated Application Readiness, Tromp's team is working with OLCF staff to take advantage of Summit's computing power upon arrival."With Summit, we will be able to image the entire globe from crust all the way down to Earth's center, including the core," Bozdag said. "Our methods are expensive -- we need a supercomputer to carry them out -- but our results show that these expenses are justified, even necessary." | Earthquakes | 2,017 |
March 28, 2017 | https://www.sciencedaily.com/releases/2017/03/170328082923.htm | Using a method from Wall Street to track slow slipping of Earth's crust | Stock traders have long used specialized trackers to decide when to buy or sell a stock, or when the market is beginning to make a sudden swing. A new University of Washington study finds that the same technique can be used to detect gradual movement of tectonic plates, what are called "slow slip" earthquakes. These movements do not unleash damaging amounts of seismic energy, but scientists are just beginning to understand how they may be linked to the Big One. | A new technique can quickly pinpoint slow slips from a single Global Positioning System station. It borrows the financial industry's relative strength index , a measure of how quickly a stock's price is changing, to detect slow slips within a string of GPS observations.The paper was published in December in the "I've always had an interest in finance, and if you go to any stock ticker website there's all these different indicators," said lead author Brendan Crowell, a UW research scientist in Earth and space sciences. "This particular index stood out in its ease of use, but also that it needed no information -- like stock volume, volatility or other terms -- besides the single line of data that it analyzes for unusual behavior."The study tests the method on more than 200 GPS stations that recorded slow slips between 2005 and 2016 along the Cascadia fault zone, which runs from northern California up to northern Vancouver Island."Looking at the Cascadia Subduction Zone -- which is the most-studied slow slip area in the world -- was a good way to validate the methodology," Crowell said.The results show that this simple technique's estimates for the size, duration and travel distance for major slow slip events match the results of more exhaustive analyses of observations along the fault.Discovered in the early 2000s, slow slips are a type of silent earthquake in which two plates slip harmlessly past one another over weeks or months. In Cascadia the slipping runs backward from the typical motion along the fault. A slow slip slightly increases the chance of a larger earthquake. It also may be providing clues, which scientists don't yet know how to decipher, to what is happening in the physics at the plate boundary.Regular earthquake monitoring relies on seismometers to track the shaking of the ground. That doesn't work for slow slips, which do not release enough energy to send waves of energy through the Earth's crust to reach seismometers.Instead, detection of slow slips relies on GPS data."If you don't have much seismic energy, you need to measure what's happening with something else. GPS is directly measuring the displacement of the Earth," Crowell said.At GPS stations, the same type of sensors used in smartphones are secured to steel pipes that are cemented at least 35 meters (115 feet, or about 10 stories) into solid rock. By minimizing the noise, these stations can detect millimeter-scale changes in position at the surface, which can be used to infer movement deep underground.Using these data to detect slow slips currently means comparing different GPS stations with complex data processing. But thanks to the efforts of stock traders who want to know quickly whether to buy or sell, the new paper shows that the relative strength index can detect a slow slip from a single one of the 213 GPS stations along the Cascadia Subduction Zone.The initial success suggests the method could have other geological applications."I want to be able to use this for things beyond slow slip," Crowell said. "We might use the method to look at the seismic effects of groundwater extraction, volcanic inflation and all kinds of other things that we may not be detecting in the GPS data."The technique could be applied in places that are not as well studied as the Pacific Northwest, where geologic activity is already being closely monitored."This works for stations all over the world -- on islands, or areas that are pretty sparsely populated and don't have a lot of GPS stations," Crowell said.In related research, Crowell has used an Amazon Catalyst grant to integrate GPS, or geodetic, data into the ShakeAlert earthquake alert system. For really big earthquakes, detecting the large, slow shaking is not as accurate for pinpointing the source and size of the quake. It's more accurate to use GPS to detect how much the ground has actually moved. Tracking ground motion also improves tsunami warnings. Crowell has used the grant to integrate the GPS data into the network's real-time alerts, which are now in limited beta testing. | Earthquakes | 2,017 |
March 24, 2017 | https://www.sciencedaily.com/releases/2017/03/170324105414.htm | Steep rise of the Bernese Alps | The striking North Face of the Bernese Alps is the result of a steep rise of rocks from the depths following a collision of two tectonic plates. This steep rise gives new insight into the final stage of mountain building and provides important knowledge with regard to active natural hazards and geothermal energy. The results from researchers at the University of Bern and ETH Zürich are being published in the | Mountains often emerge when two tectonic plates converge, where the denser oceanic plate subducts beneath the lighter continental plate into Earth's mantle according to standard models. But what happens if two continental plates of the same density collide, as was the case in the area of the Central Alps during the collision between Africa and Europe?Geologists and geophysicists at the University of Bern and ETH Zürich examined this question. They constructed the 3D geometry of deformation structures through several years of surface analysis in the Bernese Alps. With the help of seismic tomography, similar to ultrasound examinations on people, they also gained additional insight into the deep structure of Earth's crust and beyond down to depths of 400 km in Earth's mantle.A reconstruction based on this data indicated that the European crust's light, crystalline rocks cannot be subducted to very deep depths but are detached from Earth's mantle in the lower earth's crust and are consequently forced back up to Earth's surface by buoyancy forces. Steep fault zones are formed here, which push through Earth's crust and facilitate the steep rise of rocks from the depths. There are textbook examples of these kinds of fault zones in the Hasli valley, where they appear as scars in form of morphological incisions impressively cutting through the glacially polished granite landscape.The detachment of Earth's crust and mantle takes place at a depth of 25-30 kilometres. This process is triggered by the slow sinking and receding of the European plate in the upper earth's mantle towards the north. In specialist terminology, this process is called slab rollback. The high temperatures at these depths make the lower crust's rocks viscous, where they can subsequently be forced up by buoyant uplift forces.Together with surface erosion, it is this steep rise of the rocks from lower to mid-crustal levels which is responsible for the Bernese Alps' steep north front today (Titlis -- Jungfrau region -- Blüemlisalp range). The uplift data in the range of one millimetre per year and today's earthquake activity indicate that the process of uplift from the depths is still in progress. However, erosion on Earth's surface causes continuous ablation which is why the Alps do not carry on growing upwards endlessly.The analysis of the steep fault zones are not just of scientific interest though. The seismic partly still active faults are responsible for the rocks weathering more intensively on the surface and therefore landslides and debris flows occurring, for example in the Halsi valley in the extremely steep areas of the Spreitlaui or Rotlaui. The serious debris flows in the Guttannen area are based, among other things, on this structural preconditioning of the host rocks. The leakage of warm hydrothermal water, which it is important to explore for geothermal energy and the 2050 energy policy, can be traced directly back to the brittle fracturing of the upper earth's crust and the seeping in of cold surface waters. The water is heated up in the depths and arrives at the surface again through the steep fault zones -- for example, in the Grimsel region. In this sense, the new findings lead to a deeper understanding of surface processes, which influence our infrastructures, for example the transit axes (rail, roads) through the Alps. | Earthquakes | 2,017 |
March 22, 2017 | https://www.sciencedaily.com/releases/2017/03/170322100803.htm | Sinking of seal beach wetlands tied to ancient quakes | A California State University, Fullerton faculty-student study shows evidence of abrupt sinking of the wetlands near Seal Beach, Calif., caused by ancient earthquakes that shook the area at least three times in the past 2,000 years -- and it could happen again, the researchers say. | The paleoseismology study reveals that the wetlands at the National Wildlife Refuge Seal Beach, a nearly 500-acre area located within the Naval Weapons Station Seal Beach and next to the communities of Seal Beach and Huntington Harbor, are susceptible to rapid lowering in elevation during large -- over 7.0 magnitude -- earthquakes."Imagine a large earthquake -- and it can happen again -- causing the Seal Beach wetlands to sink abruptly by up to three feet. This would be significant, especially since the area already is at sea level," said Matthew E. Kirby, CSUF professor of geological sciences.Kirby and colleague Brady P. Rhodes, CSUF professor emeritus of geological sciences, and CSUF alumnus Robert J. Leeper, whose master's thesis is based on the research findings, led the study. The researchers mentored numerous CSUF geology students and collaborated with geologists and earthquake experts, including those from the U.S. Geological Survey (USGS).The researchers' study, "Evidence for Coseismic Subsidence Events in a Southern California Coastal Saltmarsh," was published in Leeper, now a doctoral student in Earth sciences program at University of California, Riverside, is the lead author of the paper, and CSUF co-authors are Kirby; Rhodes; Joe Carlin, assistant professor of geological sciences; and 2016 geology graduate Angela Aranda, who for her master's thesis analyzed sediment cores from the wetlands. Other collaborators and co-authors are Katherine Scharer and Scott Starratt of the USGS; Eileen Hemphill-Haley, consulting micropaleontologist; and Simona Avnaim-Katav and Glen MacDonald from UCLA.Located off Pacific Coast Highway between Belmont Shores and Sunset Beach, the Seal Beach wetlands likely formed due to complex, lateral movement of the Newport-Inglewood fault, said Leeper. The wetlands straddle a segment of the fault system, which extends from Beverly Hills in the north to the San Diego region in the south.The study identifies three previously undocumented earthquakes in the area over the past 2,000 years, noted Leeper, who earned his bachelor's degree in 2013 and master's degree in 2016, both in geology, at Cal State Fullerton. The last big quake to cause the land to abruptly drop occurred approximately 500 years ago, he pointed out."These research findings have important implications in terms of seismic hazard and risk assessment in coastal Southern California and are relevant to municipal, industrial and military infrastructure in the region," added Leeper, a former USGS geologist whose work focused on natural hazards. He recently left the scientific agency to concentrate on his doctoral studies.This new study stems from National Science Foundation-funded research on past occurrences of tsunamis along Southern California's coastal wetlands that Kirby and Rhodes began in 2012. As an undergraduate, Leeper joined their study, which turned up no evidence of previous tsunamis in Orange County or the region.Soil samples analyzed in Kirby's lab from mud cores collected from the Seal Beach wetlands, combined with the study of microscopic fossils to identify the past environment, pointed the researchers in a new direction. The analyses revealed buried wetland surface layers, signaling evidence of sinking in the area from past massive earthquakes."Since that epiphany in 2013, our research evolved and has involved many other collaborators, each providing a skill or expertise that helped to develop our conclusions," Kirby said.CSUF's Carlin, his students and Leeper are continuing to study the Seal Beach wetlands to further investigate potential seismic hazards, as well as the poorly understood Newport-Inglewood fault system."We're looking to identify other past earthquake events in the sediment record from other cores from the wetlands," Carlin said. "The goal is to get a better understanding of how often earthquakes may have occurred in the past, the hazards associated with this fault -- and the probability of the next earthquake." | Earthquakes | 2,017 |
March 15, 2017 | https://www.sciencedaily.com/releases/2017/03/170315134608.htm | Dissection of the 2015 Bonin deep earthquake | Researchers at Tohoku University's Department of Geophysics, have been studying the deep earthquake which occurred on May 30, 2015, to the west of Japan's Bonin Islands. | The earthquake, which registered at about 670 km depth with moment magnitude (Mw) of 7.9, was the deepest global seismic event on record with M ? 7.8. It was also an isolated event located over 100 km deeper than the mainstream seismic zones recorded so far. The event has attracted great interest among researchers because high pressure and high temperature at such great depth make it unusual for earthquakes to generate there.In the Izu-Bonin region, the Pacific plate is subducting northwestward beneath the Philippine Sea plate. Subduction is a process where one of Earth's tectonic plates sinks under another. To date, several studies have investigated the source location of the Bonin deep earthquake relative to the subducting Pacific plate. But there have been conflicting results because the mantle structure in and around the source zone is still unclear.The Tohoku University team, led by Professor Dapeng Zhao, applied a method of seismic tomography to over five million P-wave arrival-time data recorded by world-wide seismic stations to determine a high-resolution mantle tomography beneath the Izu-Bonin region. The stations included those from the dense seismic networks in Japan and East China.Seismic tomography*2 is an effective tool for investigating the three-dimensional (3-D) structure of Earth's interior, in particular, for clarifying the morphology and structure of subducting slabs. Using that method, the team received clear images of the subducting Pacific slab as a high-velocity zone , and showed that the Bonin deep event occurred within the Pacific slab, which is penetrating the lower mantle. Moreover, its hypocenter is located just beside the eastern slab boundary to the ambient mantle within the mantle transition zone*3.They also found that the Pacific slab is split at about 28° north latitude, i.e., slightly north of the 2015 deep event hypocenter. In the north, the slab is flat in the mantle transition zone. Whereas in the south, the slab is nearly vertical and directly penetrating the lower mantle.These results suggest that this deep earthquake was caused by the joint effects of several factors. These include the Pacific slab's fast deep subduction, slab tearing, slab thermal variation, stress changes and phase transformations in the slab, as well as complex interactions between the slab and the ambient mantle. This work sheds new light on the deep slab structure and subduction dynamics.*1 Slab: the subducting oceanic plate.*2 Seismic tomography: a method to image the three-dimensional structure of Earth's interior by inverting abundant seismic wave data generated by many earthquakes and recorded at many seismic stations.*3 Mantle transition zone: a part of Earth's mantle between depths of approximately 410 and 670 km, separating the upper mantle from the lower mantle. | Earthquakes | 2,017 |
March 13, 2017 | https://www.sciencedaily.com/releases/2017/03/170313085603.htm | CRUST adds new layer of defense against earthquakes, tsunamis | The first computer model to simulate the whole chain of events triggered by offshore mega subduction earthquakes could reduce losses to life and property caused by disasters like the huge earthquake and tsunami that struck Japan six years ago. | This pioneering new model has been developed by the CRUST (Cascading Risk and Uncertainty Assessment of Earthquake Shaking and Tsunami) project with funding from the Engineering and Physical Sciences Research Council (EPSRC). The University of Bristol, in collaboration with UCL, has led the work at the head of a multi-national consortium.Designed to be used in any part of the world potentially vulnerable to offshore subduction earthquakes (where one tectonic plate is forced beneath another), such as Japan, New Zealand, the Pacific Northwest (US and Canada), Mexico, Chile and Indonesia, the model integrates every aspect and consequence of an undersea earthquake -- including tsunamis, aftershocks and landslides -- into a single disaster simulation tool.By generating more comprehensive, more accurate maps of all potential hazards and a better understanding of how these are connected with each other, it can be used to strengthen emergency planning, improve evacuation strategies, enable engineers to calculate buildings' resilience more realistically and help the insurance industry produce more reliable financial risk analyses, for example.In the past, risks posed by earthquakes and by the different threats associated with them have been modelled separately, based on different methods, data and assumptions varying from one part of the world to another. This lack of integration and lack of a standard approach has limited models' real-world value as well as the benefits of information sharing between countries.Dr Katsu Goda, Senior Lecturer in Civil Engineering in the University of Bristol's Department of Civil Engineering, who has led the CRUST team, says: "For the first time ever, we've brought genuine joined-up thinking to the whole issue of offshore giant subduction earthquakes and their links to tsunamis, aftershocks and landslides, taking account of how all of these are linked and how one type of event leads, or 'cascades', into another."With its ability to produce a more reliable and realistic picture of the entire sequence of events and to generate multi-hazard maps, the model will enable governments, emergency services, the financial industry and others to explore alternative disaster scenarios in detail. In the coming months, the CRUST team will focus on refining the model's capabilities as a truly predictive tool.Dr Goda says: "The magnitude 9 Tohoku earthquake and resulting tsunami waves that hit the east coast of Japan on 11 March, 2011 caused around 19,000 deaths plus economic damage estimated at US$300 billion. We hope our simulation tool will secure wide rollout around the world and will be used to inform decision-making and boost resilience to these frequently devastating events." | Earthquakes | 2,017 |
March 7, 2017 | https://www.sciencedaily.com/releases/2017/03/170307130842.htm | Fault system off San Diego, Orange, Los Angeles counties could produce magnitude 7.3 earthquake | A fault system that runs from San Diego to Los Angeles is capable of producing up to magnitude 7.3 earthquakes if the offshore segments rupture and a 7.4 if the southern onshore segment also ruptures, according to an analysis led by Scripps Institution of Oceanography at the University of California San Diego. | The Newport-Inglewood and Rose Canyon faults had been considered separate systems but the study shows that they are actually one continuous fault system running from San Diego Bay to Seal Beach in Orange County, then on land through the Los Angeles basin."This system is mostly offshore but never more than four miles from the San Diego, Orange County, and Los Angeles County coast," said study lead author Valerie Sahakian, who performed the work during her doctorate at Scripps and is now a postdoctoral fellow with the U.S. Geological Survey. "Even if you have a high 5- or low 6-magnitude earthquake, it can still have a major impact on those regions which are some of the most densely populated in California."The study, "Seismic constraints on the architecture of the Newport-Inglewood/Rose Canyon fault: Implications for the length and magnitude of future earthquake ruptures," appears in the American Geophysical Union's The researchers processed data from previous seismic surveys and supplemented it with high-resolution bathymetric data gathered offshore by Scripps researchers between 2006 and 2009 and seismic surveys conducted aboard former Scripps research vessels New Horizon and Melville in 2013. The disparate data have different resolution scales and depth of penetration providing a "nested survey" of the region. This nested approach allowed the scientists to define the fault architecture at an unprecedented scale and thus to create magnitude estimates with more certainty.They identified four segments of the strike-slip fault that are broken up by what geoscientists call stepovers, points where the fault is horizontally offset. Scientists generally consider stepovers wider than three kilometers more likely to inhibit ruptures along entire faults and instead contain them to individual segments -- creating smaller earthquakes. Because the stepovers in the Newport-Inglewood/Rose Canyon (NIRC) fault are two kilometers wide or less, the Scripps-led team considers a rupture of all the offshore segments is possible, said study co-author Scripps geologist and geophysicist Neal Driscoll.The team used two estimation methods to derive the maximum potential a rupture of the entire fault, including one onshore and offshore portions. Both methods yielded estimates between magnitude 6.7 and magnitude 7.3 to 7.4.The fault system most famously hosted a 6.4-magnitude quake in Long Beach, Calif. that killed 115 people in 1933. Researchers have found evidence of earlier earthquakes of indeterminate size on onshore portions of the fault, finding that at the northern end of the fault system, there have been between three and five ruptures in the last 11,000 years. At the southern end, there is evidence of a quake that took place roughly 400 years ago and little significant activity for 5,000 years before that.Driscoll has recently collected long sediment cores along the offshore portion of the fault to date previous ruptures along the offshore segments, but the work was not part of this study.In addition to Sahakian and Driscoll, study authors include Jayne Bormann, Graham Kent, and Steve Wesnousky of the Nevada Seismological Laboratory at the University of Nevada, Reno, and Alistair Harding of Scripps. Southern California Edison funded the research at the direction of the California Energy Commission and the California Public Utilities Commission."Further study is warranted to improve the current understanding of hazard and potential ground shaking posed to urban coastal areas from Tijuana to Los Angeles from the NIRC fault," the study concludes. | Earthquakes | 2,017 |
March 2, 2017 | https://www.sciencedaily.com/releases/2017/03/170302091051.htm | Going deep to learn the secrets of Japan's earthquakes | The 11 March 2011 Tohoku-Oki earthquake was the largest and most destructive in the history of Japan. Japanese researchers -- and their Norwegian partners -- are hard at work trying to understand just what made it so devastating. | The massive earthquake that rocked Japan on 11 March 2011 killed more than 20,000 people, making it one of the most deadly natural disasters in the country's history. Virtually all of the victims drowned in a tsunami that in places was more than 30 metres high.The tsunami also crippled the Fukushima Daiichi nuclear power plant, causing meltdowns in three of the plant's six reactors and releasing record amounts of radiation to the ocean. The reactors were so unstable at one point that the former Prime Minister, Naoto Kan, later admitted he considered evacuating 50 million people from the greater Tokyo region. Eventually, 160,000 people had to leave their homes because of radiation.This national disaster, Japan's largest-ever earthquake, was a call to action for Japanese earth scientists. Their mission: to understand exactly what happened to make this quake so destructive. For this, they turned to JAMSTEC, the Japan Agency for Marine-Earth Science and Technology to probe the secrets in the 7000-metre deep Japan Trench, the epicentre of the temblor.In the five years since the disaster, researchers have found intriguing clues as to what made the quake so dangerous. Norwegian petroleum expertise from working on the Norwegian Continental Shelf is now helping to uncover new details as scientists continue to try to understand what factors contribute to making an earthquake in this region really big. In doing so, they hope to be able to better predict the magnitude and location of future quakes and tsunamis.Japan sits in what may be one of the most dangerous places possible when it comes to earthquakes. The northern part of the country lies on a piece of the North American plate, whereas the southern part of the country sits on the Eurasian plate. In the north, the Pacific plate is sliding underneath the North American plate, while to the south, the Eurasian plate is riding over the Philippine Sea plate. When one plate moves in relation to another, the movement can trigger an earthquake and tsunami.The complex jumble of tectonic plates explains why roughly 1,500 earthquakes rattle the country every year, and why it is home to 40 active volcanoes -- 10 per cent of the world's total.Given that Japan experiences so many earthquakes, the quake that shook the country on the afternoon of 11 March wasn't completely unexpected. In fact, researchers predicted that the region would see an earthquake of 7.5 magnitude or more over the next 30 years.Earthquakes are routine enough in Japan that the country has strict building codes to prevent damage. Most large buildings wriggle and sway with the shaking of the earth -- one man in Tokyo told the BBC that the movements in his workplace skyscraper during the 2011 quake were so strong he felt seasick -- and even the Fukushima Daiichi nuclear plant was protected by 10-metre-high seawall.Yet some combination of factors made the Tohoku-Oki earthquake bigger and with a more deadly tsunami than scientists expected. But what?"This is what we want to understand -- and to mitigate," says Shin'ichi Kuramoto, Director General for the Center for Deep Earth Exploration at JAMSTEC. "Why do these big earthquakes occur?"JAMSTEC researchers mobilized almost immediately after the disaster, and sent their 106-metre-long research vessel RV Kairei to the quake's epicentre just a few days after it occurred.For a little over two weeks, the ship cruised over the Japan Trench off the coast of Honshu. The purpose was to create a bathymetric picture of the sea bottom and to collect reflection seismic data, which allows researchers to peer into the sediments and rocks underneath the seafloor.A subsequent cruise by JAMSTEC's RV Kaiyo 7-8 months after the earthquake collected additional high-resolution reflection seismic images in the area. Fortunately, the researchers also had data from a similar study that had been done in 1999 in the same region.The data showed them that the landward seafloor in the trench area slipped as much as 50 metres horizontally, said Yasuyuki Nakamura, Deputy Group leader in JAMSTEC's Center for Earthquake and Tsunami Structural Seismology Group."This was a big slip in the trench axis area," he said. "For comparison, the 1995 Kobe earthquake, which killed more than 6000 people and was a magnitude 7.3, had an average slip of 2 metres."Another magnitude 8 earthquake in 1946 in the Nankai area in southern Japan that destroyed 36,000 homes had a maximum slip of 10 metres, Nakamura said."So you can see that 50 metres is a very huge slip," he said. That in itself partly explains why the tsunami wave was so big, he said.When Martin Landrø, a geophysicist at the Norwegian University of Science and Technology (NTNU), read about the Japanese earthquake and learned that his Japanese counterparts had collected seismic data from both before and after the quake, he thought he might be able to offer some help.For more than 20 years, Landrø has worked with interpreting and visualizing seismic data. Oil companies and geophysicists routinely use this approach to collect information about the geology under the seafloor. Landrø has studied everything from putting seismic data to work to discover new undersea oil reservoirs to visualizing what happens to COIt works like this: a ship sails along a straight line for 100 kilometres or more, and uses airguns to send an acoustic signal every 50 metres while the ship sails along. The ship also tows a long cable behind it to record the acoustic signals that are reflected back by the sediments and bedrock under the sea floor. Simply stated, harder materials reflect signals back more quickly than softer materials.Geologists can create a two-dimensional image, a cross section of the geology under the sea floor, by towing one long cable behind a ship. A three-dimensional image can be created by towing a number of cables with sensors on them and essentially combining a series of two-dimensional images into a three-dimensional one.A very special type of seismic data, however, is called 4D, where the fourth dimension is time. Here, geophysicists can combine 2D images from different time periods, or 3D images from different time periods to see how an area has changed over time. It can be highly complex, especially if different systems have been used to collect the seismic data from the two different time periods. But 4D seismic analysis is Landrø's special expertise.Landrø contacted Shuichi Kodaira, director of JAMSTEC's Center for Earthquake and Tsunami, and said that he wanted to see if some of the techniques that had been used for petroleum-related purposes could be used to understand stress changes related to earthquakes. Kodaira agreed.Then it was just a matter of getting the data and "reprocessing it," Landrø said, to make the two different time periods as comparable as possible."We could then estimate movements and changes caused by the earthquake at the seabed and below the seabed," Landrø said.After nearly a year of working remotely together on the data, Landrø and his Norwegian colleagues flew to Japan in November 2016 to meet their Japanese counterparts for the first time. They're now in the process of jointly writing a scientific paper for publication, so he is reluctant to describe their new findings in detail before they are published."The ultimate goal here is to understand what happened during the earthquake in as detailed a way as possible. The big picture is more or less the same," Landrø said. "It is more like we are looking at minor details that might be important using a technique that has been used in the oil industry for many years. Maybe we will see some details that haven't been seen before."Landrø is also interested in a system that JAMSTEC has installed in the ocean off the southern part of the country, called the Dense Oceanfloor Network system for Earthquakes and Tsunamis, more commonly known as DONET.The DONET system (of which there are now two) is a series of linked pressure sensors installed on the ocean floor in the Nankai Trough, an area that has been hit by repeated dangerous earthquakes, JAMSTEC's Nakamura said.The Nankai Trough is located where the Philippine Sea plate is sliding under the Eurasian plate at a rate of about 4 cm per year. In general, there have been large earthquakes along the trough every 100 to 150 years.DONET 1 also includes a series of seismometers, tilt meters and strain indicators that were installed in a pit 980 metres below a known earthquake centre in the Nankai Trough. The sensors from the pit and from the seafloor above are all linked in a network of cables that sends real-time observations to monitoring stations and to local governments and businesses.Essentially, if there is movement big enough to cause an earthquake and tsunami, the sensors will report it. JAMSTEC researchers have conducted studies that show that the DONET network could detect a coming tsunami as much as 10 to 15 minutes earlier than land-based detection stations along the coast. Those extra minutes could mean saving thousands of lives."One of the main purposes here is to provide a tsunami early warning system," Nakamura said. "We've been collaborating with local governments to establish this."Landrø says he thinks that using techniques from 4D seismic imaging could also be used with the data collected by all the DONET sensors.The DONET approach, or some variation of it, might also be useful in the future as Norway and other countries explore using oil reservoirs to store CO2. One of the biggest concerns about storing CO2 in subsea reservoirs is monitoring the storage area to make certain the COLandrø also says he thinks that techniques from 4D seismic imaging could be used with the data collected by all the DONET sensors to obtain a better understanding of how the area is changing over time.DONET "is passive data, listening to the rock," Landrø said. "But here you could also use some of the same techniques as for 4D analysis to learn more." | Earthquakes | 2,017 |
March 1, 2017 | https://www.sciencedaily.com/releases/2017/03/170301130521.htm | 2017 forecast: Significant chance of earthquake damage in the Central and Eastern US | A one-year seismic hazard model for 2017, from the U.S. Geological Survey, forecasts lower damaging ground shaking levels in the central and eastern U.S. compared to the previous forecast, in areas where there have been numerous earthquakes induced by wastewater disposal from industrial activities. | Despite the recent drop in earthquake rates, Oklahoma and southern Kansas still face a significant risk of induced earthquake damage in 2017, according to the USGS report published March 1 in the journal For more than 3 million people in Oklahoma and southern Kansas, the chance of damage in the next year from induced earthquakes is similar to that of natural earthquakes in high-hazard areas of California, the report concludes. Ground shaking caused by a quake is considered damaging if it is strong enough to crack plaster and weak masonry.The 2017 forecast is the follow-up to a similar report in 2016, which was the first to consider seismic hazards from both induced and natural earthquakes in the central and eastern U.S. over a one-year timeframe. USGS' Mark Petersen and colleagues found that there were lower rates of earthquakes in 2016 compared to 2015 in five key areas: Oklahoma-Kansas, the Raton Basin along the Colorado-New Mexico border, north Texas, north Arkansas, and the New Madrid seismic zone (which ruptures in natural earthquakes) extending from Illinois to Mississippi.The decreased rate may be due to a decrease in wastewater injection from oil and gas production, the USGS researchers noted. When wastewater produced by oil and gas production is returned to the ground, it creates changes in pressure along faults, unclamping them and allowing them to slip. Seismologists think that the volume of wastewater, along with the rate at which it is injected back into the ground, are important factors in whether an earthquake is triggered by the activity.Wastewater injection in this region may have decreased in 2016 due to new regulations for its disposal, or slowed due to lower oil prices and less overall production."We understand, for example, that there were industry regulations introduced in Oklahoma [in 2016] by the Oklahoma Corporation Commission, that reduced the amount of injection in some areas by up to 40 percent," Petersen explained.Last year's forecast performed well in many respects, said Petersen. In Oklahoma, for instance, all 21 of the magnitude 4 or larger earthquakes in the catalog occurred within the area designated as the highest hazard area of the 2016 forecast. Oklahoma experienced three M 5+ earthquakes in 2016, including the magnitude 5.8 earthquake near Pawnee that was the largest earthquake ever recorded in the state.The model also correctly forecasted that there would be damaging shaking in the Raton Basin, where two magnitude 4 or larger quakes took place in 2016.In north Texas and north Arkansas, however, where the researchers expected damaging induced earthquakes, there were no earthquakes in 2016 larger than magnitude 2.7. Petersen said the USGS team will be working with researchers in north Texas to find out whether there were any changes in injection practices in the area in the past year.Petersen said the 2016 hazard forecast has been used by risk modelers, the U.S Army Corps of Engineers, and state geological surveys and city and county emergency managers, among others.The one-year forecasts may become less relevant if induced earthquake rates continue to fall as the result of wastewater injection regulations, Petersen said. "But as long as the earthquake rates in the central and eastern U.S. remain elevated and cause a significant chance for damaging ground shaking, it will be important to make these types of forecasts. Continuing collaborations between regulators, industry, and scientists will be important in reducing hazard, improving forecasts, and enhancing preparedness." | Earthquakes | 2,017 |
February 27, 2017 | https://www.sciencedaily.com/releases/2017/02/170227120347.htm | Earth probably began with a solid shell | Today's Earth is a dynamic planet with an outer layer composed of giant plates that grind together, sliding past or dipping beneath one another, giving rise to earthquakes and volcanoes. Others separate at undersea mountain ridges, where molten rock spreads out from the centers of major ocean basins. | But new research suggests that this was not always the case. Instead, shortly after Earth formed and began to cool, the planet's first outer layer was a single, solid but deformable shell. Later, this shell began to fold and crack more widely, giving rise to modern plate tectonics.The research, described in a paper published February 27, 2017 in the journal "Models for how the first continental crust formed generally fall into two groups: those that invoke modern-style plate tectonics and those that do not," said Michael Brown, a professor of geology at the University of Maryland and a co-author of the study. "Our research supports the latter -- a 'stagnant lid' forming the planet's outer shell early in Earth's history."To reach these conclusions, Brown and his colleagues from Curtin University and the Geological Survey of Western Australia studied rocks collected from the East Pilbara Terrane, a large area of ancient granitic crust located in the state of Western Australia. Rocks here are among the oldest known, ranging from 3.5 to about 2.5 billion years of age. (Earth is roughly 4.5 billion years old.) The researchers specifically selected granites with a chemical composition usually associated with volcanic arcs -- a telltale sign of plate tectonic activity.Brown and his colleagues also looked at basalt rocks from the associated Coucal formation. Basalt is the rock produced when volcanoes erupt, but it also forms the ocean floor, as molten basalt erupts at spreading ridges in the center of ocean basins. In modern-day plate tectonics, when ocean floor basalt reaches the continents, it dips -- or subducts -- beneath the Earth's surface, where it generates fluids that allow the overlying mantle to melt and eventually create large masses of granite beneath the surface.Previous research suggested that the Coucal basalts could be the source rocks for the granites in the Pilbara Terrane, because of the similarities in their chemical composition. Brown and his collaborators set out to verify this, but also to test another long-held assumption: could the Coucal basalts have melted to form granite in some way other than subduction of the basalt beneath Earth's surface? If so, perhaps plate tectonics was not yet happening when the Pilbara granites formed.To address this question, the researchers performed thermodynamic calculations to determine the phase equilibria of average Coucal basalt. Phase equilibria are precise descriptions of how a substance behaves under various temperature and pressure conditions, including the temperature at which melting begins, the amount of melt produced and its chemical composition.For example, one of the simplest phase equilibria diagrams describes the behavior of water: at low temperatures and/or high pressures, water forms solid ice, while at high temperatures and/or low pressures, water forms gaseous steam. Phase equilibria gets a bit more involved with rocks, which have complex chemical compositions that can take on very different mineral combinations and physical characteristics based on temperature and pressure."If you take a rock off the shelf and melt it, you can get a phase diagram. But you're stuck with a fixed chemical composition," Brown said. "With thermodynamic modeling, you can change the composition, pressure and temperature independently. It's much more flexible and helps us to answer some questions we can't address with experiments on rocks."Using the Coucal basalts and Pilbara granites as a starting point, Brown and his colleagues constructed a series of modeling experiments to reflect what might have transpired in an ancient Earth without plate tectonics. Their results suggest that, indeed, the Pilbara granites could have formed from the Coucal basalts.More to the point, this transformation could have occurred in a pressure and temperature scenario consistent with a "stagnant lid," or a single shell covering the entire planet.Plate tectonics substantially affects the temperature and pressure of rocks within Earth's interior. When a slab of rock subducts under the Earth's surface, the rock starts off relatively cool and takes time to gain heat. By the time it reaches a higher temperature, the rock has also reached a significant depth, which corresponds to high pressure -- in the same way a diver experiences higher pressure at greater water depth.In contrast, a "stagnant lid" regime would be very hot at relatively shallow depths and low pressures. Geologists refer to this as a "high thermal gradient.""Our results suggest the Pilbara granites were produced by melting of the Coucal basalts or similar materials in a high thermal gradient environment," Brown said. "Additionally, the composition of the Coucal basalts indicates that they, too, came from an earlier generation of source rocks. We conclude that a multi-stage process produced Earth's first continents in a 'stagnant lid' scenario before plate tectonics began." | Earthquakes | 2,017 |
February 27, 2017 | https://www.sciencedaily.com/releases/2017/02/170227120333.htm | Study opens new questions on how the atmosphere and oceans formed | A new study led by The Australian National University (ANU) has found seawater cycles throughout Earth's interior down to 2,900km, much deeper than previously thought, reopening questions about how the atmosphere and oceans formed. | A popular theory is that the atmosphere and oceans formed by releasing water and gases from Earth's mantle through volcanic activity during the planet's first 100 million years.But lead researcher Dr Mark Kendrick from ANU said the new study provided evidence to question this theory."Our findings make alternative theories for the origin of the atmosphere and oceans equally plausible, such as icy comets or meteorites bringing water to Earth," said Dr Kendrick from the ANU Research School of Earth Sciences.Seawater is introduced into Earth's interior when two tectonic plates converge and one plate is pushed underneath the other into the mantle.The study has overturned the notion that seawater only makes it about 100km into the mantle before it is returned to Earth's surface through volcanic arcs, such as those forming the Pacific Ring of Fire that runs through the western America's, Japan and Tonga.The team analysed samples of volcanic glass from the Atlantic, Pacific and Indian oceans that contained traces of seawater that had been deeply cycled throughout Earth's interior."The combination of water and halogens found in the volcanic glasses enables us to preclude local seawater contamination and conclusively prove the water in the samples was derived from the mantle," Dr Kendrick said. | Earthquakes | 2,017 |
February 27, 2017 | https://www.sciencedaily.com/releases/2017/02/170227082217.htm | Better communication key to cutting earthquake death toll, experts say | Communicating earthquake risk has long been a major challenge for scientists. Yet the right messages at the right time can and will save lives, say U.S. Communication scholars in an article published in the | A major problem is that scientists are unable to predict when, where, and with what strength the next earthquake will strike. Instead, they use 'probabilistic forecasting' based on seismic clustering. Earthquake experts have long grappled with the problem of how to convey these complex probabilities to lay persons.The tragic Italian 2009 L'Aquila earthquake highlighted the difficult task facing scientists when communicating risk and uncertainty. Poor risk communication about the tremors that preceded the deadly quake led to widespread misunderstanding and confusion among the general public. The consequences were devastating. This crystallized the need for operational earthquake forecasting (OEF) scientists to change what and how they communicate with one another and the public.In this study, U.S. researchers, led by Deanna Sellnow, a Communication Professor from the University of Central Florida, examined the impact of the L'Aquila earthquake on the international scientific earthquake community of practice (CoP). Key tasks included a review of the failed communication crisis and a detailed analysis of a OEF Decision Making workshop held in June 2014.The findings showed a significant shift in the earthquake scientists' approach to communication. They transformed their goal from being focused solely on probabilistic modelling to actively forming strong partnerships with a diverse range of experts, including risk communication experts.By involving a range of interdisciplinary partners, the OEF CoP developed a clear, evidence-based, practical approach to improve risk communication and protect public safety during earthquakes and other natural disasters. Key recommendations include:2) Developing simple and precise public warning messages that are less likely to be misunderstood, and ensuring message alerts are timely and delivered through multiple communication sources and channels.3) Minimizing the potential negative impact of inaccurate and misleading messages by issuing corrections or clarifications promptly.Sellnow writes, "This research confirms the importance of translating science into accurate and comprehensible messages delivered to non-scientific publics. The expanded community of practice that emerged as a result of the [L'Aquila] risk communication failure, which now includes communication social science experts, can serve as a model for other scientific communities that also may need to translate their knowledge effectively to disparate non-scientific publics." | Earthquakes | 2,017 |
February 22, 2017 | https://www.sciencedaily.com/releases/2017/02/170222113802.htm | Insight into a physical phenomenon that leads to earthquakes | Scientists have gotten better at predicting where earthquakes will occur, but they're still in the dark about when they will strike and how devastating they will be. | In the search for clues that will help them better understand earthquakes, scientists at the University of Pennsylvania are studying a phenomenon called ageing. In ageing, the longer that materials are in contact with each other, the more force is required to move them. This resistance is called static friction. The longer something, such as a fault, is sitting still, the more static friction builds up and the stronger the fault gets.Even when the fault remains still, tectonic motion is still occurring; stress builds up in the fault as the plates shift until finally they shift so much that they exceed the static friction force and begin to slide. Because the fault grew stronger with time, the stress can build up to large levels, and a huge amount of energy is then released in the form of a powerful quake."This ageing mechanism is critical in underlying the unstable behavior of faults that lead to earthquakes," said Robert Carpick, the John Henry Towne Professor and chair of the Department of Mechanical Engineering and Applied Mechanics in Penn's School of Engineering and Applied Science. "If you didn't have ageing, then the fault would move very easily and so you'd get much smaller earthquakes happening more frequently, or maybe even just smooth motion. Ageing leads to the occurrence of infrequent, large earthquakes that can be devastating."Scientists have been studying the movement of faults and ageing in geological materials at the macroscale for decades, producing phenomenological theories and models to describe their experimental results. But there's a problem when it comes to these models."The models are not fundamental, not physically based, which means we cannot derive those models from basic physics," said Kaiwen Tian, a graduate student in Penn's School of Arts & Sciences.But a Penn-based project seeks to understand the friction of rocks from a more physical point of view at the nanoscale.In their most recent paper, published in The research was led by Tian and Carpick. David Goldsby, an associate professor in the Department of Earth and Environmental Science at Penn; Izabela Szlufarska, a professor of materials science and engineering at the University of Wisconsin-Madison; UW alumnus Yun Liu; and Nitya Gosvami, now an assistant professor in the Department of Applied Mechanics at IIT Delhi, also contributed to the study.Previous work from the group found that static friction is logarithmic with time. That means that if materials are in contact for 10 times longer, then the friction force required to move them doubles. While scientists had seen this behavior of rocks and geological materials at the macroscopic scale, these researchers observed it at the nanoscale.In this new study, the researchers varied the amount of normal force on the materials to find out how load affects the ageing behavior."That's a very important question because load may have two effects," Tian said. "If you increase load, you will increase contact area. It may also affect the local pressure."To study this, the researchers used an atomic force microscope to investigate bonding strength where two surfaces meet. They used silicon oxide because it is a primary component of many rock materials. Using the small nanoscale tip of the AFM ensures that the interface is composed of a single contact point, making it easier to estimate the stresses and contact area.They brought a nanoscale tip made from silicon oxide into contact with a silicon oxide sample and held it there. After enough time passed, they slid the tip and measured the force required to initiate sliding. Carpick said this is analogous to putting a block on the floor, letting it sit for a while, and then pushing it and measuring how much force it takes for the block to start moving.They observed what happened when they pushed harder in the normal direction, increasing the load. They found that they doubled the normal force, and then the friction force required also doubled.Explaining it required looking very carefully the mechanism leading to this increase in friction force."The key," Carpick said, "is we showed in our results how the dependence of the friction force on the holding time and the dependence of the friction force on the load combine. This was consistent with a model that assumes that the friction force is going up because we're getting chemical bonds forming at the interface, so the number of those bonds increase with time. And, when we push harder, what we're doing is increasing the area of contact between the tip and the sample, causing friction to go up with normal force."Prior to this research, it had been suggested that pushing harder might also cause those bonds to form more easily.The researchers found that this wasn't the case: to a good approximation, increasing the normal force simply increases the amount of contact and the number of sites where atoms can react.Currently, the group is looking at what happens when the tip sits on the sample for very short amounts of time. Previously they had been looking at hold times from one-tenth of a second to as much as 100 seconds. But now they're looking at timescales even shorter than one-tenth of a second.By looking at very short timescales, they can gain insights into the details of the energetics of the chemical bonds to see if some bonds can form easily and if others take longer to form. Studying bonds that form easily is important because those are the first bonds to form and might provide insight into what happens at the very beginning of the contact.In addition to providing a better understanding of earthquakes, this work could lead to more efficient nano-devices. Because many micro- and nano-devices are made from silicon, understanding friction is key to getting those devices to function more smoothly.But, most important, the researchers hope that somewhere down the line, a better understanding of ageing will enable them to predict when earthquakes will occur."Earthquake locations can be predicted fairly well," Carpick said, "but when an earthquake is going to happen is very difficult to predict, and this is largely because there's a lack of physical understanding of the frictional mechanisms behind the earthquakes. We have long way to go to connect this work to earthquakes. However, this work gives us more fundamental insights into the mechanism behind this ageing and, in the long term, we think these kinds of insights could help us predict earthquakes and other frictional phenomena better."This research was supported by a grant from the Earth Sciences Division of the National Science Foundation. | Earthquakes | 2,017 |
February 14, 2017 | https://www.sciencedaily.com/releases/2017/02/170214104252.htm | Ventura fault could cause stronger shaking, new research finds | A new study by a team of researchers, including one from the University of California, Riverside, found that the fault under Ventura, Calif., would likely cause stronger shaking during an earthquake and more damage than previously suspected. | The Ventura-Pitas Point fault in southern California has been the focus of a lot of recent attention because it is thought to be capable of magnitude 8 earthquakes. It underlies the city of Ventura and runs offshore, and thus may be capable of generating tsunamis.Since it was identified as an active and potentially dangerous fault in the late 1980s, there has been a controversy about its location and geometry underground, with two competing models.Originally, researchers assumed the fault was planar and steeply dipping, like a sheet of plywood positioned against a house, to a depth of about 13 miles. But a more recent study, published in 2014, suggested the fault had a "ramp-flat geometry," with a flat section between two tilting sections, similar to a portion of a staircase.In a recently published paper in In these computer models, the crust -- outermost layer of rock -- in the Ventura-Santa Barbara region is represented as a three-dimensional volume, with the surfaces of the region's faults as weaknesses within it. That volume is then "squeezed" at the rate and direction that the region is being squeezed by plate tectonics. In comparisons of the expected movement in the models with GPS data, the fault with the staircase-like structure was favored.That means more of the fault, which runs westward 60 miles from the city of Ventura, through the Santa Barbara Channel, and beneath the cities of Santa Barbara and Goleta, is closer to the surface. That would likely cause stronger shaking during an earthquake and more damage."Our models confirm that the Ventura-Pitas Point fault is a major fault, that lies flat under much of the coast between Ventura and Santa Barbara," said Gareth Funning, an associate professor of geophysics at UC Riverside, one of the authors of the study. "This means that a potential source of large earthquakes is just a few miles beneath the ground in those cities. We would expect very strong shaking if one occurred."Future research will address the consequences of there being a fault ramp under Ventura. Researchers now can run more accurate simulations based on the ramp model to predict where the shaking will be strongest, and whether they would expect a tsunami. | Earthquakes | 2,017 |
February 14, 2017 | https://www.sciencedaily.com/releases/2017/02/170214130441.htm | Seismicity in British Columbia and hidden continent called Zealandia | The science and information magazine of The Geological Society of America, | Seismicity in the Pacific Northwest is well documented and includes recent seismic activity on fault systems within the Juan de Fuca Strait. However, the seismic potential of crustal faults within the forearc of the northern Cascadia subduction zone in British Columbia has remained elusive. This article by Kristin Morell, Christine Regalla, Lucinda J. Leonard, and Vic Levson presents evidence for earthquake surface ruptures along the Leech River fault, a prominent crustal fault near Victoria, British Columbia. The authors use LiDAR and field data to identify linear scarps, sags, and swales that cut across both bedrock and Quaternary deposits along the Leech River fault. Displacement data indicate that the Leech River fault has experienced at least two surface-rupturing earthquakes since deglaciation following the last glacial maximum ca. 15 ka. The history of multiple Quaternary ruptures along the Leech River fault zone suggests that it is capable of producing earthquakes of MW >6. This active fault zone lies within tens of kilometers of downtown Victoria, British Columbia, Canada, and is in close proximity to three local water dams. Thus, the author's identification of a significant shallow seismic source has considerable implications for the seismic risk exposure of this populated region.Most of us view the continents and oceans as discrete entities of land and water across Earth's surface. However, even a cursory look at our world establishes the problem. Are North America and South America truly separate continents with their connection through the Isthmus of Panama? Where and why does one distinguish Europe, Africa, and Asia considering the Bosphorus and Sinai Peninsula? One might suggest a geological reason: Continents are large, identifiable areas underlain by continental crust.The article by Nick Mortimer, Hamish J. Campbell, Andy J. Tulloch, Peter R. King, Vaughan M. Stagpoole, Ray A. Wood, Mark S. Rattenbury, Rupert Sutherland, Chris J. Adams, Julien Collot, and Maria Seton follows this idea, but then throws a fascinating twist on the subject: Zealandia. One only needs to look at a bathymetric map, where ocean water is removed, to appreciate the issue. Several islands, notably New Zealand and New Caledonia, are connected by submerged continental crust across a large area of Earth's surface. This mostly underwater continent is geologically separate and distinct from Australia and Antarctica, and as highlighted by Mortimer and colleagues, should be treated as such. Basically, from a well-reasoned geoscience perspective, Earth has well-established continents, but also an extra one, mostly underwater. | Earthquakes | 2,017 |
February 13, 2017 | https://www.sciencedaily.com/releases/2017/02/170213090756.htm | Scientists uncover huge 1.8 million square kilometers reservoir of melting carbon under Western United States | New research published in Earth and Planetary Science Letters describes how scientists have used the world's largest array of seismic sensors to map a deep-Earth area of melting carbon covering 1.8 million square kilometres. Situated under the Western US, 350km beneath Earth's surface, the discovered melting region challenges accepted understanding of how much carbon Earth contains -- much more than previously understood. | The study, conducted by geologist at Royal Holloway, University of London's Department of Earth Sciences used a huge network of 583 seismic sensors that measure Earth's vibrations, to create a picture of the area's deep sub surface. Known as the upper mantle, this section of Earth's interior is recognised by its high temperatures where solid carbonates melt, creating very particular seismic patterns."It would be impossible for us to drill far enough down to physically 'see' Earth's mantle, so using this massive group of sensors we have to paint a picture of it using mathematical equations to interpret what is beneath us," said Dr Sash Hier-Majumder of Royal Holloway.He continued, "Under the western US is a huge underground partially-molten reservoir of liquid carbonate. It is a result of one of the tectonic plates of the Pacific Ocean forced underneath the western USA, undergoing partial melting thanks to gasses like COAs a result of this study, scientists now understand the amount of CO"We might not think of the deep structure of Earth as linked to climate change above us, but this discovery not only has implications for subterranean mapping but also for our future atmosphere," concluded Dr Hier-Majumder, "For example, releasing only 1% of this CO | Earthquakes | 2,017 |
February 10, 2017 | https://www.sciencedaily.com/releases/2017/02/170210130921.htm | Cold plates and hot melts: New data on history of Pacific Ring of Fire | About 2000 kilometers east of the Philippine Islands lies one of the most famous topographical peculiarities of the oceans: the Mariana trench. Reaching depths of up to 11,000 meters below sea level, it holds the record as the deepest point of the world's ocean. This 4000-kilometer-long trench extends from the Mariana Islands in the south through the Izu-Bonin Islands to Japan in the north. Here, the Pacific Plate is subducted beneath the Philippine Sea Plate, resulting in intense volcanic activity and a high number of earthquakes. The entire area is part of the "Pacific Ring of Fire." | But when and how exactly did the subduction of the Pacific Plate begin? This is a controversial topic among scientists. An international team led by the GEOMAR Helmholtz Center for Ocean Research Kiel, the Japan Agency for Marine Earth Science and Technology (JAMSTEC) and the Australian National University investigated this early phase of subduction along the Izu-Bonin-Mariana trench, with findings published in the March edition of the scientific journal The study is based on a drill core that was obtained by the International Ocean Discovery Program (IODP) in 2014 with the US research drilling vessel JOIDES RESOLUTION some 600 kilometers west of the current Izu-Bonin Trench. "For the first time, we were able to obtain samples of rocks that originate from the first stages of subduction," says Dr. Philipp Brandl from GEOMAR, first author of the study. "It is known that the active subduction zone has been moving eastwards throughout its history and has left important geological traces on the seabed during its migration. We have now drilled where the process has begun."The team of the JOIDES RESOLUTION was able to drill more than 1600 meters deep on the seabed, starting at a water depth of around 4700 meters below sea level. "This is already at the limit of the technically feasible," emphasizes Dr. Brandl. Based on analysis of this drill core, the researchers were able to trace the history of the subduction zone layer by layer up to the approximately 50 million year-old rocks at the bottom of the core, which are typical for the birth of a subduction zone. "There has not been such a complete overview yet," says Dr. Brandl.Brandl and his colleagues were now able to acquire and analyze microscopic inclusions of cooled magma from the rocks. The data obtained provide the scientists with insights into the history of volcanic activity at the Pacific Ring of Fire 30-40 million years ago. The researchers found evidence that volcanism was only beginning to gain momentum. The volcanic activity intensified with the rollback of the subduction zone towards the east and the huge explosive stratovolcanoes formed, similar to those present nowadays, for example along the western rim of the Pacific Ring of Fire.However, further drilling is necessary to test the validity of these observations. "The more drill cores we can gain from such old strata, the better we learn to understand our own planet," Dr. Brandl says. The question of how subduction zones develop is not only interesting to understand the history of the earth. Subduction zones are the drivers for the chemical exchange between the earth's surface and the earth's interior. "The dynamics of a subduction zone can thus also influence the speed of global elemental cycles," summarizes Dr. Brandl. | Earthquakes | 2,017 |
February 6, 2017 | https://www.sciencedaily.com/releases/2017/02/170206084238.htm | Aftershock Nepal: Changing perceptions through student journalism | When the earthquake hit Nepal in 2015, the newspapers were full of stories of the tragic event, the devastation left following the natural disaster, and the heroic clean-up effort on the ground. Less focus, as is often the case with crisis news, was given to the lives of the people who were affected by the quake, which is where the research project Aftershock Nepal came in. | Aftershock Nepal took students from Bournemouth University and sent them to Kathmandu to report on the aftermath of the disaster and, more importantly, to challenge traditional crisis journalism by capturing the voices of Nepalese people who were dealing with life, loss, and repair after the earthquake.The project was led by BU's Dr Chindu Sreedharan, himself a former journalist and now a Senior Lecturer in Journalism and Communication, in collaboration with BU colleagues Dr Einar Thorsen and Robert Munday. Dr Sreedharan says, "The whole idea of Aftershock Nepal was to chronicle what was happening out there, and we found that there was a real need for that. The media attention of a disaster such as this can come and go so quickly and we had the time to fill the gap and address the issue. We also gave our students the opportunity to respond to live crisis reporting to see how they would put into practice what we had been teaching them."The project was run in partnership with Symbiosis International University and Amity University in India and Kathmandu University and Tribhuvan University in Nepal. Students from all five universities spent time working in a Kathmandu-based news bureau set up specifically for the project, gathering stories and publishing them on the Aftershock Nepal website, Facebook page, Instagram page and Twitter account.The students travelled through Nepal, gathering stories from far-flung places which are rarely accessible to journalists reporting during a moment of crisis. The team were encouraged to use a breadth of reporting techniques, utilising multimedia skills to present the stories in various forms, such as longer and shorter form writing, as well as video, audio and pictorially. Aftershock Nepal reporters also engaged with virtual reality, capturing some of the first 360-degree footage to come out of Nepal."The kind of stories that they wanted to tell were those of ordinary people," says Dr Sreedharan. This was about chronicling their life after the quake, what they went through, recording their day-to-day life experiences -- while the rest of the world moved on, they were still rebuilding. It was humanitarianism journalism."Some of the stories reported through Aftershock Nepal go beyond the normal style of media reporting; we gathered a range of different and varied voices from across Nepal for a number of months, up until the first anniversary of the quake. Crises are not simple things, they are complex, and so are the lives of those involved. We wanted to reflect that in our reporting and help in some way through our reporting."The inspiration for this project was informed by a longstanding body of BU research into crisis news and the resulting Aftershock Nepal project is an excellent example of professional practice and research blending together. The findings from this project were presented through conferences but, as Dr Sreedharan explains, this way of doing research is also unconventional."It is slightly different to other studies. Our practice is our research and we want to understand the impact of what we did on stakeholders involved," he said. "We did not publish a report and wait for the effects to filter through, we made a difference to individuals through the practice of research, and then through the conferences we kickstarted the process of writing about the project and analysing the impact our work had on the lives of those who were involved. This is the research that sits alongside the project."The conferences brought together journalists, NGOs and stakeholders -- we presented our work and the issues we faced during the project. We also heard from those involved as a part of the conferences, which helped us to analyse the work we did."Dr Sreedharan and Dr Thorsen are now working to understand how this project has made a difference or produced changes in the process of crisis communication and reporting. Dr Sreedharan explains, "We are trying to understand what it means for journalism, and what it takes to cover a crisis over a longer length of time, the process of it and how to set it up. But we are also documenting what it meant to the students who took part in it, what did they get out of it, what did they learn, what challenges did they face?"We are also interested in the impact of this kind of journalism on readers, on other stakeholders, such as NGOs, external stakeholders and journalists. Our research is designed to develop a framework to understand their issues too, and the challenges they face in crisis reporting. We now have firsthand knowledge that we can use as research to make a real difference to this industry."Where else do you get an opportunity to do something of this scale? When do students get the chance to put their education into practice in a real crisis zone, to practice what they are learning and contribute to advancing scholarly knowledge at the same time? This really has been a life-changing project for so many people."The project will now have a further output, with a report on Aftershock Nepal looking at the process of the project, what happened, and the impact of the project. The project was the first to be funded by Global BU and followed a similar initiative, called Project India, where BU students covered the Indian elections in 2014. | Earthquakes | 2,017 |
January 31, 2017 | https://www.sciencedaily.com/releases/2017/01/170131124143.htm | Non-reporting 'Did You Feel It?' areas can be used to improve earthquake intensity maps | The remarkable reach of the U.S. Geological Survey's "Did You Feel It?" website can be used to improve maps of earthquake intensity -- if non-reporting areas are including in the mapping analysis, according to a new study published online February 1 in the journal | Since its launch in the late 1990s, the DYFI website (Earthquake intensity is a measure of ground shaking, usually ranging from "not felt" to "extreme." Intensity differs from earthquake magnitude or size, which measures the energy released by a quake.In the past, DYFI earthquake intensity maps have ignored ZIP codes with no DYFI reports from the public. But in the SRL study, USGS researchers John Boatwright and Eleyne Phillips suggest that these "non-reporting" ZIP codes represent real data -- that is, they indicate that no earthquake was felt in the area.Use of the DYFI website is so widespread -- "the envy of every scientific website in the world," -- said Boatwright -- that it is more reasonable to assume that "no report" means "no shaking," rather than a lack of participation by the public in that particular ZIP code."Feeling an earthquake is a powerful inducement for submitting a DYFI felt report, but there is no similar stimulus for submitting a 'not felt' report. In fact, 'not felt' reports make up less than 1% of DYFI reports," Boatwright added.In their study, Boatwright and Phillips included information from non-reporting areas to develop new intensity mapping for two California earthquakes: a January 2011 magnitude 4.5 urban earthquake that occurred near San Juan Bautista, and the February 2012 magnitude 5.6 Weitchpec earthquake that occurred in a more rural area in Humboldt County.The approach allowed the researchers to better delineate the "felt area" for these earthquakes, particularly where the intensities were lowest. For the two quakes that they analyzed, they found that there were no non-reporting ZIP codes located near the earthquake epicenter, overlapping reporting and non-reporting ZIP codes at intermediate distances of 80 to 180 kilometers away from the centers, and few reporting ZIP codes located at even further distances.After including the non-reporting ZIP codes in their analysis, Boatwright says, the researchers' DYFI-based maps of seismic intensity "resemble very well the older, historical maps that seismologists like [Charles] Richter and others published for these areas." | Earthquakes | 2,017 |
January 30, 2017 | https://www.sciencedaily.com/releases/2017/01/170130100131.htm | Prediction of large earthquake probability improved | As part of the "Research in Collaborative Mathematics" project run by the Obra Social "la Caixa," researchers of the Mathematics Research Centre (CRM) and the UAB have developed a mathematical law to explain the size distribution of earthquakes, even in the cases of large-scale earthquakes such as those which occurred in Sumatra (2004) and in Japan (2011). | The probability of an earthquake occurring exponentially decreases as its magnitude value increases. Fortunately, mild earthquakes are more probable than devastatingly large ones. This relation between probability and earthquake magnitude follows a mathematical curve called the Gutenberg-Richter law, and helps seismologists predict the probabilities of an earthquake of a specific magnitude occurring in some part of the planet.The law however lacks the necessary tools to describe extreme situations. For example, although the probability of an earthquake being of the magnitude of 12 is zero, since technically this would imply the earth breaking in half, the mathematics of the Gutenberg-Richter law do not consider impossible a 14-magnitude earthquake."The limitations of the law are determined by the fact that the Earth is finite, and the law describes ideal systems, in a planet with an infinite surface," explains Isabel Serra, first author of the article, researcher at CRM and affiliate lecturer of the UAB Department of Mathematics.To overcome these shortages, researchers studied a small modification in the Gutenberg-Richter law, a term which modified the curve precisely in the area in which probabilities were the smallest. "This modification has important practical effects when estimating the risks or evaluating possible economic losses. Preparing for a catastrophe where the losses could be, in the worst of the cases, very high in value, is not the same as not being able to calculate an estimated maximum value," clarifies co-author Álvaro Corral, researcher at the Mathematics Research Centre and the UAB Department of Mathematics.Obtaining the mathematical curve which best fits the registered data on earthquakes is not an easy task when dealing with large tremors. From 1950 to 2003 there were only seven earthquakes measuring higher than 8.5 on the Richter scale and since 2004 there have only been six. Although we are now in a more active period following the Sumatra earthquake, there are very few cases and that makes it statistically a poorer period. Thus, the mathematical treatment of the problem becomes much more complex than when there is an abundance of data. For Corral, "this is where the role of mathematics is fundamental to complement the research of seismologists and guarantee the accuracy of the studies." According to the researcher, the approach currently used to analyse seismic risk is not fully correct and, in fact, there are many risk maps which are downright incorrect, "which is what happened with the Tohoku earthquake of 2011, where the area contained an under-dimensioned risk." "Our approach has corrected some things, but we are still far from being able to give correct results in specific regions," Corral continues.The mathematical expression of the law at the seismic moment, proposed by Serra and Corral, meets all the conditions needed to determine both the probability of smaller earthquakes and of large ones, by adjusting itself to the most recent and extreme cases of Tohoku, in Japan (2011) and Sumatra, in Indonesia (2004); as well as to determine negligible probabilities for earthquakes of disproportionate magnitudes.The derived Gutenberg-Richter law has also been used to begin to explore its applications in the financial world. Isabel Serra worked in this field before beginning to study earthquakes mathematically. "The risk assessment of a firm's economic losses is a subject insurance companies take very seriously, and the behaviour is similar: the probability of suffering losses decreases in accordance with the increase in volume of losses, according to a law that is similar to that of Gutenberg-Richter, but there are limit values which these laws do not take into consideration, since no matter how big the amount, the probability of losses of that amount never results in zero" Serra explains. "That makes the 'expected value of losses' enormous. To solve this, changes would have to be made to the law similar to those we introduced to the law on earthquakes." | Earthquakes | 2,017 |
January 25, 2017 | https://www.sciencedaily.com/releases/2017/01/170125093743.htm | Novel mechanism to stop tsunamis in their tracks proposed | Devastating tsunamis could be halted before hitting Earth's shoreline by firing deep-ocean sound waves at the oncoming mass of water, new research has proposed. | Dr Usama Kadri, from Cardiff University's School of Mathematics, believes that lives could ultimately be saved by using acoustic-gravity waves (AGWs) against tsunamis that are triggered by earthquakes, landslides and other violent geological events.AGWs are naturally occurring sounds waves that move through the deep ocean at the speed of sound and can travel thousands of metres below the surface.AGWs can measure tens or even hundreds of kilometres in length and it is thought that certain lifeforms such as plankton, that are unable to swim against a current, rely on the waves to aid their movement, enhancing their ability to find food.In a paper published in the journal By the time the tsunami reaches the shoreline, Dr Kadri writes, the reduced height of the tsunami would minimise the damage caused to both civilians and the environment.Dr Kadri also believes that this process of firing AGWs at a tsunami could be repeated continuously until the tsunami is completely dispersed."Within the last two decades, tsunamis have been responsible for the loss of almost half a million lives, widespread long-lasting destruction, profound environmental effects and global financial crisis," Dr Kadri said."Up until now, little attention has been paid to trying to mitigate tsunamis and the potential of acoustic-gravity waves remains largely unexplored."The devastating tsunami that was generated in the Indian Ocean in 2004 after a magnitude 9 earthquake has been recorded as one of the deadliest natural disasters in recent history after it caused over 230,000 deaths in 14 countries.The energy released on Earth's surface by the earthquake and subsequent tsunami was estimated to be the equivalent of over 1,500 times that of the Hiroshima atomic bomb.In order to use AGWs in tsunami mitigation, engineers will firstly need to devise highly accurate AGW frequency transmitters or modulators, which Dr Kadri concedes would be challenging.It may also be possible to utilise the AGWs that are naturally generated in the ocean when a violent geological event, such as an earthquake, occurs -- essentially using nature's natural processes against itself.Indeed, Dr Kadri has already shown that naturally occurring AGWs could be utilised in an early tsunami detection system by placing detection systems in the deep ocean.Dr Kadri continued: "In practice, generating the appropriate acoustic-gravity waves introduces serious challenges due to the high energy required for an effective interaction with a tsunami. However, this study has provided proof-of-concept that devastating tsunamis could be mitigated by using acoustic-gravity waves to redistribute the huge amounts of energy stored within the wave, potentially saving lives and billions of pounds worth of damage." | Earthquakes | 2,017 |
January 24, 2017 | https://www.sciencedaily.com/releases/2017/01/170124111527.htm | Southern Italy: Earthquake hazard due to active plate boundary | Since the early civilizations, the lives of people in Europe, in the Middle East, and in North Africa have been closely linked to the Mediterranean. Natural catastrophes such as volcanic eruptions, earthquakes and tsunamis have repeatedly shattered cultures and states in this area. The reason for this constant threat is that in the Mediterranean the Eurasian plate and the African plate interact. "Unfortunately, the tectonic situation is very complicated, since there are many different fault zones in this area. This makes an exact hazard analysis for certain areas very difficult," explains Prof. Dr. Heidrun Kopp Geophysicist at GEOMAR Helmholtz Centre for Ocean Research Kiel. | Together with colleagues from France, Italy and Spain, as well as from the Universities of Kiel and Bremen, the scientists now published their results of extensive investigations of the seafloor off the coast of Sicily and Calabria in the current edition of the international scientific journal The results are based on six ship expeditions since 2010, including three with the German research vessel METEOR. During these expeditions the respective teams have mapped the seafloor using state-of-the-art technologies. In addition, the scientists have used seismic methods to investigate the structure of the ocean floor up to a depth of 30 kilometres."We already knew before that sedimentary layers in this region are typical for a situation when one plate slides underneath the other. However, it has been controversial whether these structures are old or whether the so-called subduction process is still active," explains Heidrun Kopp. The new investigations now show that the plates are still moving -- "slowly, but in a way that they can build up stresses in the interior of Earth," Professor Kopp adds.The region investigated in this study is of great interest because in the past it has repeatedly been hit by devastating earthquakes and tsunamis. For example, an earthquake in the Messina strait in 1908 and a subsequent tsunami called for 72,000 lives."Of course, with the new findings, we can not predict if and when a severe earthquake will occur. But the more we know about the seafloor and its structure in detail, the better we can estimate where the probability of natural hazards is particularly high. Then actions for hazard mitigation and building regulations can reduce the risks," says Prof. Dr. Kopp. | Earthquakes | 2,017 |
January 18, 2017 | https://www.sciencedaily.com/releases/2017/01/170118125739.htm | Heat from Earth's core could be underlying force in plate tectonics | For decades, scientists have theorized that the movement of Earth's tectonic plates is driven largely by negative buoyancy created as they cool. New research, however, shows plate dynamics are driven significantly by the additional force of heat drawn from the Earth's core. | The new findings also challenge the theory that underwater mountain ranges known as mid-ocean ridges are passive boundaries between moving plates. The findings show the East Pacific Rise, the Earth's dominant mid-ocean ridge, is dynamic as heat is transferred.David B. Rowley, professor of geophysical sciences at the University of Chicago, and fellow researchers came to the conclusions by combining observations of the East Pacific Rise with insights from modeling of the mantle flow there. The findings were published Dec. 23 in Science Advances."We see strong support for significant deep mantle contributions of heat-to-plate dynamics in the Pacific hemisphere," said Rowley, lead author of the paper. "Heat from the base of the mantle contributes significantly to the strength of the flow of heat in the mantle and to the resultant plate tectonics."The researchers estimate up to approximately 50 percent of plate dynamics are driven by heat from the Earth's core and as much as 20 terawatts of heat flow between the core and the mantle.Unlike most other mid-ocean ridges, the East Pacific Rise as a whole has not moved east-west for 50 to 80 million years, even as parts of it have been spreading asymmetrically. These dynamics cannot be explained solely by the subduction -- a process whereby one plate moves under another or sinks. Researchers in the new findings attribute the phenomena to buoyancy created by heat arising from deep in the Earth's interior."The East Pacific Rise is stable because the flow arising from the deep mantle has captured it," Rowley said. "This stability is directly linked to and controlled by mantle upwelling," or the release of heat from Earth's core through the mantle to the surface.The Mid-Atlantic Ridge, particularly in the South Atlantic, also may have direct coupling with deep mantle flow, he added."The consequences of this research are very important for all scientists working on the dynamics of the Earth, including plate tectonics, seismic activity and volcanism," said Jean Braun of the German Research Centre for Geosciences, who was not involved in the research.Convection, or the flow of mantle material transporting heat, drives plate tectonics. As envisioned in the current research, heating at the base of the mantle reduces the density of the material, giving it buoyancy and causing it to rise through the mantle and couple with the overlying plates adjacent to the East Pacific Rise. The deep mantle-derived buoyancy, together with plate cooling at the surface, creates negative buoyancy that together explain the observations along the East Pacific Rise and surrounding Pacific subduction zones.A debate about the origin of the driving forces of plate tectonics dates back to the early 1970s. Scientists have asked: Does the buoyancy that drives plates primarily derive from plate cooling at the surface, analogous with cooling and overturning of lakes in the winter? Or, is there also a source of positive buoyancy arising from heat at the base of the mantle associated with heat extracted from the core and, if so, how much does it contribute to plate motions? The latter theory is analogous to cooking oatmeal: Heat at the bottom causes the oatmeal to rise, and heat loss along the top surface cools the oatmeal, causing it to sink.Until now, most assessments have favored the first scenario, with little or no contribution from buoyancy arising from heat at the base. The new findings suggest that the second scenario is required to account for the observations, and that there is an approximately equal contribution from both sources of the buoyancy driving the plates, at least in the Pacific basin."Based on our models of mantle convection, the mantle may be removing as much as half of Earth's total convective heat budget from the core," Rowley said. Much work has been performed over the past four decades to represent mantle convection by computer simulation. Now the models will have to be revised to account for mantle upwelling, according to the researchers."The implication of our work is that textbooks will need to be rewritten," Rowley said.The research could have broader implications for understanding the formation of the Earth, Braun said. "It has important consequences for the thermal budget of the Earth and the so-called 'secular cooling' of the core. If heat coming from the core is more important than we thought, this implies that the total heat originally stored in the core is much larger than we thought."Also, the magnetic field of the Earth is generated by flow in the liquid core, so the findings of Rowley and co-authors are likely to have implications for our understanding of the existence, character and amplitude of the Earth's magnetic field and its evolution through geological time," Braun added. | Earthquakes | 2,017 |
January 11, 2017 | https://www.sciencedaily.com/releases/2017/01/170111151428.htm | Release of water shakes Pacific plate at depth | Tonga is a seismologists' paradise, and not just because of the white-sand beaches. The subduction zone off the east coast of the archipelago racks up more intermediate-depth and deep earthquakes than any other subduction zone, where one plate of Earth's lithosphere dives under another, on the planet. | "Tonga is such an extreme place, and that makes it very revealing," said S. Shawn Wei, a seismologist who earned his doctorate at Washington University in St. Louis and now is a postdoctoral fellow at the Scripps Institution of Oceanography in San Diego.That swarm of earthquakes is catnip for seismologists because they still don't understand what causes earthquakes to pop off at such great depths.Below about 40 miles, the enormous heat and pressure in Earth's interior should keep rock soft and pliable, more inclined to ooze than to snap. So triggering an earthquake at depth should be like getting molasses to shatter.In the Jan. 11 issue of Analyzing data from several seismic surveys with both ocean bottom seismometers and island-based seismic stations, they were surprised to find a zone of intense earthquake activity in the downgoing slab, which they call a seismic belt.The pattern of the activity along the slab provided strong evidence that the earthquakes are sparked by the release of water at depth."It looks like the seismic belt is produced by the sudden flushing of water when the slab warms up enough that the hydrated minerals can decompose and give off their water," said Doug Wiens, the Robert S. Brookings Distinguished Professor of earth and planetary sciences in Arts & Sciences at Washington University."The pressure of the fluid causes earthquakes in the same way that wastewater injected into deep wells causes them in Oklahoma," Wiens said. "Although the details are very different when it's many miles down, it's the same physical process. "The Tonga Trench holds a place of honor in the annals of seismology because this is where American scientists, invited to investigate the grumbling earth by the King of Tonga, got their first clear glimpse of a subduction zone in action.The classic paper that scientists Bryan Isacks, Jack Oliver and Lynn Sykes published in 1968 led to the acceptance of the then speculative theory of plate tectonics.In 1985, the Japanese seismologist Hitoshi Kawakatsu discovered something else interesting in Tonga: the descending slab has a double seismic zone. "There are two zones of earthquakes in the slab," Wiens said. "One is in the top part of the slab and the other is toward the middle of the slab."Wiens, who has been studying the Tonga subduction zone since the early 1990s, says it is a great natural laboratory because its characteristics are so extreme. The ocean floor taking the dive there is older and colder than most other subducting slabs. It is also moving very fast."In the northern part of the Tonga Trench, the slab is moving 9 inches a year," said Wiens. "The San Andreas Fault, by comparison, moves 2 inches a year."And the subducting slab has another useful quirk. It isn't descending into the trench at uniform speed but instead going down much faster at the northern end of the trench than at the southern end.This means that the slab warms up at different rates along its length. "It's like pushing a cold bar of chocolate into a bubbling pan of pudding," said Wiens. "If you push slowly, the chocolate has a chance to warm up and melt, but if you push fast, the chocolate stays cold longer."This is a perfect setup for studying temperature-dependent phenomenon.When Wei analyzed the data from Tonga, he saw the double seismic zone the Japanese scientist had discovered. "We're pretty much to follow up on that 1985 paper," he said."Where the double seismic zone started to break down in Tonga, however, we saw this really active area of earthquakes that we named the seismic belt," Wiens said. "That was a surprise; we weren't expecting it."Why the sudden burst of earthquakes as the slab descended? The telling clue was that the burst angled upward from north to south along the slab. The faster the slab was moving, the deeper the earthquakes, and the slower the slab, the shallower the earthquakes.The angled seismic belt told the scientists that the mechanism triggering earthquakes was temperature sensitive. "We think the earthquakes occur when the mantle in the downgoing slab gets hot enough to release its water," Wiens said."People have proposed this mechanism before, but this is the smoking gun, " Wiens continued. "The seismicity is changing depth in a way that's correlated with the subduction rate and the slab temperature. "But where does the water come from, and why is it released suddenly?The interior of the Pacific plate is exposed to seawater as the plate is pulled under the Tonga Plate and faults open on its upper surface, Wei said. Seawater reacts with the rock to form hydrous minerals (minerals that include water in their crystal structure) in the serpentine family. The most abundant of these serpentine minerals is a green stone called antigorite.But as the slab descends and the temperature and pressure increases, these hydrous minerals become unstable and break down through dehydration reactions, Wei said.This sudden release of large amounts of water is what triggers the earthquakes."The temperature we predict in the earthquake locations strongly suggests that minerals dehydrate very deep in the Tonga subduction zone, said Peter van Keken, a staff scientist at the Carnegie Institution for Science and a co-author on the paper.The "phase diagrams" for antigorite dehydration reactions overlap neatly with the pressure and temperature of the slab at the seismic belt.But the phase diagrams aren't that reliable at these extreme temperatures and depths. So Wei, for one, would like to see more laboratory data on the behavior of antigorite and other hydrous minerals at high temperature and pressure to nail down the mechanism.For him, the most exciting part of the research is the evidence of water 180 miles beneath the surface. " We currently don't know how much water gets to the deep Earth or how deep the water can finally reach," Wei said. "In other words, we don't know how much water is stored in the mantle, which is a key factor for Earth's water budget."The water down there may be as important to us as the water up here. It is beginning to look like water is the lubricant that oils the machine that recycles Earth's crust."The Tonga dataset is such a great treasure chest that we'll be exploiting for many years to come," said Wei. "Tonga has many more stories to tell us about Earth's interior." | Earthquakes | 2,017 |
January 9, 2017 | https://www.sciencedaily.com/releases/2017/01/170109113806.htm | High rates of PTSD and other mental health problems after great east Japan earthquake | The devastating 2011 earthquake, tsunami, and resulting nuclear disaster in Japan had a high mental health impact -- with some effects persisting several years later, according to a comprehensive research review in the January/February issue of the | Although symptoms of posttraumatic stress disorder (PTSD) related to the Great East Japan Earthquake seem to have improved over time, there is evidence of persistent problems with depression, reports the study by Dr. Shuntaro Ando of Tokyo Metropolitan Institute of Medical Science and colleagues. Their findings highlight specific areas and groups of disaster victims who may have a special need for long-term mental healthy support.On March 11, 2011, a magnitude 9.0 earthquake occurred off the Pacific Coast of northeastern Japan. A resulting tsunami damaged the Fukushima-Daiichi Nuclear Power Plant, leading to a major nuclear disaster in addition to other local destruction. Four years after this unprecedented "triple disaster," more than 80,000 people were still living in temporary housing.To assess the mental health impact of the Great East Japan Earthquake, Dr. Ando and colleagues identified and analyzed 42 research papers reporting on the type, severity, and prevalence of mental health problems in areas affected by the disaster. The analysis included information on trends in mental health problems over time and risk factors for developing such problems.In all studies that examined posttraumatic symptoms, the prevalence of PTSD was ten percent or higher. Depression and child behavior problems were also reported frequently -- although estimates varied widely due to the use of differing measures and cutoff points.In studies investigating trends in mental health problems over time, posttraumatic stress symptoms tended to improve, or in any case not to get worse. In contrast, depression symptoms tended to persist during follow-up.Risk factors for mental health problems included resettlement of daily lives, pre-existing illness, and small social network size. The reported prevalence of post-traumatic stress reactions was higher in Fukushima prefecture, where the damaged nuclear power station was located.Suicides increased initially, followed by a decrease in the two years after the earthquake. However, the suicide rate remained higher than the pre-disaster level in Fukushima, in contrast to neighboring prefectures.Natural disasters are known to increase the risk of mental health problems, with the potential for long-term effects. Because of its magnitude and unique characteristics, the Great East Japan Earthquake might have an even greater mental health impact than previous disasters.The results add to previous studies showing a high prevalence of PTSD and other mental health problems after this unique disaster. "The prevalence and severity of mental health problems seemed to be higher in Fukushima than in other prefectures, and some specific risk factors were reported for the region," Dr. Ando and colleagues conclude. The results suggest the need for long-term mental health support in Fukushima -- perhaps especially targeting evacuees who are still living in temporary housing. | Earthquakes | 2,017 |
January 9, 2017 | https://www.sciencedaily.com/releases/2017/01/170109092551.htm | Backpackers demonstrated resounding leadership in aftermath of Nepal earthquake | Prof. Haya Itzhaky was enjoying a routine day in Nepal on April 25, 2015. She had been in the region for about three months studying the behavior of post-Israel Defense Forces (IDF) Israeli backpackers when tragedy struck: a powerful earthquake measuring 7.8 on the Richter Scale took the lives of more than 9,000 people and injured tens of thousands more. | An expert in community practice, as well as topics ranging from trauma to domestic violence, and Chair of the PhD Program at Bar-Ilan University's Louis and Gabi Weisfeld School of Social Work, Itzhaky quickly initiated a study she hadn't planned on conducting. The study focused on how Israeli tourists cope in the immediate aftermath of a devastating earthquake. Recently published in the Prof. Itzhaky had been traveling with a number of Israeli backpackers when the earthquake struck. Other backpackers were caught by the quake in different locations -- some in the Everest, Annapurna, Poonhill and Langtang regions, where the destruction of villages and roads was extensive. Some had to be rescued by helicopter. In Katmandu backpackers were only moderately exposed to the devastation.Itzhaky conducted individual, in-depth interviews with 21 Israeli men and women, ranging in age from 21-26. All of the participants were interviewed between one week and one month following the initial earthquake, often shortly following one of its many aftershocks. All of the interviewees discussed where they were when the earthquake struck, how they felt, and how they responded.The interview data were later analyzed by Itzhaky and her colleagues, Karni Kissil, a US-based couple and family therapist, and Shlomit Weiss-Dagan, of the Louis and Gabi Weisfeld School of Social Work. Four dominant themes emerged from participants' descriptions of their experiences of the earthquake: emotional turmoil, quick recovery, springing into action, and connection to the army.Participants said that they had never been so terrified in their lives, that they could barely sleep for many nights following the quake, and that they thought they were going to die. Some described a sense of helplessness as they attempted to cope with the magnitude of the event. They noted, however, that they quickly recovered from the initial turmoil. They regained their emotional balance by being with other people in the same situation, talking themselves into having a fighting spirit, knowing that their homes weren't destroyed and they had a place to go back to, and speaking with their loved ones to let them know that they were safe.Following the initial emotional reactions and quick recovery, the Israeli backpackers swiftly looked for ways to improve their situation and help others. "They understood that they must survive and they turned their fears into survival and took command of the situation," says Itzhaky. A community of action was created very quickly and roles were divided among members of the group. Some Israelis approached the Nepalese and built a camp together for survivors. They helped bury the dead according to local rituals, a medic helped the injured, set up a makeshift "situation room," organized a search team to locate missing Israeli backpackers, and assisted travelers from other nationalities after they were rescued by Israeli helicopters. A group visited embassies from all over the world in the capital, Katmandu, to report on those they found and where to look for others.Participants described how the experience of the earthquake brought back memories from their time in the IDF. To some of them, revisiting their experiences in the army provided concrete steps they could take to solve problems in their current situation. For others, remembering what they went through in the army gave them a sense of mastery and competence, realizing that they managed to cope with difficult situations in the past. Some, however, found that comparing their current situation to their experiences in the army was unhelpful because they perceived the current situation as much worse and very different from the army and therefore they couldn't rely on their previous experience to regain a sense of control."I found leaders in every one of the Israelis that I met," says Itzhaky. "There were no ego trips among them. They all felt a sense of belonging to the community, a sense of togetherness. The group cohesion was a very protective factor which helped them cope. And knowing that Israel would send humanitarian assistance in their time of need comforted them and allowed them to feel that they wouldn't be alone. There was a feeling of pride in being Israeli and part of a group, and a feeling of generous spirit, kindness, generosity toward one another, and unity for the purpose of survival. In Israeli culture, people unite in difficult times, and this was very evident in Nepal."Itzhaky and team are currently following up with the group in an effort to examine their emotional state a year-and-a-half after the quake. | Earthquakes | 2,017 |
December 21, 2016 | https://www.sciencedaily.com/releases/2016/12/161221125443.htm | Report calls for improved methods to assess earthquake-caused soil liquefaction | Several strong earthquakes around the world have resulted in a phenomenon called soil liquefaction, the seismic generation of excess porewater pressures and softening of granular soils, often to the point that they may not be able to support the foundations of buildings and other infrastructure. The November 2016 earthquake in New Zealand, for example, resulted in liquefaction that caused serious damage to the Port of Wellington, which contributes approximately $1.75 billion to the country's annual GDP. An estimated 40 percent of the U.S. is subject to ground motions severe enough to cause liquefaction and associated damage to infrastructure. | Effectively engineering infrastructure to protect life and to mitigate the economic, environmental, and social impacts of liquefaction requires the ability to accurately assess the likelihood of liquefaction and its consequences. A new report by the National Academies of Sciences, Engineering, and Medicine evaluates existing field, laboratory, physical model, and analytical methods for assessing liquefaction and its consequences, and recommends how to account for and reduce the uncertainties associated with the use of these methods.When liquefaction occurs, wet granular materials such as sands and some silts and gravels can behave in a manner similar to a liquid. The most commonly used approaches to estimate the likelihood of liquefaction are empirical case-history-based methods initially developed more than 45 years ago. Since then, variations to these methods have been suggested based not only on case historical data but also informed by laboratory and physical model tests and numerical analyses. Many of the variations are in use, but there is no consensus regarding their accuracy. As a result, infrastructure design often incurs additional costs to provide the desired confidence that the effects of liquefaction are properly mitigated.The report evaluates existing methods for assessing the potential consequences of liquefaction, which are not as mature as those for assessing the likelihood of liquefaction occurring. Improved understanding of the consequences of liquefaction will become more important as earthquake engineering moves more toward performance-based design."The engineering community wrestles with the differences among the various approaches used to predict what triggers liquefaction and to forecast its consequences," said Edward Kavazanjian, Ira A. Fulton Professor of Geotechnical Engineering and Regents' Professor at Arizona State University and chair of the committee that conducted the study and wrote the report. "It's important for the geotechnical earthquake engineering community to consider new, more robust methods to assess the potential impacts of liquefaction."The committee called for greater use of principles of geology, seismology, and soil mechanics to improve the geotechnical understanding of case histories, project sites, and the likelihood and consequences of liquefaction. The committee also emphasized the need for explicit consideration of the uncertainties associated with data used in assessments as well as the uncertainties in the assessment procedures.The report recommends establishing standardized and publicly accessible databases of liquefaction case histories that could be used to develop and validate methods for assessing liquefaction and its consequences. Further, the committee suggested establishing observatories for gathering data before, during, and after an earthquake at sites with a high likelihood of liquefaction. This would allow better understanding of the processes of liquefaction and the characteristics and behavior of the soils that liquefied. Data from these sites could be used to develop and validate assessment procedures.Access the report at: | Earthquakes | 2,016 |
December 19, 2016 | https://www.sciencedaily.com/releases/2016/12/161219134436.htm | 'Tiny earthquakes' help scientists predict mountain rock falls | The risk of mountain rock falls in regions with sub-zero temperatures, such as the Swiss Alps and parts of Canada, could be better predicted by using technology which measures 'tiny earthquakes' -- according to a group of international experts. | In a new study led by the University of Sussex, geoscientists from the British Geological Survey and the Technical University of Munich reveal that using a micro-seismic technique, which detects tiny earthquakes which cause cracks in the rock, alongside modern electrical imaging technology, which measures rock mass, would provide scientists with much earlier warnings of potential rock falls.Traditionally scientists use a manual method to monitor rock freezing and thawing, which involves drilling holes into rocks and is affected by frost weathering. During the new study the scientists replicated the conditions of a freezing environment in the Permafrost Laboratory at the University of Sussex and monitored the freeze-thaw of six hard and soft limestone blocks during an experiment that simulated 27 years of natural freezing and thawing.By using the micro-seismic technique together with capacitive resistivity imaging, which measures freezing and thawing in limestone without having to drill into the rock, the study team recorded a staggering 1000 micro-cracking events.With previous studies showing that higher temperatures, caused by global warming, have led to more unstable mountain rocks -- the scientists, who took part in the new study, believe that using the two monitoring techniques together could prove vital for thousands of skiers and mountain climbers who undertake trips every year.Professor Julian Murton, from the University of Sussex, who led on the study, said: "As our climate warms mountain rock walls are becoming more unstable -- so working out how to predict rock falls could prove crucial in areas where people go climbing and skiing."Understanding the impact of freezing and thawing on bedrock is vital if we are to assess the stability of mountain rock walls. By using these two techniques together we have not only identified a practical method which allows us to monitor many more cracking events -- but also one which can be used for many years to come."Dr Oliver Kuras, from the British Geological Survey, who led on the development of geo-electrical imaging technology, said: "It is traditionally difficult to reliably 'see inside rock walls' using conventional electrical imaging methods, particularly when repeating surveys over time."With our new capacitive resistivity imaging technology, we have extended the advantages of state-of-the-art geo-electrical monitoring to hard rock environments, which should benefit geohazards research in the future."Professor Michael Krautblatter, from the Technical University of Munich, added: "With this study we could virtually visualise and listen to the cracking of rocks and we can now better understand how rock slopes become unstable and produce hazardous rock falls." | Earthquakes | 2,016 |
December 19, 2016 | https://www.sciencedaily.com/releases/2016/12/161219115220.htm | 180-million-year-old rocks lend insight into Earth's most powerful earthquakes | The raggedness of the ocean floors could be the key to triggering some of Earth's most powerful earthquakes, scientists from Cardiff University have discovered. | In a new study published in By studying exposed rocks from a 180-million-year-old extinct fault zone in New Zealand, the researchers have shown, for the first time, that the extremely thick oceanic and continental tectonic plates can slide against each other without causing much bother, but when irregularities on the sea floor are introduced, it can cause a sudden slip of the tectonic plate and trigger a giant earthquake.The researchers believe that this information, along with detailed subsurface maps of the ocean floor, could help to develop accurate models to forecast where large earthquakes are likely to occur along subduction zones, and therefore help to prepare for disasters.For generations scientists have known that the largest earthquakes, known as megathrust earthquakes, are triggered at subduction zones where a single tectonic plate is pulled underneath another one. It is also in these regions that volcanoes form, as is most common in the so-called 'Ring of Fire' in the Pacific Ocean -- the most seismically active region in the world.The most recent megathrust earthquake occurred in Tohoku, Japan in 2011. The magnitude 9 earthquake triggered a 40 metre-high tsunami and claimed over 15,000 lives with economic costs estimated at US$235 billion.However, there are many regions across the world, including in the 'Ring of Fire', where scientists would expect megathrust earthquakes to occur, but they don't.The new research appears to have solved this conundrum and therefore propose an explanation as to what triggers giant earthquakes. The team arrived at their conclusions by examining rocks that, through erosion and tectonic uplift, have been carried to Earth's surface from depths of 15-20km in an extinct fault zone in New Zealand that was once active around 180 million years ago.The team found that the rocks in the fault zone can be tens to hundreds of metres thick and can act as a sponge to soak up the pressure that builds as two tectonic plates slip past each other.This means that movement between two plates can commonly occur with no consequences, and that it takes a sudden change in the conditions, such as a lump or mound on the sea floor, to trigger an earthquake."By exhuming rocks from this depth, we've been able to gain an unprecedented insight into what a fault zone actually looks like," said Dr Ake Fagereng, lead author of the study from Cardiff University's School of Earth and Ocean sciences."With an active fault in the ocean, we can only drill to a depth of 6km, so our approach has given us some really valuable information.""We've shown that the fault zone along plate boundaries may be thicker than we originally thought, which can accommodate the stress caused by the creeping plates. However, when you have an irregularity on the sea floor, such as large bumps or mounds, this can cause the plate boundaries to slip tens of metres and trigger a giant earthquake." | Earthquakes | 2,016 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.