Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
July 29, 2019 | https://www.sciencedaily.com/releases/2019/07/190729151851.htm | Predicting earthquake hazards from wastewater injection after fracking | A byproduct of oil and gas production is a large quantity of toxic wastewater called brine. Well-drillers dispose of brine by injecting it into deep rock formations, where its injection can cause earthquakes. Most quakes are relatively small, but some of them have been large and damaging. | Yet predicting the amount of seismic activity from wastewater injection is difficult because it involves numerous variables. These include the quantity of brine injected, how easily brine can move through the rock, the presence of existing geological faults, and the regional stresses on those faults.Now a team of Arizona State University-led geoscientists, working under a Department of Energy grant, has developed a method to predict seismic activity from wastewater disposal. The team's study area is in Oklahoma, a state where much fracking activity has been carried out with a lot of wastewater injection, and where there have been several induced earthquakes producing damage.The team's paper reporting their findings appeared in the "Overall, earthquake hazards increase with background seismic activity, and that results from changes in the crustal stress," says Guang Zhai, a postdoctoral research scientist in ASU's School of Earth and Space Exploration and a visiting assistant researcher at the University of California, Berkeley. "Our focus has been to model the physics of such changes that result from wastewater injection."Zhai is lead author for the paper, and the other scientists are Manoochehr Shirzaei, associate professor in the School, plus Michael Manga, of UC Berkeley, and Xiaowei Chen, of the University of Oklahoma."Seismic activity soared in one area for several years after wastewater injection was greatly reduced," says Shirzaei. "That told us that existing prediction methods were inadequate."To address the problem, his team went back to basics, looking at how varying amounts of injected brine perturbed the crustal stresses and how these lead to earthquakes on a given fault."Fluids such as brine (and natural groundwater) can both be stored and move through rocks that are porous," says Zhai.The key was building a physics-based model that combined the rock's ability to transport injected brine, and the rock's elastic response to fluid pressure. Explains Shirzaei, "Our model includes the records collected for the past 23 years of brine injected at more than 700 Oklahoma wells into the Arbuckle formation."He adds that to make the scenario realistic, the model also includes the mechanical properties of the rocks in Oklahoma. The result was that the model successfully predicted changes in the crustal stress that come from brine injection.For the final step, Shirzaei says, "We used a well-established physical model of how earthquakes begin so we could relate stress perturbations to the number and size of earthquakes."The team found that the physics-based framework does a good job of reproducing the distribution of actual earthquakes by frequency, magnitude, and time."An interesting finding," says Zhai, "was that a tiny change in the rocks' elastic response to changes in fluid pressure can amplify the number of earthquakes by several times. It's a very sensitive factor."While wastewater injection can cause earthquakes, all major oil and gas production creates a large amount of wastewater that needs to be disposed of, and injection is the method the industry uses."So to make this safer in the future," says Shirzaei, "our approach offers a way to forecast injection-caused earthquakes. This provides the industry with a tool for managing the injection of brine after fracking operations."Knowing the volume of brine to be injected and the location of the disposal well, authorities can estimate the probability that an earthquake of given size will result. Such probabilities can be used for short-term earthquake hazard assessment.Alternatively, the team says, given the probability that an earthquake of certain size will happen, oil and gas operators can manage the injected brine volume to keep the probability of large earthquakes below a chosen value.The end result, says Zhai, "is that this process will allow a safer practice, benefiting both the general public and the energy industry." | Earthquakes | 2,019 |
July 29, 2019 | https://www.sciencedaily.com/releases/2019/07/190729123855.htm | Earthquakes: Numerical model pinpoints source of pre-cursor to seismic signals | Numerical simulations have pinpointed the source of acoustic signals emitted by stressed faults in laboratory earthquake machines. The work further unpacks the physics driving geologic faults, knowledge that could one day enable accurately predicting earthquakes. | "Previous machine-learning studies found that the acoustic signals detected from an earthquake fault can be used to predict when the next earthquake will occur," said Ke Gao, a computational geophysicist in the Geophysics group at Los Alamos National Laboratory. "This new modeling work shows us that the collapse of stress chains inside the earthquake gouge emits that signal in the lab, pointing to mechanisms that may also be important in Earth." Gao is lead author of the paper, "From Stress Chains to Acoustic Emission," published today in Stress chains are bridges composed of grains that transmit stresses from one side of a fault block to the other.Gao works on a Los Alamos team that has identified the predictive acoustic signal in data from both laboratory quakes and megathrust regions in North America, South America and New Zealand. The signal accurately indicates the state of stress in the fault, no matter when the signal is read."Using the numerical model that we developed at Los Alamos, we examine and connect the dynamics in a granular system of fault gouge to signals detected on passive remote monitors," Gao said. Fault gouge is the ground-up, gravelly rock material created by the stresses and movements of a fault.To investigate the cause of acoustic signals, the team conducted a series of numerical simulations on supercomputers using the Los Alamos-developed code HOSS (Hybrid Optimization Software Suite). This novel numerical tool is a hybrid methodology -- the combined finite-discrete element method. It merges techniques developed under discrete element methods, to describe grain-to-grain interactions; and under finite element methods, to describe stresses as a function of deformation within the grains and wave propagation away from the granular system. The simulations accurately mimic the dynamics of earthquake fault evolution, such as how the materials inside the gouge grind and collide with each other, and how the stress chains form and evolve over time via interactions between adjacent gouge materials.Los Alamos has funded a multi-million-dollar, multi-year program consisting of experiments, numerical modeling, and machine-learning efforts to develop and test a highly novel approach to probe the earthquake cycle and, in particular, to detect and locate stressed faults that are approaching failure. | Earthquakes | 2,019 |
July 23, 2019 | https://www.sciencedaily.com/releases/2019/07/190723121916.htm | Many Dallas-Fort Worth area faults have the potential to host earthquakes, new study finds | A study led by The University of Texas at Austin has found that the majority of faults underlying the Fort Worth Basin are as sensitive to changes in stress that could cause them to slip as those that have generated earthquakes in recent years. | Researchers with UT's Bureau of Economic Geology, Stanford University and Southern Methodist University have created a comprehensive map of more than 250 faults totaling more than 1,800 miles in combined length, some of which extend under highly populated areas in the Dallas-Fort Worth region. The study, published July 23 in the "That means the whole system of faults is sensitive," said lead author Peter Hennings, a bureau research scientist and the principal investigator at the Center for Integrated Seismicity Research (CISR).The Fort Worth Basin saw a major increase in seismic activity from 2008 to 2015 as oil and gas operations increased, but a significant reduction in earthquakes the last four years as injection has slowed. Hennings said that people should be aware that the nature of the fault system means that many areas are susceptible to potentially hosting earthquakes, and that an upturn in oil and gas production that results in increased deep wastewater disposal could also bring an upturn in quakes if not properly managed.SMU professor and study co-author Heather DeShon pointed out that the strongest earthquakes in the area have been magnitude 4, which is much less powerful than the major earthquakes that hit California earlier this month, but said that the region needs to prepare for the hazard."This study provides key information to allow the public, cities, state, the federal government and industry to understand potential hazards and to design effective public policies, regulations and mitigation strategies," she said.SMU compiled the earthquake history of the region -- a step that identified the need for an updated fault map in the first place. Stanford provided a map of the tectonic stress in the region and the analysis methods used to determine fault sensitivity. UT developed the new fault map and lead the overall effort. The research also benefited from data provided by petroleum industry partners.The team also created maps that show the sensitivity of the faults. One shows that the faults in their natural geologic condition are relatively stable and not expected to slip if left undisturbed. Another map shows the impact of general pressure to the subsurface caused by wastewater disposal and reveals that it significantly increases the likelihood for faults to slip."Industrial activities can increase the probability of triggering earthquakes before they would happen naturally, but there are steps we can take to reduce that probability," said co-author Jens-Erik Lund Snee, a doctoral student at Stanford University.Stanford professor and co-author Mark Zoback added that although this research focused on the Dallas-Fort Worth region, it creates a roadmap for those interested in investigating earthquake potential in other places."The methodology could be used in other areas where induced seismicity is of concern but development activities are just beginning, making it possible to identify potentially problematic faults before induced seismicity occurs," Zoback said. "So it creates a framework for making decisions in the future."CISR and the TexNet Seismological Network, managed by the bureau at the UT Jackson School of Geosciences, played a key role in the research. TexNet is a collection of seismometers across Texas that was authorized and funded by the Texas Legislature and Gov. Greg Abbott. It has been tracking seismic activity across the state since January 2017.The study found that the Fort Worth Basin is full of faults, many of which are small in size, yet susceptible to slipping. Most are less than 6 miles long -- a fact that underscores the importance in studying the area in detail using high-resolution data."Most of the faults that have slipped are too small to have been previously recognized; they're very difficult to find," Hennings said. "We certainly haven't identified all of the faults in the region, but this new work is a big improvement compared to what was previously available."The new maps are important tools in understanding the seismic hazard underling the Dallas-Fort Worth area, the most populated area in Texas. However, Hennings said that they provide just a general overview. He said that the team is working on a more in-depth analysis of individual faults that will give a more nuanced view of the unique factors influencing seismic hazard."In 2020, we will be publishing an updated version of the fault map as well as a comprehensive model that indicates the degree of pressurization that each of the earthquake faults has experienced from wastewater disposal," he said. "Combined with this current work, the future research will give industry and our regulators powerful tools to use in managing the hazard and reducing risk." | Earthquakes | 2,019 |
July 16, 2019 | https://www.sciencedaily.com/releases/2019/07/190716113030.htm | Stronger earthquakes can be induced by wastewater injected deep underground | Virginia Tech scientists have found that in regions where oilfield wastewater disposal is widespread -- and where injected water has a higher density than deep naturally occurring fluids -- earthquakes are getting deeper at the same rate as the wastewater sinks. | Perhaps more critically, the research team of geoscientists found that the percentage of high-magnitude earthquakes increases with depth, and may create -- although fewer in number -- greater magnitude earthquakes years after injection rates decline or stop altogether.The study, led by Ryan M. Pollyea in the Virginia Tech College of Science's Department of Geosciences, was published July 16 in The problem: The wastewater sinks and increases fluid pressure deep underground when it has a higher density than fluids already there naturally. Pressure changes so deep -- at depths up to 5 miles or greater -- can cause more high-magnitude earthquakes even though the overall number of earthquakes is decreasing."Earthquakes are now common in the central United States where the number of magnitude-3 or greater earthquakes increased from about 19 per year before 2008 to more than 400 per year since," said Pollyea, an assistant professor of geosciences and director of the Computational Geofluids Laboratory at Virginia Tech. (Pollyea adds that the overall earthquake rate per year has been declining since 2016.)"In many cases, these earthquakes occur when oilfield wastewater is disposed of by pumping it into deep geologic formations," Pollyea added. "As wastewater is injected deep underground, fluid pressure builds up and migrates away from injection wells. This destabilizes faults and causes 'injection-induced' earthquakes, such as the damaging 5.8-magnitude earthquake that struck Pawnee, Oklahoma, in 2016."Pollyea authored the study with Martin Chapman, a research associate professor of geosciences and director of the Virginia Tech Seismological Observatory; and Richard S. Jayne and Hao Wu, both graduate students at Virginia Tech. The study used computational modeling and earthquake data analysis from across a broad region of northern Oklahoma and southern Kansas -- roughly 30,000 square miles."This was a surprising result," Chapman said. "It suggests that sinking wastewater increases fluid pressure at greater depths and may cause larger earthquakes."By analyzing earthquake data, the researchers found that the number of earthquakes greater than a magnitude 4 increased more than 150 percent from 2017 to 2018 while the number of earthquakes with magnitude 2.5 or greater decreased 35 percent during the same period. More bluntly, the overall number of earthquakes is starting to decrease, but the percentage of higher-magnitude earthquakes is increasing."Our models show that high-density wastewater may continue sinking and increasing fluid pressure at depths of 5 or more miles for 10 or more years after injections stop," Pollyea said. "There is a larger proportion of high-magnitude earthquakes at depths greater than 5 miles in north-central Oklahoma and southern Kansas, but there are fewer total earthquakes at these depths. This implies that the rate of high-magnitude earthquakes is decreasing more slowly than the overall earthquake rate."The study also found that fluid pressure caused by sinking wastewater remains in the environment much longer than previously considered. "Our models show that high-density wastewater continues sinking and increasing fluid pressure for 10 to 15 years after injections stop, and this may prolong the earthquake hazard in regions like Oklahoma and Kansas," Pollyea said.It's important to note that Pollyea and his colleagues are not saying that all oilfield wastewater disposal operations cause earthquakes, nor are they predicting a large and potentially damaging earthquake in the Midwest region. Nor does the study indicate that density-driven pressure build-up occurs everywhere that oilfield wastewater operations occur.Researchers have known since the 1960s that pumping fluids deep underground can trigger earthquakes, Pollyea said, but this study is the first to show that the density of the wastewater itself plays a role in earthquake occurrence. The heavier the fluid, the greater the effect of displacement of natural fluids and the greater the fluid pressure change. To wit: Take a cup of salty ocean water heavy with dissolved particulates and dump it into a glass of regular tap water. Before the two eventually mix, the heavier ocean water will sink to the bottom, displacing the "lighter" tap water upward."Past pore-pressure models have assumed the density of injected fluids are the same as that in host rocks, but the real world begs to differ, and injected fluids are often heavier," said Shemin Ge, a professor and chair of the Department of Geological Sciences at the University of Colorado, who was not involved with the study. "This study looks into the effect of heavier fluids on pore pressure and consequently on inducing earthquakes. Heavier injected fluids have the tendency to migrate downward, which, interestingly, tracks the occurrence of earthquakes."The new study offers scientists, regulators, and policymakers a new framework for managing oilfield wastewater disposal and the associated earthquake hazard, according to the research team. In places such as Oklahoma, where earthquake mitigation measures are ongoing, results from this study may be particularly important because the combination of persistent and deepening fluid pressure suggests that the rate of high-magnitude earthquakes is likely decreasing slower than the overall earthquake rate.Funding for this study came from the United States Geological Survey, Earthquake Hazards Program. | Earthquakes | 2,019 |
July 8, 2019 | https://www.sciencedaily.com/releases/2019/07/190708131157.htm | Istanbul: Seafloor study proves earthquake risk for the first time | Collapsed houses, destroyed port facilities and thousands of victims -- on 22 May 1766 an earthquake of approximately 7.5 magnitude units and a subsequent water surge triggered a catastrophe in Istanbul. The origin of the quake was located along the North Anatolian fault in the Sea of Marmara. It was the last major earthquake to hit the metropolis on the Bosporus. | Researchers of the GEOMAR Helmholtz Centre for Ocean Research Kiel (Germany), together with colleagues from France and Turkey, have now been able to demonstrate for the first time with direct measurements on the seafloor that considerable tectonic strain has built up again on the North Anatolian fault below the Sea of Marmara. "It would be sufficient to trigger another earthquake with magnitudes between 7.1 to 7.4," says geophysicist Dr. Dietrich Lange of GEOMAR. He is the lead author of the study published today in the international journal The North Anatolian fault zone marks the boundary between the Eurasian and Anatolian plates. "Strong earthquakes occur when the fault zone becomes locked. Then tectonic strain accumulates, and the seismic energy is released in an earthquake," explains Dr. Lange. The last time this happened was in 1999 at a section of the North Anatolian fault near Izmit, about 90 kilometers east of Istanbul.Tectonic strain build-up along fault zones on land has been regularly monitored for years using GPS or land surveying methods. This is not possible in seabed fault zones due to the low penetration depth of the GPS satellite signals under water. However, the section of the North Anatolian fault that poses the considerable threat to the Istanbul metropolitan region is located underwater in the Marmara Sea.Up to now, it has only been possible to extrapolate, for example using land observations, whether the plate boundaries there are moving or locked. However, the methods could not distinguish between a creeping movement and the complete locking of the tectonic plates. The new GeoSEA system developed at GEOMAR measuring acoustic distances on the seabed now enables scientists for the first time to directly measure crustal deformation with mm-precision. Over a period of two and a half years, a total of ten measuring instruments were installed at a water depth of 800 metres on both sides of the fault. During this time, they carried out more than 650,000 distance measurements."In order to get measurements accurate within a few millimetres over several hundred of metres, very precise knowledge of the speed of sound underwater is required. Therefore, pressure and temperature fluctuations of the water must also be measured very precisely over the entire period," explains Prof. Dr. Heidrun Kopp, GeoSEA project manager and co-author of the current study."Our measurements show that the fault zone in the Marmara Sea is locked and therefore tectonic strain is building up. This is the first direct proof of the strain build-up on the seabed south of Istanbul," emphasizes Dr. Lange."If the accumulated strain is released during an earthquake, the fault zone would move by more than four metres. This corresponds to an earthquake with a magnitude between 7.1 and 7.4," adds Professor Kopp. Such an event would very probably have similar far-reaching consequences for nearby Istanbul as the 1999 earthquake for Izmit with over 17,000 casualties. | Earthquakes | 2,019 |
June 18, 2019 | https://www.sciencedaily.com/releases/2019/06/190618174356.htm | Appearance of deep-sea fish does not signal upcoming earthquake in Japan | The unusual appearance of deep-sea fish like the oarfish or slender ribbonfish in Japanese shallow waters does not mean that an earthquake is about to occur, according to a new statistical analysis. | The study published in the When the researchers examined the relationship between deep-sea fish appearances and earthquakes in Japan, however, they found only one event that could have been plausibly correlated, out of 336 fish sightings and 221 earthquakes."As a result, one can hardly confirm the association between the two phenomena," the authors write in the BSSA paper.The study included data from November 1928 to March 2011, looking at records of deep-sea fish appearances 10 and 30 days ahead of earthquakes that occurred 50 and 100 kilometers away from the fish sighting.They confined their search to earthquakes of magnitude 6.0 or larger, since these are the earthquakes that have been linked to "precursor phenomena like unusual animal behavior in previous reports," said Orihara.There were no recorded deep-sea fish appearances before an earthquake of magnitude 7.0, and no earthquakes with a magnitude greater than 6.0 occurred within 10 days of a deep-sea fish appearance.Orihara became interested in the deep-sea fish stories after the 2011 magnitude 9.0 Tohoku earthquake in Japan. If the stories were true, he said, deep-sea fish appearances could be used in disaster mitigation efforts."From this motivation, we started compiling the event catalog for statistical study," said Orihara. "There were some previous papers to survey deep-sea fish appearances. However, their reports were insufficient for a statistical study. To collect a lot of events, we focused on local newspapers that have often reported the events."The researchers scoured a digitized database of newspaper articles to find mentions of the unusual appearance of deep-sea fish that folklore says will appear before an earthquake, including oarfish, several kinds of ribbonfish, dealfish and the unicorn crestfish.Orihara said that he and his colleagues expect that their results "will have an influence on people who believe the folklore," and they hope that their findings will be shared throughout Japan. | Earthquakes | 2,019 |
June 14, 2019 | https://www.sciencedaily.com/releases/2019/06/190614125848.htm | Satellite observations improve earthquake monitoring, response | Researchers at the University of Iowa and the U.S. Geological Survey have found that data gathered from orbiting satellites can provide more accurate information on the impact of large earthquakes, which, in turn, can help provide more effective emergency response. | The satellite imagery provides detailed information about where the earthquakes occurred, how big the surface deformation was, and where the earthquakes occurred relative to population centers, typically within two to three days of the earthquake. This information was then incorporated into a set of operational response guides managed by the USGS National Earthquake Information Center (NEIC) that is distributed to decision makers, search and rescue operations, and other groups.In the paper published online June 6 in the journal "This, in turn, led to more accurate estimates of the numbers of fatalities and economic losses that are critical to more accurately determine in the days and weeks following devastating earthquakes," says Bill Barnhart, assistant professor in the Department of Earth and Environmental Sciences at the UI and a lead author on the study.Mainstays in determining an earthquake's impact are ground-based seismometers that measure seismic activity around the world. But these instruments are not located everywhere, which can lead to incomplete information about the effects of some earthquakes in the critical time immediately after they occur. Moreover, some quakes are more complex and can't be measured precisely by seismometers alone.Increasingly, earthquake specialists are turning to geodetic methods -- the math-based study of changes in the Earth's shape -- that use satellites and other instruments to complement data gathered by seismometers."While this is not yet a fully operational system, we are working with the USGS to make operational earthquake response with satellite imagery a systematic component of the NEIC's global earthquake monitoring and response efforts," Barnhart says.One example is the work done by Emma Mankin, a UI senior and geoscience major who will graduate in December. Mankin processed radar imagery, or interferograms, from a 6.9 magnitude quake that struck Indonesia in August 2018. She then used this imagery to produce a model of the earthquake and where it was located. The USGS used this model directly to update its predictions of ground shaking and earthquake impact that were incorporated into its disaster-response systems."Emma's rapid work on the Indonesia earthquake directly contributed to the operational analysis of a global earthquake," Barnhart says. "Her contributions improved earthquake impact estimates for that event and helped to further demonstrate that these satellite approaches can provide actionable information that benefits society."Contributing authors include Gavin Hayes and David Wald from the USGS National Earthquake Information Center.The USGS funded the research. | Earthquakes | 2,019 |
June 12, 2019 | https://www.sciencedaily.com/releases/2019/06/190612141418.htm | Future tsunamis possible in the Red Sea's Gulf of Elat-Aqaba | Researchers who took a closer look at a 1995 tsunami in the Gulf of Elat-Aqaba, at the northeastern tip of the Red Sea, say that the gulf's surrounding countries should prepare for future tsunami hazards in the economically developing vital region. | A team of scientists led by Amos Salamon of the Geological Survey of Israel simulated tsunamis, using the GeoClaw modeling program, paired with models of the magnitude 7.2 Nuweiba earthquake, which led to the 1995 tsunami in the gulf. They conclude in the journal Four countries -- Egypt, Saudi Arabia, Jordan and Israel -- border the Gulf of Eliat-Aqaba. Nuweiba is an Egyptian coastal town on the gulf. The tsunami reached its greatest wave height -- three to four meters at the Nuweiba harbor -- and brought the most serious damage to a platform there, with some minor damage also occurring to local nomad dwellings along the coast and to the Aqaba Port.Since 1995, the Gulf of Elat-Aqaba has grown in economic importance for all four countries, supporting shipping ports, tourism and potentially large regional water and electrical projects such as Jordan's proposed Red Sea-Dead Sea Water Conveyance (sometimes called the "Peace Conduit") pipeline.The tsunami was a surprise at the time, happening in a closed gulf far away from the open ocean, but has been mostly forgotten, said Salamon. "People remember the earthquake and the shaking rather than the tsunami. It went 'below the radar,' and was left hidden in reports and papers."The 1995 earthquake took place along the Dead Sea Transform, the fault system that runs from southeastern Turkey to the southern tip of the Sinai Peninsula. "We have already learned that earthquakes along the fault system have the potential to generate tsunamis in the Eastern Mediterranean through seismogenic submarine landslides, but now we realize that it is also capable of generating tsunamis in the gulf by co-seismic deformation as well as by submarine landslides," Salamon said.The motion along the fault that ruptured in the 1995 Nuweiba earthquake was predominantly strike-slip, where rocks slide past each other in a horizontal fashion along a vertical fracture. The most damaging tsunamis around the globe have been associated with thrust faults, where one slab of crust rides over the top of another slab. But researchers have noted several tsunamis related to strike-slip faults in recent years, including the magnitude 7.5 Palu earthquake in Indonesia last year, even though these ruptures do not tend to produce as much vertical upheaval as a thrust fault."In general the concept is that strike-slip earthquakes do not generate significant tsunamis, and we thought it is of great importance to show that we do have this," Salamon said. "There was relatively small and limited damage here from this tsunami, but we should not ignore that this unique seismotectonic settings includes active normal faults along the gulf's shoulders as well as steep submarine slopes along which we have identified fresh scars that may have evidence of past tsunamis."One of the most important pieces of information from the 1995 event, which allowed the researchers to perform their simulations, was a 1995 mareogram, or tidal gauge reading, that Salamon's student Eran Frucht discovered. These data gave them a way to judge the accuracy of their simulations. The researchers concluded, for instance, that the peak three to four-meter wave height in Nuweiba Harbor may have been a local phenomenon affected by the shape of the harbor or by a nearby underwater landslide.Salamon and his colleagues now are conducting a larger scale earthquake and tsunami hazard evaluation for the entire Gulf of Elat-Aqaba region. The evaluation is particularly important because there are few historical records in the area on past tsunami size and frequency, to guide future estimates.The worst case scenario, said Salamon, would be an earthquake on a fault that crosses from the land into the gulf, which could produce severe ground shaking in the surrounding cities, submarine landslides and subsidence of the coastline that amplifies inundation from a tsunami.Salamon said that the hazard evaluation would give scientists in all four countries surrounding the gulf a better understanding on whether to expect this worst-case scenario. "We thought that if we do this kind of research, it would be relevant for our neighbors as well." | Earthquakes | 2,019 |
June 12, 2019 | https://www.sciencedaily.com/releases/2019/06/190612084351.htm | Models suggest faults are linked through California's Imperial Valley | New mechanical modeling of a network of active strike-slip faults in California's Imperial Valley suggests the faults are continuously linked, from the southern San Andreas Fault through the Imperial Fault to the Cerro Prieto fault further to the south of the valley. | Although more studies are needed to understand the slip rates and exact relationships between these faults, linkage could increase the likelihood that they might rupture together and cause larger than expected shaking in Los Angeles, Imperial, Riverside and San Bernardino counties in California, Jacob Dorsett and colleagues suggest in their analysis, published in the While its faults may not be as well-known as the San Andreas and San Jacinto faults to the north, the region has hosted several damaging earthquakes, including a magnitude 6.9 rupture along the Imperial Fault in 1940 and a magnitude 6.6 earthquake on the same fault in 1979, along with magnitude 6.2 and 6.6 earthquakes on the Elmore Ranch and Superstition Hills faults, respectively, in 1989."We know that faults here have hosted earthquakes in the past, but understanding the three-dimensional fault shapes in the area, and specifically how the faults connect to and interact with one another, is necessary to better understand earthquake behavior. This project, to determine fault linkage and how it relates to fault slip rates here, is a necessary first step toward the future goal of mitigating risk," said co-author Elizabeth Madden at Universidade de Brasília, who first advanced the idea of the BSSA study.Dorsett, who worked on the study as an undergraduate at Appalachian State University and is now pursuing a Master's degree at Indiana University, said the new analysis could prove useful to seismologists working on large-scale computational models to better forecast California earthquakes."Our work suggests that the San Andreas effectively does not end at the Salton Sea, but is most likely physically connected to the Imperial fault, and then to the Cerro Prieto fault in Mexico," he said. "All other factors equal, if the San Andreas is connected to these other structures, then it makes the chances of a longer rupture -- and a larger magnitude -- more likely."The system of faults in the Imperial Valley accommodate most of the motion between the Pacific and North American tectonic plates in southern California. The complex spatial arrangement of the faults, however, makes it difficult to know whether this motion is mostly transferred via slip along major faults, or through significant off-fault deformation of the crust, Dorsett noted.To better understand slip rates and linkage among these faults, the researchers created a suite of 3D models that simulate long-term Pacific-North American plate motion, basing the initial fault geometry on the SCEC Community Fault Model 5.0, a comprehensive database of known faults in southern California. They developed four geologically plausible fault configurations with different degrees of connectivity between the Imperial fault and the Coachella (southern) part of the San Andreas Fault to the north and to the Cerro Prieto fault in the south.They then compared slip rates along those faults, as suggested by the mechanical models, with average fault slip rates calculated by the Uniform California Earthquake Rupture Model version 3, or UCERF3, the most recent earthquake rupture forecast for California.Their model results suggest the Coachella-San Andreas Fault, the Brawley Seismic Zone, the Imperial Fault and the Cerro Prieto fault are mechanically linked and form a continuous fault structure.The analysis also suggests that the Dixieland fault in the west part of the valley, in particular, may accommodate a more significant part of the plate motion than previously suspected, Dorsett noted, while the models also indicate that there may be a significant amount of seismic strain that occurs off of the faults."Like others before us, we definitely suspected that there could be faults linking the San Andreas to the Imperial and Cerro Prieto faults, and it's great that our study suggests that these linking faults are needed to correctly capture the plate motions," said Dorsett. | Earthquakes | 2,019 |
June 11, 2019 | https://www.sciencedaily.com/releases/2019/06/190611133943.htm | Catalog of north Texas earthquakes confirms continuing effects of wastewater disposal | A comprehensive catalog of earthquake sequences in Texas's Fort Worth Basin, from 2008 to 2018, provides a closer look at how wastewater disposal from oil and gas exploration has changed the seismic landscape in the basin. | In their report published in the However, their analysis also notes that new faults have become active during this period, and that seismicity continues at a greater distance from injection wells over time, suggesting that "far-field" changes in seismic stress will be important for understanding the basin's future earthquake hazard potential."One thing we have come to appreciate is how broadly injection in the basin has modified stress within entire basin," said DeShon.The first thing researchers noted with wastewater injection into the basin "was the reactivation of individual faults," she added, "and what we're now starting to see is essentially the leftover energy on all sorts of little faults being released by the cumulative volume that's been put into the basin."The earthquake catalog published in BSSA reports all seismicity recorded by networks operated by SMU between 2008 and 2018. Some seismic sequences in the catalog -- such as the 2008 Dallas Fort Worth Airport earthquakes -- are well-known and well-studied, while others such as the 2018 west Cleburne sequence are reported in the paper for the first time.DeShon said publishing the complete catalog was important in part to help people recognize that "there are earthquakes throughout the basin, not just on these three or four sequences that have garnered a lot of press attention."The researchers found that overall seismicity in the Fort Worth Basin has been strongly correlated in time and space with wastewater injection activities, with most seismicity occurring within 15 kilometers of disposal wells.Wastewater disposal volume began to decrease from its peak in 2014, mostly as a result of lower oil and gas prices, and the study shows "tapering off of seismicity along the faults that were near high-injection wells," said Quinones.There are exceptions to this pattern, including the 2015 Irving-Dallas and 2017 Lake Lewisville sequences that have no wells within 15 kilometers.Induced earthquakes occur when wastewater injected back into the ground increases the pore pressure within the rocks and affects stress along faults in surrounding rock layers. In the Fort Worth Basin, these stress changes may propagate far -- more than 10 kilometers -- from the injection wells, the researchers suggest."Injection rates peaked in 2014, but we still don't understand how spatially extensive the modification of pore pressure is at depth, so we still don't understand how the hazard is going to reduce with time," said DeShon.There are still far fewer induced earthquakes in the Fort Worth Basin compared to regions such as Oklahoma, which also has experienced a dramatic increase in seismicity in the past decade as the result of wastewater disposal from oil and gas production. The volumes of injected wastewater are much higher in Oklahoma, and the faults there tend to be much closer together, DeShon said.By contrast, Quinones said, faults in the Fort Worth Basin are more widely spaced, and there are few instances of earthquakes jumping between faults.However, the dense population of the Dallas-Fort Worth metropolitan area makes it critical to continue monitoring the region's induced earthquake risk, comparing seismic data with more information on wastewater injection.For the moment, DeShon said, researchers only have access to monthly cumulative volume and average pressure at injection wellheads, in a report that is updated once a year. "It would be best if injection data were provided in a more timely fashion in Texas, and if more detailed daily information on injection rates and volumes and some measurements of downhole pressure were provided," she said. | Earthquakes | 2,019 |
June 7, 2019 | https://www.sciencedaily.com/releases/2019/06/190607193701.htm | Dashing the dream of ideal 'invisibility' cloaks for stress waves | Whether Harry Potter's invisibility cloak, which perfectly steers light waves around objects to make them invisible, will ever become reality remains to be seen, but perfecting a more crucial cloak is impossible, a new study says. It would have perfectly steered stress waves in the ground, like those emanating from a blast, around objects like buildings to make them "untouchable." | Despite casting deep doubt on dozens of theoretical papers on "elastodynamic" cloaking, the new study's authors from the Georgia Institute of Technology don't think civil engineers should completely give up on it, just on the idea of an ideal cloak. Limited cloaking could still add a degree of protection to structures, particularly against some stress waves common in earthquakes."With cloaking, there is this expectation that if you get any kind of stress wave from any kind of direction, a cloak should be able to hide the object from it. We now see that it is not possible," said principal investigator Arash Yavari, a professor in Georgia Tech's School of Civil and Environmental Engineering and in the George W. Woodruff School of Mechanical Engineering. "But for a large class of disturbances, namely the in-plane disturbances, you could probably design a good cloak."In an earthquake, in-plane disturbances are seismic waves that track along flat and broad -- or planar -- paths through the surface of the Earth.Yavari and coauthor Ashkan Golgoon, a graduate research assistant studying with Yavari, published their study in the journal The dream of cloaking to steer stress waves past a structure like it isn't even there has a lot in common with the dream of an invisibility cloak, which would bend light -- electromagnetic waves -- around an object then point it out the other side.The light waves hitting the viewer's eye would reveal what is behind the object but not the object itself. In elastodynamic cloaking, the waves are not electromagnetic but mechanical, moving through the ground. Hypothetically, cloaking the object would completely isolate it from the waves.In a scenario to protect, say, a nuclear reactor from any stress waves traveling through the ground, whether from a natural or human-made calamity, ideally, civil engineers might lower the base of the reactor into a hole below the surface of the ground. They would build a protective cylinder or a half-spherical underground bowl around it with special materials to steer the stress waves around the circle.There are dreams, then there are the study's findings."We proved that the shape of the cloak does not matter, whether spherical or cylindrical, you can't completely cloak," Yavari said.A lot of theory and math from electromagnetic (light) cloaking has been transferred onto elastodynamic cloaking research, and some of the former appears to have thrown a wrench into the latter."Many times, analogies from other fields are useful, but elasticity adds multiple physical factors that you don't have in electromagnetism," Yavari said. "For example, the balance of angular momentum is being violated in much of the research literature.Angular momentum is a property of mass in rotational motion, and it is resistant to changes. Many people have experienced angular momentum by tilting a spinning gyroscope and watching it stubbornly move down an unexpected path.Although it's a wave, light is photons, which have no mass. Stress waves, on the other hand, travel through matter -- specifically, solid matter as opposed to liquid or gas -- and that adds pivotal real-world dynamics to the equation.Those dynamics also affect that hole that hides the object. Without it, the stress waves travel pretty uniformly through a medium, but with it, stresses concentrate around the hole and mess up the neat geometry of the wave patterns.What to do? Cloak anyway. If the ideal solution does not exist, make an imperfect one."The math says that cloaking is not possible in the strict sense. When you understand that, you don't waste time," Yavari said. "You formulate problems that optimize with what you do know around targeted stresses or loads you want to protect against."Engineers could protect against important earthquake stresses if they use materials that have been specifically pre-stressed, have certain elastic properties and distribution of densities that are detailed in the study. A real-life cloak can fall short of an ideal and still be great."If instead of 100 percent of the wave energy I only feel 10 or 20 percent, it's a huge deal because engineering is not a pursuit of absolute ideals," Yavari said.Even the ancient Romans, notoriously math-phobic, appear to have inadvertently built seismic cloaks in their design of amphitheaters, according to a report in The new study also examined a popular idea in civil engineering that building with a family of materials that have a microstructure making them "Cosserat solids" might allow for perfect cloaking. The authors concluded that this also can't work. The study did not consider so-called metamaterials, which have received attention for rerouting in particular light waves. | Earthquakes | 2,019 |
June 7, 2019 | https://www.sciencedaily.com/releases/2019/06/190607140446.htm | Disturbed sleep linked to mental health problems in natural disaster survivors | Preliminary results from a new study suggest that sleep disturbances are associated with mental health problems among survivors of a natural disaster even two years after the event. | The researchers surveyed survivors two years after the 2010 earthquake in Haiti. Results show that 94% reported experiencing subsequent insomnia symptoms after the disaster. Two years later, 42% showed clinically significant levels of post-traumatic stress disorder (PTSD), and nearly 22% had symptoms of depression. There were significant positive correlations between sleep disturbances and peritraumatic distress (i.e., emotional reaction during and immediately after the event), PTSD, and symptoms of depression. Resilience did not appear to be a buffer against sleep disturbance."This is one of the first epidemiological studies to investigate the prevalence of sleep disturbances among survivors of the 2010 Haiti earthquake," said lead author and principal investigator Judite Blanc, Ph.D., an NIH T32 postdoctoral fellow in the Center for Healthful Behavior Change within the Division of Health and Behavior in the Department of Population Health at NYU School of Medicine. "Our study underscores the strong association between common trauma-related disorders and comorbid sleep conditions among a group of survivors."The study involved 165 participants with a mean age of about 31 years; 52% were men. Participants were living in Port-au-Prince, Haiti, one of the areas affected by the 2010 earthquake. According to the authors, it was the most devastating earthquake in the country's history, killing more than 200,000 people and displacing more than 1 million residents.Measures included demographic factors, the Peritraumatic Distress Inventory, the PTSD Checklist Specific, the Beck Depression Inventory, and the Connor-Davidson Resilience Scale. Spearman correlations and multilinear regressions were used to explore associations among resilience, PTSD, depression, and sleep disturbances."Findings from our study highlight the need to assess and treat sleep issues among disaster survivors, as they are highly prevalent after a natural disaster and are related to mental health conditions," said Blanc. "Our work supports the importance of sleep in disaster preparedness programs globally."Blanc added that sleep is often neglected in the aftermath of traumatic events, but in these situations, sleep should be considered an important target of mental and physical health interventions."Our results make the case that sleep health should be a major component of all public and global health programs and specifically in humanitarian crises," said Blanc.The research abstract was published recently in an online supplement of the journal | Earthquakes | 2,019 |
June 7, 2019 | https://www.sciencedaily.com/releases/2019/06/190607091035.htm | How tides can trigger earthquakes | The tides are turning in a quest to solve an earthquake mystery. | Years ago, scientists realized that earthquakes along mid-ocean ridges -- those underwater mountain ranges at the edges of the tectonic plates -- are linked with the tides. But nobody could figure out why there's an uptick in tremors during low tides."Everyone was sort of stumped, because according to conventional theory, those earthquakes should occur at high tides," explained Christopher Scholz, a seismologist at Columbia University's Lamont-Doherty Earth Observatory.In a study published today in "It's the magma chamber breathing, expanding and contracting due to the tides, that's making the faults move," said Scholz, who co-led the study along with Lamont-Doherty graduate student Yen Joe Tan.The low tide correlation is surprising because of the way the mid-ocean fault moves. Scholz described the fault as a tilted plane that separates two blocks of earth. During movement, the upper block slides down with respect to the lower one. So, scientists expected that at high tides, when there is more water sitting on top of the fault, it would push the upper block down and cause the earthquakes. But that's not what happens. Instead, the fault slips down during low tide, when forces are actually pulling upwards -- "which is the opposite of what you'd expect," said Scholz.To get to the bottom the mystery, he, Tan, and Fabien Albino from the University of Bristol studied the Axial Volcano along the Juan de Fuca Ridge in the Pacific Ocean. Because the volcano erupts every ten years or so, scientists have set up dense networks of ocean bottom instruments to monitor it. The team used the data from those instruments to model and explore different ways the low tides could be causing the tremors.In the end, it came down to a component that no one else had considered before: the volcano's magma chamber, a soft, pressurized pocket below the surface. The team realized that when the tide is low, there is less water sitting on top of the chamber, so it expands. As it puffs up, it strains the rocks around it, forcing the lower block to slide up the fault, and causing earthquakes in the process.Furthermore, said Scholz, the tidal earthquakes in this region are "so sensitive that we can see details in the response that nobody could ever see before." When the team charted the earthquake rate versus the stress on the fault, they realized that even the tiniest stress could trigger an earthquake. The tidal data helped to calibrate this effect, but the triggering stress could be caused by anything -- such as the seismic waves from another earthquake, or fracking wastewater pumped into the ground."People in the hydrofracking business want to know, is there some safe pressure you can pump and make sure you don't produce any earthquakes?" said Scholz. "And the answer that we find is that there isn't any -- it can happen at any level of stress."Of course, a small stress over a small area isn't going to cause a devastating earthquake, and the exact amount of stress needed varies from place to place. "Our point is there's no intrinsic stress that has to be exceeded to cause an earthquake," says Scholz. "There isn't any rule of thumb." | Earthquakes | 2,019 |
June 5, 2019 | https://www.sciencedaily.com/releases/2019/06/190605133511.htm | Glacial sediments greased the gears of plate tectonics | Earth's outer layer is composed of giant plates that grind together, sliding past or dipping beneath one another, giving rise to earthquakes and volcanoes. These plates also separate at undersea mountain ridges, where molten rock spreads from the centers of ocean basins. | But this was not always the case. Early in Earth's history, the planet was covered by a single shell dotted with volcanoes -- much like the surface of Venus today. As Earth cooled, this shell began to fold and crack, eventually creating Earth's system of plate tectonics.According to new research, the transition to plate tectonics started with the help of lubricating sediments, scraped by glaciers from the slopes of Earth's first continents. As these sediments collected along the world's young coastlines, they helped to accelerate the motion of newly formed subduction faults, where a thinner oceanic plate dips beneath a thicker continental plate.The new study, published June 6, 2019 in the journal The findings suggest that sediment lubrication controls the rate at which Earth's crust grinds and churns. Sobolev and Brown found that two major periods of worldwide glaciation, which resulted in massive deposits of glacier-scrubbed sediment, each likely caused a subsequent boost in the global rate of plate tectonics.The most recent such episode followed the "snowball Earth" that ended sometime around 635 million years ago, resulting in Earth's modern plate tectonic system."Earth hasn't always had plate tectonics and it hasn't always progressed at the same pace," Brown said. "It's gone through at least two periods of acceleration. There's evidence to suggest that tectonics also slowed to a relative crawl for nearly a billion years. In each case, we found a connection with the relative abundance -- or scarcity -- of glacial sediments."Just as a machine needs grease to keep its parts moving freely, plate tectonics operates more efficiently with lubrication. While it may be hard to confuse the gritty consistency of clay, silt, sand and gravel with a slippery grease, the effect is largely the same at the continental scale, in the ocean trenches where tectonic plates meet."The same dynamic exists when drilling Earth's crust. You have to use mud -- a very fine clay mixed with water or oil -- because water or oil alone won't work as well," Brown said. "The mud particles help reduce friction on the drill bit. Our results suggest that tectonic plates also need this type of lubrication to keep moving."Previous research on the western coast of South America was the first to identify a relationship between sediment lubrication and friction along a subduction fault. Off the coast of northern Chile, a relative lack of sediment in the fault trench creates high friction as the oceanic Nazca plate dips beneath the continental South America plate. This friction helped to push the highest peaks of the central Andes Mountains skyward as the continental plate squashed and deformed.In contrast, further south there is a higher sediment load in the trench, resulting in less friction. This caused less deformation of the continental plate and, consequently, created smaller mountain peaks. But these findings were limited to one geographic area.For their study, Sobolev and Brown used a geodynamic model of plate tectonics to simulate the effect of sediment lubrication on the rate of subduction. To verify their hypothesis, they checked for correlations between known periods of widespread glaciation and previously published data that indicate the presence of continental sediment in the oceans and trenches. For this step, Sobolev and Brown relied on two primary lines of evidence: the chemical signature of the influence of continental sediments on the chemistry of the oceans and indicators of sediment contamination in subduction-related volcanoes, much like those that make up today's "ring of fire" around the Pacific Ocean.According to Sobolev and Brown's analysis, plate tectonics likely emerged on Earth between 3 and 2.5 billion years ago, around the time when Earth's first continents began to form. This time frame also coincides with the planet's first continental glaciation.A major boost in plate tectonics then occurred between 2.2 to 1.8 billion years ago, following another global ice age that scrubbed massive amounts of sediments into the fault trenches at the edges of the continents.The next billion years, from 1.75 billion to 750 million years ago, saw a global reduction in the rate of plate tectonics. This stage of Earth's history was so sedate, comparatively speaking, that it earned the nickname "the boring billion" among geologists.Later, following the global "snowball Earth" glaciation that ended roughly 635 million years ago, the largest surface erosion event in Earth's history may have scrubbed more than a vertical mile of thickness from the surface of the continents. According to Sobolev and Brown, when these sediments reached the oceans, they kick-started the modern phase of active plate tectonics. | Earthquakes | 2,019 |
June 4, 2019 | https://www.sciencedaily.com/releases/2019/06/190604101644.htm | Supercomputing dynamic earthquake rupture models | Some of the world's most powerful earthquakes involve multiple faults, and scientists are using supercomputers to better predict their behavior. Multi-fault earthquakes can span fault systems of tens to hundreds of kilometers, with ruptures propagating from one segment to the other. During the last decade, scientists have observed several cases of this complicated type of earthquake. Major examples include the magnitude (abbreviated M) 7.2 2010 Darfield earthquake in New Zealand; the M7.2 El Mayor -- Cucapah earthquake in Mexico immediately south of the US-Mexico border; the 2012 magnitude 8.6 Indian Ocean Earthquake; and perhaps the most complex of all, the M7.8 2015 Kaikoura earthquake in New Zealand. | "The main findings of our work concern the dynamic interactions of a postulated network of faults in the Brawley seismic zone in Southern California," said Christodoulos Kyriakopoulos, a Research Geophysicist at the University of California, Riverside. He's the lead author of a study published in April of 2019 in the A dynamic rupture model is a model that allows scientists to study the fundamental physical processes that take place during an earthquake. With this type of model, supercomputers can simulate the interactions between different earthquake faults. For example, the models allow study of how seismic waves travel from one fault to and influence the stability of another fault. In general, Kyriakopoulos said that these types of models are very useful to investigate big earthquakes of the past, and perhaps more importantly, possible earthquake scenarios of the future.The numerical model Kyriakopoulos developed consists of two main components. First is a finite element mesh that implements the complex network of faults in the Brawley seismic zone. "We can think of that as a discretized domain, or a discretized numerical world that becomes the base for our simulations. The second component is a finite element dynamic rupture code, known as FaultMod (Barall et. al. 2009) that allows us to simulate the evolution of earthquake ruptures, seismic waves, and ground motion with time," Kyriakopoulos said. "What we do is create earthquakes in the computer. We can study their properties by varying the parameters of the simulated earthquakes. Basically, we generate a virtual world where we create different types of earthquakes. That helps us understand how earthquakes in the real world are happening.""The model helps us understand how faults interact during earthquake rupture," he continued. "Assume an earthquake starts at point A and travels towards point B. At point B, the earthquake fault bifurcates, or splits in two parts. How easy would it be for the rupture, for example, to travel on both segments of the bifurcation, versus taking just one branch or the other? Dynamic rupture models help us to answer such questions using basic physical laws and realistic assumptions."Modeling realistic earthquakes on a computer isn't easy. Kyriakopoulos and his collaborators faced three main challenges. "The first challenge was the implementation of these faults in the finite element domain, in the numerical model. In particular, this system of faults consists of an interconnected network of larger and smaller segments that intersect each other at different angles. It's a very complicated problem," Kyriakopoulos said.The second challenge was to run dozens of large computational simulations. "We had to investigate as much as possible a very large part of parameter space. The simulations included the prototyping and the preliminary runs for the models. The Stampede supercomputer at TACC was our strong partner in this first and fundamental stage in our work, because it gave me the possibility to run all these initial models that helped me set my path for the next simulations." The third challenge was to use optimal tools to properly visualize the 3-D simulation results, which in their raw form consist simply of huge arrays of numbers. Kyriakopoulos did that by generating photorealistic rupture simulations using the freely available software ParaView.To overcome these challenges, Kyriakopoulos and colleagues used the resources of XSEDE, the NSF-funded Extreme Science and Engineering Environment. They used the computers Stampede at the Texas Advanced Computing Center; and Comet at the San Diego Supercomputer Center (SDSC). Kyriakopoulos' related research includes XSEDE allocations TACC's Stampede2 system."Approximately one-third of the simulations for this work were done on Stampede, specifically, the early stages of the work," Kyriakopoulos said. I would have to point out that this work was developed over the last three years, so it's a long project. I would like to emphasize, also, how the first simulations, again, the prototyping of the models, are very important for a group of scientists that have to methodically plan their time and effort. Having available time on Stampede was a game-changer for me and my colleagues, because it allowed me to set the right conditions for the entire set of simulations. To that, I would like to add that Stampede and in general XSEDE is a very friendly environment and the right partner to have for large-scale computations and advanced scientific experiments."Their team also used briefly the computer Comet of SDSC in this research, mostly for test runs and prototyping. "My overall experience, and mostly based on other projects, with SDSC is very positive. I'm very satisfied from the interaction with the support team that was always very fast in responding my emails and requests for help. This is very important for an ongoing investigation, especially in the first stages where you are making sure that your models work properly. The efficiency of the SDSC support team kept my optimism very high and helped me think positively for the future of my project."XSEDE had a big impact on this earthquake research. "The XSEDE support helped me optimize my computational work and organize better the scheduling of my computer runs. Another important aspect is the resolution of problems related to the job scripting and selecting the appropriate resources (e.g amount of RAM, and number of nodes). Based on my overall experience with XSEDE I would say that I saved 10-20% of personal time because of the way XSEDE is organized," Kyriakopoulos said."My participation in XSEDE gave a significant boost in my modeling activities and allowed me to explore better the parameter space of my problem. I definitely feel part of a big community that uses supercomputers and has a common goal, to push forward science and produce innovation," Kyriakopoulos said.Looking at the bigger scientific context, Kyriakopoulos said that their research has contributed towards a better understanding of multi-fault ruptures, which could lead to better assessments of the earthquake hazard. "In other words, if we know how faults interact during earthquake ruptures, we can be better prepared for future large earthquakes -- in particular, how several fault segments could interact during an earthquake to enhance or interrupt major ruptures," Kyriakopoulos said.Some of the results from this research point to the possibility of a multi-fault earthquake in Southern California, which could have dire consequences. "Under the current parametrization and the current model assumptions, we found that a rupture on the Southern San Andreas fault could propagate south of Bombay Beach, which is considered to be the southern end of the southern San Andreas fault. In this case, if a rupture actually propagates south of Bombay Beach, it could conceivably sever Interstate 8, which is considered to be a lifeline between the eastern and western California in the case of a large event," Kyriakopoulos said."Second, we found that a medium-sized earthquake nucleating on one of these cross faults could actually trigger a major event on the San Andreas fault. But this is only a very small part in this paper. And it's actually the topic of our ongoing and future work," he added."This research has provided us with a new understanding of a complex set of faults in Southern California that have the potential to impact the lives of millions of people in the United States and Mexico. Ambitious computational approaches, such as those undertaken by this research team in collaboration with XSEDE, make more realistic physics-based earthquake models possible," said National Science Foundation Earth Sciences Program Director Eva Zanzerkia.Said Kyriakopoulos: "Our planet is a complex physical system. Without the support from supercomputer facilities, we would not be able to numerically represent this complexity and specifically in my field analyze in depth the geophysical processes behind earthquakes."Stampede, Stampede2, Comet, and the Extended Collaborative Support Services program are allocated resources of the Extreme Science and Engineering Discovery Environment (XSEDE) funded by the National Science Foundation (NSF). | Earthquakes | 2,019 |
May 29, 2019 | https://www.sciencedaily.com/releases/2019/05/190529145107.htm | Scientists find telling early moment that indicates a coming megaquake | Scientists combing through databases of earthquakes since the early 1990s have discovered a possible defining moment 10-15 seconds into an event that could signal a magnitude 7 or larger megaquake. | Likewise, that moment -- gleaned from GPS data on the peak rate of acceleration of ground displacement -- can indicate a smaller event. GPS picks up an initial signal of movement along a fault similar to a seismometer detecting the smallest first moments of an earthquake.Such GPS-based information potentially could enhance the value of earthquake early warning systems, such as the West Coast's ShakeAlert, said Diego Melgar, a professor in the Department of Earth Sciences at the University of Oregon.The physics-heavy analyses of two databases maintained by co-author Gavin P. Hayes of the U.S. Geological Survey's National Earthquake Information Center in Colorado detected a point in time where a newly initiated earthquake transitions into a slip pulse where mechanical properties point to magnitude.Melgar and Hayes also were able to identify similar trends in European and Chinese databases. Their study was detailed in the May 29 issue of the online journal "To me, the surprise was that the pattern was so consistent, Melgar said. "These databases are made different ways, so it was really nice to see similar patterns across them."Overall, the databases contain data from more than 3,000 earthquakes. Consistent indicators of displacement acceleration that surface between 10-20 seconds into events were seen for 12 major earthquakes occurring in 2003-2016.GPS monitors exist along many land-based faults, including at ground locations near the 620-mile-long Cascadia subduction zone off the U.S. Pacific Northwest coast, but their use is not yet common in real time hazard monitoring. GPS data shows initial movement in centimeters, Melgar said."We can do a lot with GPS stations on land along the coasts of Oregon and Washington, but it comes with a delay," Melgar said. "As an earthquake starts to move, it would take some time for information about the motion of the fault to reach coastal stations. That delay would impact when a warning could be issued. People on the coast would get no warning because they are in a blind zone."This delay, he added, would only be ameliorated by sensors on the seafloor to record this early acceleration behavior.Having these capabilities on the seafloor and monitoring data in real time, he said, could strengthen the accuracy of early warning systems. In 2016, Melgar, as a research scientist at Berkeley Seismological Laboratory in Berkeley, California, led a study published in Geophysical Research Letters that found real time GPS data could provide an additional 20 minutes of warning of a possible tsunami.Japan already is laying fiber optic cable off its shores to boost its early warning capabilities, but such work is expensive and would be more so for installing the technology on the seafloor above the Cascadia fault zone, Meglar noted.Melgar and Hayes came across the slip-pulse timing while scouring USGS databases for components that they could code into simulations to forecast what a magnitude 9 rupture of the Cascadia subduction zone would look like.The subduction zone, which hasn't had a massive lengthwise earthquake since 1700, is where the Juan de Fuca ocean plate dips under the North American continental plate. The fault stretches just offshore of northern Vancouver Island to Cape Mendocino in northern California. | Earthquakes | 2,019 |
May 28, 2019 | https://www.sciencedaily.com/releases/2019/05/190528095258.htm | Lessons from Pohang: Solving geothermal energy's earthquake problem | On a November afternoon in 2017, a magnitude 5.5 earthquake shook Pohang, South Korea, injuring dozens and forcing more than 1,700 of the city's residents into emergency housing. Research now shows that development of a geothermal energy project shoulders the blame. | "There is no doubt," said Stanford geophysicist William Ellsworth. "Usually we don't say that in science, but in this case, the evidence is overwhelming." Ellsworth is among a group of scientists, including Kang-Kun Lee of Seoul National University, who published a perspective piece May 24 in The Pohang earthquake stands out as by far the largest ever linked directly to development of what's known as an enhanced geothermal system, which typically involves forcing open new underground pathways for Earth's heat to reach the surface and generate power. And it comes at a time when the technology could provide a stable, ever-present complement to more finicky wind and solar power as a growing number of nations and U.S. states push to develop low-carbon energy sources. By some estimates, it could amount to as much as 10 percent of current U.S. electric capacity. Understanding what went wrong in Pohang could allow other regions to more safely develop this promising energy source.Conventional geothermal resources have been generating power for decades in places where heat and water from deep underground can burble up through naturally permeable rock. In Pohang, as in other enhanced geothermal projects, injections cracked open impermeable rocks to create conduits for heat from the Earth that would otherwise remain inaccessible for making electricity."We have understood for half a century that this process of pumping up the Earth with high pressure can cause earthquakes," said Ellsworth, who co-directs the Stanford Center for Induced and Triggered Seismicity and is a professor in the School of Earth, Energy & Environmental Sciences (Stanford Earth).Here, Ellsworth explains what failed in Pohang and how their analysis could help lower risks for not only the next generation of geothermal plants, but also fracking projects that rely on similar technology. He also discusses why, despite these risks, he still believes enhanced geothermal can play a role in providing renewable energy.WILLIAM ELLSWORTH: The goal of an enhanced geothermal system is to create a network of fractures in hot rock that is otherwise too impermeable for water to flow through. If you can create that network of fractures, then you can use two wells to create a heat exchanger. You pump cold water down one, the Earth warms it up, and you extract hot water at the other end.Operators drilling a geothermal well line it with a steel tube using the same process and technology used to construct an oil well. A section of bare rock is left open at the bottom of the well. They pump water into the well at high pressure, forcing open existing fractures or creating new ones.Sometimes these tiny fractures make tiny little earthquakes. The problem is when the earthquakes get too big.ELLSWORTH: When they began injecting fluids at high pressure, one well produced a network of fractures as planned. But water injected in the other well began to activate a previously unknown fault that crossed right through the well.Pressure migrating into the fault zone reduced the forces that would normally make it difficult for the fault to move. Small earthquakes lingered for weeks after the operators turned the pumps off or backed off the pressure. And the earthquakes kept getting bigger as time went by.That should have been recognized as a sign that it wouldn't take a very big kick to trigger a strong earthquake. This was a particularly dangerous place. Pressure from the fluid injections ended up providing the kick.ELLSWORTH: Civil authorities worldwide generally don't want drilling and injection to cause earthquakes big enough to disturb people. In practice, authorities and drillers tend to focus more on preventing small earthquakes that can be felt rather than on avoiding the much less likely event of an earthquake strong enough to do serious harm.With this in mind, many projects are managed by using a so-called traffic light system. As long as the earthquakes are small, then you have a green light and you go ahead. If earthquakes begin to get larger, then you adjust operations. And if they get too big then you stop, at least temporarily. That's the red light.Many geothermal, oil and gas projects have also been guided by a hypothesis that as long as you don't put more than a certain volume of fluid into a well, you won't get earthquakes beyond a certain size. There may be some truth to that in some places, but the experience in Pohang tells us it's not the whole story.ELLSWORTH: The potential for a runaway or triggered earthquake always has to be considered. And it's important to consider it through the lens of evolving risk rather than hazard. Hazard is a potential source of harm or danger. Risk is the possibility of loss caused by harm or danger. Think of it this way: An earthquake as large as Pohang poses the same hazard whether it strikes in a densely populated city or an uninhabited desert. But the risk is very much higher in the city.The probability of a serious event may be small, but it needs to be acknowledged and factored into decisions. Maybe you would decide that this is not such a good idea at all.For example, if there's a possibility of a magnitude 5.0 earthquake before the project starts, then you can estimate the damages and injuries that might be expected. If we can assign a probability to earthquakes of different magnitudes, then civil authorities can decide whether or not they want to accept the risk and under what terms.As the project proceeds, those conversations need to continue. If a fault ends up being activated and the chance of a damaging earthquake increases, civil authorities and project managers might say, "we're done."ELLSWORTH: Natural geothermal systems are an important source of clean energy. But they are rare and pretty much tapped out. If we can figure out how to safely develop power plants based on enhanced geothermal systems technology, it's going to have huge benefits for all of us as a low-carbon option for electricity and space heating.Additional Stanford co-authors include postdoctoral research fellow Cornelius Langenbruch. Other co-authors are affiliated with ETH, Zurich in Switzerland, Victoria University of Wellington in New Zealand, University of Colorado, Boulder, the China Earthquake Administration, and Seoul National University, Chonnam National University and Chungnam National University in the Republic of Korea.The work was supported by the Korea Institute of Energy Technology. | Earthquakes | 2,019 |
May 23, 2019 | https://www.sciencedaily.com/releases/2019/05/190523161147.htm | Aftershocks of 1959 earthquake rocked Yellowstone in 2017-18 | On Aug. 17, 1959, back when Dwight D. Eisenhower was president, the U.S. had yet to send a human to space and the nation's flag sported 49 stars, Yellowstone National Park shook violently for about 30 seconds. The shock was strong enough to drop the ground a full 20 feet in some places. It toppled the dining room fireplace in the Old Faithful Inn. Groundwater swelled up and down in wells as far away as Hawaii. Twenty-eight people died. It went down in Yellowstone history as the Hebgen Lake earthquake, with a magnitude of 7.2. | And in 2017, nearly 60 years and 11 presidents later, the Hebgen Lake quake shook Yellowstone again. A swarm of more than 3,000 small earthquakes in the Maple Creek area (in Yellowstone National Park but outside of the Yellowstone volcano caldera) between June 2017 and March 2018 are, at least in part, aftershocks of the 1959 quake. That's according to a study published in "These kinds of earthquakes in Yellowstone are very common," says Koper, director of the University of Utah Seismograph Stations. "These swarms happen very frequently. This one was a little bit longer and had more events than normal.""We don't think it will increase the risk of an eruption," Pang adds.Taken together, the more than 3,000 small quakes of the Maple Creek swarm can be divided into two clusters. The northern cluster consists of Hebgen Lake aftershocks. The quakes fell along the same fault line, and were oriented the same way, as the Hebgen Lake event. Also, the team didn't see signs that the northern cluster was caused by movement of magma and other fluids beneath the ground.Koper and Pang says it's not unheard of for aftershocks of a large earthquake to continue decades after the initial event. Pang, for example, has also studied aftershocks as recent as 2017 from the 1983 Borah Peak earthquake in central Idaho."There are formulas to predict how many aftershocks you should see," Koper says. "For Hebgen Lake, there looked like a deficit in the number of aftershocks. Now that we've had these, it has evened things out back up to the original expectations."The southern cluster of the Maple Creek swarm seems to have a different origin. Although the northern cluster was lined up with the Hebgen Lake fault, the southern cluster's lineup was rotated about 30 degrees and the quakes were about 0.6 miles (1 kilometer) shallower than the northern cluster.So, the researchers concluded, although the shaking in the northern cluster influenced the southern cluster, the primarily cause of the southern shaking was likely subsurface movement of magma."We do consider it to be one swarm all together," Koper says. "Because they were so close, there was some feedback and influence between the two sections."Koper says that the results highlight how earthquakes are different than other natural hazards. Floods, hurricanes or wildfires are over when they're over. "Earthquakes don't happen as a single discrete event in time," he says. The specter of aftershocks can continue for months, years or even, as Maple Creek shows, decades.The study was funded by the United States Geological Survey, the Brinson Foundation and the Carrico Funds. | Earthquakes | 2,019 |
May 16, 2019 | https://www.sciencedaily.com/releases/2019/05/190516114625.htm | Ritter Island gives new insights into the dynamics of volcanic landslides | On the morning of the 13th of March 1888, the inhabitants of the Finschhafen trading post on the east coast of New Guinea were awakened by a dull rumbling sound. An eyewitness later reported that the water in the port had receded at the same time. A short time later, several two- to three-metre high waves hit the coast. It was a tsunami on that fateful morning that devastated the surrounding coasts. Several thousand people probably died in New Guinea and the Bismarck Archipelago. | The cause of the tsunami was quickly discovered: the largest part of the volcanic island of Ritter Island, 150 kilometres from Finschhafen, had slipped into the sea in a single catastrophic collapse. However, some questions about the exact course of the landslide remained unanswered.In the international journal The study is based on the expedition SO252 of the German research vessel SONNE to Ritter Island in Autumn 2016. With seismic methods, the international team led by Prof. Dr. Christian Berndt (GEOMAR) precisely measured the traces of the disaster of 1888. They found evidence that the flank of the island had moved sporadically over a long period of time before 1888. This is indicated by corresponding deformation of the subsurface at a smaller volcanic cone off the coast of Ritter Island.It is unknown whether slow landslides at volcanic flanks are precursors of a catastrophic collapse, or even whether they might reduce the risk of such a collapse because they relieve tension from the volcanic system. "At Ritter Island, we now have evidence that sporadic, small landslides have preceded a much larger one," explains Dr Karstens.Both types of landslides were observed last year on active volcanoes. Last year's eruption of Kilauea on Hawaii was accompanied by a landslide of the volcano flank, which caused a moderate earthquake. The eastern flank of Mount Etna in Sicily is also moving slowly towards the sea, as showed in a study published in Autumn 2018. In December 2018, an eruption of the volcano Anak Krakatau caused a landslide that triggered a tsunami in the Sunda Strait (Indonesia) and killed more than 400 people. The events at Anak Krakatau are comparable to those that took place on the 13th of March 1888 at the Ritter Island Volcano. This demonstrates the relevance of the findings at Ritter Island for hazard assessments on volcanic islands all over the world."The better we know the dynamics of such events, the better we can asses the hazard for a given region. Ritter Island is a very good case study because the volcano resembles many other volcanic islands and because the eruption and the tsunami are well documented thanks to eyewitness accounts. Together with our modern research methods, we can get a more complete picture of the processes of 1888," summarizes Dr. Karstens. | Earthquakes | 2,019 |
May 2, 2019 | https://www.sciencedaily.com/releases/2019/05/190502143353.htm | Fracking: Earthquakes are triggered well beyond fluid injection zones | Using data from field experiments and modeling of ground faults, researchers at Tufts University have discovered that the practice of subsurface fluid injection used in 'fracking' and wastewater disposal for oil and gas exploration could cause significant, rapidly spreading earthquake activity beyond the fluid diffusion zone. Deep fluid injections -- greater than one kilometer deep -- are known to be associated with enhanced seismic activity -- often thought to be limited to the areas of fluid diffusion. Yet the study, published today in the journal | The results account for the observation that the frequency of human-made earthquakes in some regions of the country surpass natural earthquake hotspots.The study also represents a proof of concept in developing and testing more accurate models of fault behavior using actual experiments in the field. Much of our current understanding about the physics of geological faults is derived from laboratory experiments conducted at sample length scales of a meter or less. However, earthquakes and fault rupture occur over vastly larger scales. Observations of fault rupture at these larger scales are currently made remotely and provide poor estimates of the physical parameters of fault behavior that would be used to develop a model of human-made effects. More recently, the earthquake science community has put resources behind field-scale injection experiments to bridge the scale gap and understand fault behavior in its natural habitat.The researchers used data from these experimental field injections, previously conducted in France and led by a team of researchers based at the University of Aix-Marseille and the University of Nice Sophia-Antipolis. The experiments measured fault pressurization and displacement, slippage and other parameters that are fed into the fault-slip model used in the current study. The Tufts researchers' analysis provides the most robust inference to date that fluid-activated slippage in faults can quickly outpace the spread of fluid underground."One important constraint in developing reliable numerical models of seismic hazard is the lack of observations of fault behavior in its natural habitat," said Pathikrit Bhattacharya, a former post-doc in the department of civil and environmental engineering at Tufts University's School of Engineering and lead author of the study. "These results demonstrate that, when available, such observations can provide remarkable insight into the mechanical behavior of faults and force us to rethink their hazard potential." Bhattacharya is now assistant professor in the School of Earth, Ocean and Climate Sciences at the Indian Institute of Technology in Bhubaneswar, India.The hazard posed by fluid-induced earthquakes is a matter of increasing public concern in the US. The human-made earthquake effect is considered responsible for making Oklahoma -- a very active region of oil and gas exploration -- the most productive seismic region in the country, including California. "It's remarkable that today we have regions of human-made earthquake activity that surpass the level of activity in natural hot spots like southern California," said Robert C. Viesca, associate professor of civil and environmental engineering at Tufts University's School of Engineering, co-author of the study and Bhattacharya's post-doc supervisor. "Our results provide validation for the suspected consequences of injecting fluid deep into the subsurface, and an important tool in assessing the migration and risk of induced earthquakes in future oil and gas exploration."Most earthquakes induced by fracking are too small -- 3.0 on the Richter scale -- to be a safety or damage concern. However, the practice of deep injection of the waste products from these explorations can affect deeper and larger faults that are under stress and susceptible to fluid induced slippage. Injection of wastewater into deep boreholes (greater than one kilometer) can cause earthquakes that are large enough to be felt and may cause damage.According to the U.S. Geological Survey, the largest earthquake induced by fluid injection and documented in the scientific literature was a magnitude 5.8 earthquake in September 2016 in central Oklahoma. Four other earthquakes greater than 5.0 have occurred in Oklahoma as a result of fluid injection, and earthquakes of magnitude between 4.5 and 5.0 have been induced by fluid injection in Arkansas, Colorado, Kansas and Texas. | Earthquakes | 2,019 |
May 1, 2019 | https://www.sciencedaily.com/releases/2019/05/190501141108.htm | Improved risk management for geothermal systems | Enhanced Geothermal Systems (EGS) are considered a promising source of energy that is clean, provides a sustainable baseload for heat and electricity and is an emerging key technology in the long-term transition to a fossil fuel-free future. However, developing a geothermal reservoir requires the forceful creation of fluid pathways in the deep underground by injecting large amounts of water under high pressure. Induced seismicity is an inevitable and well-known, yet poorly understood by-product of this technology and has caused serious public concern and scepticism leading to the shutdown of several EGS projects in the past. Managing the induced seismicity risk is therefore crucial for the development and further exploitation of EGS technology towards market-ready power and heat supply in urban environments. | In a new study now published in In the project, a traffic-light-style system involving near-realtime seismic monitoring allowed active feedback and guidelines to the stimulation engineers on how to adjust pumping rates and pressure at the injection. Professor Georg Dresen, head of the Geomechanics group at GFZ states: "This feedback in near real-time was the key to success and allowed to deepen the understanding of the reservoir seismic response and the hydraulic energy release at depth, while ensuring promptness in the technical response to increased seismic activity." This allowed immediate adjustment of the reservoir treatment through mitigating injection-rate and duration of resting periods that were applied in the course of the months-long experiment and ensured the successful control of maximum observed magnitude of the induced seismic events."While the quantitative results successfully applied here to avoid larger seismic events are not directly transferrable to other tectonic settings, the methodology and concept we developed in our study can be useful to other EGS projects to limit the seismic risk and derive ad-hoc stimulation strategies," says Grzegorz Kwiatek. The St1 Deep Heat Oy energy project is now approved for further advancement and after completion of a second well will move on to implementation of a fully functional geothermal plant for local heat provision. | Earthquakes | 2,019 |
April 26, 2019 | https://www.sciencedaily.com/releases/2019/04/190426130052.htm | Quick reconnaissance after 2018 Anchorage quake reveals signs of ground failure | A day after the 30 November 2018 magnitude 7 earthquake in Anchorage, Alaska, U.S. Geological Survey scientists Robert Witter and Adrian Bender had taken to the skies. The researchers were surveying the region from a helicopter, looking for signs of ground failure from landslides to liquefaction. | As Witter will discuss at the 2019 SSA Annual Meeting, the researchers saw some of the same signs of ground failure -- slumped road ramps, settled railroad grids and damaged bridges -- that were already making national news. These ground failures occurred mostly in artificial fill, where unconsolidated soils and rock placed under buildings and other infrastructure were shaken by strong ground motion. But Witter and his colleagues were looking for other signs that natural materials had failed as well."We were more interested in what this earthquake might have done to steep slopes, the landslides they might have triggered, and to landslides that we knew about already," Witter explained.In particular, they wanted to know if shaking from the Anchorage earthquake had reactivated landslides from the state's most devastating earthquake, the 1964 magnitude 9.2 Great Alaska earthquake, as well as the 1954 magnitude 6.4 Kenai Peninsula earthquake.For the most part, these landslides stayed intact during the 2018 quake, probably because the duration of shaking during this earthquake was much shorter, the researchers concluded. Shaking went on for 20 to 40 seconds during the Anchorage earthquake, while shaking continued for four to five minutes during the 1964 earthquake.There were some signs that the past landslides had been impacted by the 2018 earthquake, however. Witter and his colleagues checked on the 1964 Turnagain Heights landslide in Anchorage's Earthquake Park and the Government Hill landslide, which destroyed an empty elementary school in 1964."What we found at both those places is along the head scarps, or the tops of the landslides, and along block boundaries that moved during these 1964 landslides new cracks had formed about a centimeter wide that extended tens of meters in length," said Witter. "That suggested to us that these landforms were responding to the [2018] shaking, but that the shaking didn't last long enough to reactivate them."The scientists did see long-run out landslides in Potter Hill in south Anchorage, near the railroad tracks. The Potter Hill landslides are "significant because they're occurring in an area that is geologically susceptible to landslides, and where they have occurred in the past," Witter said. Earthquake-triggered landslides destroyed the tracks in 1954 and 1964, but the 2018 landslides did not damage the rails. Using light detection and ranging (lidar) data acquired before and after the 2018 earthquake, researchers were able to calculate the volume of the new landslides.In the tidal flats and deltaic environments along Cook Inlet and similar areas, Witter and his colleagues also documented evidence of liquefaction, including "sand volcanoes," where unconsolidated sand and soils saturated with groundwater lose their strength after earthquake shaking, behaving more like a liquid than a solid.These signs of liquefaction were swept away in the next high tide, and other evidence of ground failure was covered by snow a few days later, impressing upon the researchers the importance of immediate reconnaissance after an earthquake."We need to document the effects of earthquakes right after they happen, because the evidence sometimes can be quickly erased," said Witter.The data collected by Witter and his colleagues will help the USGS further develop and test its near real-time ground failure products, which are predictive maps of where landslides and liquefaction may occur after an earthquake in a particular region. | Earthquakes | 2,019 |
April 26, 2019 | https://www.sciencedaily.com/releases/2019/04/190426110603.htm | What does the future of Kilauea hold? | Ever since Hawaii's Kilauea stopped erupting in August 2018, ceasing activity for the first time in 35 years, scientists have been wondering about the volcano's future. Its similarities to the Hawaiian seamount Lo`ihi might provide some answers, according to Jacqueline Caplan-Auerbach at Western Washington University. | In her presentation at the 2019 SSA Annual Meeting, Caplan-Auerbach, a volcano seismologist, said Lo`ihi's 1996 eruption has some remarkable parallels to 2018 activity at Kilauea. Lo`ihi is a submarine volcano located about 22 miles off the southwest coast of the island of Hawaii, with its summit about 3000 feet below sea level.Caplan-Auerbach has studied Lo`ihi since she was a graduate student in 1996, with more recent work at Kilauea, using data from seismic instruments placed on the submarine flanks of both volcanoes.After the sudden cessation of activity at Kilauea last summer, "it was very apparent to me that there were some very striking similarities between this eruption and what we saw at Lo`ihi in 1996," she says.Like the 2018 Kilauea eruptive sequence, the 1996 Lo`ihi eruption began with a dramatic increase in seismic activity that started in the volcano's rift zone and transitioned to its summit. Then in both cases, "there was a long sequence of very large earthquakes for a volcano of that size," says Caplan-Auerbach. Lo`ihi experienced more than 100 magnitude 4 or larger earthquakes, while there were more than 50 magnitude 5 or larger earthquakes at Kilauea.In both cases, the swarms of earthquakes at the summits of each volcano led to a significant collapse, creating Pele's Pit on Lo`ihi and enlarging the Halema`uma`u crater at Kilauea.It's rare to see the kind of caldera collapse that happened at Kilauea in action, says Caplan-Auerbach, although scientists have watched it occur at Fernandina volcano in the Galápagos Islands and Bárðarbunga volcano in Iceland. "One of the things I would like to know more about is whether this type of activity, this draining of the summit reservoir and this sort of collapse of a pit ... indicates a volcano has kind of done its time," she says.After its 1996 eruption, Lo`ihi became quiet, with little to no seismicity recorded during two instrument deployments in 1997-1998 and 2010-2011. "It was a level of quiescence that we had never seen there before," says Caplan-Auerbach. The seamount remained mostly quiet for almost twenty years, gradually increasing seismicity before beginning new earthquake swarms in 2015.This might indicate that Lo`ihi is replenishing its magma reservoir. If Lo`ihi's and Kilauea's similarities are a guide to Kilauea's future, Kilauea might be quiet for a decade before becoming active again, Caplan-Auerbach suggests."I think the good news is that volcanoes tend to talk to us before they do anything truly dramatic," she says. "so I think we will know when it restores its magma system." | Earthquakes | 2,019 |
April 26, 2019 | https://www.sciencedaily.com/releases/2019/04/190426110601.htm | Studies link earthquakes to fracking in the Central and Eastern US | Small earthquakes in Ohio, Pennsylvania, West Virginia, Oklahoma and Texas can be linked to hydraulic fracturing wells in those regions, according to researchers speaking at the SSA 2019 Annual Meeting. | While relatively rare compared to earthquakes caused by wastewater disposal in oil and gas fields in the central United States, Michael Brudzinski of Miami University in Ohio and his colleagues have identified more than 600 small earthquakes (between magnitude 2.0 and 3.8) in these states.Brudzinski said these earthquakes may be "underappreciated" compared to seismicity related to wastewater disposal since they appear to happen less frequently. He and his colleagues are studying the trends related to the likelihood of induced seismicity from hydraulic fracturing or fracking, which could help industry and state regulators better manage drilling practices.Unconventional U.S. oil production, which extracts oil from shales and tight rocks using a variety of drilling techniques, has been linked to an increase in human-induced earthquakes across the mid-continent of the United States for nearly a decade. Researchers studying the increase in places such as Oklahoma think that the main driver of this increase in seismicity is the injection of wastewater produced by extraction back into rock layers, which increases pore pressure within rocks and can affect stress along faults in layers selected for disposal.Hydraulic fracturing uses pressurized liquid to break apart or create cracks within a rock formation through which petroleum and natural gas can flow and be more easily extracted.In the eastern half of Ohio and other parts of the Appalachian Basin, where there has been a dramatic rise in natural gas production over the past two decades, fracking wells are more prevalent than wastewater disposal wells, in part because the geological layers that contain oil and gas are not as wet as in places like Oklahoma, reducing the need for wastewater disposal.The numerous fracking wells in eastern Ohio prompted Brudzinski and his colleagues to take a closer look at whether small earthquakes in the region could be connected to fracking operations. "The wells are more widely spaced when they're active, and there isn't as much wastewater disposal going on," Brudzinski explained, "so you can see a bit more specifically and directly when wastewater disposal is generating seismicity and when hydraulic fracturing is generating seismicity in the Appalachian Basin."The scientists used a technique called multi-station template matching, which scans through hundreds of seismic signals to find those that match the "fingerprint" of known earthquakes. The technique allowed them to detect small earthquakes that might have otherwise been overlooked, and to compare the more complete earthquake catalog in a region to information on the timing and location of regional fracking well operations.Seismologists identify earthquakes as being caused by hydraulic fracture wells when they are tightly linked in time and space to fracking operations. Fracking-related seismicity also tends to look different from seismicity caused by wastewater disposal, Brudzinski said."The [fracking] seismic signature when you look at it in a sort of timeline shows these bursts of seismicity, hundreds or sometimes thousands of events over a couple of days or weeks, and then it's quiet again. You don't tend to see that pattern with wastewater disposal," he explained.Brudzinski and his colleagues are now using their dataset from Oklahoma to look at how a variety of variables might affect the likelihood of fracking-induced earthquakes, from the volume and viscosity of the injected liquid to the depth of the rock layers targeted by fracking."The one that has stuck out to us the most is that the depth of the well is more tied to likelihood of seismicity than we expected," Brudzinski said.It isn't just the deeper the well, the more likely it is to be closer to basement rock and mature faults that are likely to slip, he said, although that might still play a role in these earthquakes. Instead, overpressuring appears to have a stronger correlation with fracking-induced seismicity. Overpressuring occurs when there is high fluid pressure within rocks buried deep in a basin by many overlying rock layers. "It's one of the strongest trends we saw," said Brudzinski.The researchers have discussed some of their findings with colleagues in Canada and China, where induced seismicity from fracking operations are being studied closely. "We are doing that kind of international comparison to get a better handle on the salient features and trends that aren't just tied to a specific location," said Brudzinski. | Earthquakes | 2,019 |
April 26, 2019 | https://www.sciencedaily.com/releases/2019/04/190426075429.htm | Snowmelt causes seismic swarm near California's Long Valley Caldera | A spring surge of meltwater, seeping through vertically tilted layers of rock, caused a seismic swarm near California's Long Valley Caldera in 2017, according to research presented at the 2019 SSA Annual Meeting. | The unusual event prompted U.S. Geological Survey researcher Emily Montgomery-Brown and her colleagues to look back through 33 years of seismic and water records for the region. They found that rates of shallow seismicity were about 37 times higher during very wet periods versus dry periods.Although scientists have linked earthquakes to heavy rainfall or heavy runoff before this, the evidence connecting the two has been relatively weak or ambiguous, says Montgomery-Brown. In the Long Valley Caldera case, she says, "we're seeing phenomenal correlation between the seismicity and the stream discharge, and we are seeing about 37 times the number of earthquakes during the wet season as during the dry season."As the meltwater recharged the groundwater in the drought-stricken area, it changed the pore pressure in the rocks lying one to three kilometers below the ground surface, triggering the small earthquakes of the 2017 swarm.The shallow nature of the earthquakes, along with their unusual propagation, helped Montgomery-Brown and her colleagues determine that they were caused by seeping water and not volcanic processes related to Long Valley Caldera.In locating the earthquakes, Montgomery-Brown's USGS colleague Dave Shelly found that the quakes "were actually propagating deeper, down from the surface," she says. In other swarms around volcanic areas, such as Yellowstone National Park, the earthquakes tend to start in a deeper seismic zone, about six to eight kilometers deep, and often move upward toward the surface.Detailed geologic maps of the swarm area, to the south of Long Valley Caldera, show steeply dipping, nearly vertical rock layers that act like a fast conduit for meltwater. The runoff may not be reactivating a particular fault, Montgomery-Brown says, but instead may be infiltrating these rock layers and triggering small earthquakes there.The researchers didn't see the same strong correlation between meltwater runoff and seismic rates in other areas around Long Valley Caldera. "It's only in these areas where we see the highly dipping strata," she says.As far as she and her colleagues can tell, these shallow earthquakes stay shallow, and do not propagate deep enough to trigger activity on deeper faults in the area.Montgomery-Brown has been studying seismic signals and monitoring ground deformation at Long Valley Caldera to keep track of volcanic activity and the movement of magma and gas beneath the caldera. Deformation caused by heavy snowfall (and then snowmelt) in the mountains creates seasonal signals in her data. "Usually I'm trying to get rid of that signal so I can see the actual volcanic deformation that is happening."When the 2017 earthquake swarm -- one of the biggest in a long time in the area -- occurred on the edge of the caldera, she and her colleagues had been discussing whether it was caused by volcanic activity. "People were starting to feel the earthquakes, and we were wondering if we needed to issue some kind of statement regarding how the volcano was behaving."But as she was giving a new manager a tour of a router closet at their offices, she ended up talking with Mammoth Community Water District managers who share the office space. As they talked about the spring runoff flooding, "it just kind of clicked in our understanding" how the runoff and the swarm might be linked, she says. | Earthquakes | 2,019 |
April 24, 2019 | https://www.sciencedaily.com/releases/2019/04/190424153546.htm | Minerals in mountain rivers tell the story of landslide activity upstream | Researchers from the University of Helsinki and the University of Tübingen have come up with a new way of analysing sand in mountain rivers to determine the activity of landslides upstream, which has important implications for understanding natural hazards in mountainous regions. | Landslides occur in hilly and mountainous landscapes, often triggered by extreme rainfall events or ground shaking resulting from earthquakes. For example, a magnitude 7.8 earthquake in Nepal in April 2015 and its aftershocks are estimated to have triggered more than 25,000 landslides. For people living in these regions landslides are a major natural hazard, thus knowledge of the history of landslide activity in these areas is critical to understanding and mitigating their risk.The 2015 Nepal earthquake and the landslides it triggered were dramatic examples of natural hazards associated with a single event, but knowledge of the longer-term behaviour of landslide activity in a region is much more difficult to measure. The authors developed a new technique that enables them to understand how often landslides occur in a region and how long the sediment produced from landslides remains within a river system before being transported downstream."Our approach is based quite simply on taking a handful of sand from a river and measuring the chemistry of the sediments," says Todd Ehlers, co-author of the study and professor in the Department of Geociences at the University of Tübingen, Germany. "When combined with computer models we can determine how much landslide activity exists upstream of the location where the sediment was collected, and how long landslide produced sediment was in the river before being flushed out."Previous studies have been limited in their ability to determine how often landslides occur and how significant these events are at eroding topography compared to other processes such as river or glacier erosion. "What is surprising in this study is that we figured out a way to address both limitations that previous studies have struggled with," Ehlers explains.The results of the study have implications for understanding how active and important landslides are in a region, and also how long these catastrophic events swamp the rivers with sediment."Sediment in these steep landscapes is transported downstream surprisingly quickly" says David Whipp, study lead author and associate professor in the Institute of Seismology at the University of Helsinki. He continues "while sediment in many river systems may be stored for tens of thousands of years, our results suggest most of the sediment in the steep Himalayan mountains remains in the river system for no more than ten years."This surprising finding speaks to the immense power of water flowing in Himalayan mountain rivers during the annual monsoon season, which helps transport massive volumes of sediment downstream. | Earthquakes | 2,019 |
April 24, 2019 | https://www.sciencedaily.com/releases/2019/04/190424153512.htm | Major deep carbon sink linked to microbes found near volcano chains | Up to about 19 percent more carbon dioxide than previously believed is removed naturally and stored underground between coastal trenches and inland chains of volcanoes, keeping the greenhouse gas from entering the atmosphere, according to a study in the journal | Surprisingly, subsurface microbes play a role in storing vast amounts of carbon by incorporating it in their biomass and possibly by helping to form calcite, a mineral made of calcium carbonate, Rutgers and other scientists found. Greater knowledge of the long-term impact of volcanoes on carbon dioxide and how it may be buffered by chemical and biological processes is critical for evaluating natural and human impacts on the climate. Carbon dioxide is the major greenhouse gas linked to global warming."Our study revealed a new way that tiny microorganisms can have an outsized impact on a large-scale geological process and the Earth's climate," said co-author Donato Giovannelli, a visiting scientist and former post-doc in the Department of Marine and Coastal Sciences at Rutgers University-New Brunswick. He is now at the University of Naples in Italy.Giovannelli is a principal investigator for the interdisciplinary study, which involves 27 institutions in six nations. Professor Costantino Vetriani in the Department of Marine and Coastal Sciences and Department of Biochemistry and Microbiology in the School of Environmental and Biological Sciences is one of the Rutgers co-authors. The study covers how microbes alter the flow of volatile substances that include carbon, which can change from a solid or liquid to a vapor, in subduction zones. Such zones are where two tectonic plates collide, with the denser plate sinking and moving material from the surface into Earth's interior.The subduction, or geological process, creates deep-sea trenches and volcanic arcs, or chains of volcanoes, at the boundary of tectonic plates. Examples are in Japan and South and Central America. Arc volcanoes are hot spots for carbon dioxide emissions that re-enter the atmosphere from subducted material, which consists of marine sediment, oceanic crust and mantle rocks, Giovannelli said. The approximately 1,800-mile-thick mantle of semi-solid hot rock lies beneath the Earth's crust.The Earth's core, mantle and crust account for 90 percent of carbon. The other 10 percent is in the ocean, biosphere and atmosphere. The subduction zone connects the Earth's surface with its interior, and knowing how carbon moves between them is important in understanding one of the key processes on Earth and regulating the climate over tens of millions of years.The study focused on the Nicoya Peninsula area of Costa Rica. The scientists investigated the area between the trench and the volcanic arc -- the so-called forearc. The research reveals that volcanic forearc are a previously unrecognized deep sink for carbon dioxide. | Earthquakes | 2,019 |
April 24, 2019 | https://www.sciencedaily.com/releases/2019/04/190424125113.htm | Salish seafloor mapping identifies earthquake and tsunami risks | The central Salish Sea of the Pacific Northwest is bounded by two active fault zones that could trigger rockfalls and slumps of sediment that might lead to tsunamis, according to a presentation at the 2019 SSA Annual Meeting. | These tsunamis might be directed toward the islands of San Juan Archipelago, Vancouver Island and low coastal areas of the United States including Bellingham, Washington.Extensive seismic mapping of the seafloor by Canadian and U.S. scientists has revealed details of the extent and surrounding features of the Devils Mountain Fault Zone running south of the Archipelago, as well as the newly mapped Skipjack Island Fault Zone at its northern edge, said H. Gary Greene of Moss Landing Marine Laboratories. Both of the faults extend more than 55 kilometers (~34 miles) offshore, but might have the potential to rupture over 125 kilometers (~78 miles) if connected to onshore faults.The faults are similar to the east-west trending faults under the cities of Seattle and Tacoma, lying in the brittle upper plate of the Cascadia Subduction Zone. Deformation of sediments along the Devils Mountain and Skipjack faults indicates that they were active at least 10,000 years ago, Greene said. Although there have not been any large recorded earthquakes along these faults, he said the similar Seattle and Tacoma fault zones have produced magnitude 6 to 7 earthquakes in the past.The new seafloor mapping holds a few troubling signs for what might happen if an earthquake of that magnitude does occur along the Skipjack Island fault, in particular. For instance, Greene and his colleagues have identified an underwater rubble field from a past landslide along the steep northeastern face of Orcas Island near the Skipjack fault. A Skipjack earthquake could shake loose the massive rubble blocks here, he said, "and generate an impact tsunami from this material."The researchers also saw evidence of previous ground failure -- slumps and slides of sediment -- along the southern edge of the Canadian Fraser River Delta, which lies just north of the Skipjack Island fault zone. If an earthquake led to a massive slide of river delta sediments, the resulting tsunami might affect both the islands of the San Juan Archipelago and the Washington state coast.Greene also noted that the sediments lining Bellingham Bay have "just a tremendous amount of pockmarks, which indicate that methane is seeping out of the seafloor and has in the past." The gas might further destabilize sediment in the region.Together, the faults and seafloor features suggest that seismologists should keep a close eye on the potential local tsunami risks in the central Salish Sea. "We have the two faults here, we know that they have moved fairly recently, and that they are in the upper plate of the Cascadia Subduction Zone, an unstable area that we know can fail," Greene said.Although Greene, Vaughn Barrie of the Geological Survey of Canada, and other colleagues have identified some of the potential causes of tsunami between the Devils Mountain and Skipjack Fault Zones, the next step would be to model in detail how the tsunami might occur. "Modeling could help us establish the volume of the material that would fail, and that would give us a better idea of the potential magnitude of the tsunami," he said. | Earthquakes | 2,019 |
April 24, 2019 | https://www.sciencedaily.com/releases/2019/04/190424125111.htm | Reinforced concrete wall damage may be larger than expected in major Seattle earthquake | Using ground motions generated for a range of simulated magnitude 9 earthquakes in the Pacific Northwest, researchers are testing how well reinforced concrete walls might stand up under such seismic events. | The walls may not fare so well, especially within the city of Seattle, said University of Washington postdoctoral researcher Nasser A. Marafi, who studied the phenomenon for his Ph.D. dissertation.The ground motions produced by a long duration, large magnitude earthquake would be amplified in the deep sedimentary basin that underlies the city, and most buildings under 24 stories in the city are not designed to take into account the potential damage produced by such basin effects, Marafi reported at the 2019 SSA Annual Meeting."What we found is that the results are actually a lot more damaging than what we would expect," Marafi said.With a magnitude 9 earthquake, the maximum story drifts -- describing the displacement between consecutive floors on a building -- predicted for the reinforced concrete structures are on average 11% larger and are more variable than those used for earthquake building codes that do not account for basin effects.Marafi's analysis can't always predict whether a particular structure made with reinforced concrete will collapse during a magnitude 9 earthquake in Seattle, but the study suggests that structures designed to the current minimum seismic standards may have up to a 33% probability on average of collapse, depending on their design specifications.His project is part of a larger research effort by scientists at the University of Washington and the U.S. Geological Survey to learn more about what to expect from a magnitude 9 earthquake in the Pacific Northwest. Although there is a historic and prehistoric record of these massive and damaging earthquakes in the Cascadia Subduction Zone, there are no seismic recordings of large magnitude earthquakes in the region.To remedy this, the researchers have used computer simulations to generate a set of ground motions that might be expected under numerous magnitude 9 scenarios in the region. "Then my work takes the ground motions that those simulations predict and asks what this means for building response. How do buildings respond to this kind of shaking that we're predicting from this simulation?" said Marafi.Marafi used 30 of these ground motions to test against 32 computer generated models of reinforced concrete-core-wall structures, between four and 40 stories high. Before designing the concrete models, he met with engineers practicing in Seattle to make sure that his designs were representative of how buildings are currently constructed in the city.Seismic waves that pass through the deep and soft sediments that lie beneath Seattle slow down and pile up their energy, resulting in damaging large amplitude waves that may be trapped in the basin. Seattle buildings that are 24 stories or less do not have to be designed to withstand these basin effects, but Marafi said this is changing. The next version of the National Seismic Hazard Maps that inform building codes, for instance, will include basin effects for Seattle, and the city is likely to include some basin effects in its design codes for structures 24 stories or less by 2021, he said.The changes mean that existing buildings in the city may need to be retrofitted and that new buildings would be built with "more steel and more concrete, so that the structure is slightly bigger and ends up being a stronger, stiffer building," Marafi said. | Earthquakes | 2,019 |
April 18, 2019 | https://www.sciencedaily.com/releases/2019/04/190418145229.htm | Data mining digs up hidden clues to major California earthquake triggers | A powerful computational study of southern California seismic records has revealed detailed information about a plethora of previously undetected small earthquakes, giving a more precise picture about stress in the earth's crust. A new publicly available catalog of these findings will help seismologists better understand the stresses triggering the larger earthquakes that occasionally rock the region. | "It's very difficult to unpack what triggers larger earthquakes because they are infrequent, but with this new information about a huge number of small earthquakes, we can see how stress evolves in fault systems," said Daniel Trugman, a post-doctoral fellow at Los Alamos National Laboratory and coauthor of a paper published in the journal Trugman and coauthors from the California Institute of Technology and Scripps Institution of Oceanography performed a massive data mining operation of the Southern California Seismic Network for real quakes buried in the noise. The team was able to detect, understand, and locate quakes more precisely, and they created the most comprehensive earthquake catalog to date. The work identified 1.81 million quakes -- 10 times more earthquakes occurring 10 times more frequently than quakes previously identified using traditional seismology methods.The team developed a comprehensive, detailed earthquake library for the entire southern California region, called the Quake Template Matching (QTM) catalog. They are using it to create a more complete map of California earthquake faults and behavior. This catalog may help researchers detect and locate quakes more precisely.The team analyzed nearly two decades of data collected by the Southern California Seismic Network. The network, considered one of the world's best seismic systems, amasses a catalog of quakes from 550 seismic monitoring stations in the region. The SCSN catalog is based entirely on the traditional approach: manual observation and visual analysis. But Trugman says this traditional approach misses many weak signals that are indicators of small earthquakes.The team improved on this catalog with data mining. Using parallel computing, they crunched nearly 100 terabytes of data across 200 graphics processing units. Zooming in at high resolution for a 10-year period, they performed template matching using seismograms (waveforms or signals) of previously identified quakes. To create templates, they cut out pieces of waveforms from previously recorded earthquakes and matched those waveforms to patterns of signals recorded simultaneously from multiple seismic stations. Template matching has been done before, but never at this scale."Now we can automate it and search exhaustively through the full waveform archive to find signals of very small earthquakes previously hidden in the noise," Trugman explained.Applying the templates found events quake precursors, foreshocks and small quakes that had been missed with manual methods. Those events often provide key physical and geographic details to help predict big quakes. The team also identified initiation sequences that reveal how quakes are triggered.New details also revealed three-dimensional geometry and fault structures, which will support development of more realistic models.Recently, Trugman and Los Alamos colleagues have applied machine learning to study earthquakes created in laboratory quake machines. That works has uncovered important details about earthquake behavior that may be used to predict quakes."In the laboratory, we see small events as precursors to big slip events, but we don't see this consistently in the real world. This big data template-matching analysis bridges the gap," he said. "And now we've discovered quakes previously discounted as noise and learned more about their behavior. If we can identify these sequences as foreshocks in real time, we can predict the big one." | Earthquakes | 2,019 |
April 5, 2019 | https://www.sciencedaily.com/releases/2019/04/190405101329.htm | Damaging Sichuan earthquakes linked to fracking operations | Two moderate-sized earthquakes that struck the southern Sichuan Province of China last December and January were probably caused by nearby fracking operations, according to a new study published in | The December 2018 magnitude 5.7 and the January 2019 magnitude 5.3 earthquakes in the South Sichuan Basin caused extensive damage to farmhouses and other structures in the area. The December earthquake was especially destructive, injuring 17 people and resulting in a direct economic loss of about 50 million Chinese Yuan Renminbi (roughly $US 7.5 million).The Changning shale gas block in the South Sichuan Basin has been the site of fracking operations since 2010, with extensive horizontal fracking injection wells becoming more common since 2014. The earthquake rate in the Changning block rose dramatically at the same time that systematic fracking began.In the United States, wastewater disposal from oil and gas operations, where water produced during hydrocarbon extraction is injected back into rock layers, is thought to be the primary cause of induced earthquakes, especially in Oklahoma. However, there is growing evidence that hydraulic fracturing, or fracking, which uses injected water to break apart rock layers during hydrocarbon extraction, may have caused moderate-size earthquakes at some sites in Ohio, Oklahoma and western Canada.Both wastewater disposal and fracking have induced earthquakes in the south Sichuan basin, say Xinglin Lei of the Geological Survey of Japan and colleagues. In their new study in SRL, the researchers present "a full chain of evidence" to show that the December and January earthquakes were induced by fracking operations.They pinpointed the location of the earthquakes, finding that they were relatively shallow (between two and ten kilometers below the surface), as would be expected for induced earthquakes. The December and January quakes also coincided in time and space with injection at nearby fracking well pads. They did not have the exact injection volumes at these well pads to better understand the relationship between injection activities and the evolution of seismicity.Lei and colleagues' modeling of seismic activity show that most of the activity came from the initial mainshocks, with little aftershock activity, which is also consistent with the pattern seen for induced earthquakes. Finally, their calculations show that overpressure on the rock pores, produced by the fracking injections, was strong enough to activate preexisting faults in the region. These faults were mostly unmapped and not in a favorable orientation to slip under normal tectonic activity, the researchers note."For most well pads, the associated seismicity fades out quickly after the hydraulic fracture ended or halted," said Lei, although he noted that their analysis did raise the possibility of seeing signs of fault reactivation from previous seismicity."In my opinion, repeated moderate earthquakes can be caused as long as the injection is continuing, since a moderate earthquake releases very limited strain," he added. "The national regulations in China should be updated with the requirement for operators to take action if some signs of fault reactivation were observed."The researchers say more information is needed about faults and their stress patterns in areas of the Sichuan basin surrounding fracking well pads, to guide drilling in a way that would avoid moderate seismic activity. "Moderate earthquakes were observed in a limited number of sites," said Lei. "If these sites could be screened out, the risk of moderate earthquakes would be greatly reduced."Lei and colleagues would like to see researchers, regulators and oil and gas operators work together to better understand what causes injection-induced seismicity in the South Sichuan Basin, to allow effective and safe fracking operations. | Earthquakes | 2,019 |
April 3, 2019 | https://www.sciencedaily.com/releases/2019/04/190403155421.htm | Crowdsourcing speeds up earthquake monitoring | Data produced by Internet users can help to speed up the detection of earthquakes. Fast and accurate information is essential in the case of earthquakes: Epicentre location, depth and magnitude are minimum requirements to reliably estimate their possibly catastrophic consequences. An international team of scientists has presented a method to combine in real time data from seismic networks with information derived from users looking for earthquake information on specific websites, the smartphone LastQuake app and via Twitter. This method significantly reduces the time needed to detect and locate those earthquakes that are felt by the public. The team reported about their results in the journal | Robert J. Steed, Amaya Fuenzalida and Remy Bossu of the European-Mediterranean Seismological Centre (EMSC) in France carried out the research with colleagues from France, Hungary and Germany. The EMSC is one of the top global earthquake information centers which distributes global seismic data for free to the public via its websites (Usually within 3 to 8 minutes after an earthquake, a software developed at GFZ is able to compute location and magnitude of the earthquake. This information is made available online and shared immediately with partner organizations. The new method to compute location can accelerate the detection time to only 1 to 3 minutes for felt earthquakes. After feeling an earthquake, people tend to rapidly seek information from the Internet or tweet about their observations. The sudden increase in demand for earthquake information from websites like the EMSC can be detected and an approximate determination made of their geographical origin. This crowdsourcing data collected by the EMSC, in combination with seismic data provided by GFZ, accelerates the detection of felt earthquakes. The algorithm incorporates usage of the EMSC websites and the EMSC's smartphone app "LastQuake" as well as searching for the word "earthquake" in 59 different languages on Twitter.The team used the crowdsourcing approach to analyze more than 1500 earthquakes during the years 2016 and 2017. The time required to arrive at a reliable detection could be reduced by on average more than a minute compared to the analysis of only seismic data. | Earthquakes | 2,019 |
April 3, 2019 | https://www.sciencedaily.com/releases/2019/04/190403122442.htm | California's current earthquake hiatus is an unlikely pause | There have been no major ground rupturing earthquakes along California's three highest slip rate faults in the past 100 years. A new study published in | U.S. Geological Survey researchers Glenn Biasi and Kate Scharer analyzed long paleoseismic records from the San Andreas, San Jacinto and Hayward Faults for the past 1000 years, to determine how likely it might be to have a 100-year gap in earthquakes across the three faults. They found that the gap was very unlikely -- along the lines of a 0.3% chance of occurring, given the seismic record of the past 1000 years.The results emphasize that the hiatus is exceptional, and that the gap isn't some sort of statistical fluke created by incomplete paleoseismic records, said Biasi.The analysis also indicates that the next 100 years of California earthquakes along these faults could be a busy one, he noted. "If our work is correct, the next century isn't going to be like the last one, but could be more like the century that ended in 1918."Between 1800 and 1918, there were eight large ground-rupturing earthquakes along the faults, including the well-known 1906 earthquake in San Francisco and the similar-sized 1857 rupture of the San Andreas in southern California, but nothing so large since."We know these big faults have to carry most of the [tectonic] motion in California, and sooner or later they have to slip," said Biasi. "The only questions are how they're going to let go and when."The three faults and their major branches analyzed by the researchers accommodate the majority of the slip between the Pacific and North American plate boundary. Paleoseismic records from the faults predict that there would be three to four large ground-rupturing earthquakes (magnitude 6.5 or larger) each century.Biasi and Scharer examined the best available paleoseismic records from sites along the three faults to determine whether the current gap could be explained by missing data, or incorrect radiocarbon dating of past earthquakes. From these data, they calculated the probability that there would be a 100-year gap in ground-rupturing earthquakes across all three faults."Our paper confirms that this hiatus is very improbable and it's our view that our efforts will be better spent considering explanations for this, rather than trying to bend the data to make the hiatus a 'statistically improbable but could happen' kind of thing," said Biasi."We're saying, no, it's not a data problem, it's not a data choice problem, it doesn't matter how you slice this," he added. "We just have not had earthquakes that past records predict that we should have had."He likened the hiatus to what a person might see if they pulled up a chair alongside a freeway to count passing cars. "You might say that a certain number of cars per hour is kind of representative, and then something happens and you go ten minutes of seeing no cars. If it's just ten minutes, you could say it was a statistical fluke."But if the freeway stays clear of traffic for a long time, "the other reason there might be no cars is that up around the bend, there's a wreck," said Biasi.The researchers would like more seismologists to focus on the reasons -- "the wreck around the bend" -- behind the current hiatus."We had the flurry of very large earthquakes from 1800 to 1918," Biasi said. "It's possible that among them they just wrung out -- in the sense of wringing out a dishrag -- a tremendous amount of energy out the system."There may be stronger long-range interactions between the faults than suspected, or there may be unknown features of the mantle and lower crust below the faults that affect the probability of ground-rupturing earthquakes, he noted. | Earthquakes | 2,019 |
March 29, 2019 | https://www.sciencedaily.com/releases/2019/03/190329144223.htm | 66-million-year-old deathbed linked to dinosaur-killing meteor | The beginning of the end started with violent shaking that raised giant waves in the waters of an inland sea in what is now North Dakota. | Then, tiny glass beads began to fall like birdshot from the heavens. The rain of glass was so heavy it may have set fire to much of the vegetation on land. In the water, fish struggled to breathe as the beads clogged their gills.The heaving sea turned into a 30-foot wall of water when it reached the mouth of a river, tossing hundreds, if not thousands, of fresh-water fish -- sturgeon and paddlefish -- onto a sand bar and temporarily reversing the flow of the river. Stranded by the receding water, the fish were pelted by glass beads up to 5 millimeters in diameter, some burying themselves inches deep in the mud. The torrent of rocks, like fine sand, and small glass beads continued for another 10 to 20 minutes before a second large wave inundated the shore and covered the fish with gravel, sand and fine sediment, sealing them from the world for 66 million years.This unique, fossilized graveyard -- fish stacked one atop another and mixed in with burned tree trunks, conifer branches, dead mammals, mosasaur bones, insects, the partial carcass of a "This is the first mass death assemblage of large organisms anyone has found associated with the K-T boundary," said DePalma, curator of paleontology at the Palm Beach Museum of Natural History in Florida and a doctoral student at the University of Kansas. "At no other K-T boundary section on Earth can you find such a collection consisting of a large number of species representing different ages of organisms and different stages of life, all of which died at the same time, on the same day."In a paper to appear next week in the journal "It's like a museum of the end of the Cretaceous in a layer a meter-and-a-half thick," said Mark Richards, a UC Berkeley professor emeritus of earth and planetary science who is now provost and professor of earth and space sciences at the University of Washington.Richards and Walter Alvarez, a UC Berkeley Professor of the Graduate School who 40 years ago first hypothesized that a comet or asteroid impact caused the mass extinction, were called in by DePalma and Dutch scientist Jan Smit to consult on the rain of glass beads and the tsunami-like waves that buried and preserved the fish. The beads, called tektites, formed in the atmosphere from rock melted by the impact.Richards and Alvarez determined that the fish could not have been stranded and then buried by a typical tsunami, a single wave that would have reached this previously unknown arm of the Western Interior Seaway no less than 10 to 12 hours after the impact 3,000 kilometers away, if it didn't peter out before then. Their reasoning: The tektites would have rained down within 45 minutes to an hour of the impact, unable to create mudholes if the seabed had not already been exposed.Instead, they argue, seismic waves likely arrived within 10 minutes of the impact from what would have been the equivalent of a magnitude 10 or 11 earthquake, creating a seiche (pronounced saysh), a standing wave, in the inland sea that is similar to water sloshing in a bathtub during an earthquake. Though large earthquakes often generate seiches in enclosed bodies of water, they're seldom noticed, Richards said. The 2011 Tohoku quake in Japan, a magnitude 9.0, created six-foot-high seiches 30 minutes later in a Norwegian fjord 8,000 kilometers away."The seismic waves start arising within nine to 10 minutes of the impact, so they had a chance to get the water sloshing before all the spherules (small spheres) had fallen out of the sky," Richards said. "These spherules coming in cratered the surface, making funnels -- you can see the deformed layers in what used to be soft mud -- and then rubble covered the spherules. No one has seen these funnels before."The tektites would have come in on a ballistic trajectory from space, reaching terminal velocities of between 100 and 200 miles per hour, according to Alvarez, who estimated their travel time decades ago."You can imagine standing there being pelted by these glass spherules. They could have killed you," Richards said. Many believe that the rain of debris was so intense that the energy ignited wildfires over the entire American continent, if not around the world."Tsunamis from the Chicxulub impact are certainly well-documented, but no one knew how far something like that would go into an inland sea," DePalma said. "When Mark came aboard, he discovered a remarkable artifact -- that the incoming seismic waves from the impact site would have arrived at just about the same time as the atmospheric travel time of the ejecta. That was our big breakthrough."At least two huge seiches inundated the land, perhaps 20 minutes apart, leaving six feet of deposits covering the fossils. Overlaying this is a layer of clay rich in iridium, a metal rare on Earth, but common in asteroids and comets. This layer is known as the K-T, or K-Pg boundary, marking the end of the Cretaceous Period and the beginning of the Tertiary Period, or Paleogene.In 1979, Alvarez and his father, Nobelist Luis Alvarez of UC Berkeley, were the first to recognize the significance of iridium that is found in 66 million-year-old rock layers around the world. They proposed that a comet or asteroid impact was responsible for both the iridium at the K-T boundary and the mass extinction.The impact would have melted the bedrock under the seafloor and pulverized the asteroid, sending dust and melted rock into the stratosphere, where winds would have carried them around the planet and blotted out the sun for months, if not years. Debris would have rained down from the sky: not only tektites, but also rock debris from the continental crust, including shocked quartz, whose crystal structure was deformed by the impact.The iridium-rich dust from the pulverized meteor would have been the last to fall out of the atmosphere after the impact, capping off the Cretaceous."When we proposed the impact hypothesis to explain the great extinction, it was based just on finding an anomalous concentration of iridium -- the fingerprint of an asteroid or comet," said Alvarez. "Since then, the evidence has gradually built up. But it never crossed my mind that we would find a deathbed like this."Key confirmation of the meteor hypothesis was the discovery of a buried impact crater, Chicxulub, in the Caribbean and off the coast of the Yucatan in Mexico, that was dated to exactly the age of the extinction. Shocked quartz and glass spherules were also found in K-Pg layers worldwide. The new discovery at Tanis is the first time the debris produced in the impact was found along with animals killed in the immediate aftermath of the impact."And now we have this magnificent and completely unexpected site that Robert DePalma is excavating in North Dakota, which is so rich in detailed information about what happened as a result of the impact," Alvarez said. "For me, it is very exciting and gratifying!"Jan Smit, a retired professor of sedimentary geology from Vrije Universiteit in Amsterdam in The Netherlands who is considered the world expert on tektites from the impact, joined DePalma to analyze and date the tektites from the Tanis site. Many were found in near perfect condition embedded in amber, which at the time was pliable pine pitch."I went to the site in 2015 and, in front of my eyes, he (DePalma) uncovered a charred log or tree trunk about four meters long which was covered in amber, which acted as sort of an aerogel and caught the tektites when they were coming down," Smit said. "It was a major discovery, because the resin, the amber, covered the tektites completely, and they are the most unaltered tektites I have seen so far, not 1 percent of alteration. We dated them, and they came out to be exactly from the K-T boundary."The tektites in the fishes' gills are also a first."Paddlefish swim through the water with their mouths open, gaping, and in this net, they catch tiny particles, food particles, in their gill rakers, and then they swallow, like a whale shark or a baleen whale," Smit said. "They also caught tektites. That by itself is an amazing fact. That means that the first direct victims of the impact are these accumulations of fishes."Smit also noted that the buried body of a "We have an amazing array of discoveries which will prove in the future to be even more valuable," Smit said. "We have fantastic deposits that need to be studied from all different viewpoints. And I think we can unravel the sequence of incoming ejecta from the Chicxulub impact in great detail, which we would never have been able to do with all the other deposits around the Gulf of Mexico.""So far, we have gone 40 years before something like this turned up that may very well be unique," Smit said. "So, we have to be very careful with that place, how we dig it up and learn from it. This is a great gift at the end of my career. Walter sees it as the same." | Earthquakes | 2,019 |
March 28, 2019 | https://www.sciencedaily.com/releases/2019/03/190328080405.htm | Seismic safety upgrades may cost CA hospitals billions | California hospitals would need to make substantial investments -- between $34 billion and $143 billion statewide -- to meet 2030 state seismic safety standards, according to a new RAND Corporation report. | After the 1994 Northridge Earthquake, in which 11 hospitals were damaged, and eight were evacuated, the state adopted SB1953, which aims to improve hospital resilience to seismic events. The law requires hospitals to reduce their buildings' risk of collapse by 2020 and to remain operational after an earthquake by 2030.This study, which builds upon earlier RAND research on the law's cost to hospitals, focuses on the 2030 deadline. Researchers assessed the cost and affordability of compliance of the state's 418 general acute-care hospitals based on recent hospital data, depending on whether hospitals upgrade current buildings or opt for new construction.The state law puts the responsibility on hospitals to pay the entire cost for upgrades. The researchers note that this presents a challenge to the state's hospital industry, with 34 percent of all hospitals currently in some form of financial distress. That could rise to more than 50 percent as hospitals comply with the law. The most significant increases are likely to be among public healthcare district hospitals, independent private hospitals, critical access hospitals that serve rural areas and hospitals that serve a large share of Medi-Cal patients, according to the study."There is little question that it is in the public interest to have seismically resilient hospitals. But given that hospitals most at risk of collapse will be upgraded by 2020, there is an opportunity for analysis and discussion for how to most effectively and efficiently enhance resilience in health service delivery in the future," said Benjamin Preston, the report's lead author and a senior policy researcher and director of RAND's Community Health and Environmental Policy Program. "While acute-care hospitals still play a critical role, more and more individuals rely on outpatient services to support health needs. It is important that capital investments in health facilities are aligned with current models for health care delivery."The report, which was funded by the California Hospital Association (CHA), describes the potential cost implications of the law but does not make specific policy recommendations as to whether it should be implemented as is or altered. Instead, it presents possible policy alternatives that may more effectively balance the public benefits of seismic safety with the financial burden imposed by seismic requirements.As part of RAND's commitment to independent and objective analysis, a project advisory committee was established at the outset of the project. Members of that group consisted of stakeholders with diverse views -- private-sector structural engineers, former hospital executives as well as CHA-member hospitals. As with all RAND analyses, the report was subjected to RAND's rigorous research quality assurance process, which includes an independent peer review.Other authors of the report are Tom LaTourrette, James Broyles, R.J. Briggs, David Catt, Christopher Nelson, Jeanne Ringel and Daniel Waxman. RAND Social and Economic Well-Being is a division of RAND that seeks to actively improve the health and social and economic well-being of populations and communities throughout the world. | Earthquakes | 2,019 |
March 27, 2019 | https://www.sciencedaily.com/releases/2019/03/190327142053.htm | Signs of 1906 earthquake revealed in mapping of offshore northern San Andreas Fault | A new high-resolution map of a poorly known section of the northern San Andreas Fault reveals signs of the 1906 San Francisco earthquake, and may hold some clues as to how the fault could rupture in the future, according to a new study published in the | Samuel Johnson of the U.S. Geological Survey and Jeffrey Beeson of Fugro USA Marine Inc. compiled the map for the 35-kilometer-long section of the fault between Tomales Point and Fort Ross, California. They discovered two large zones, each covering about two square miles, of slope failure on the seafloor offshore of the Russian River, marked by lobes that appear to have formed when the intense shaking of the 1906 earthquake caused sand liquefaction.The mapping also demonstrates that there are two active strands of the fault within the northern part of Bodega Bay, each of which has moved tens of meters within the past 10,000 years.The findings "are not going to affect what we know about the recurrence interval or slip rate" on the Northern San Andreas Fault, "but it will affect what we know about how the northern San Andreas fault ruptures," Johnson said."Normally if you were studying a fault zone on land and found a prominent fault strand, you would probably assume that was the strand that has most recently ruptured," he explained. "Because we found two here, it's a cautionary tale for earthquake geologists to comprehensively map fault zones. You may only capture part of the earthquake history or slip rate along a fault if you only know about one strand in a multi-strand zone"The northern offshore areas of the fault have been intensively studied only within the past eight years, said Johnson. While much of the rest of the San Andreas fault has become a natural laboratory for studying earthquakes, "it's a major geoscience oversight that these northern areas have not been studied before," said Johnson. "We have been waiting for technology to produce the tools to look at these areas in high resolution."The researchers used data drawn from several techniques, including high-resolution seismic reflection profiles and multibeam bathymetry, both of which use multiple directed sound waves to image layers on or below the seafloor. Collection of some of the bathymetry data was funded by the California Ocean Protection Council as part of its work to designate and develop monitoring strategies for Marine Protected Areas, and by the National Oceanic and Atmospheric Administration (NOAA) to improve nautical charts.The liquefaction lobes, which are similar to the ground failure seen offshore of the Klamath River delta during the magnitude 7.2 Eureka earthquake in northern California in 1980, were one of the surprises uncovered during the mapping, said Johnson.The researchers were lucky to have caught a glimpse of the lobes before they disappeared, as some of the features are already being smoothed over by sediments deposited after 1906, Johnson said. "If you came back in another 50 to 100 years, you might not see these features because they would be all covered up. You can see their lifespan in the data and images we have now."Other insights from the map include a look at how movement along this portion of the fault has affected the onshore landscape, including the uplift of marine terraces and rapid formation of beaches and coastal sand dunes. For instance, the researchers noted that uplift west of the Northern San Andreas Fault has blocked the southward drift of sediment from the Russian River and Salmon Creek, leading to the swiftly growing South Salmon Creek Beach and its background of high coastal sand dunes. | Earthquakes | 2,019 |
March 27, 2019 | https://www.sciencedaily.com/releases/2019/03/190327112718.htm | Massive earthquakes provide new insight into deep Earth | In the waning months of 2018, two of the mightiest deep earthquakes ever recorded in human history rattled the Tonga-Fiji region of the South Pacific. | In the first-ever study of these deep earthquakes -- generally defined as any earthquake occurring 350 kilometers or more below the Earth's surface -- a Florida State University-led research team characterized these significant seismological events, revealing new and surprising information about our planet's mysterious, ever-changing interior.The team's findings, published in the journal "We don't have these kind of large earthquakes too often," said study author Wenyuan Fan, an earthquake seismologist in FSU's Department of Earth, Ocean and Atmospheric Science. "These deep earthquakes, especially larger earthquakes, aren't really promoted by the ambient environment. So why is this happening? It's a compelling question to ask."While deep earthquakes are rarely felt on the Earth's surface, studying these titanic events can help researchers better understand the systems and structures of the inner Earth.But the precise mechanisms of deep earthquakes have long been a mystery to earthquake scientists. The extreme temperature and pressure conditions of the deep Earth aren't suitable for the kinds of mechanical processes typically responsible for earthquakes -- namely the movement and sudden slippage of large plates.Instead, the extraordinary pressure holds things firmly in place, and the soaring temperatures make rocky material behave like chocolate -- moving around viscously instead of like ice cubes as is seen in the shallow surface."We did not expect to have deep earthquakes," Fan said. "It should not happen. But we do have observations of deep earthquakes. So why? How? What kind of physical processes operate under such conditions?"Using advanced waveform analyses, Fan and his team found that the first quake -- a behemoth clocking in at magnitude 8.2, making it the second-largest deep earthquake ever recorded -- was the product of two distinct physical processes.The earthquake, they found, began in one of the region's seismically important slabs, a portion of one tectonic plate subducted beneath another. Slab cores are cooler than their seething hot surroundings, and therefore more amenable to earthquake nucleation.Once the earthquake began forming in the slab core, it propagated out into its warmer and more ductile surroundings. This outward propagation moved the earthquake from one mechanical process to another."This is interesting because before Tonga was thought to predominantly only have one type of mechanism, which is within the cold slab core," Fan said. "But we're actually seeing that multiple physical mechanisms are involved."The dual mechanism propagation pattern present in the magnitude 8.2 earthquake wasn't wholly surprising to Fan and his team. The process was reminiscent of a similarly deep, 7.6 magnitude quake that shook the region in 1994. These recognizable patterns were a promising sign."To see that something is predictable, like the repeated patterns observed in the magnitude 8.2 earthquake, is very satisfying," Fan said. "It brings up the hope that we do know something about this system."But the second earthquake, which occurred 18 days after the first, was more of a puzzle. The magnitude 7.9 convulsion struck in an area that previously experienced very little seismic activity. The distinct physical mechanisms present in the second quake shared more similarities with South American deep earthquakes than with the massive quakes that rock the South Pacific. And, puzzlingly for researchers, the magnitude 7.9 earthquake produced surprisingly few aftershocks relative to its considerable size.Somehow, Fan said, a large earthquake was triggered in a previously aseismic region that then immediately returned to normal.It's this triggering process that most interests Fan going forward. He said this earthquake "doublet" illustrates the dynamic and interrelated nature of deep-Earth processes and the urgent need to better understand how these complicated processes operate."It's important that we address the question of how large earthquakes trigger other large earthquakes that are not far away," he said. "This is a good demonstration that there seem to be physical processes involved that are still unknown. We've gradually learned to identify the pattern, but not to a degree where we know exactly how it works. I think this is important to any kind of hazard forecast. It's more than an intellectual interest. It's important for human society."This study was funded by the Postdoctoral Scholar Program at the Woods Hole Oceanographic Institution. | Earthquakes | 2,019 |
March 26, 2019 | https://www.sciencedaily.com/releases/2019/03/190326132728.htm | The solid Earth breathes | The solid Earth breathes as volcanoes "exhale" gases like carbon dioxide (CO | Most subduction zones in the world are complex: the amount of sediment and carbon (C) concentration frequently varies along their length, and at many, some of the sediment reaching the subduction zone is scraped off, so the C in it never gets returned into the Earth. Developing a way to figure out how C cycles at complex subduction margins is therefore critical to understanding our planet.To establish such a method, researchers Brian M. House and colleagues focused on the Sunda margin along Indonesia, a subduction zone where the amount of sediment changes dramatically as does the proportion of organic and inorganic C, and very little of the sediment actually stays attached to the subducting plate.Erosion from the Himalayas and underwater sediment "avalanches" bring a tremendous amount of sediment that is rich in organic C to the northeast section of the margin while the southwest portion is inundated by sediment rich in calcium carbonate (CaCOTo account for this the team made a 3D model of the sediments and their composition across thousands of square kilometers outboard of the margin, which allowed us to more accurately quantify C in sediments throughout the region. House says they "estimate that only about a tenth of the C reaching the margin makes it past the subduction zone while the rest is scraped off the plate into the enormous wedge of sediment offshore of Sumatra and Java."House and colleagues estimate that the C returning into the Earth is much less -- maybe only a fifth -- of what volcanoes expel each year, meaning that the margin represents a net source of C into the atmosphere and that C from something other than the subducting sediments is released. "The sediments subducted into the Earth also have a different C isotope composition than that of volcanic COThese are two possible CO | Earthquakes | 2,019 |
March 25, 2019 | https://www.sciencedaily.com/releases/2019/03/190325122019.htm | Earth's deep mantle flows dynamically | As ancient ocean floors plunge over 1,000 km into the Earth's deep interior, they cause hot rock in the lower mantle to flow much more dynamically than previously thought, finds a new UCL-led study. | The discovery answers long-standing questions on the nature and mechanisms of mantle flow in the inaccessible part of deep Earth. This is key to understanding how quickly Earth is cooling, and the dynamic evolution of our planet and others in the solar system."We often picture the Earth's mantle as a liquid that flows but it isn't -- it's a solid that moves very slowly over time. Traditionally, it's been thought that the flow of rock in Earth's lower mantle is sluggish until you hit the planet's core, with most dynamic action happening in the upper mantle which only goes to a depth of 660 km. We've shown this isn't the case after all in large regions deep beneath the South Pacific Rim and South America," explained lead author, Dr Ana Ferreira (UCL Earth Sciences and Universidade de Lisboa)."Here, the same mechanism we see causing movement and deformation in the hot, pressurised rock in the upper mantle is also occurring in the lower mantle. If this increased activity is happening uniformly over the globe, Earth could be cooling more rapidly than we previously thought," added Dr Manuele Faccenda, Universita di Padova.The study, published today in The team found that the deformation and increased flow in the lower mantle is likely due to the movement of defects in the crystal lattice of rocks in the deep Earth, a deformation mechanism called "dislocation creep," whose presence in the deep mantle has been the subject of debate.The researchers used big data sets collected from seismic waves formed during earthquakes to probe what's happening deep in Earth's interior. The technique is well established and comparable to how radiation is used in CAT scans to see what's happening in the body."In a CAT scan, narrow beams of X-rays pass through the body to detectors opposite the source, building an image. Seismic waves pass through the Earth in much the same way and are detected by seismic stations on the opposite side of the planet to the earthquake epicentre, allowing us to build a picture of the structure of Earth's interior," explained Dr Sung-Joon Chang, Kangwon National University.By combining 43 million seismic data measurements with dynamic computer simulations using the UK's supercomputing facilities HECToR, Archer and the Italian Galileo Computing Cluster, CINECA the researchers generated images to map how the Earth's mantle flows at depths of ~1,200 km beneath our feet.They revealed increased mantle flow beneath the Western Pacific and South America where ancient ocean floors are plunging towards Earth's core over millions of years.This approach of combining seismic data with geodynamic computer modelling can now be used to build detailed maps of how the whole mantle flows globally to see if dislocation creep is uniform at extreme depths.The researchers also want to model how material moves up from the Earth's core to the surface, which together with this latest study, will help scientists better understand how our planet evolved into its present state."How mantle flows on Earth might control why there is life on our planet but not on other planets, such as Venus, which has a similar size and location in the solar system to Earth, but likely has a very different style of mantle flow. We can understand a lot about other planets from revealing the secrets of our own," concluded Dr Ferreira.The study was funded by the Leverhulme Trust, NERC, the Korea Meteorlogical Administration Research and Development Program, the Progetto di Ateneo FACCPTRAT12 granted by the Università di Padova and by the ERC StG #758199 NEWTON. | Earthquakes | 2,019 |
March 22, 2019 | https://www.sciencedaily.com/releases/2019/03/190322122544.htm | Scientists argue for more comprehensive studies of Cascade volcanoes | The string of volcanoes in the Cascades Arc, ranging from California's Mt. Lassen in the south to Washington's Mt. Baker in the north, have been studied by geologists and volcanologists for over a century. Spurred on by spectacular events such as the eruption of Mount Lassen in 1915 and Mount St. Helens in 1980, scientists have studied most of the Cascade volcanoes in detail, seeking to work out where the magma that erupts comes from and what future eruptions might look like. | However, mysteries still remain about why nearby volcanoes often have radically different histories of eruption or erupt different types of magma. Now scientists would like to find out why -- both for the Cascades and for other volcanic ranges.In a perspective essay published today (March 22) in "The study of volcanoes is fascinating in detail, and it has largely been focused on research into individual volcanoes rather than the bigger picture," said Adam Kent, a volcano expert at Oregon State University and a co-author on the essay. "We now have the insight and data to go beyond looking at just Mount St. Helens and other well-known volcanoes. We can take a step back and ask why is St. Helens different from Mount Adams, why is that different from Mount Hood?"The study takes a novel approach to this topic. "One way to do this is to consider the heat it took to create each of the volcanoes in the Cascades Arc, for example, and also compare this to the local seismic wave speeds and heat flow within the crust, Kent said. "Linking these diverse data sources together this way gives us a better glimpse into the past, but offer some guidance on what we might expect in the future."The need for studying volcanoes more thoroughly is simple, noted Christy Till of Arizona State University, lead author of the Worldwide almost a billion people live in areas at risk from volcanic eruptions, 90 percent of which live in the so-called Pacific Ring of Fire.The subduction of the Juan de Fuca tectonic plate beneath the North American plate is the ultimate driver for the formation of the Cascade Range, as well as many of the earthquakes the Northwest has experienced. Subduction results in deep melting of the Earth's mantle, and the magma then heads upward towards the crust and surface, eventually reaching the surface to produce volcanoes.But there are differences among the volcanoes, the researchers note, including in the north and south of the Cascade Range."The volcanoes in the north stand out because they stand alone," Kent said. "In the south, you also have recognizable peaks like the Three Sisters and Mount Jefferson, but you also many thousands of smaller volcanoes like Lava Butte and those in the McKenzie Pass area in between. Our work suggests that, together with the larger volcanoes, these small centers require almost twice the amount of magma being input into the crust in the southern part of the Cascade Range.""If you live around a volcano, you have to be prepared for hazards and the hazards are different with each different type of volcano," Kent said. "The northern Cascades are likely to have eruptions in the future, but we know where they'll probably be -- at the larger stratovolcanoes like Mount Rainier, Mount Baker and Glacier Peak. In the south the larger volcanoes might also have eruptions, but then we have these large fields of smaller -- so called 'monogenetic' volcanoes. For these it is harder to pinpoint where future eruptions will occur."The field of volcanology has progressed quite a bit, the researchers acknowledge, and the need now exists to integrate some of the methodology of individual detailed studies to give a more comprehensive look at the entire volcanic system. The past is the best informer of the future."If you look at the geology of a volcano, you can tell what kind of eruption is most likely to happen," Kent said. "Mount Hood, for example, is known to have had quite small eruptions in the past, and the impact of these is mostly quite local. Crater Lake, on the other hand, spread ash across much of the contiguous United States."What we would like to know is why one volcano turns out to be a Mount Hood while another develops into a Crater Lake, with a very different history of eruptions. This requires us to think about the data that we have in new ways."The 1980 eruption of Mt. St. Helens was a wake-up call to the threat of volcanoes in the continental United States, and though noteworthy, its eruption was relatively minor. The amount of magma involved in the eruption was estimated to be 1 kilometer cubed (enough to fill about 400,000 Olympic swimming pools), whereas the eruption of Mt. Mazama 6,000 years ago that created Crater Lake was 50 km cubed, or 50 times as great.The researchers say the process of building and tearing down volcanoes continues today, though it is difficult to observe on a day-to-day basis."If you could watch a time-lapse camera over millions of years, you would see volcanoes building up slowly, and then eroding fairly quickly," said Kent, who is in OSU's College of Earth, Ocean, and Atmospheric Sciences. "Sometimes, both are happening at once."Which of the Cascades is most likely to erupt? The smart money is on Mount St. Helens, because of its recent activity, but many of the volcanoes are still considered active."I can tell you unequivocally that Mount Hood will erupt in the future," Kent said. "I just can't tell you when."For the record, Kent said the odds of Mt. Hood erupting in the next 30 to 50 years are less than 5 percent. | Earthquakes | 2,019 |
March 21, 2019 | https://www.sciencedaily.com/releases/2019/03/190321130954.htm | Hundreds of bubble streams link biology, seismology off Washington's coast | Off the coast of Washington, columns of bubbles rise from the seafloor, as if evidence of a sleeping dragon lying below. But these bubbles are methane that is squeezed out of sediment and rises up through the water. The locations where they emerge provide important clues to what will happen during a major offshore earthquake. | The study, from the University of Washington and Oregon State University, was recently published in the The first large-scale analysis of these gas emissions along Washington's coast finds more than 1,700 bubble plumes, primarily clustered in a north-south band about 30 miles (50 kilometers) from the coast. Analysis of the underlying geology suggests why the bubbles emerge here: The gas and fluid rise through faults generated by the motion of geologic plates that produce major offshore earthquakes in the Pacific Northwest."We found the first methane vents on the Washington margin in 2009, and we thought we were lucky to find them, but since then, the number has just grown exponentially," said lead author Paul Johnson, a UW professor of oceanography."These vents are a little ephemeral," Johnson added. "Sometimes they turn off-and-on with the tides, and they can move around a little bit on the seafloor. But they tend to occur in clusters within a radius of about three football fields. Sometimes you'll go out there and you'll see one active vent and you'll go back to the same location and it's gone. They're not reliable, like the geysers at Yellowstone."The authors analyzed data from multiple research cruises over the past decade that use modern sonar technology to map the seafloor and also create sonar images of gas bubbles within the overlying water. Their new results show more than 1,778 methane bubble plumes issuing from the waters off Washington state, grouped into 491 clusters."If you were able to walk on the seafloor from Vancouver Island to the Columbia River, you would never be out of sight of a bubble plume," Johnson said.The sediments off the Washington coast are formed as the Juan de Fuca oceanic plate plunges under the North American continental plate, scraping material off the ocean crust. These sediments are then heated, deformed and compressed against the rigid North American plate. The compression forces out both fluid and methane gas, which emerges as bubble streams from the seafloor.The bubble columns are located most frequently at the boundary between the flat continental shelf and the steeply sloped section where the seafloor drops to the abyssal depths of the open ocean. This abrupt change in slope is also a tectonic boundary between the oceanic and continental plates."Although there are some methane plumes from all depths on the margin, the vast majority of the newly observed methane plume sites are located at the seaward side of the continental shelf, at about 160 meters water depth," Johnson said.A previous study from the UW had suggested that warming seawater might be releasing frozen methane in this region, but further analysis showed the methane bubbles off the Pacific Northwest coast arise from sites that have been present for hundreds of years, and are not related to global warming, Johnson said.Instead, these gas emissions are a long-lived natural feature, and their prevalence contributes to the continental shelf area being such productive fishing grounds. Methane from beneath the seafloor provides food for bacteria, which then produce large quantities of bacterial film. This biological material then feeds an entire ecological chain of life that enhances fish populations in those waters."If you look online at where the satellite transponders show where the fishing fleet is, you can see clusters of fishing boats around these methane plume hotspots," Johnson said.To understand why the methane bubbles occur here, the authors used archive geologic surveys conducted by the oil and gas companies in the 1970s and 1980s. The surveys, now publicly accessible, show fault zones in the sediment where the gas and fluid migrate upward until emerging from the seafloor."Seismic surveys over the areas with methane emission indicate that the continental shelf edge gets thrust westward during a large megathrust, or magnitude-9, earthquake," Johnson said. "Faults at this tectonic boundary provide the permeable pathways for methane gas and warm fluid to escape from deep within the sediments."The location of these faults could potentially provide new understanding of the earthquake hazard from the Cascadia Subduction Zone, which last ruptured more than 300 years ago. If the seafloor movement during a subduction-zone earthquake occurs closer to shore, and a major component of this motion occurs within the shallower water, this would generate a smaller tsunami than if the seafloor motion were entirely in deep water."If our hypothesis turns out to be true, then that has major implications for how this subduction zone works," Johnson said. | Earthquakes | 2,019 |
March 21, 2019 | https://www.sciencedaily.com/releases/2019/03/190321130317.htm | Geophysics: A surprising, cascading earthquake | The Kaikoura earthquake in New Zealand in 2016 caused widespread damage. Ludwig-Maximilians-Universitaet (LMU) in Munich researchers have now dissected its mechanisms revealing surprising insights on earthquake physics with the aid of simulations carried out on the supercomputer SuperMUC. | The 2016 Kaikoura earthquake (magnitude 7.8) on the South Island of New Zealand is among the most intriguing and best-documented seismic events anywhere in the world -- and one of the most complex. The earthquake exhibited a number of unusual features, and the underlying geophysical processes have since been the subject of controversy. LMU geophysicists Thomas Ulrich and Dr. Alice-Agnes Gabriel, in cooperation with researchers based at the Université Côte d'Azur in Valbonne and at Hong Kong Polytechnic University, have now simulated the course of the earthquake with an unprecedented degree of realism. Their model, which was run on the Bavarian Academy of Science's supercomputer SuperMUC at the Leibniz Computing Center (LRZ) in Munich, elucidates dynamic reasons for such uncommon multi-segment earthquake. This is an important step towards improving the accuracy of earthquake hazard assessments in other parts of the world. Their findings appear in the online journal According to the paper's authors the Kaikoura earthquake is the most complicated ever recorded and raises a number of important questions. One of its most striking features was that it resulted in the successive rupture of more than 20 segments of a fault network. "Looking at the pattern of surface faults affected by the quake, one finds large gaps of more than 15 km in between them. Up to now, analyses of seismic hazard have been based on the assumption that faults that are more than 5 km apart will not be broken in a single event," says Gabriel. A second unusual observation was that, although the earthquake began on land, it also resulted in the largest tsunami recorded in the region since 1947. This indicates that the subsurface ruptures ultimately triggered local displacements of the seafloor.The insights provided by the simulations have now yielded a better understanding of the causes of the sequence of fault ruptures that characterized the earthquake. "This was made possible by the realistic nature of our model, which incorporates the essential geophysical characteristics of fault failure, and realistically reproduces how subsurface rocks fracture and generate seismic waves," says Gabriel. The model confirmed that the Kaikoura earthquake involved a complex cascade of fault ruptures, which propagated in a zig-zag fashion. Propagation velocities along the individual fault systems were not unusually slow, but the complex geometry of the fault network and delays at the transitions between fault segments led to a tortuous rupture path. While a large amount of tectonic forces accumulating over decades may seem intuitively required to steer an earthquake throughout such complex fault networks, the authors suggest that the required forcing was on the contrary quite weak. "The rupture of such a weakly loaded fault was boosted by very gradual slippage or creep below the faults, where the crust is more ductile and low levels of frictional resistance, promoted by the presence of fluids," Gabriel explains. "In addition, high rupture velocities generally result in the rapid dissipation of frictional resistance."The researchers state that their model could contribute to improving estimates of earthquake hazard in certain areas. Current hazard assessments require careful mapping of the fault systems in the region concerned, and their susceptibility to rupture under seismic stress is then estimated. "Earthquake modeling is now becoming an important part of the rapid earthquake response toolset and for improving long-term building codes in earthquake prone areas by delivering physics-driven interpretations that can be integrated synergistically with established data-driven efforts," says the first author of the study, PhD student Thomas Ulrich. | Earthquakes | 2,019 |
March 20, 2019 | https://www.sciencedaily.com/releases/2019/03/190320102210.htm | How fluid viscosity affects earthquake intensity | Fault zones play a key role in shaping the deformation of the Earth's crust. All of these zones contain fluids, which heavily influence how earthquakes propagate. In an article published today in | The study formed part of wider research into geothermal energy projects which, like other underground activities, can trigger earthquakes -- a process known as induced seismicity, as opposed to natural seismicity, where earthquakes occur without human intervention."Subsurface exploration projects such as geothermal power, injection wells and mining all involve injecting pressurized fluids into fractures in the rock," explains Cornelio. "Studies like this show how a better understanding of the properties and effects of fluids is vital to preventing or attenuating induced earthquakes. Companies should factor these properties into their thinking, rather than focusing solely on volume and pressure considerations."Cornelio ran 36 experiments, simulating earthquakes of varying intensity, and propagating at different speeds, in granite or marble, with fluids of four different viscosities. Her findings demonstrated a clear correlation between fluid viscosity and earthquake intensity."Imagine these fluids working like soap, reducing the friction between your hands when you wash them, or like the oil you spray on mechanical parts to get them moving again," explains Marie Violay, an assistant professor and the head of the LEMR. "Moreover, naturally occurring earthquakes produce heat when the two plates rub together. That heat melts the rock, creating a lubricating film that causes the fault to slip even further. Our study also gives us a clearer picture of how that natural process works." | Earthquakes | 2,019 |
March 19, 2019 | https://www.sciencedaily.com/releases/2019/03/190319135053.htm | Underwater surveys in Emerald Bay reveal the nature and activity of Lake Tahoe faults | Emerald Bay, California, a beautiful location on the southwestern shore of Lake Tahoe, is surrounded by rugged landscape, including rocky cliffs and remnants of mountain glaciers. Scenic as it may be, the area is also a complex structural puzzle. Understanding the history of fault movement in the Lake Tahoe basin is important to assessing earthquake hazards for regional policy planners. | The Lake Tahoe region is rife with active faults, many of which have created the dramatic and rugged landscapes. The Lake Tahoe region lies between the Sierra Nevada microplate to the west and the Basin and Range Province to the east. Northwestward movement of the Sierra Nevada microplate creates stresses that may produce both strike slip (horizontal) and vertical movement on faults. For years, geologists have traversed the forbidding terrain around Emerald Bay, noting where faults cut the landscape, but a detailed picture of the faults was still missing.Two of these faults -- the Tahoe-Sierra frontal fault zone (TSFFZ) and the West Tahoe-Dollar Point fault zone (WTDPFZ) -- stretch along the western side of Lake Tahoe, but their continuity across landscapes and the nature of their movement has been debated for nearly two decades. Richard Schweickert, of the University of Nevada-Reno (UNR), part of a team of geologists and engineers from UNR, the U.S. Geological Survey, and Santa Clara University, said, "We found plenty of evidence for scarps (i.e., faults) that cut the glacial moraines all along the west side of Lake Tahoe, in particular around Emerald Bay."But it was what they couldn't hike across that most interested them. "We were desperate to see what's actually going on the bottom [of Emerald Bay]." The research team decided to examine the faults both "by land and by sea."In a new paper in The high-resolution bathymetry survey showed clear evidence of scarps cutting across submerged glacial deposits and lake sediments, along with landslide deposits that toppled into the bay after the glaciers melted. Based on the age of nearby moraines, Schweickert says the scarps in most cases are younger than about 20,000 years old.The faults scarps in the Bay are very sharp, with steep faces sloping between 30 and 60 degrees -- surprisingly steep for scarps that could be thousands of years old. "You would only see that steep angle on land exposures for faults that had just moved within the last few hundred years," Schweickert says, but noted the scarps were likely preserved by being underwater instead of being repeatedly exposed to running water on land.Schweickert says that the bathymetric data paired with direct underwater observations with the ROV show conclusively that the scarps are related to faults. "We've been able to produce the highest resolution maps with the greatest amount of detail of anywhere in the Lake Tahoe Basin," he added.After studying the ROV, bathymetry, and LiDAR data, the team noted that over the past 20,000 years, the TSFFZ and WTDPFZ were moving vertically, with no strike slip motion. Schweickert says their discovery was a bit of a paradox to what might be expected in the Lake Tahoe basin. Ten to 15 years of satellite GPS measurements show a northwest movement for the Lake Tahoe region, relative to the interior of North America. But the TSFFZ and WTDPFZ don't reflect that direction of movement -- at least not recently."I think this shows us that the GPS data really doesn't tell us what the faults themselves are doing on a local scale," says Schweickert. "They do their own thing." He adds that scientists studying other faults in the Lake Tahoe area have reached similar conclusions.However, Schweickert notes that landforms around Emerald Bay, thought to be roughly 100,000 years old, look like they experienced right lateral, strike slip movement sometime in the past. "Faults can move in different directions over long periods of time," he says. "Just because we see some them doing something right now doesn't mean that they didn't have a more complex history in the not too distant past.""There's more to this story that still needs to be known," says Schweickert, and notes that studies like theirs have value for policy and planning. | Earthquakes | 2,019 |
March 19, 2019 | https://www.sciencedaily.com/releases/2019/03/190319092019.htm | A new first: Scientists mimic nature's self-affinity using computer simulations | For the first time ever, researchers have simulated the process of surface roughness creation: a step forward in understanding the emergence of fractal characteristic of rough surfaces on many scales ranging from atomic to geological scales -- and maybe a step forward in earthquake prediction. | The topography of the world looks the same whether you look at it from a hilltop viewpoint or under the eyes of a microscope. A mountain range at geological scale is rather identical to the roughness features of an otherwise solid metal object at nanometer scale. This similarity has first been highlighted in 1983 by mathematician Benoit Mandelbrot.Since then it has been proven that all fractured surfaces share a similar characteristic, referred to as fractal self-affinity. This unique characteristic is often attributed to the physics of crack propagation through solid materials which -- up until now -- has proved notoriously difficult to model."This is the first time we have been able to reproduce self-affinity of sliding surfaces using computer simulations. Our simulations clearly illustrate how subsurface crack propagation and surface wear generates near-identical roughness across multiple scales, independent of initial roughness state," says Ramin Aghababaei, Assistant Professor at the Department of Engineering, Aarhus University.The study shows that the development of the self-affine morphology is due to smoothing and re-roughening mechanisms via subsurface crack propagation and third body configuration, and the new findings have been published in The numerical technique, which was the stepping stone for this work, was developed by Ramin Aghababaei in collaboration with scientists from École Polytechnique Fédérale de Lausanne (EPFL), Switzerland and Cornell University, USA."I started to work on this subject three years ago when I was at the EPFL and worked in the Computational Solid Mechanics Laboratory of Prof. Jean-Francois Molinari. The current study was made possible by the great effort of PhD student, Enrico Milanese, lead author of the paper, who used the developed technique and conducted large scale simulations to track the roughness evolution of surfaces sliding against one another," Assistant Professor Aghababaei explains.The main goal of the current project was to model the process of roughness creation, but the perspectives of their findings are far reaching. Controlling surface roughness is essential to the performance and durability of virtually all engineering applications. For instance, this is of great interest for manufacturing industries to predict and control the surface finish of machined components according to target tolerances."In this study, we investigated the mechanism of roughness creation, and we propose it has to do with subsurface crack propagation and material removal at different scales. A crack never goes straight. It twists and curls like a lightning bolt that curves through the air. And when you zoom in, that same waviness self-replicates on smaller scales. In other words, the crack propagation pathway dictates the self-affine nature of rough surfaces at all scales," Ramin Aghababaei says and couples it to the natural fault roughness at the geological scale."Mankind still don't know why earthquakes happen, but we believe it's a matter of microscopic crack propagation at the subsurface of the earth, sending out seismic waves that make the ground shake. And if we can understand the crack initiating mechanism by post-factum analyzing of faults roughness, we hope we can predict something," he adds.Ramin Aghababaei, who's leading the Surface Mechanics Group at the Department of Engineering, Aarhus University, is currently collaborating with geologists to use this knowledge to understand the microscopic origins of earthquakes."I am really proud of this work and a great collaboration with the EPFL team on this project. I believe this new technique opens many new possibilities to explore micromechanics of surface degradation and failure, to lay a theoretical foundation for developing the next generation of wear-resistance materials and coatings, hoping to reduce the material wastage due to wear," he says. | Earthquakes | 2,019 |
March 14, 2019 | https://www.sciencedaily.com/releases/2019/03/190314151645.htm | Tectonics in the tropics trigger Earth's ice ages | Over the last 540 million years, the Earth has weathered three major ice ages -- periods during which global temperatures plummeted, producing extensive ice sheets and glaciers that have stretched beyond the polar caps. | Now scientists at MIT, the University of California at Santa Barbara, and the University of California at Berkeley have identified the likely trigger for these ice ages.In a study published in The scientists say that the heat and humidity of the tropics likely triggered a chemical reaction between the rocks and the atmosphere. Specifically, the rocks' calcium and magnesium reacted with atmospheric carbon dioxide, pulling the gas out of the atmosphere and permanently sequestering it in the form of carbonates such as limestone.Over time, the researchers say, this weathering process, occurring over millions of square kilometers, could pull enough carbon dioxide out of the atmosphere to cool temperatures globally and ultimately set off an ice age."We think that arc-continent collisions at low latitudes are the trigger for global cooling," says Oliver Jagoutz, an associate professor in MIT's Department of Earth, Atmospheric, and Planetary Sciences. "This could occur over 1-5 million square kilometers, which sounds like a lot. But in reality, it's a very thin strip of Earth, sitting in the right location, that can change the global climate."Jagoutz' co-authors are Francis Macdonald and Lorraine Lisiecki of UC Santa Barbara, and Nicholas Swanson-Hysell and Yuem Park of UC Berkeley.When an oceanic plate pushes up against a continental plate, the collision typically creates a mountain range of newly exposed rock. The fault zone along which the oceanic and continental plates collide is called a "suture." Today, certain mountain ranges such as the Himalayas contain sutures that have migrated from their original collision points, as continents have shifted over millenia.In 2016, Jagoutz and his colleagues retraced the movements of two sutures that today make up the Himalayas. They found that both sutures stemmed from the same tectonic migration. Eighty million years ago, as the supercontinent known as Gondwana moved north, part of the landmass was crushed against Eurasia, exposing a long line of oceanic rock and creating the first suture; 50 million years ago, another collision between the supercontinents created a second suture.The team found that both collisions occurred in tropical zones near the equator, and both preceded global atmospheric cooling events by several million years -- which is nearly instantaneous on a geologic timescale. After looking into the rates at which exposed oceanic rock, also known as ophiolites, could react with carbon dioxide in the tropics, the researchers concluded that, given their location and magnitude, both sutures could have indeed sequestered enough carbon dioxide to cool the atmosphere and trigger both ice ages.Interestingly, they found that this process was likely responsible for ending both ice ages as well. Over millions of years, the oceanic rock that was available to react with the atmosphere eventually eroded away, replaced with new rock that took up far less carbon dioxide."We showed that this process can start and end glaciation," Jagoutz says. "Then we wondered, how often does that work? If our hypothesis is correct, we should find that for every time there's a cooling event, there are a lot of sutures in the tropics."The researchers looked to see whether ice ages even further back in Earth's history were associated with similar arc-continent collisions in the tropics. They performed an extensive literature search to compile the locations of all the major suture zones on Earth today, and then used a computer simulation of plate tectonics to reconstruct the movement of these suture zones, and the Earth's continental and oceanic plates, back through time. In this way, they were able to pinpoint approximately where and when each suture originally formed, and how long each suture stretched.They identified three periods over the last 540 million years in which major sutures, of about 10,000 kilometers in length, were formed in the tropics. Each of these periods coincided with each of three major, well-known ice ages, in the Late Ordovician (455 to 440 million years ago), the Permo-Carboniferous (335 to 280 million years ago), and the Cenozoic (35 million years ago to present day). Importantly, they found there were no ice ages or glaciation events during periods when major suture zones formed outside of the tropics."We found that every time there was a peak in the suture zone in the tropics, there was a glaciation event," Jagoutz says. "So every time you get, say, 10,000 kilometers of sutures in the tropics, you get an ice age."He notes that a major suture zone, spanning about 10,000 kilometers, is still active today in Indonesia, and is possibly responsible for the Earth's current glacial period and the appearance of extensive ice sheets at the poles.This tropical zone includes some of the largest ophiolite bodies in the world and is currently one of the most efficient regions on Earth for absorbing and sequestering carbon dioxide. As global temperatures are climbing as a result of human-derived carbon dioxide, some scientists have proposed grinding up vast quantities of ophiolites and spreading the minerals throughout the equatorial belt, in an effort to speed up this natural cooling process.But Jagoutz says the act of grinding up and transporting these materials could produce additional, unintended carbon emissions. And it's unclear whether such measures could make any significant impact within our lifetimes."It's a challenge to make this process work on human timescales," Jagoutz says. "The Earth does this in a slow, geological process that has nothing to do with what we do to the Earth today. And it will neither harm us, nor save us." | Earthquakes | 2,019 |
March 11, 2019 | https://www.sciencedaily.com/releases/2019/03/190311081933.htm | New way to sense earthquakes could help improve early warning systems | Every year earthquakes worldwide claim hundreds or even thousands of lives. Forewarning allows people to head for safety and a matter of seconds could spell the difference between life and death. UTokyo researchers demonstrate a new earthquake detection method -- their technique exploits subtle telltale gravitational signals traveling ahead of the tremors. Future research could boost early warning systems. | The shock of the 2011 Tohoku earthquake in eastern Japan still resonates for many. It caused unimaginable devastation, but also generated vast amounts of seismic and other kinds of data. Years later researchers still mine this data to improve models and find novel ways to use it, which could help people in the future. A team of researchers from the University of Tokyo's Earthquake Research Institute (ERI) found something in this data which could help the field of research and might someday even save lives.It all started when ERI Associate Professor Shingo Watada read an interesting physics paper on an unrelated topic by J. Harms from Istituto Nazionale di Fisica Nucleare in Italy. The paper suggests gravimeters -- sensors which measure the strength of local gravity -- could theoretically detect earthquakes."This got me thinking," said Watada. "If we have enough seismic and gravitational data from the time and place a big earthquake hit, we could learn to detect earthquakes with gravimeters as well as seismometers. This could be an important tool for future research of seismic phenomena."The idea works like this. Earthquakes occur when a point along the edge of a tectonic plate comprising the earth's surface makes a sudden movement. This generates seismic waves which radiate from that point at 6-8 kilometers per second. These waves transmit energy through the earth and rapidly alter the density of the subsurface material they pass through. Dense material imparts a slightly greater gravitational attraction than less dense material. As gravity propagates at light speed, sensitive gravimeters can pick up these changes in density ahead of the seismic waves' arrival."This is the first time anyone has shown definitive earthquake signals with such a method. Others have investigated the idea, yet not found reliable signals," elaborated ERI postgraduate Masaya Kimura. "Our approach is unique as we examined a broader range of sensors active during the 2011 earthquake. And we used special processing methods to isolate quiet gravitational signals from the noisy data."Japan is famously very seismically active so it's no surprise there are extensive networks of seismic instruments on land and at sea in the region. The researchers used a range of seismic data from these and also superconducting gravimeters (SGs) in Kamioka, Gifu Prefecture, and Matsushiro, Nagano Prefecture, in central Japan.The signal analysis they performed was extremely reliable scoring what scientists term a 7-sigma accuracy, meaning there is only a one-in-a-trillion chance a result is incorrect. This fact greatly helps to prove the concept and will be useful in calibration of future instruments built specifically to help detect earthquakes. Associate Professor Masaki Ando from the Department of Physics invented a novel kind of gravimeter -- the torsion bar antenna (TOBA) -- which aims to be the first of such instruments."SGs and seismometers are not ideal as the sensors within them move together with the instrument, which almost cancels subtle signals from earthquakes," explained ERI Associate Professor Nobuki Kame. "This is known as an Einstein's elevator, or the equivalence principle. However, the TOBA will overcome this problem. It senses changes in gravity gradient despite motion. It was originally designed to detect gravitational waves from the big bang, like earthquakes in space, but our purpose is more down-to-earth."The team dreams of a network of TOBA distributed around seismically active regions, an early warning system that could alert people 10 seconds before the first ground shaking waves arrive from an epicenter 100 km away. Many earthquake deaths occur as people are caught off-guard inside buildings that collapse on them. Imagine the difference 10 seconds could make. This will take time but the researchers continually refine models to improve accuracy of the method for eventual use in the field.This research was supported by JSPS (KAKENHI JP15K13559, JP16K05532, JP18J21734), and MEXT via the Program for Leading Graduate Schools and the Earthquake and Volcano Hazards Observation and Research Program. | Earthquakes | 2,019 |
March 1, 2019 | https://www.sciencedaily.com/releases/2019/03/190301123247.htm | Machine learning expands to help predict and characterize earthquakes | With a growing wealth of seismic data and computing power at their disposal, seismologists are increasingly turning to a discipline called machine learning to better understand and predict complicated patterns in earthquake activity. | In a focus section published in the journal Machine learning refers to a set of algorithms and models that allow computers to identify and extract patterns of information from large data sets. Machine learning methods often discover these patterns from the data themselves, without reference to the real-world, physical mechanisms represented by the data. The methods have been used successfully on problems such as digital image and speech recognition, among other applications.More seismologists are using the methods, driven by "the increasing size of seismic data sets, improvements in computational power, new algorithms and architecture and the availability of easy-to-use open source machine learning frameworks," write focus section editors Karianne Bergen of Harvard University, Ting Cheng of Los Alamos National Laboratory, and Zefeng Li of Caltech.Several researchers are using a class of machine learning methods called deep neural networks, which can learn the complex relationships between massive amounts of input data and their predicted output. For instance, Farid Khosravikia and colleagues at the University of Texas, Austin show how one kind of deep neural network can be used to develop ground motion models for natural and induced earthquakes in Oklahoma, Kansas and Texas. The unusual nature of the growing number of earthquakes caused by petroleum wastewater disposal in the region make it essential to predict ground motion for future earthquakes and to possibly mitigate their impact.Machine learning techniques could be used increasingly in the near future to preserve analog records of past earthquakes. As the media on which these data are recorded gradually degrades, seismologists are in a race against time to protect these valuable records. Machine learning methods that can identify and categorize images can be used to capture these data in a cost-effective manner, according to Kaiwen Wang of Stanford University and colleagues, who tested the possibilities on analog seismograph film from the U.S. Geological Survey's Rangely earthquake control experiment.Machine learning methods also are already in place in applications such as MyShake, to harvest and analyze data from the crowdsourced global smartphone seismic network, according to Qingkai Kong of the University of California, Berkeley and colleagues.Other researchers are using machine learning algorithms to sift through seismic data to better identify earthquake aftershocks, volcanic seismic activity and to monitor the tectonic tremor that marks deformation at plate boundaries where megathrust earthquakes might occur. Some studies use machine learning techniques to locate earthquake origins and to distinguish small earthquakes from other seismic "noise" in the environment. | Earthquakes | 2,019 |
February 28, 2019 | https://www.sciencedaily.com/releases/2019/02/190228113625.htm | Drilling results reveal global climate influence on basin waters in young rifts | Young rift basins, such as the Gulf of Corinth, are known to be sensitive recorders of past changes in climate and sea level and of the chemical and biological conditions of the waters they contain. Nonetheless, the changes observed in the Gulf of Corinth, published in the journal | This is an important discovery for understanding the impact that global climate fluctuations have on the history of sedimentation, particularly for the earliest sediments deposited as new ocean basins form. The process of continental rifting is fundamental for the formation of new ocean basins, and these basins are the source of a large proportion of the Earth's hydrocarbon resources. Therefore their history of sedimentation informs how hydrocarbons are formed and where they may collect.The results come from a new scientific ocean drilling expedition conducted as part of the International Ocean Discovery Program (IODP) and are published this week in the journal "This is the first long and high-resolution sediment record of the early rifting process ever obtained," Lisa McNeill comments. "At one site, an expanded 700 m section records the last 800,000 years of rift basin history." The cores have revealed how the conditions changed as global-climate and ice sheet growth reduced sea level and isolated the Corinth basin from the open ocean. The reduced salinity severely restricted the range of organisms that could inhabit the waters under such stressed conditions. These fluctuations occurred about once every 100,000 years as global climate changed.The most striking result is that the rate of sediment flux into the basin increases by a factor of 2-7 during the globally glaciated periods relative to the non-glaciated periods. This is recorded in the relative thicknesses of the sedimentary fill of the rift basin during these different periods. Donna Shillington explains, "We argue that this change is not caused by increased precipitation and erosion during glacial times, but rather by a reduction in vegetation cover and a change in the type of vegetation during glacial periods leading to enhanced erosion." This is an important result for understanding the Eastern Mediterranean climate at this time and for the impact of global climate fluctuations on the rate of filling of sedimentary basins. Such basins are the source of a large proportion of the Earth's hydrocarbon resources and their history of sedimentation, particularly in the early phase of rifting, informs how hydrocarbon source and reservoir rocks develop.The science team also found that the sedimentation rates in the Holocene (the last 10,000 years) were much higher than earlier non-glaciated time periods. This is probably due to the human impacts on the landscape of mainland Greece, deforesting the landscape and increasing slope erosion rates over a period of about 4000 years.1. The Science Team of the International Ocean Discovery Program (IODP) Expedition 381 included 35 scientists of different geoscience disciplines from Australia, Brazil, China, France, Germany, Greece, India, Norway, Spain, the United Kingdom, and the United States, nine of whom sailed onboard the Fugro Synergy October to December of 2017 in the Gulf of Corinth, Greece, to collect the cores and data. After the offshore phase, the whole Science Team met at the IODP Bremen Core Repository (BCR), located at MARUM -- Center for Marine Environmental Sciences at the University of Bremen, Germany, in February 2018 to split, analyze and sample the cores and analyze the data collected. The scientists are continuing to analyse the cores and samples over the next 2 to 4 years to extract more information from this unique new dataset.2. The four main themes of the expedition are:4. The expedition is conducted by the European Consortium for Ocean Research Drilling (ECORD) as part of the International Ocean Discovery Program (IODP). The International Ocean Discovery Program (IODP) is an international marine research program supported by 23 countries, which explores Earth's history and structure recorded in seafloor sediments and rocks, and monitors sub-seafloor environments. Through multiple platforms -- a feature unique to IODP -- scientists sample the deep biosphere and sub-seafloor ocean, environmental change, processes and effects, and solid earth cycles and dynamics. | Earthquakes | 2,019 |
February 26, 2019 | https://www.sciencedaily.com/releases/2019/02/190226112308.htm | Coda waves reveal carbon dioxide storage plume | Pumping carbon dioxide into the ground to remove it from the atmosphere is one way to lower greenhouse gases, but keeping track of where that gas is, has been a difficult chore. Now, a team of researchers from Penn State and Lawrence Berkley National Laboratory are using previously ignored seismic waves to pinpoint and track the gas clouds. | "We usually don't look at coda waves, we usually throw them out," said Tieyuan Zhu, professor of geophysics, Penn State. "If we look at a carbon dioxide plume underground with P waves we don't see any change in shape, but if we use the late-arriving waves, the coda waves, we see a change."P waves are the fastest seismic body waves that pass through the Earth after an earthquake or explosion. S waves are slower body waves. Coda waves come later and are disorganized, but they can reveal where gases are stored in the ground, because the combination of rock and gas alters the waves.When carbon dioxide is stored underground, it is pumped over a mile deep into geological spaces and is usually over 150 degrees Fahrenheit and at high pressure. Ideally, researchers would like the gas to stay at that depth forever. Current monitoring methods are difficult, expensive and can only be done periodically. Tracking the plumes with coda waves also has the benefit of better estimation of the total amount of gas in the reservoir, rather than just a local region."Current technology is very expensive," said Zhu. "We are looking for an economical method to monitor the gas."Standard monitoring now is done at six-month or yearly intervals. The researchers hope that by using permanent seismic sources and coda wave analysis they can monitor much more frequently -- days or weeks."Our future experiments would test real-time monitoring systems," said Zhu. "In that way, we could detect small changes in the gas plume. Chris Marone, professor of geosciences at Penn State, also is working on the project. Work at Lawrence Berkeley National Laboratory was carried out by staff scientists Jonathan Ajo-Franklin, and Thomas Daley. | Earthquakes | 2,019 |
February 15, 2019 | https://www.sciencedaily.com/releases/2019/02/190215092922.htm | Tide gauges capture tremor episodes in cascadian subduction zone | Hourly water level records collected from tide gauges can be used to measure land uplift caused by episodic tremor and slip of slow earthquakes in the Cascadia Subduction Zone, according to a new report in the | Global Positioning System (GPS) data are typically used to measure uplift from these events, but the new findings offer a way to study the phenomena using tidal gauge records collected in the pre-GPS era, before 1995, the study's authors said.The Cascadia Subduction Zone marks where the North American continental plate collides with multiple oceanic plates. The collision can produce devastating megathrust earthquakes that result in massive loss of life and damage to infrastructure, with the last one occurring in the region in 1700. But the zone also hosts slow earthquakes, accompanied by episodic tremor and slip (ETS), that take months or years to release the energy built up by the colliding plates.The slow earthquakes are of interest to seismologists looking for clues about exactly where and how the plates may be colliding, said Sequoia Alba of the University of Oregon, lead author on the BSSA paper.In particular, the earthquakes might help researchers understand the boundaries of the "locked zone" at the interface of the plates, where a rupture is likely to occur in the event of a megathrust earthquake."The part of the fault that is slipping during ETS can't be totally locked, because it is experiencing periodic slow earthquakes, so that area sort of defines the edge of what could theoretically slip, producing destructive seismic waves, during a megathrust earthquake," Alba explained."How destructive [a megathrust earthquake] would be to the people and cities of the Pacific Northwest will depend largely on where on the interface the earthquake happens -- in this case, how far inland it happens -- because the intensity of shaking is dependent on distance," she added. "If the part of the fault that slips during an earthquake is beneath the ocean floor miles out to sea, for example, that will be less damaging to cities like Portland and Seattle than if the slip patch is directly beneath those cities."Other seismologists have suggested that ETC "may change in some observable way over the megathrust cycle that would help us predict how likely an earthquake is in a given period in the future," Alba noted.Alba and her colleagues turned to tide gauge data as a possible way to detect ETS patterns before the use of GPS in the Cascadia region. They looked for signs of ETS-related uplift reflected in the hourly water levels measured by four gauges along the Juan de Fuca Strait and Puget Sound, at Port Angeles, Port Townsend, Neah Bay and Seattle. Average relative sea level (compared to a fixed point on land) should appear to descend as the land itself deforms upward during a slow earthquake.The researchers then calculated the amount of uplift and uplift rates suggested by the gauge records between 1996 and 2011, comparing their results to uplift as measured by GPS over the same time period. The gauge data were not sensitive enough to capture individual ETS events, Alba and colleagues concluded, but they could be used to detect periodic groups of ETS events.Both the GPS and tide gauge data suggest these events occurred every 14.6 months between 1996 and 2011, but Alba and colleagues could not find that same pattern in the tide gauge data from 1980 to 1995. "Our results are too preliminary to characterize how ETS changes, or if ETS was present during the pre-GPS era, but there does appear to have been a change," they write.The recurrence interval for ETS could have been different between 1980 to 1995, they suggest, or slip might have been taking place along a different part of the Cascadia interface during the earlier time period that would not have been captured in the tidal data. | Earthquakes | 2,019 |
February 14, 2019 | https://www.sciencedaily.com/releases/2019/02/190214153125.htm | Massive Bolivian earthquake reveals mountains 660 kilometers below our feet | Most schoolchildren learn that the Earth has three (or four) layers: a crust, mantle and core, which is sometimes subdivided into an inner and outer core. That's not wrong, but it does leave out several other layers that scientists have identified within the Earth. | In a study published this week in To peer deep into the Earth, scientists use the most powerful waves on the planet, which are generated by massive earthquakes. "You want a big, deep earthquake to get the whole planet to shake," said Irving, an assistant professor of geosciences.Big earthquakes are vastly more powerful than small ones -- energy increases 30-fold with every step up the Richter scale -- and deep earthquakes, "instead of frittering away their energy in the crust, can get the whole mantle going," Irving said. She gets her best data from earthquakes that are magnitude 7.0 or higher, she said, as the shockwaves they send out in all directions can travel through the core to the other side of the planet -- and back again. For this study, the key data came from waves picked up after a magnitude 8.2 earthquake -- the second-largest deep earthquake ever recorded -- that shook Bolivia in 1994."Earthquakes this big don't come along very often," she said. "We're lucky now that we have so many more seismometers than we did even 20 years ago. Seismology is a different field than it was 20 years ago, between instruments and computational resources."Seismologists and data scientists use powerful computers, including Princeton's Tiger supercomputer cluster, to simulate the complicated behavior of scattering waves in the deep Earth.The technology depends on a fundamental property of waves: their ability to bend and bounce. Just as light waves can bounce (reflect) off a mirror or bend (refract) when passing through a prism, earthquake waves travel straight through homogenous rocks but reflect or refract when they encounter any boundary or roughness."We know that almost all objects have surface roughness and therefore scatter light," said Wu, the lead author on the new paper, who just completed his geosciences Ph.D. and is now a postdoctoral researcher at the California Institute of Technology. "That's why we can see these objects -- the scattering waves carry the information about the surface's roughness. In this study, we investigated scattered seismic waves traveling inside the Earth to constrain the roughness of the Earth's 660-km boundary."The researchers were surprised by just how rough that boundary is -- rougher than the surface layer that we all live on. "In other words, stronger topography than the Rocky Mountains or the Appalachians is present at the 660-km boundary," said Wu. Their statistical model didn't allow for precise height determinations, but there's a chance that these mountains are bigger than anything on the surface of the Earth. The roughness wasn't equally distributed, either; just as the crust's surface has smooth ocean floors and massive mountains, the 660-km boundary has rough areas and smooth patches. The researchers also examined a layer 410 kilometers (255 miles) down, at the top of the mid-mantle "transition zone," and they did not find similar roughness."They find that Earth's deep layers are just as complicated as what we observe at the surface," said seismologist Christine Houser, an assistant professor at the Tokyo Institute of Technology who was not involved in this research. "To find 2-mile (1-3 km) elevation changes on a boundary that is over 400 miles (660 km) deep using waves that travel through the entire Earth and back is an inspiring feat. ... Their findings suggest that as earthquakes occur and seismic instruments become more sophisticated and expand into new areas, we will continue to detect new small-scale signals which reveal new properties of Earth's layers."The presence of roughness on the 660-km boundary has significant implications for understanding how our planet formed and continues to function. That layer divides the mantle, which makes up about 84 percent of the Earth's volume, into its upper and lower sections. For years, geoscientists have debated just how important that boundary is. In particular, they have investigated how heat travels through the mantle -- whether hot rocks are carried smoothly from the core-mantle boundary (almost 2,000 miles down) all the way up to the top of the mantle, or whether that transfer is interrupted at this layer. Some geochemical and mineralogical evidence suggests that the upper and lower mantle are chemically different, which supports the idea that the two sections don't mix thermally or physically. Other observations suggest no chemical difference between the upper and lower mantle, leading some to argue for what's called a "well-mixed mantle," with both the upper and lower mantle participating in the same heat-transfer cycle."Our findings provide insight into this question," said Wu. Their data suggests that both groups might be partially right. The smoother areas of the 660-km boundary could result from more thorough vertical mixing, while the rougher, mountainous areas may have formed where the upper and lower mantle don't mix as well.In addition, the roughness the researchers found, which existed at large, moderate and small scales, could theoretically be caused by heat anomalies or chemical heterogeneities. But because of how heat in transported within the mantle, Wu explained, any small-scale thermal anomaly would be smoothed out within a million years. That leaves only chemical differences to explain the small-scale roughness they found.What could cause significant chemical differences? The introduction of rocks that used to belong to the crust, now resting quietly in the mantle. Scientists have long debated the fate of the slabs of sea floor that get pushed into the mantle at subduction zones, the collisions happening found all around the Pacific Ocean and elsewhere around the world. Wu and Irving suggest that remnants of these slabs may now be just above or just below the 660-km boundary."It's easy to assume, given we can only detect seismic waves traveling through the Earth in its current state, that seismologists can't help understand how Earth's interior has changed over the past 4.5 billion years," said Irving. "What's exciting about these results is that they give us new information to understand the fate of ancient tectonic plates which have descended into the mantle, and where ancient mantle material might still reside."She added: "Seismology is most exciting when it lets us better understand our planet's interior in both space and time." | Earthquakes | 2,019 |
February 14, 2019 | https://www.sciencedaily.com/releases/2019/02/190214084620.htm | Satellite images reveal interconnected plumbing system that caused Bali volcano to erupt | A team of scientists, led by the University of Bristol, has used satellite technology provided by the European Space Agency (ESA) to uncover why the Agung volcano in Bali erupted in November 2017 after 50 years of dormancy. | Their findings, published today in the journal Two months prior to the eruption, there was a sudden increase in the number of small earthquakes occurring around the volcano, triggering the evacuation of 100,000 people.The previous eruption of Agung in 1963 killed nearly 2,000 people and was followed by a small eruption at its neighboring volcano, Batur.Because this past event was among the deadliest volcanic eruptions of the 20th Century, a great effort was deployed by the scientific community to monitor and understand the re-awakening of Agung.During this time, a team of scientists from the University of Bristol's School of Earth Sciences, led by Dr Juliet Biggs used Sentinel-1 satellite imagery provided by the ESA to monitor the ground deformation at Agung.Dr Biggs said: "From remote sensing, we are able to map out any ground motion, which may be an indicator that fresh magma is moving beneath the volcano."In the new study, carried out in collaboration with the Center for Volcanology and Geological Hazard Mitigation in Indonesia (CVGHM), the team detected uplift of about 8-10 cm on the northern flank of the volcano during the period of intense earthquake activity.Dr Fabien Albino, also from Bristol's School of Earth Sciences, added: "Surprisingly, we noticed that both the earthquake activity and the ground deformation signal were located five kilometres away from the summit, which means that magma must be moving sideways as well as vertically upwards."Our study provides the first geophysical evidence that Agung and Batur volcanoes may have a connected plumbing system."This has important implications for eruption forecasting and could explain the occurrence of simultaneous eruptions such as in 1963." | Earthquakes | 2,019 |
February 12, 2019 | https://www.sciencedaily.com/releases/2019/02/190212162218.htm | Indonesia's devastating 2018 earthquake was a rare supershear, UCLA study finds | The devastating 7.5 magnitude earthquake that struck the Indonesian island of Sulawesi last September was a rare "supershear" earthquake, according to a study led by UCLA researchers. | Only a dozen supershear quakes have been identified in the past two decades, according to Lingsen Meng, UCLA's Leon and Joanne V.C. Knopoff Professor of Physics and Geophysics and one of the report's senior authors. The new study was published Feb. 4 in the journal Meng and a team of scientists from UCLA, France's Geoazur Laboratory, the Jet Propulsion Laboratory at Caltech, and the Seismological Laboratory at Caltech analyzed the speed, timing and extent of the Palu earthquake. Using high-resolution observations of the seismic waves caused by the temblor, along with satellite radar and optical images, they found that the earthquake propagated unusually fast, which identified it as a supershear.Supershear earthquakes are characterized by the rupture in Earth's crust moving very fast along a fault, causing the up-and-down or side-to-side waves that shake the ground -- called seismic shear waves -- to intensify. Shear waves are created in standard earthquakes, too, but in supershear quakes, the rupture moving faster than the shear waves produces more energy in a shorter time, which is what makes supershears even more destructive."That intense shaking was responsible for the widespread landslides and liquefactions [the softening of soil caused by the shaking, which often causes buildings to sink into the mud] that followed the Palu earthquake," Meng said.In fact, he said, the vibrations produced by the shaking of supershear earthquakes is analogous to the sound vibrations of the sonic boom produced by supersonic jets.UCLA graduate student Han Bao, the report's first author, gathered publicly available ground-motion recordings from a sensor network in Australia -- about 2,500 miles away from where the earthquake was centered -- and used a UCLA-developed source imaging technique that tracks the growth of large earthquakes to determine its rupture speed. The technique is similar to how a smartphone user's location can be determined by triangulating the times that phone signals arrive at cellphone antenna towers."Our technique uses a similar idea," Meng said. "We measured the delays between different seismic sensors that record the seismic motions at set locations."The researchers could then use that to determine the location of the rupture at different times during the earthquake.They determined that the minute-long quake moved away from the epicenter at 4.1 kilometers per second (or about 2.6 miles per second), faster than the surrounding shear-wave speed of 3.6 kilometers per second (2.3 miles per second). By comparison, non-shear earthquakes move at about 60 percent of that speed -- around 2.2 kilometers per second (1.3 miles per second), Meng said.Previous supershear earthquakes -- like the magnitude 7.8 Kunlun earthquake in Tibet in 2001 and the magnitude 7.9 Denali earthquake in Alaska in 2002 -- have occurred on faults that were remarkably straight, meaning that there were few obstacles to the quakes' paths. But the researchers found on satellite images of the Palu quake that the fault line had two large bends. The temblor was so strong that the rupture was able to maintain a steady speed around these bends.That could be an important lesson for seismologists and other scientists who assess earthquake hazards."If supershear earthquakes occur on nonplanar faults, as the Palu earthquake did, we have to consider the possibility of stronger shaking along California's San Andreas fault, which has many bends, kinks and branches," Meng said.Supershear earthquakes typically start at sub-shear speed and then speed up as they continue. But Meng said the Palu earthquake progressed at supershear speed almost from its inception, which would imply that there was high stress in the rocks surrounding the fault -- and therefore stronger shaking and more land movement in a compressed amount of time than would in standard earthquakes."Geometrically irregular rock fragments along the fault plane usually act as barriers preventing earthquakes," Meng said. "However, if the pressure accumulates for a long time -- for decades or even hundreds of years -- an earthquake will eventually overcome the barriers and will go supershear right away."Among the paper's other authors are Tian Feng, a UCLA graduate student, and Hui Huang, a UCLA postdoctoral scholar. The UCLA researchers were supported by the National Science Foundation and the Leon and Joanne V.C. Knopoff Foundation. The other authors are Cunren Liang of the Seismological Laboratory at Caltech; Eric Fielding and Christopher Milliner of JPL at Caltech and Jean-Paul Ampuero of Geoazur. | Earthquakes | 2,019 |
February 8, 2019 | https://www.sciencedaily.com/releases/2019/02/190208115322.htm | Famous 'sandpile model' shown to move like a traveling sand dune | The so-called Abelian sandpile model has been studied by scientists for more than 30 years to better understand a physical phenomenon called self-organized criticality, which appears in a plethora of real-life situations such as the coordinated firing of brain cells, the spread of forest fires, the distribution of earth quake magnitudes and even in the coordinated behavior of ant colonies. Even though the sandpile model serves as the archetypical model to study self-organized criticality, questions about its characteristics are still open and remain an active field of research. | Moritz Lang and Mikhail Shkonikov from the Institute of Science and Technology Austria (IST Austria) have now discovered a new property of this mathematical model: by adding sand grains in a specific manner to the sandpile, they induce dynamics reminiscent of the emergence, movement, collision and disappearance of sand dunes in the Gobi or the Namib desert. Different to real-world sand dunes, however, the dunes in their work -- which is published in the current issue of The rules of the "sandpile experiment" are fairly simple: The model essentially consist of a grid of quadratic fields, similar to a checkerboard, onto which sand grains are dropped randomly. Fields that end up with less than four grains of sand remain stable, but when more grains accumulate on a field, the field becomes unstable and "topples." In such a "toppling," four grains of sand are passed on to the four neighboring fields: one to the top, one to the bottom, one to the left, and one to the right. This might cause the neighboring fields to also become unstable and topple, which then in turn may cause the next neighbors to topple and so on -- an "avalanche" emerges. Similar to real-world avalanches in the Alps, these "sandpile avalanches" have no characteristic size, and it is extremely challenging to predict if the next sand grain will cause a huge avalanche, or nothing at all.While, due to the simplicity of these rules, the sandpile model is regularly used as an easy example in elementary programming courses, it nevertheless displays various mathematical and physical phenomena still unexplained today -- despite more than 30 years of extensive research. Among the most fascinating of these phenomena is the appearance of fractal sandpile configurations. These fractal sandpiles are characterized by repetitive and self-similar patterns where the same shapes appear over and over again, but in smaller and smaller versions. The occurrence of these fractal patterns has yet evade any mathematical explanation. While the researchers at IST Austria could also not solve this mathematical riddle, they rendered this phenomenon even more mysterious by showing that these fractal patterns can seemingly continuously transform into one another: They were able to produce movies in which the fractal patterns display dynamics which are, depending on the background of the observer, either reminiscent of the movement of real-world sand dunes, or of "psychedelic movies" characteristic for the 70'ies.Not solving a mathematical question but only making it appear to be even more mysterious might at first sight not seem to be the ideal outcome. However, the two scientists -- Moritz Lang who is a postdoc in the research group of Professor Calin Guet, and Mikhail Shkonikov, a postdoc in the group of Professor Tamas Hausel -- believe that their "psychedelic movies" might be the key to a better understanding of the sandpile model, and maybe also of many other physical, biological or even economical problems. "You could say that we have found universal coordinates for the sandpile," say Mikhail Shkonikov, "essentially, we can give every sand dune in the desert a very specific identifier." Moritz Lang, who is a theoretical biologist, adds: "the key to understand any physical or biological phenomenon is to understand its consequences. The more consequences we know, the harder it becomes to develop a scientific hypothesis which is in agreement with all those consequences. In that sense, knowing all possible sand dunes and how they move represents a lot of constraints, and we hope that, in the end, this will remove sufficient hay from the stack such that we can find the needle."The two researchers see many applications of their theoretical work to real-world problems like the prediction of earthquake magnitudes, the functioning of the human brain, physics, or even economics: "In all these fields, we find haystacks which look similar, very similar. Maybe it turns out that all haystacks are the same, and that there is only one needle to find."Moritz Lang finished his PhD at ETH Zürich in spring 2015 with a thesis entitled "Modular identification and analysis of biomolecular networks." He joined IST Austria in August 2015. Mikhail Shkonikov obtained his PhD from the University of Geneva and joined IST Austria in 2017. | Earthquakes | 2,019 |
February 5, 2019 | https://www.sciencedaily.com/releases/2019/02/190205151006.htm | Dark fiber lays groundwork for long-distance earthquake detection and groundwater mapping | In traditional seismology, researchers studying how the earth moves in the moments before, during, and after an earthquake rely on sensors that cost tens of thousands of dollars to make and install underground. And because of the expense and labor involved, only a few seismic sensors have been installed throughout remote areas of California, making it hard to understand the impacts of future earthquakes as well as small earthquakes occurring on unmapped faults. | Now researchers at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have figured out a way to overcome these hurdles by turning parts of a 13,000-mile-long testbed of "dark fiber," unused fiber-optic cable owned by the DOE Energy Sciences Network (ESnet), into a highly sensitive seismic activity sensor that could potentially augment the performance of earthquake early warning systems currently being developed in the western United States. The study detailing the work -- the first to employ a large regional network as an earthquake sensor -- was published this week in Nature's According to Jonathan Ajo-Franklin, a staff scientist in Berkeley Lab's Earth and Environmental Sciences Area who led the study, there are approximately 10 million kilometers of fiber-optic cable around the world, and about 10 percent of that consists of dark fiber.The Ajo-Franklin group has been working toward this type of experiment for several years. In a 2017 study, they installed a fiber-optic cable in a shallow trench in Richmond, California, and demonstrated that a new sensing technology called distributed acoustic sensing (DAS) could be used for imaging of the shallow subsurface. DAS is a technology that measures seismic wavefields by shooting short laser pulses across the length of the fiber. In a follow-up study, they and a group of collaborators demonstrated for the first time that fiber-optic cables could be used as sensors for detecting earthquakes.The current study uses the same DAS technique, but instead of deploying their own fiber-optic cable, the researchers ran their experiments on a 20-mile segment of the 13,000-mile-long ESnet Dark Fiber Testbed that extends from West Sacramento to Woodland, California. "To further verify our results from the 2017 study, we knew we would need to run the DAS tests on an actual dark fiber network," said Ajo-Franklin, who also heads Berkeley Lab's Geophysics Department."When Jonathan approached me about using our Dark Fiber Testbed, I didn't even know it was possible" to use a network as a sensor, said Inder Monga, Executive Director of ESnet and director of the Scientific Networking Division at Berkeley Lab. "No one had done this work before. But the possibilities were tremendous, so I said, 'Sure, let's do this!"Chris Tracy from ESnet worked closely with the researchers to figure out the logistics of implementation. Telecommunications company CenturyLink provided fiber installation information.Because the ESnet Testbed has regional coverage, the researchers were able to monitor seismic activity and environmental noise with finer detail than previous studies."The coverage of the ESnet Dark Fiber Testbed provided us with subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network," said co-author Verónica Rodríguez Tribaldos, a postdoctoral researcher in Ajo-Franklin's lab. "Conventional seismic networks often employ only a few dozen sensors spaced apart by several kilometers to cover an area this large, but with the ESnet Testbed and DAS, we have 10,000 sensors in a line with a two-meter spacing. This means that with just one fiber-optic cable you can gather very detailed information about soil structure over several months."After seven months of using DAS to record data through the ESnet Dark Fiber Testbed, the researchers proved that the benefits of using a commercial fiber are manifold. "Just by listening for 40 minutes, this technology has the potential to do about 10 different things at once. We were able to pick up very low frequency waves from distant earthquakes as well as the higher frequencies generated by nearby vehicles," said Ajo-Franklin. The technology allowed the researchers to tell the difference between a car or moving train versus an earthquake, and to detect both local and distant earthquakes, from Berkeley to Gilroy to Chiapas, Mexico. The technology can also be used to characterize soil quality, provide information on aquifers, and be integrated into geotechnical studies, he added.With such a detailed picture of the subsurface, the technology has potential for use in time-lapse studies of soil properties, said Rodríguez Tribaldos. For example, in environmental monitoring, this tool could be used to detect long-term groundwater changes, the melting of permafrost, or the hydrological changes involved in landslide hazards.The current study's findings also suggest that researchers may no longer have to choose between data quality and cost. "Cell phone sensors are inexpensive and tell us when a large earthquake happens nearby, but they will not be able to record the fine vibrations of the planet," said co-author Nate Lindsey, a UC Berkeley graduate student who led the field work and earthquake analysis for the 2017 study. "In this study, we showed that inexpensive fiber-optics pick up those small ground motions with surprising quality."With 300 terabytes of raw data collected for the study, the researchers have been challenged to find ways to effectively manage and process the "fire hose" of seismic information. Ajo-Franklin expressed hope to one day build a seismology data portal that couples ESnet as a sensor and data transfer mechanism, with analysis and long-term data storage managed by Berkeley Lab's supercomputing facility, NERSC (National Energy Research Scientific Computing Center).Monga added that even though the Dark Fiber Testbed will soon be lit for the next generation of ESnet, dubbed "ESnet 6," there may be sections that could be used for seismology. "Although it was completely unexpected that ESnet -- a transatlantic network dedicated for research -- could be used as a seismic sensor, it fits perfectly within our mission," he said. "At ESnet, we want to enable scientific discovery unconstrained by geography." | Earthquakes | 2,019 |
February 4, 2019 | https://www.sciencedaily.com/releases/2019/02/190204085936.htm | MERMAIDs reveal secrets from below the ocean floor | Seismologists use waves generated by earthquakes to scan the interior of our planet, much like doctors image their patients using medical tomography. Earth imaging has helped us track down the deep origins of volcanic islands such as Hawaii, and identify the source zones of deep earthquakes. | "Imagine a radiologist forced to work with a CAT scanner that is missing two-thirds of its necessary sensors," said Frederik Simons, a professor of geosciences at Princeton. "Two-thirds is the fraction of the Earth that is covered by oceans and therefore lacking seismic recording stations. Such is the situation faced by seismologists attempting to sharpen their images of the inside of our planet."Some 15 years ago, when he was a postdoctoral researcher, Simons partnered with Guust Nolet, now the George J. Magee Professor of Geoscience and Geological Engineering, Emeritus, and they resolved to remediate this situation by building an undersea robot equipped with a hydrophone -- an underwater microphone that can pick up the sounds of distant earthquakes whose waves deliver acoustic energy into the oceans through the ocean floor.This week, Nolet, Simons and an international team of researchers published the first scientific results from the revolutionary seismic floats, dubbed MERMAIDs -- Mobile Earthquake Recording in Marine Areas by Independent Divers.The researchers, from institutions in the United States, France, Ecuador and China, found that the volcanoes on Galápagos are fed by a source 1,200 miles (1,900 km) deep, via a narrow conduit that is bringing hot rock to the surface. Such "mantle plumes" were first proposed in 1971 by one of the fathers of plate tectonics, Princeton geophysicist W. Jason Morgan, but they have resisted attempts at detailed seismic imaging because they are found in the oceans, rarely near any seismic stations.MERMAIDs drift passively, normally at a depth of 1,500 meters -- about a mile below the sea surface -- moving 2-3 miles per day. When one detects a possible incoming earthquake, it rises to the surface, usually within 95 minutes, to determine its position with GPS and transmit the seismic data.By letting their nine robots float freely for two years, the scientists created an artificial network of oceanic seismometers that could fill in one of the blank areas on the global geologic map, where otherwise no seismic information is available.The unexpectedly high temperature that their model shows in the Galápagos mantle plume "hints at the important role that plumes play in the mechanism that allows the Earth to keep itself warm," said Nolet."Since the 19th century, when Lord Kelvin predicted that Earth should cool to be a dead planet within a hundred million years, geophysicists have struggled with the mystery that the Earth has kept a fairly constant temperature over more than 4.5 billion years," Nolet explained. "It could have done so only if some of the original heat from its accretion, and that created since by radioactive minerals, could stay locked inside the lower mantle. But most models of the Earth predict that the mantle should be convecting vigorously and releasing this heat much more quickly. These results of the Galápagos experiment point to an alternative explanation: the lower mantle may well resist convection, and instead only bring heat to the surface in the form of mantle plumes such as the ones creating Galápagos and Hawaii."To further answer questions on the heat budget of the Earth and the role that mantle plumes play in it, Simons and Nolet have teamed up with seismologists from the Southern University of Science and Technology (SUSTech) in Shenzhen, China, and from the Japan Agency for Marine-Earth Science and Technology (JAMSTEC). Together, and with vessels provided by the French research fleet, they are in the process of launching some 50 MERMAIDs in the South Pacific to study the mantle plume region under the island of Tahiti."Stay tuned! There are many more discoveries to come," said professor Yongshun (John) Chen, a 1989 Princeton graduate alumnus who is head of the Department of Ocean Science and Engineering at SUSTech, which is leading the next phase of what they and their international team have called EarthScope-Oceans. | Earthquakes | 2,019 |
January 30, 2019 | https://www.sciencedaily.com/releases/2019/01/190130133037.htm | Toppled train offers insight into ground motion, origin of 1906 earthquake | By mathematically modeling the movements of a locomotive that toppled from the tracks north of San Francisco during the city's infamous 1906 earthquake, researchers have calculated a lower limit on the earthquake ground motion at the spot of the tipped train. | Their report in the journal Seismologists rely on these ground motion simulations because the nearest ground motion records of the historical event came from a station located 35 kilometers (21.7 miles) away from the San Andreas Fault, which ruptured during the earthquake. The simulations help scientists determine what sort of ground motion might be likely during future large earthquakes in the region.Data gleaned from the unusual case of the toppled train, said Veeraraghavan, can be used to test the results of high-performance computing simulations of the 1906 quake and possible future quakes. "With the limited availability of ground motion records for large earthquakes, these eyewitness observations become critical data points that lend credibility to these large earthquake simulations," she said.The train was near the Point Reyes station north of San Francisco, stopped on a siding for refueling, when it took its historic tumble. An eyewitness to the event said the conductor had just climbed back into the locomotive:"when the train gave a great lurch to the east, followed by another to the west, which threw the whole train on its side. The astonished conductor dropped off as it went over, and at the sight of the falling chimneys and breaking widows of the station, he understood that it was the Temblor."Veeraraghavan and colleagues used the eyewitness account, a photo of the toppled train taken by the train's fireman and information about the size and design of the train (approximating it as a rectangular block) to guide their modeling of the ground motions that would have been necessary for the train to fall in the way that it did.From this analysis, the researchers calculated two measures of ground shaking -- peak ground acceleration (PGA) and peak ground velocity (PGV) -- to determine the minimum ground shaking necessary to overturn the train. Their results for these two measures are equivalent to a score of VIII (severe) on the Modified Mercalli Intensity scale.This isn't the first time that seismologists have attempted to extract ground motion data from the toppled train. Thomas Heaton of Caltech, one of the co-authors on the current paper, was part of a team that published a similar analysis in the Bulletin of the Seismological Society of America in 1999. The new study improves on these findings by considering potential vertical as well as horizontal ground motions gleaned from 140 worldwide earthquake records in its analysis.The researchers compared the ground motions calculated from their train model to three different simulations of the 1906 earthquake from a 2008 study, concluding that each of those simulated earthquakes could cause severe enough shaking to overturn the train.The findings suggest that large earthquake simulations for the region "are realistic," Veeraraghavan said. "Such shaking could result in significant damage to a variety of nonductile structures in the Bay Area, including certain types of tall buildings."The eyewitness account of the train lurching east then west before toppling suggests that the hypocenter of the 1906 is likely to be south of Point Reyes, perhaps offshore of San Francisco and San Juan Bautista as others have calculated, the authors write. | Earthquakes | 2,019 |
January 29, 2019 | https://www.sciencedaily.com/releases/2019/01/190129093726.htm | Earthquake in super slo-mo | A big earthquake occurred south of Istanbul in the summer of 2016, but it was so slow that nobody noticed. The earthquake, which took place at mid-crustal depth, lasted more than fifty days. Only a novel processing technique applied to data from special borehole strainmeter instruments and developed by researchers from the GFZ German Research Centre for Geosciences, in collaboration with the Turkish Disaster and Emergency Management Presidency (AFAD) and the UNAVCO institute from US, allowed to identify the ultra-slow quake below the Sea of Marmara. The team led by Patricia Martínez-Garzón from GFZ's section "Geomechanics and Scientific Drilling" reports in the journal | The region south of Istanbul is part of the North Anatolian Fault, separating Eurasia from the Anatolian plate. This geological fault is a large tectonic plate boundary known to generate destructive earthquakes causing large numbers of casualties. The last such major earthquake occurred in 1999 near Izmit causing almost 20,000 fatalities. A portion of the fault, running just south of the densely populated mega-city of Istanbul, is currently identified as a "seismic gap" and overdue to produce a large earthquake. While the tectonic loading due to plate motion is continuous thereby accumulating elastic energy on faults day-by-day, the release of the stored energy can occur either seismically in the form of earthquakes, or aseismically during fault creep or slow deformation at depth. Understanding the interaction between both phenomena is critically important to define the seismic hazard and subsequent risk in urban areas.The study in Data from one of the borehole strainmeter stations located in the most seismically active portion of the area on the Armutlu Peninsula was processed using novel computing techniques. 'This allowed to identify the slow slip signal that presumably occurred at mid-crustal depth level and that is of the same size as the largest ever seen such signal that occurred along the San Andreas Fault in California', says Dr. Martínez-Garzón, lead-author of the study. During this aseismic slow deformation signal the shallower and typically fully locked part of the earth crust responded by producing the highest number of moderate earthquakes in years indicating an interaction between near-surface and deep crustal deformation. Prof. Marco Bohnhoff, head of the GONAF observatory and a co-author of the study states: 'How this interaction works remains to be understood in detail. In any case, our results will allow to better understand and quantify the regional seismic risk, in particular for the 15-million population center of Istanbul in the light of the pending big one'. | Earthquakes | 2,019 |
January 23, 2019 | https://www.sciencedaily.com/releases/2019/01/190123131717.htm | Scientists reconstruct ancient lost plates under Andes mountains | The Andes Mountains are the longest continuous mountain range in the world, stretching about 7,000 kilometers, or 4,300 miles, along the western coast of South America. | The Andean margin, where two tectonic plates meet, has long been considered the textbook example of a steady, continuous subduction event, where one plate slipped under another, eventually forming the mountain range seen today.In a paper published in the journal Their results show that the formation of the Andean mountain range was more complicated than previous models suggested."The Andes Mountain formation has long been a paradigm of plate tectonics," said Jonny Wu, assistant professor of geology at UH and a co-author of the paper.When tectonic plates move under the Earth's crust and enter the mantle, they do not disappear. Rather, they sink toward the core, like leaves sinking to the bottom of a lake. As these plates sink, they retain some of their shape, offering glimpses of what the Earth's surface looked like millions of years ago.These plate remnants can be imaged, similar to the way CT scans allow doctors to see inside of a patient, using data gleaned from earthquake waves."We have attempted to go back in time with more accuracy than anyone has ever done before. This has resulted in more detail than previously thought possible," Wu said. "We've managed to go back to the age of the dinosaurs."The paper describes the deepest and oldest plate remnants reconstructed to date, with plates dating back to the Cretaceous Period."We found indications that when the slab reached the transition zone, it created signals on the surface," said Yi-Wei Chen, a PhD geology student in the UH College of Natural Sciences and Mathematics and first author on the paper. A transition zone is a discontinuous layer in the Earth's mantle, one which, when a sinking plate hits it, slows the plate's movement, causing a build-up above it.In addition to Wu and Chen, John Suppe, Distinguished Professor of Earth and Atmospheric Sciences at UH, is a co-author on the paper.The researchers also found evidence for the idea that, instead of a steady, continuous subduction, at times the Nazca plate was torn away from the Andean margin, which led to volcanic activity. To confirm this, they modeled volcanic activity along the Andean margin."We were able to test this model by looking at the pattern of over 14,000 volcanic records along the Andes," Wu said.The work was conducted as part of the UH Center for Tectonics and Tomography, which is directed by Suppe."The Center for Tectonics and Tomography brings together experts from different fields in order to relate tomography, which is the imaging of the Earth's interior from seismology, to the study of tectonics," Wu said. "For example, the same techniques we use to explore for these lost plates are adapted from petroleum exploration techniques." | Earthquakes | 2,019 |
January 22, 2019 | https://www.sciencedaily.com/releases/2019/01/190122115029.htm | 'Silent slip' along fault line serves as prelude to big earthquakes, research suggests | Big earthquakes appear to follow a brief episode of "shallow mantle creep" and "seismic swarms," suggests new research at Oregon State University that offers an explanation for the foreshocks observed prior to large temblors. | Published today in The research involved the Blanco Transform Fault off the coast of Oregon; a transform fault is a plate boundary at which the motion is mainly horizontal.Under the sea, transform faults connect offset mid-ocean "spreading centers," places at seafloor ridges where new oceanic crust is formed through volcanic activity and gradually moves away from the ridge."Slow slip directly triggers seismic slip -- we can see that," said co-corresponding author Vaclav Kuna, a graduate student in geology and geophysics in OSU's College of Earth, Ocean and Atmospheric Sciences. "The findings are very interesting and may have some broader implications for understanding how these kinds of faults and maybe other kinds of faults work."Researchers deployed 55 seismometers on the ocean bottom on and around the Blanco fault for a year."It's a very seismically active fault that generates significant earthquakes at higher rates than the majority of faults on land, making it ideal for studying the process of earthquake generation," Kuna said.The seismometer deployment -- from September 2012 to October 2013 -- resulted in the detection of more than 1,600 earthquakes at the Blanco Ridge, a 130-kilometer segment of the Blanco fault that served as the study area.Two distinct asperities -- basically rough edges -- along the ridge rupture roughly every 14 years with quakes in the magnitude 6 range."Our work was enabled by recent advances in long-term ocean-bottom seismometer deployments and is only the second major project targeting an oceanic transform fault," said co-corresponding author John Nabelek, professor of geology and geophysics at OSU.At its southernmost point, the Blanco Transform Fault is about 100 miles from Cape Blanco, Oregon's westernmost location, and the fault runs northwest to a point about 300 miles from Newport.The Cascadia Subduction Zone, a fault that extends from British Columbia to northern California, lies between the Blanco fault and the coastline. The fault was the site of a magnitude 9 earthquake in 1700 and is building up stress where the Juan de Fuca Plate is sliding underneath the North American Plate.Some scientists predict a 40 percent chance of another magnitude 9 or bigger quake occurring along the fault in the next 50 years."The Blanco fault is only 400 kilometers offshore," Nabelek said. "A slip on Blanco could actually trigger a Cascadia Subduction slip; it would have to be a big one, but a big Blanco quake could trigger a subduction zone slip."The Earth is put together in layers beneath the crust, the outermost skin that varies in thickness from about 40 miles (continental crust at mountain ranges) to about 2 miles (oceanic crust at mid-ocean ridges).The boundary between the crust and the next layer, the upper mantle, is known as the Moho."We see slow, aseismic slips that occur at depth in the fault beneath the Moho and load the shallower part of fault," Nabelek said. "We can see a relationship between mantle slip and crust slip. The slip at depth most likely triggers the big earthquakes. The big ones are preceded by foreshocks associated with creep."Kuna explains that the layers have different levels of seismic "coupling," the ability of a fault to lock at asperities and accumulate stress."The crust is fully coupled -- all slip is released in a seismic way," Kuna said. "Fault in the shallow mantle is partly coupled, partly not, and releases slip both seismically and aseismically. The deep mantle is fully creeping, uncoupled, with no earthquakes. But the fault is loaded by this creep from beneath -- it's all driven from beneath. Our results also show that an aseismic fault slip may trigger earthquakes directly, which may have implications for active faults on land."The National Science Foundation supported this research. | Earthquakes | 2,019 |
January 16, 2019 | https://www.sciencedaily.com/releases/2019/01/190116115518.htm | Nepal earthquake: Waiting for the complete rupture | In April 2015, Nepal -- and especially the region around the capital city, Kathmandu -- was struck by a powerful tremor. An earthquake with a magnitude of 7.8 destroyed entire villages, traffic routes and cultural monuments, with a death toll of some 9,000. | However, the country may still face the threat of much stronger earthquakes with a magnitude of 8 or more. This is the conclusion reached by a group of earth scientists from ETH Zurich based on a new model of the collision zone between the Indian and Eurasian Plates in the vicinity of the Himalayas.Using this model, the team of ETH researchers working with doctoral student Luca Dal Zilio, from the group led by Professor Taras Gerya at the Institute of Geophysics, has now performed the first high-resolution simulations of earthquake cycles in a cross-section of the rupture zone."In the 2015 quake, there was only a partial rupture of the major Himalayan fault separating the two continental plates. The frontal, near-surface section of the rupture zone, where the Indian Plate subducts beneath the Eurasian Plate, did not slip and remains under stress," explains Dal Zilio, lead author of the study, which was recently published in the journal Normally, a major earthquake releases almost all the stress that has built up in the vicinity of the focus as a result of displacement of the plates. "Our model shows that, although the Gorkha earthquake reduced the stress level in part of the rupture zone, tension actually increased in the frontal section close to the foot of the Himalayas. The apparent paradox is that 'medium-sized' earthquakes such as Gorkha can create the conditions for an even larger earthquake," says Dal Zilio.Tremors of the magnitude of the Gorkha earthquake release stress only in the deeper subsections of the fault system over lengths of 100 kilometres. In turn, new and even greater stress builds up in the near-surface sections of the rupture zone.According to the simulations performed by Dal Zilio and his colleagues, two or three further Gorkha quakes would be needed to build up sufficient stress for an earthquake with a magnitude of 8.1 or more. In a quake of this kind, the rupture zone breaks over the entire depth range, extending up to the Earth's surface and laterally -- along the Himalayan arc -- for hundreds of kilometres. This ultimately leads to a complete stress release in this segment of the fault system, which extends to some 2,000 kilometres in total.Historical data shows that mega events of this kind have also occurred in the past. For example, the Assam earthquake in 1950 had a magnitude of 8.6, with the rupture zone breaking over a length of several hundred kilometres and across the entire depth range. In 1505, a giant earthquake struck with sufficient power to produce an approximately 800-kilometre rupture on the major Himalayan fault. "The new model reveals that powerful earthquakes in the Himalayas have not just one form but at least two, and that their cycles partially overlap," says Edi Kissling, Professor of Seismology and Geodynamics. Super earthquakes might occur with a periodicity of 400 to 600 years, whereas "medium-sized" quakes such as Gorkha have a recurrence time of up to a few hundred years. As the cycles overlap, the researchers expect powerful and dangerous earthquakes to occur at irregular intervals.However, they cannot predict when another extremely large quake will next take place. "No one can predict earthquakes, not even with the new model. However, we can improve our understanding of the seismic hazard in a specific area and take appropriate precautions," says Kissling.The two-dimensional and high-resolution model also includes some research findings that were published after the Gorkha earthquake. To generate the simulations, the researchers used the Euler mainframe computer at ETH Zurich. "A three-dimensional model would be more accurate and would also allow us to make statements about the western and eastern fringes of the Himalayas. However, modelling the entire 2,000 kilometres of the rupture zone would require enormous computational power," says Dal Zilio. | Earthquakes | 2,019 |
January 10, 2019 | https://www.sciencedaily.com/releases/2019/01/190110101424.htm | New computer modeling approach could improve understanding of megathrust earthquakes | Years before the devastating Tohoku earthquake struck the coast of Japan in 2011, the Earth's crust near the site of the quake was starting to stir. Researchers at The University of Texas at Austin are using computer models to investigate if tiny tremors detected near this site could be connected to the disaster itself. | The research could help enhance scientists' understanding of forces driving megathrust earthquakes -- the world's most powerful type of earthquake -- and improve earthquake hazard assessment. The study was published on Dec.15, 2018, in Lead author Thorsten Becker, a professor at the UT Jackson School of Geosciences and researcher at the University of Texas Institute for Geophysics, said that this was the first comprehensive study showing changes in barely perceptible tremor activity before the Tohoku megathrust earthquake."The part of the crust that is close to the place that eventually ruptured changes stress state a couple of years before the event," said Becker. "By demonstrating this, our work complements studies of crustal deformation and our understanding of the forces driving earthquakes."The Institute for Geophysics is a research unit of the Jackson School of Geosciences.While the location of the tremors raises questions about their potential linkage to the quake, Becker said that it's unknown at the moment if the two events relate. However, the seismic signature of the tremors is helping refine a computer model that could help untangle the connection. This new modeling technique allows scientists to create a four-dimensional image of the Earth's crust and interactions between tectonic plates, showing how forces pushing at the fault change over time.Once the seismic data was inputted, the model matched observations of how the plate deformed in the years before and after the earthquake. This allowed the scientists to make inferences about the kind of forces taking place at the plate boundary, the point where one plate dives into the Earth's hot, viscous mantle. In this semi-molten layer, solid rocks ooze and behave in unexpected ways, so understanding the dynamics of the layer could help identify the connection between pressure along a fault before and after a major earthquake.The new research is significant because the model was originally developed using a different dataset: geodetic information about the shape of the Earth's surface. By gaining similar results using different data sets -- seismic waves and changes in the planet's shape -- scientists can be much more confident about the accuracy of earthquake models.Becker believes that with the right research and support, advanced computer models can be used to study the physics of earthquakes and perhaps contribute to improved forecasts.Currently, scientists can at best offer hazard maps showing known earthquake zones and a vague probability of an earthquake in the coming decades. Knowing more about when and where such a quake might strike, even within a few years, would represent a significant improvement on current earthquake forecasting and perhaps allow authorities and industry adequate time to prepare for such an event.To this end, the authors hope their study will contribute to global efforts to improve earthquake hazard assessment, such as the Modeling Collaboratory for Subduction RCN, a new UT-led research collaboration network funded by the National Science Foundation (NSF).The study was supported by the NSF and the Japanese Ministry of Education, Culture, Sports, Science and Technology. | Earthquakes | 2,019 |
January 9, 2019 | https://www.sciencedaily.com/releases/2019/01/190109114806.htm | Following Nepal's devastating 2015 earthquake, crisis in childhood malnutrition averted | Despite widespread destruction, including severe agricultural-related losses caused by the 2015 earthquake in Nepal, child nutrition remained stable in the hardest hit areas, a new study finds. | A team of researchers from the Johns Hopkins Bloomberg School of Public Health and Tufts University found that indicators of childhood malnutrition improved or remained stable a year after the earthquake hit.The study, published in November by The 2015 earthquake in Nepal caused over $7.1 billion in losses to infrastructure and economic production and killed nearly 9,000 people and injured nearly 22,000. Extensive losses related to food production and delivery raised concerns that childhood undernutrition could worsen. To prevent this, substantial humanitarian relief efforts were spearheaded by the government of Nepal, the United Nations and NGOs."Our findings raise the question of whether the improved nutrition and food security situation in these areas a year after the earthquake may have been due to the aid pouring into these areas, the ongoing nutrition and agriculture aid programs that were already operating in the area, or the resilience of these communities," says lead author Andrew Thorne-Lyman, ScD, associate scientist in the Bloomberg School's Department of International Health. "The most important thing is, according to multiple indicators, a nutrition crisis was averted."For their study, researchers analyzed data from the same hardest-hit districts in Nepal, both in 2014 and 2016. In both years, data were collected from over 900 households and approximately 2,000 women and children.The study found that child wasting -- a term that describes weight and body-mass loss -- declined significantly in the study areas, from 4.5 percent to 2.1 percent. Child wasting is an indicator of a child's thinness that is often used to evaluate a post-disaster nutrition situation. Children who shows signs of wasting are also at an elevated risk of death. The researchers also found that child stunting -- being too short for their age -- declined from 23.1 to 21.6 percent in the study period, although the decline was not statistically significant. Stunting is associated with cognitive impairments such as delayed motor development, impaired brain function and poor school performance.Food insecurity in the study region decreased significantly from 17.6 percent to 12.4 percent between 2014 and 2016. Food insecurity is broadly defined as not having sufficient access to safe and nutritious foods needed for a healthy life. To assess this, researchers used the Household Food Insecurity Access Scale, which has been tested in multiple international studies, and asks a series of questions reflecting different dimensions of food security including access to food.Researchers also asked household members to recall any shocks experienced the previous year, including death of a family member, structural damage to their house and crop and livestock losses. These statistics were compared for 2016 against 2014 data from the same study. As expected, due to the quake, losses were much more severe in 2016 than in 2014. For example, while fewer than 2 percent of households reported damage to their house in 2014, nearly half did so in 2016. Crop and animal loses increased to 20 percent and there were twice as many deaths reported in households in 2016 compared to 2014."While the nutritional well-being of children appears to have been spared, it's clear from our data that households experienced significant damages and trauma due to the earthquake, including structural damages, crop and animal loss and loss of life. A year later, many were still rebuilding their lives," said Keith P. West Jr., DrPH, the George G. Graham Professor in International Health at the Bloomberg School and senior author of the study.Researchers were able to make before-and-after comparisons in this study because the data are part of a larger project called the Policy and Science for Health, Agriculture and Nutrition (PoSHAN) Community surveys. These surveys are nationally representative and assess household agricultural practices, socioeconomic status, food security, and diet, health, and nutritional status of preschool children and women of reproductive age. Authors of this study limited their analysis to data from PoSHAN districts that the government of Nepal categorized as "earthquake-affected" to focus on the effects of the earthquake."Nutritional resilience in Nepal following the earthquake of 2015" was written by Andrew L. Thorne-Lyman, Angela K. C., Swetha Manohar, Binod Shrestha, Bareng A. S. Nonyane, Sumanta Neupane, Shiva Bhandari, Rolf D. Klemm, Patrick Webb, Keith P. West Jr.The study was funded by United States Agency for International Development through the Feed the Future Innovation Lab for Nutrition [USAID grant number AID-OAA-L-1-00005]. | Earthquakes | 2,019 |
January 7, 2019 | https://www.sciencedaily.com/releases/2019/01/190107102823.htm | Deep low-frequency earthquakes indicate migration of magmatic fluids beneath Laacher See Volcano | Magma could rise from the upper mantle into the middle and upper crust beneath the Laacher See Volcano (Rhineland-Palatinate). This is the result of a study conducted by the Seismological Survey of Southwest Germany (Erdbebendienst Südwest), together with GFZ German Research Centre for Geosciences, Karlsruhe Institute of Technology (KIT) and the Seismological Survey of North Rhine-Westphalia. For the first time, the scientists present evidence of deep and low-frequency earthquakes caused by magma movements under the Laacher See Volcano. However, there are presently no signs of any upcoming volcanic activity in the near future. The researchers report their findings in | "The detected earthquakes are generated at large depths and are characterized by unusually low frequencies. Their magnitudes are below the limit of human perception," explains Professor Joachim Ritter of the Geophysical Institute (GPI) at KIT. The scientists speak of "deep low-frequency" (DLF) earthquakes. They are generated at depths between ten and over forty kilometres, i.e. in the Earth's crust and upper mantle. Their dominant oscillation frequencies are between one and ten Hertz, which is significantly lower than tectonic earthquakes of comparable magnitude."DLF earthquakes are regarded worldwide as an indication of the movement of magmatic fluids at great depths," explains Professor Torsten Dahm, head of the GFZ section Physics of Earthquakes and Volcanoes. "Such earthquakes can be observed regularly beneath active volcanoes, for example in Iceland, Japan or Kamchatka." The results of the study in the East Eifel region suggest that magmatic fluids from the upper mantle of the Earth could rise into the Earth's crust under the Laacher See Volcano. This can be interpreted as an indication for the existence and ongoing slow recharge of magma chambers in the crust beneath the volcano.In their study, the scientists of KIT, GFZ, Erdbebendienst Südwest -- the joint seismological services of Rhineland-Palatinate and Baden-Württemberg -- and the State Seismological Service of North Rhine-Westphalia determined that these earthquakes occur episodically in groups that are narrowly limited in time and space and line up along a line between 10 and 45 kilometres depth The scientists conclude that fluids and magmas, i.e. molten rock, could rise from the upper mantle into the middle and upper crust of the earth beneath the Laacher See Volcano."Due to extensive improvements of the seismological monitoring networks in Rhineland-Palatinate and the adjacent regions, deep low-frequency earthquakes could be registered beneath the Laacher See Volcano for the first time in 2013," says Dr. Martin Hensch, head of the study at the Erdbebendienst Südwest. "In the past five years, a total of four spatially well constrained clusters of such DLF earthquakes have been detected in the East Eifel region." The clusters align along an approximately 80° southeast dipping line south of the Laacher See Volcano. In addition to the spatial separation, the temporal occurrence of the DLF earthquakes is also sharply limited: So far, the experts have observed eight episodes of DLF earthquakes lasting between 40 seconds and eight minutes.However, the researchers do not interpret the observed DLF earthquakes as an immediate precursor signal of any upcoming volcanic activity in the near future. "The rise of magma into the shallow crust is usually accompanied by swarms of high-frequency earthquakes. Such activity has not yet been observed in the Eastern Eifel," reports Joachim Ritter. "Further, there is no sign of deformation at the Earth's surface, which should be clearly detectable during massive magma ascents," adds Torsten Dahm. Dating of the magma produced during the last eruption 12,900 years ago show that the filling and differentiation of the upper magma chamber under Laacher See Volcano could have taken about 30,000 years before the actual eruption took place. This means that the magmatic processes take an extremely long time before an eruption occurs. As the technical requirements for the detection and localization of DLF earthquakes in the East Eifel region have only reached a sufficient quality in the last few years, it is not possible to determine retrospectively since when DLF earthquakes have occurred in the region. It can be assumed that this was already the case before 2013. Following the first observation of deep earthquakes in 2013, KIT, GFZ and Erdbebendienst Südwest installed a seismological research network. The joint use of seismic registrations now allows detailed scientific analysis of microseismicity.In order to better investigate the interrelation of DLF earthquakes and possible magmatic activity beneath the East Eifel region, the researchers recommend to intensify geochemical monitoring to analyse emitted gases, as well as repeated geodetic measurements to detect possible deformations of the earth's surface. Specific geophysical investigations should also be conducted to map and characterize possible magma reservoirs under the Laacher See Volcano. Further, the scientists advise a reassessment of the volcanic hazard of the Eifel. | Earthquakes | 2,019 |
December 18, 2018 | https://www.sciencedaily.com/releases/2018/12/181218100413.htm | Oroville Dam earthquakes in February 2017 related to spillway discharge | A closer look at small earthquakes that took place at the Oroville Dam in California's Sierra Nevada foothills in February 2017 -- near the time when the dam's spillway failed -- suggest that the seismic activity was related to reservoir discharge that opened and closed fractures in the rock below the spillway. | It seems likely that fluid leaking through cracks in the main spillway altered the pressure on the underlying rock fractures, causing them to slowly open and then slam shut, over and over, according to the report in the Researchers at the U.S. Geological Survey were able to detect more than 19,000 very small seismic events that occurred over the past 25 years, apparently as the result of these fractures opening and closing underneath the spillway.The seismic events did not cause the failure of the dam's main spillway in mid-February 2017, said USGS seismologist Robert Skoumal, who led the research team."These seismic events were not directly associated with the spillway failure. Although experts determined that water leaking through spillway cracks led to the failure, we think this leaking water also caused the seismic events," explained Skoumal. "These seismic events occurred for decades prior to failure, and continued to occur after the failure itself."Skoumal and his colleagues took a closer look at seismic signals located near the dam after two small earthquakes of magnitude 0.8 and 1.0 occurred on 14 February 2017. After unusually heavy rainfall in the area and record-breaking water levels in the Oroville reservoir, parts of the Oroville Dam main spillway crumbled between 7 and 12 February, prompting the evacuation of nearly 188,000 people in the Oroville, California area.Using a technique called template matching to search for events similar to the February 2017 magnitude 1.0 earthquake, the researchers detected more than 19,000 tiny seismic events at the dam that occurred between May 1993 and April 2018.The February 2017 earthquakes were intriguing to Skoumal and his colleagues in light of a 1975 magnitude 5.7 earthquake sequence that was previously linked to the filling of the Oroville reservoir. "We wanted to know if these recent earthquakes could also be related to rapid changes in reservoir level and determine if these events might be precursors to larger magnitude earthquakes," Skoumal said.The day after the magnitude 0.8 and 1.0 earthquakes, which were too small to be felt by humans, the researchers completed a preliminary analysis, "and very quickly we saw that there were tons of these kinds of earthquakes, located close to the spillway, and they did not look like natural events," Skoumal added.Earthquakes as small as the Oroville events are often not detected by usual seismic monitoring, so the researchers turned to a technique called template matching to search for other seismic signals from the reservoir area. "With this approach, we treat the waveform from a successfully identified earthquake like a 'fingerprint,'" explained Skoumal. "We take that 'fingerprint' and scan through decades of data looking for signals that look similar, just with smaller amplitude."The technique works well with earthquake swarms -- a lot of earthquakes that occur over a short time period. Swarms can occur naturally, near volcanoes or within tectonic subduction zones, but they are also common in cases of seismicity induced by human activity, said Skoumal."We were seeing thousands of small events, which abruptly turned-on and turned-off, so we knew there was some kind of trigger that was causing these events," he said. "But it was a puzzle at that point. If it was induced, we thought it would be related to the reservoir itself, because there are many examples of reservoirs inducing earthquakes."However, when the scientists compared the seismic data to the water levels in the reservoir, there was only a weak correlation between the tiny earthquakes and high water levels. Instead, data on Oroville spillway usage from the California Department of Water Resources turned out to fit the seismic data perfectly. The small earthquakes occurred during times of outflow from the reservoir over the spillway, and they were not related to the processes responsible for the 1975 earthquake sequence.Spillway discharge leaking through cracks in the spillway likely created rapid changes in fluid pressure along fractures in the weathered rock underlying the spillway. The researchers concluded this process could have caused the fractures to repeatedly open and close and generate the seismic swarms."Will the seismic events return when they resume usage of the spillway, which has since been repaired, or are they gone for good? Depending on the amount of rainfall we get, we might be finding this out in the next year or two," he said. | Earthquakes | 2,018 |
December 18, 2018 | https://www.sciencedaily.com/releases/2018/12/181218093045.htm | Machine learning-detected signal predicts time to earthquake | Machine-learning research published in two related papers today in | Los Alamos National Laboratory researchers applied machine learning to analyze Cascadia data and discovered the megathrust broadcasts a constant tremor, a fingerprint of the fault's displacement. More importantly, they found a direct parallel between the loudness of the fault's acoustic signal and its physical changes. Cascadia's groans, previously discounted as meaningless noise, foretold its fragility."Cascadia's behavior was buried in the data. Until machine learning revealed precise patterns, we all discarded the continuous signal as noise, but it was full of rich information. We discovered a highly predictable sound pattern that indicates slippage and fault failure," said Los Alamos scientist Paul Johnson. "We also found a precise link between the fragility of the fault and the signal's strength, which can help us more accurately predict a megaquake."The new papers were authored by Johnson, Bertrand Rouet-Leduc and Claudia Hulbert from the Laboratory's Earth and Environmental Sciences Division, Christopher Ren from the Laboratory's Intelligence and Space Research Division and collaborators at Pennsylvania State University.Machine learning crunches massive seismic data sets to find distinct patterns by learning from self-adjusting algorithms to create decision trees that select and retest a series of questions and answers. Last year, the team simulated an earthquake in a laboratory, using steel blocks interacting with rocks and pistons, and recorded sounds that they analyzed by machine learning. They discovered that the numerous seismic signals, previously discounted as meaningless noise, pinpointed when the simulated fault would slip, a major advance towards earthquake prediction. Faster, more powerful quakes had louder signals.The team decided to apply their new paradigm to the real world: Cascadia. Recent research reveals that Cascadia has been active, but noted activity has been seemingly random. This team analyzed 12 years of real data from seismic stations in the region and found similar signals and results: Cascadia's constant tremors quantify the displacement of the slowly slipping portion of the subduction zone. In the laboratory, the authors identified a similar signal that accurately predicted a broad range of fault failure. Careful monitoring in Cascadia may provide new information on the locked zone to provide an early warning system. | Earthquakes | 2,018 |
December 11, 2018 | https://www.sciencedaily.com/releases/2018/12/181211133317.htm | Historic earthquakes test Indonesia's seismic hazard assessment | Using data gleaned from historical reports, researchers have now identified the sources of some of the most destructive Indonesian earthquakes in Java, Bali and Nusa Tenggara, using these data to independently test how well Indonesia's 2010 and 2017 seismic hazard assessments perform in predicting damaging ground motion. | The study published in the Indonesia has made earthquake risk prediction a priority after the magnitude 9.1 Sumatra-Andaman megathrust earthquake and tsunami in 2004, but to date most of the research on regional earthquake hazard has concentrated on Sumatra, at the expense of studies further east in Java, said Jonathan Griffin of Geoscience Australia and colleagues.More than 57 percent (about 140 million people) of Indonesia's population lives in Java, "on a relatively small island roughly the same area as New York State, or the North Island of New Zealand, that faces many natural hazards," explained Griffin. "Getting the hazard levels right to underpin building codes is therefore critically important for a huge number of people, particularly combined with rapid economic growth and urbanization in Indonesia."Probabilistic seismic hazard assessments or PSHA is a method that calculates the likelihood that certain levels of earthquake-related ground shaking will exceed a specific intensity at a given location and time span. PSHA calculations are based on data from earthquakes detected by seismographs, however, so some of the largest and most damaging earthquakes in a region may not be included in the assessments if they occurred before instrumentation in a region.Griffin and colleagues analyzed historical catalogs and accounts of earthquakes in Java, Bali and Nusa Tenggara from 1681 to 1877, to determine the source and shaking intensity for some of the region's historically destructive earthquakes.The most significant tectonic feature of the Indonesian region is the collision and subduction of the Indian and Australian tectonic plates under the Sunda and Burma tectonic plates, generating megathrust earthquakes like the 2004 Sumatra quake. However, the researchers found little evidence for the occurrence of large earthquakes on the Java Megathrust fault during the historic time period they studied.Instead, they concluded that large intraslab earthquakes (earthquakes that occur within a subducting tectonic plate) were responsible for some of Java's most damaging historic quakes, including a magnitude 7.4 earthquake near Jakarta in 1699 and a magnitude 7.8 quake in Central Java in 1867. The researchers also noted a cluster of large earthquakes occurring on the Flores Thrust to the east of Java in 1815, 1818 and 1820, as well as earthquakes on shallow crustal faults on Java that had not been mapped previously.The Flores Thrust was responsible for two magnitude 6.9 earthquakes in Lombok in August 2018 that together killed more than 500 people.Intraslab earthquakes are well-known in the region, including recent events such as the magnitude 7.6 quake in West Sumatra and the magnitude 7.0 quake in West Java that together killed more than 1000 people in 2009, said Griffin. "However we were surprised that we didn't find conclusive evidence for a large megathrust event during the time period we examined."Although it can be difficult to distinguish between megathrust and intraslab earthquakes using the data analyzed by the researchers, Griffin said that the data he and his team analyzed fit better with an intraslab model. "So while the intraslab models fit the data better for earthquakes in 1699 and 1867, we also rely on an absence of tsunami observations from coastal locations where ground shaking damage was reported to make the case that intraslab events were the more likely source," he added."The absence of strong historical evidence for a large megathrust earthquake south of Java over the past 350 years is a really interesting problem," said Griffin. Javanese and Dutch population centers "were historically on the north coast facing the calmer Java Sea, so we only have limited data from the less hospitable south coast. So it's quite likely that smaller megathrust earthquakes have occurred that aren't captured well in the historical records, but we'd be surprised if a really large earthquake went unnoticed."Previous research suggests that that the length of time between earthquakes on the Sumatran megathrust varies considerably, said Griffin. "So the lack of large megathrust events south of Java over the past few centuries could just imply that we have been in a period of relative inactivity, but not that large earthquakes occur less frequently here on average over the long-term." | Earthquakes | 2,018 |
December 11, 2018 | https://www.sciencedaily.com/releases/2018/12/181211122442.htm | Alaska earthquakes offer new insight into improving hazard assessment | The 2016 Iniskin earthquake (magnitude 7.1) that shook Anchorage, Alaska, was captured by the seismometers of the EarthScope Transportable Array. This data is helping Geoff Abers, a professor at Cornell University's Department of Earth and Atmospheric Sciences, and Michael Mann, a graduate student in his group, explore answers to fill crucial gaps in understanding intra-slab earthquakes. Their work may provide insight into the November 30, 2018 magnitude 7.0 earthquake near Anchorage. It could also help improve earthquake hazard assessments in the future. | Intra-slab earthquakes usually occur deep in the earth, within tectonic plates descending into the mantle at subduction zones. Because they are so deep, intra-slab earthquakes can be large magnitude and felt over a broad area; however, they usually don't exhibit strong seismic wave acceleration or ground motion since the fault causing the earthquake is deep. Iniskin was different.The Iniskin earthquake originated within the Pacific Plate, which is slowly being forced under the North American Plate. "The Iniskin earthquake was 125 km deep, but caused some very high ground motion that was felt and recorded in Anchorage, and in particular where there's a dense network of accelerometers. When it occurred in 2016 it was actually the strongest ground shaking in Anchorage since the great 1964 earthquake that destroyed half the town," said Abers in early November, before the damaging earthquake of November 30. The Iniskin earthquake was more than 270 km from Anchorage.The magnitude 7.0 earthquake that occurred on November 30 was also an intra-slab earthquake within the subducting Pacific Plate, but it was only 44 km deep and only a few miles from Anchorage. It produced extensive damage, and while initial data are still being processed, it serves as a reminder of the hazard posed by this kind of earthquake.The Iniskin earthquake provided a prime opportunity to study the mechanics of an intra-slab earthquake, and how local geology can dramatically change the earthquake's effects. The Iniskin earthquake rattled Anchorage shortly after seismometers from EarthScope's Transportable Array were installed in southcentral Alaska. The Transportable Array consists of hundreds of seismic stations deployed in a grid; it has leapfrogged its way every two years across swaths of the continent for more than a decade and is currently in Alaska. The grid, with a spacing of about 85 km, covers Alaska from the southeast panhandle to the North Slope."We are only able to do this study because the Transportable Array installed high-quality, state-of-the-art instruments in many otherwise inaccessible places," Mann said."We've known for some time that you occasionally do get big intra-slab earthquakes and there's been some concern they are underrepresented globally in hazard estimates in the places in the world where they occur. So this is an opportunity to delve a little bit deeper to try to understand what was going on here," Abers said.What caused such unexpected movement from the Iniskin earthquake? Abers and Mann think there are two possible factors based on local geology: one is the temperature of the mantle the seismic waves travel through to reach the surface, and another is that the seismic waves can sometimes ricochet between the layers of a subducting tectonic plate.Anchorage sits near the edge of the North American Plate, where the continental plate pushes the Pacific Plate down into the mantle. The plate from the surface is cooler, and therefore more solid, than the surrounding mantle, so seismic waves travel faster."At very low temperatures the earth is like a bell, it just rings and waves can propagate," Abers said. "We could only see that for the Iniskin earthquake because the Transportable Array actually deployed stations for the first time west of Anchorage and north of the Alaska Range."The Transportable Array allowed a comparison of seismic waves the same distance from the source of the earthquake, but in different directions. North of the Alaska Range, where the distance from the subduction zone means the crust lies above the mantle, the seismic waves have to travel through hot mantle to get to Anchorage. The heated rock is softer and "mushier," so seismic waves don't travel as quickly as through cooler crust."Those signals are really small compared to the very large signals you'll see in Anchorage at comparable distances, by a factor of 20 to 50 at the frequencies we care about," said Abers. "These aren't subtle effects."The other possible reason why the Iniskin earthquake shook the ground so much has to do with the local structure of the crust. Abers and Mann found that at some frequencies, seismic waves seemed to be amplified. The crust is made up of multiple layers of different kinds of rock. If a weaker layer is sandwiched between stronger layers in the sinking crust of the subduction zone, seismic waves may travel up the subducting plate and be caught in the sandwiched layer, bouncing back and forth and amplifying the wave's energy."We've known about this problem for a while, but it hasn't really made it into how hazards are assessed from these earthquakes very clearly, because we haven't worked out how to determine the parameters," said Abers. In places such as Anchorage, earthquake hazard assessment may need to include information about the deep earth, tens of miles down, and not just the near-surface geology. Puzzling out what happened during the Iniskin earthquake and possibly the November 30 one, and having good data coverage to compare the earthquakes from different locations, is a step forward for improving hazard assessment for intra-slab earthquakes in the future. | Earthquakes | 2,018 |
November 26, 2018 | https://www.sciencedaily.com/releases/2018/11/181126105503.htm | A large volcanic eruption shook Deception Island 3,980 years ago | A large volcanic eruption shook Deception Island, in Antarctica, 3,980 years ago, and not 8,300, as it was previously thought, according to an international study published in | In the study, whose first author was Dermot Antoniades, from the University of Laval, Canada have taken part researchers from the Universities of Barcelona (UB), Salamanca (USAL), and Cambridge and Leicester (UK), CREAF, the Centre for Research, and Monitoring and Evaluation of the Sierra de Guadarrama National Park, and Centre for Hydrographic Studies (CEDEX).According to the age published in this new study, a volcanic eruption caldera collapse type took place 3980 years ago. The emptying of the magmatic chamber, the zone of magma accumulation that fuelled the eruption, during this violent eruptive event caused a sudden pressure drop which in turn caused the collapse of the upper part of the volcano. As a result, a depression between 8 and 10 kilometres in diameter was formed, which is what today gives to Deception Island its particular horseshoe shape. The caldera collapse would have caused a seismic event of great magnitude whose trace was recorded in the sediments accumulated in the lakes' bottom of Livingstone Island.The lacustrine sediment cores were recovered during the Antarctic campaigns of the HOLOANTAR project, between 2012 and 2014. This fieldwork was lead and coordinated by Marc Oliva, then researcher at Instituto de Geografia e Ordenamento of the University of Lisboa and now Ramon y Cajal researcher at University of Barcelona (UB). Oliva is coauthor of this study."The initial objective of the study was purely climatic, since we wanted to reconstruct the climate fluctuations of this region for the last 11,700 years using different proxies found in the sediments of the Byers Peninsula lakes, about 40 kilometres north of Deception Island. However, the presence of a different sediment layer in all lakes and of the same age after a thick layer of tephra surprised us," said Sergi Pla, researcher at CREAF and coauthor of the study."Later geochemical and biological analyses indicated us that these sediments had terrestrial origin and were deposited abruptly in the lake's bottom. These results suggested the occurrence of a major earthquake that affected all this area; put us on the track that, perhaps, we were not facing a common earthquake but the one generated by the collapse of the caldera of the Deception Island volcano. From here on, we pulled the thread," explained Santiago Giralt, researcher at ICTJA-CSIC and co-author of the study.The exact date of the eruption was obtained using different geochemical, petrological and paleolimnological techniques applied on the sediment cores from 4 lakes of the Byers Peninsula from Livingston island.These sedimentary records contained several direct and indirect pieces of evidence of the volcanic event that occurred in Deception Island. "The recovered sedimentary records showed a common pattern: first the volcanic ash from Deception Island eruption, overlaid by a sediment layer almost one meter thick composed by material dragged from the lakes' shores to their bottom due to the large earthquake and, finally, the common lake sediments, which are characterized by an alternation of clays and mosses," explained Santiago Giralt.One of the challenges that faced this study was to characterize the origin of the ashes produced during the volcanic eruption. For that, pressure and temperature conditions of the magmas that originated this eruption were calculated using the ashes present in the sediment cores. "Using this methodology, we were able to estimate the depth of all studied samples and to determine if they were part of the same magma and eruptive episode," said Antonio Álvarez Valero, researcher from the University of Salamanca (USAL) and co-author of this study.The study also estimates that the eruption had a Volcanic Explosive Index (VEI) around 6 which possibly makes it the largest known Holocene eruptive episode in the Antarctic continent."This colossal episode of eruptive caldera collapse ejected between 30 and 60 cubic kilometres of ash, comparable in volume to the eruption of the Tambora volcano in 1815, an event that is attributed to a global temperature cooling that resulted in a series of bad harvests in Europe, in what is known as the "year without summer," explains Adelina Geyer, ICTJA-CSIC researcher and co-author of the study."It is very important to be able to date this type of eruptions that allow us to understand the climatic changes caused by volcanic eruptions, in this particular case at high austral latitudes," adds the Geyer.As suggested by this study, this eruption could have had significant climatic and ecological impacts in a large area of the southern region, although more studies and new data are needed to precisely characterize what the real effects on the climate of this large eruptive event. | Earthquakes | 2,018 |
November 15, 2018 | https://www.sciencedaily.com/releases/2018/11/181115165023.htm | Climate, life and the movement of continents: New connections | A new study by The University of Texas at Austin has demonstrated a possible link between life on Earth and the movement of continents. The findings show that sediment, which is often composed of pieces of dead organisms, could play a key role in determining the speed of continental drift. In addition to challenging existing ideas about how plates interact, the findings are important because they describe potential feedback mechanisms between tectonic movement, climate and life on Earth. | The study, published Nov. 15 in The research was led by Whitney Behr, a research fellow at the Jackson School and professor at ETH Zurich in Switzerland, and co-authored by Thorsten Becker, a professor at the UT Jackson School of Geosciences and research scientist at its Institute for Geophysics (UTIG).Sediment is created when wind, water and ice erode existing rock or when the shells and skeletons of microscopic organisms like plankton accumulate on the seafloor. Sediment entering subduction zones has long been known to influence geological activity such as the frequency of earthquakes, but until now it was thought to have little influence on continental movement. That's because the speed of subduction was believed to be dependent on the strength of the subducting plate as it bends and slides into the viscous mantle, the semi molten layer of rock beneath Earth's crust. Continental movement is driven by one plate sinking under another so, in this scenario, the strength of the portion of the plate being pulled into Earth's mantle (and the energy required to bend it) would be the primary control for the speed of the plate movement, with sediment having little effect.However, prior research involving UTIG scientists had shown the subducting plates may be weaker and more sensitive to other influences than previously thought. This led researchers to look for other mechanisms that might impact plate velocity. They estimated how different types of rock might affect the plate interface - the boundary where subducting plates meet. Subsequent modelling showed that rock made of sediment can create a lubricating effect between plates, accelerating subduction and increasing plate velocity.This mechanism could set in motion a complex feedback loop. As plate velocity increases, there would be less time for sediment to accumulate, so the amount of subducting sediment would be reduced. This leads to slower subduction, which may allow for mountains to grow at plate boundaries as the force of the two plates running into each other causes uplift. In turn, erosion of those mountains by wind, water and other forces can produce more sediments which feed back into the subduction zone and restart the cycle by increasing the speed of subduction."The feedback mechanisms serve to regulate subduction speeds such that they don't 'runaway' with extremely fast velocities," said Behr.Behr and Becker's new model also offers a compelling explanation for variations found in plate speed, such as India's dramatic northward acceleration some 70 million years ago. The authors propose that as India moved through equatorial seas teeming with life, an abundance of sedimentary rock formed by organic matter settling on the seafloor created a lubricating effect in the subducting plate. India's march north accelerated from a stately 5 centimeters per year (about 2 inches) to an eye-watering 16 centimeters per year (about 6 inches). As the continent accelerated the amount of sediment being subducted decreased and India slowed before finally colliding with Asia.Behr and Becker suggest these feedback mechanisms would have been very different in the early Earth before the formation of continents and the emergence of life. Although their model does not examine the origins of these feedback mechanisms, it does raise compelling questions about the interaction between continental movement and life on Earth."What is becoming clear is that the geological history of the incoming plate matters," said Becker, who also holds the Shell Distinguished Chair in Geophysics at UT. "We will have to study in more detail how those possible feedback processes may work." | Earthquakes | 2,018 |
November 14, 2018 | https://www.sciencedaily.com/releases/2018/11/181114132013.htm | Seismic study reveals huge amount of water dragged into Earth's interior | Slow-motion collisions of tectonic plates under the ocean drag about three times more water down into the deep Earth than previously estimated, according to a first-of-its-kind seismic study that spans the Mariana Trench. | The observations from the deepest ocean trench in the world have important implications for the global water cycle, according to researchers in Arts & Sciences at Washington University in St. Louis."People knew that subduction zones could bring down water, but they didn't know how much water," said Chen Cai, who recently completed his doctoral studies at Washington University. Cai is the first author of the study published in the Nov. 15 issue of the journal "This research shows that subduction zones move far more water into Earth's deep interior -- many miles below the surface -- than previously thought," said Candace Major, a program director in the National Science Foundation's Division of Ocean Sciences, which funded the study. "The results highlight the important role of subduction zones in Earth's water cycle.""Previous estimates vary widely in the amount of water that is subducted deeper than 60 miles," said Doug Wiens, the Robert S. Brookings Distinguished Professor in Earth and Planetary Sciences in Arts & Sciences and Cai's research advisor for the study. "The main source of uncertainty in these calculations was the initial water content of the subducting uppermost mantle."To conduct this study, researchers listened to more than one year's worth of Earth's rumblings -- from ambient noise to actual earthquakes -- using a network of 19 passive, ocean-bottom seismographs deployed across the Mariana Trench, along with seven island-based seismographs. The trench is where the western Pacific Ocean plate slides beneath the Mariana plate and sinks deep into the Earth's mantle as the plates slowly converge.The new seismic observations paint a more nuanced picture of the Pacific plate bending into the trench -- resolving its three-dimensional structure and tracking the relative speeds of types of rock that have different capabilities for holding water.Rock can grab and hold onto water in a variety of ways.Ocean water atop the plate runs down into the Earth's crust and upper mantle along the fault lines that lace the area where plates collide and bend. Then it gets trapped. Under certain temperature and pressure conditions, chemical reactions force the water into a non-liquid form as hydrous minerals -- wet rocks -- locking the water into the rock in the geologic plate. All the while, the plate continues to crawl ever deeper into the Earth's mantle, bringing the water along with it.Previous studies at subduction zones like the Mariana Trench have noted that the subducting plate could hold water. But they could not determine how much water it held and how deep it went."Previous conventions were based on active source studies, which can only show the top 3-4 miles into the incoming plate," Cai said.He was referring to a type of seismic study that uses sound waves created with the blast of an air gun from aboard an ocean research vessel to create an image of the subsurface rock structure."They could not be very precise about how thick it is, or how hydrated it is," Cai said. "Our study tried to constrain that. If water can penetrate deeper into the plate, it can stay there and be brought down to deeper depths."The seismic images that Cai and Wiens obtained show that the area of hydrated rock at the Mariana Trench extends almost 20 miles beneath the seafloor -- much deeper than previously thought.The amount of water that can be held in this block of hydrated rock is considerable.For the Mariana Trench region alone, four times more water subducts than previously calculated. These features can be extrapolated to predict the conditions under other ocean trenches worldwide."If other old, cold subducting slabs contain similarly thick layers of hydrous mantle, then estimates of the global water flux into the mantle at depths greater than 60 miles must be increased by a factor of about three," Wiens said.And for water in the Earth, what goes down must come up. Sea levels have remained relatively stable over geologic time, varying by less than 1,000 ft. This means that all of the water that is going down into the Earth at subduction zones must be coming back up somehow, and not continuously piling up inside the Earth.Scientists believe that most of the water that goes down at the trench comes back from the Earth into the atmosphere as water vapor when volcanoes erupt hundreds of miles away. But with the revised estimates of water from the new study, the amount of water going into the earth seems to greatly exceed the amount of water coming out."The estimates of water coming back out through the volcanic arc are probably very uncertain," said Wiens, who hopes that this study will encourage other researchers to reconsider their models for how water moves back out of the Earth. "This study will probably cause some re-evaluation."Moving beyond the Mariana Trench, Wiens along with a team of other scientists has recently deployed a similar seismic network offshore in Alaska to consider how water is moved down into the Earth there."Does the amount of water vary substantially from one subduction zone to another, based on the kind of faulting that you have when the plate bends?" Wiens asked. "There's been suggestions of that in Alaska and in Central America. But nobody has looked at the deeper structure yet like we were able to do in the Mariana Trench." | Earthquakes | 2,018 |
November 6, 2018 | https://www.sciencedaily.com/releases/2018/11/181106150432.htm | Punctuated earthquakes for New Madrid area, Missouri, U.S. | Indianapolis, IN, USA: In 1811 and 1812, the region around New Madrid, Missouri, experienced a number of major earthquakes. The final and largest earthquake in this sequence occurred on the Reelfoot fault, and temporarily changed the course of the Mississippi River. These earthquakes are estimated to be just shy of magnitude 8.0 and devastated towns along the Mississippi River -- soil liquefied, houses collapsed, and chimneys toppled. | Because of the 1811-1812 earthquakes, the New Madrid area is recognized as a high-hazard zone for potential future seismic events. Previous investigations found have also found evidence for multiple, older earthquake events preserved in the geologic record."We know there were also large earthquakes at ~1450 AD and at ~900 AD," says Ryan Gold of the U.S. Geological Survey (USGS), but frequent earthquakes along the fault may not be the norm."If earthquakes happen on the Reelfoot fault every 500 years, and have been doing so for hundreds of thousands of years, we would expect to see a mountain range there -- but we don't," says Gold. Instead, he suggests the modest fault scarp associated with the Reelfoot fault indicate that the earthquakes haven't been sustained over a long period of time.To test this, USGS researchers wanted to look beyond the last few thousand years. Preserving long-records of past earthquakes can be a challenge for the Reelfoot fault because natural processes like rain and occasional floods on the Mississippi River can conspire to erase the record of past earthquakes. "That's coupled with anthropogenic effects -- lots of farming, forestry, [and] construction," says Gold.Instead of studying the fault directly, the USGS team moved to the rolling hills around the Mississippi River, east of the Reelfoot fault. They noted a high concentration of depressions called sackungen (German word meaning "to sag") near the fault, and hypothesized that these sags are cracks in the ground caused by strong shaking from large earthquakes.The USGS excavated a trench across one of the sackung that had formed in Peoria loess -- silt that was blown in during the last glacial period until as recently as around 11,000 years ago. Gold explained how the team hypothesized that a sackung crack forms during an earthquake, the middle of the crack falls downward, and sediment washes in from the shoulders -- thus recording the timing of the earthquake.Their trench revealed four distinct packages of sediment, says Gold, adding he was pleased to see such a long record. "I figured we would only see the 1811 and 1812 earthquake sequence."In the sackung, they dated all four packages of sediment and found they corresponded to previously identified earthquakes that occurred on the Reelfoot fault: 1812 AD, ~1450 AD, ~900 AD, and ~2300 BC. Importantly, they didn't find evidence for any additional earthquakes in the interval from ~4,300 to ~11,000 years ago. If the earthquake record preserved in the sackung is complete, "our record confirms that the tempo of earthquakes hasn't been sustained," says Gold.Gold will present their findings on Tuesday at the Geological Society of America's Annual meeting in Indianapolis, Indiana. "Our results will hopefully encourage the seismic hazard community to consider the possibility that the tempo of faulting may be variable," says Gold. "Sometimes there may be very long intervals between earthquakes and sometimes the earthquakes may be more closely spaced."The USGS team hopes their new results on New Madrid ruptures can provide insights to those who model risk and seismic hazard in the region. Gold says that refining and updating seismic hazards with more information on how a fault might rupture can help with building codes -- designing buildings just right to keep us safe, but not over-designed, which can waste resources. | Earthquakes | 2,018 |
November 6, 2018 | https://www.sciencedaily.com/releases/2018/11/181106124929.htm | White line of algae deaths marks uplift in 2016 Chilean earthquake | A bleached fringe of dead marine algae, strung along the coastlines of two islands off the coast of Chile, offers a unique glimpse at how the land rose during the 2016 magnitude 7.6 Chiloé earthquake, according to a new study in the | Durham University researcher Ed Garrett and colleagues used the algal data to help confirm the amount of fault slip that occurred during the Chiloé quake, which took place in an area that had been seismically quiet since the 1960 magnitude 9.5 Valdivia earthquake -- the largest instrumentally recorded earthquake in the world.There are fewer records of more moderate earthquakes in the region, so "precisely quantifying the amount and distribution of slip in 2016 therefore helps us to understand the characteristics of these smaller events. This information helps us to better assess how faults accumulate and release strain during sequences of ruptures of different magnitudes," said Garrett."Such insights in turn assist with efforts to assess future seismic hazards," he added. "While the 2016 earthquake occurred in a sparsely populated region, similar major earthquakes on the Chilean Subduction Zone could pose significant hazards to more populous regions in the future."Garrett and colleagues combined their calculations of the amount of uplift indicated by the algal data -- about 25.8 centimeters -- with satellite data of the crust's movement during the earthquake to determine that the maximum slip along the fault was approximately three meters.The slip is equivalent to about 80 percent of the maximum cumulative plate convergence since the 1960 Valdivia earthquake, they conclude, which is a result similar to other recent estimates of slip. Some of the earliest reports from the 2016 earthquake suggested that the maximum fault slip during the event was as much as five meters, which would have wiped out or exceeded all the seismic stress built up by plate convergence since the 1960 earthquake.When an earthquake rupture lifts coastal crust, it can strand organisms like algae and mussels that fix themselves to rocks, raising their homes above their normal waterline. The catastrophe leaves a distinctive line of dead organisms traced across the rock. The distance between the upper limit of this death zone and the upper limit of the zone containing living organisms offers an estimate of the vertical uplift of the crust.Researchers have long used the technique to measure abrupt vertical deformation of the crust. During the famous 19th century voyage of the HMS Beagle, Charles Darwin used a band of dead mussels to determine the uplift of Isla Santa María during the magnitude 8.5 Chilean earthquake of 1835.Ten months after the Chiloé earthquake, Garrett and his colleagues were studying the effects of the quake on coastal environments such as tidal marshes, looking for modern examples of how earthquakes affect these environments that they could use in their study of prehistoric earthquakes."It was only once we reached Isla Quilán that we noticed the band of bleached coralline algae along the rocky shorelines and realized that we could use this marker to quantify the amount of uplift," Garrett said.The research team made hundreds of measurements of the bleached algae line where it appeared on Isla Quilán and Isla de Chiloé. The rich algal record was helpful in corroborating the amount of vertical uplift in a region of the world sparsely covered by instruments that measure crustal deformation. The study demonstrates that land-level changes as low as 25 centimeters can be determined using large numbers of "death-zone" measurements at sites that are sheltered from waves, the researchers note.The study was supported by the Natural Environment Research Council, the European Union/Durham University (COFUND) under the DIFeREns2 project and CYCLO, a Nucleus of the Millennium Scientific Initiative (ICM) of the Chilean Government. | Earthquakes | 2,018 |
November 6, 2018 | https://www.sciencedaily.com/releases/2018/11/181106111614.htm | Nobody wins in a landslide | A University of Cincinnati geologist is studying one of the lesser-known hazards of life in the West: landslides. | People who live in the Basin and Range of Nevada are accustomed to earthquakes, flash floods and wildfires. But UC professor Daniel Sturmer said this part of the United States has generated numerous, large landslides as well. This landslide-prone region includes parts of California, Utah and Arizona."Certainly, in the Basin and Range, it's a hazard that is vastly underestimated," he said.Landslides get far less attention than other natural disasters because they typically occur in less populous areas, Sturmer said. But they can be devastating.A landslide in Vajont, Italy, in 1963 killed more than 2,000 people after the face of a mountain crashed into a lake, creating a tsunami that scientists said had the force of nuclear bomb.Sturmer, an assistant professor of geology in UC's McMicken College of Arts and Sciences, is working with the Nevada Bureau of Mines and Geology to add specific details on landslides to the state's map of known hazards. The interactive map includes areas prone to wildfires, flash-flood zones and earthquake fault lines."These failures frequently occur in El Niño years when you have a lot of rain. Fires exacerbate the problem because you don't have vegetative roots holding the soil in place. And then you get heavy rains," he said.Sturmer presented the project to the Geological Society of America conference in Indianapolis in November.The U.S. Geological Survey (USGS) oversees a national landslide hazards program to reduce long-term losses from landslides and to understand what causes them. Landslides are responsible for as much as $3.5 billion in damage each year in the United States, according to the USGS. Rockfalls kill dozens of people every year, according to the agency.Landslides are never far from the popular imagination. The phrase "won by a landslide" has been in use in connection with elections since at least the 1840s.In his 1880 book "A Tramp Abroad," humorist Mark Twain wrote of a famous landslide in the Swiss countryside that destroyed four towns 74 years earlier."A constant marvel with us as we sped along the bases of the steep mountains on this journey was not that avalanches occur, but that they are not occurring all the time," he wrote. "One does not understand why rocks and landslides do not plunge down these declivities daily."UC geology graduate student Nicholas Ferry has been studying the Blue Diamond landslide south of Las Vegas, Nevada."Las Vegas has grown very rapidly over the last 20 years. Development is approaching this area where potentially another landslide could happen," Ferry said.But Ferry said the region with the most landslides ironically is home to the world's oldest mountains: the Appalachians.Ferry decided to come to UC to study geology because of its in-depth work on landslides."Of all the schools I applied to, UC was doing the most interesting work. I jumped all over it," Ferry said. "It's been pretty exciting."Landslides hold tremendous potential energy. In the textbook "Geology of California," authors Robert Norris and Robert Webb wrote about the incredible force behind an Ice Age landslide in the San Bernardino mountains called Blackhawk. It's one of the most-studied landslides in the United States, Sturmer said."As the slide moved down the canyon (at 170 mph), it passed over a ridge that crosses the canyon and was thus launched into the air -- a geologic version of a flying carpet," Norris and Webb wrote.Rocks from the slide reached top speeds of 270 mph as they fell more than 4,000 feet and came to rest more than 5 miles away. Sturmer said evidence of that slide is still evident today, despite thousands of years of erosion."It's one of the most-studied landslides in the region. The mountain failed on a ridge. The falling rock was supported on a cushion of air as it traveled," Sturmer said.Nevada's Bureau of Mines and Geology gives residents detailed information about their local risks of natural hazards through an interactive online map, said Rachel Micander, a cartographer for the agency."The web application contains data on earthquakes, floods and wildfires, the top three natural hazards in Nevada," she said. "We also have information on radon. But there is very little about landslides now."She studied geology under Sturmer at the University of Nevada, Reno."We have a few types of landslides in Nevada and across the country. There are records in the geologic strata of these types of landslides happening across Nevada," she said.Landslides pose a legitimate risk because development is expanding to areas that can become unstable under the right conditions, she said."Reno sits in a valley. As the population grows, we're building higher up on the mountain slopes," she said. "We've seen debris flows recently in the Carson City area."The hazard map gives residents information to prepare for the most likely natural disasters nearest them and to make smart decisions about where to build, she said."That's the goal of the project, to develop a tool for Nevada citizens to make educated decisions about their community. Learn about hazard risks," she said. "A lot of people don't realize Nevada is earthquake country."Micander said people accustomed to earthquakes take precautions such as securing bookcases to the walls and not placing heavy objects like picture frames or unsecured shelving over beds."A lot of these hazards are fairly easily mitigated," Micander said. "I know open shelving is trendy. But I keep the doors on my closets and cabinets. Mom always put a wooden spoon through the handles to secure the kitchen cupboards."In addition, the digital map identifies the risks of radon, a colorless, odorless toxic gas that can seep into homes from underground bedrock and the water table. It's one of the leading causes of lung cancer in the United States. The Centers for Disease Control and Prevention recommends people test their homes for radon exposure and make repairs, if necessary, to limit future exposure.Micander said she would like to incorporate Sturmer's new landslide data in the hazard map. So far, he has identified about 70 sites of landslides or landslide-prone areas. He has visited about a dozen of them but plans to study more in person."Most of them either haven't been studied at all or have been studied very little," he said. "It's been a fun hunt to look for them. It seems every time I go looking for them, I find more." | Earthquakes | 2,018 |
November 5, 2018 | https://www.sciencedaily.com/releases/2018/11/181105105359.htm | Enhanced views of Earth tectonics | Scientists from Germany's Kiel University and British Antarctic Survey (BAS) have used data from the European Space Agency (ESA), Gravity field and steady-state Ocean Circulation Explorer (GOCE) mission to unveil key geological features of the Earth's lithosphere -- the rigid outer layer that includes the crust and the upper mantle. | Published this week in the journal GOCE measures differences in horizontal and vertical components of the gravity field -- known as gradients. These gradients can be complex to interpret and so the authors combined these to produce simpler 'curvature images' that reveal large-scale tectonic features of the Earth more clearly.Lead author, Prof. Jörg Ebbing from the Kiel University said: "Our new satellite gravity gradient images improve our knowledge of Earth's deep structure. The satellite gravity data can be combined with seismological data to produce more consistent images of the crust and upper mantle in 3D. This is crucial to understanding how plate tectonics and deep mantle dynamics interact."Fausto Ferraccioli, Science Leader of Geology and Geophysics at the British Antarctic Survey and co-author of the study, said, "Satellite gravity is revolutionizing our ability to study the lithosphere of the entire Earth, including its least understood continent, Antarctica. In East Antarctica, for example, we now begin to see a more complex mosaic of ancient lithosphere provinces. GOCE shows us fundamental similarities but also unexpected differences between its lithosphere and other continents, to which it was joined until 160 million years ago."The new study presents a view of the Earth's continental crust and upper mantle not previously achievable using global seismic models alone. The authors noted that, despite their similar seismic characteristics, there are contrasts in the gravity signatures for ancient parts of the lithosphere (known as cratons), indicating differences in their deep structure and composition. These features are important. Because they form the oldest cores of the lithosphere, they hold key records of Earth's early history. | Earthquakes | 2,018 |
November 1, 2018 | https://www.sciencedaily.com/releases/2018/11/181101133823.htm | Where water goes after fracking is tied to earthquake risk | In addition to producing oil and gas, the energy industry produces a lot of water, about 10 barrels of water per barrel of oil on average. New research led by The University of Texas at Austin has found that where the produced water is stored underground influences the risk of induced earthquakes. | Beyond supporting the link between water disposal and induced seismicity, the research also describes factors that can help reduce earthquake risk."If we want to manage seismicity, we really need to understand the controls," said lead author Bridget Scanlon, a senior research scientist at UT's Bureau of Economic Geology.The research was published Oct. 31 in the journal The researchers found that the increased pressure that is caused by storing produced water inside geologic formations raises the risk of induced seismicity. The risk increases with the volume of water injected, both at the well and regional scale, as well as the rate of injection.Researchers specifically looked at water stored near tight oil plays, including the Bakken, Eagle Ford and Permian shale plays, and Oklahoma overall, which has high levels of induced seismicity in concentrated areas. Researchers found marked differences in the degree of seismic activity associated with underground water storage.For example, the study found that in Oklahoma 56 percent of wells used to dispose of produced water are potentially associated with earthquakes. The next highest is the Eagle Ford Shale of South Texas, where 20 percent are potentially associated with earthquakes.The study reported that the different levels of induced seismic activity relate to, among other reasons, how the water is managed and where it is stored underground. In Oklahoma, the tendency to store water in deep geologic formations -- which are often connected to faults that can trigger earthquakes when stressed -- has increased the risk of induced seismicity. In the other areas, water is stored at shallower depths, which limits exposure to potentially risky faults.In conventional energy production, water is usually injected back into the reservoir that produced the oil and gas, which stabilizes pressure within the reservoir. However, water produced during hydraulic fracturing -- the method used to access energy in tight oil plays -- cannot be returned because the rock pores are too small for the water to be injected back into the rock. That water is usually injected into nearby geologic formations, which can increase pressure on the surrounding rock.The findings are consistent with directives issued by the Oklahoma Corporation Commission (OCC) in 2015 to mitigate seismicity, which included reducing injection rates and regional injection volumes by 40 percent in deep wells. This study confirmed the changes resulted in a 70 percent reduction in the number of earthquakes over a 3.0 magnitude in 2017 compared with the peak year of 2015."Everything they (the OCC) did is supported by what we have in this article," said Murray. "The decisions they made, the directives that they put out, are supported by statistical associations we found."The reduction in earthquakes in Oklahoma shows that subsurface management practices can influence seismic risk. However, Scanlon said the changes could come with trade-offs. For example, shallow disposal may help lower the risk of earthquakes, but the shallower storage depths could increase the risk of the produced water contaminating overlying aquifers."There's no free lunch," Scanlon said. "You keep iterating and doing things, but you must keep watching to see what's happening."The study was funded by the Alfred P. Sloan Foundation, the Cynthia and George Mitchell Foundation, the Jackson School of Geosciences, and the Stanford Center for Induced and Triggered Seismicity. | Earthquakes | 2,018 |
November 1, 2018 | https://www.sciencedaily.com/releases/2018/11/181101085116.htm | Micro-earthquakes preceding a 4.2 earthquake near Istanbul as early warning signs? | One of the high-risk geological structures lies near Istanbul, a megacity of 15 million people. The North Anatolian fault, separating the Eurasian and Anatolian tectonic plates, is a 1.200 kilometer-long fault zone running between eastern Turkey and the northern Aegean Sea. Since the beginning of the 20th century its seismic activity has caused more than 20,000 deaths. A large (Mw > 7) earthquake is overdue in the Marmara section of the fault, just south of Istanbul. | In a new study, led by Peter Malin and Marco Bohnhoff of the GFZ German Research Center for Geosciences, the authors report on the observation of foreshocks that, if analyzed accordingly and in real-time, may possibly increase the early-warning time before a large earthquake from just a few seconds up to several hours. However, the authors caution: "The results are so far based on only one -- yet encouraging -- field example for an ,earthquake preparation sequence' typically known from repeated rock-deformation laboratory experiments under controlled conditions," says Marco Bohnhoff.The study from Peter Malin and Marco Bohnhoff, together with colleagues from the AFAD Disaster and Emergency Management Presidency in Turkey, uses waveform data from the recently implemented GONAF borehole seismic network. GONAF operates at low-magnitude detection. It allowed identifying a series of micro-earthquakes prior to an earthquake of magnitude 4.2 which occurred in June 2016 south of Istanbul and which was the largest event in the region in several years.In the latest issue of "Our study shows a substantial increase in self-similarity of the micro-quakes during the hours before the mainshock," says Professor Bohnhoff of the GFZ; "while the current early-warning system in place in Istanbul relies on the arrival times of seismic waves emitted from the hypocentre to the city and is therefore restricted to a couple of seconds at maximum." While similar precursory activity has been detected for recent large earthquakes in Japan (2011 Mw9 Tohoku-Oki) and Chile (2014 Mw8.1 Iquique), this is at present by no means a ubiquitous observation and needs further testing before its implementation. | Earthquakes | 2,018 |
October 25, 2018 | https://www.sciencedaily.com/releases/2018/10/181025155708.htm | Relationship between tremors, water at the Cascadia margin | The earthquakes are so small and deep that someone standing in Seattle would never feel them. In fact, until the early 2000s, nobody knew they happened at all. Now, scientists at Rice University have unearthed details about the structure of Earth where these tiny tremors occur. | Rice postdoctoral researcher and seismologist Jonathan Delph and Earth scientists Fenglin Niu and Alan Levander make a case for the incursion of fluid related to slippage deep inside the Cascadia margin off the Pacific Northwest's coast.Their paper, which appears in the American Geophysical Union journal G"These aren't large, instantaneous events like a typical earthquake," Delph said. "They're seismically small, but there's a lot of them and they are part of the slow-slip type of earthquake that can last for weeks instead of seconds."Delph's paper is the first to show variations in the scale and extent of fluids that come from dehydrating minerals and how they relate to these low-velocity quakes. "We are finally at the point where we can address the incredible amount of research that's been done in the Pacific Northwest and try to bring it all together," he said. "The result is a better understanding of how the seismic velocity structure of the margin relates to other geologic and tectonic observations."The North American plate and Juan de Fuca plate, a small remnant of a much larger tectonic plate that used to subduct beneath North America, meet at the Cascadia subduction zone, which extends from the coast of northern California well into Canada. As the Juan de Fuca plate moves to the northeast, it sinks below the North American plate.Delph said fluids released from minerals as they heat up at depths of 30 to 80 kilometers propagate upward along the boundary of the plates in the northern and southern portions of the margin, and get trapped in sediments that are subducting beneath the Cascadia margin."This underthrust sedimentary material is being stuck onto the bottom of the North American plate," he said. "This can allow fluids to infiltrate. We don't know why, exactly, but it correlates well with the spatial variations in tremor density we observe. We're starting to understand the structure of the margin where these tremors are more prevalent."Delph's research is based on extensive seismic records gathered over decades and housed at the National Science Foundation-backed IRIS seismic data repository, an institutional collaboration to make seismic data available to the public."We didn't know these tremors existed until the early 2000s, when they were correlated with small changes in the direction of GPS stations at the surface," he said. "They're extremely difficult to spot. Basically, they don't look like earthquakes. They look like periods of higher noise on seismometers."We needed high-accuracy GPS and seismometer measurements to see that these tremors accompany changes in GPS motion," Delph said. "We know from GPS records that some parts of the Pacific Northwest coast change direction over a period of weeks. That correlates with high-noise 'tremor' signals we see in the seismometers. We call these slow-slip events because they slip for much longer than traditional earthquakes, at much slower speeds."He said the phenomenon isn't present in all subduction zones. "This process is pretty constrained to what we call 'hot subduction zones,' where the subducting plate is relatively young and therefore warm," Delph said. "This allows for minerals that carry water to dehydrate at shallower depths."In 'colder' subduction zones, like central Chile or the Tohoku region of Japan, we don't see these tremors as much, and we think this is because minerals don't release their water until they're at greater depths," he said. "The Cascadia subduction zone seems to behave quite differently than these colder subduction zones, which generate large earthquakes more frequently than Cascadia. This could be related in some way to these slow-slip earthquakes, which can release as much energy as a magnitude 7 earthquake over their duration. This is an ongoing area of research." | Earthquakes | 2,018 |
October 25, 2018 | https://www.sciencedaily.com/releases/2018/10/181025103348.htm | Mexico's 2017 Tehuantepec quake suggests a new worry | Last September's magnitude 8.2 Tehuantepec earthquake happened deep, rupturing both mantle and crust, on the landward side of major subduction zone in the Pacific Ocean off Mexico's far south coast. | Initially, it was believed the earthquake was related to a seismic gap, occurring where the Cocos ocean plate is being overridden by a continental plate, in an area that had not had a quake of such magnitude since 1787. Subduction zone megaquakes generally occur near the top of where plates converge.The epicenter, however, was 46 kilometers (28 miles) deep in the Cocos plate, well under the overriding plate and where existing earthquake modeling had said it shouldn't happen, a 13-member research team reported Oct. 1 in the journal "We don't yet have an explanation on how this was possible," said the study's lead author Diego Melgar, an earth scientist at the University of Oregon. "We can only say that it contradicts the models that we have so far and indicates that we have to do more work to understand it."Earthquakes do occur in such locations, where a descending plate's own weight creates strong forces that stretch the slab as it dives down toward the mantle, but have been seen only under older and cooler subduction zones. The 1933 Sanriku, Japan, earthquake was one. It generated a 94-foot tsunami that killed 1,522 people and destroyed more than 7,000 homes.The Mexican quake, ruptured the descending slab and generated a 6-foot tsunami, which likely was limited in size by the angle of the overriding continental plate so close to shore, Melgar said."This subducting plate is still very young and warm, geologically speaking," he said. "It really shouldn't be breaking."Subduction zone ages and their temperatures relate to their distance from mid-ocean ridges, where plates are made in temperatures of 1,400 degrees Celsius (2,552 degrees Fahrenheit), Melgar said. The 25-million-year-old Cocos subduction zone is 600 miles from the mid-ocean ridge where it began. Japan's subduction zone is much further from the ridge and 130-million-years old.Temperatures cool as plates move outward. Tension-related earthquakes, the researchers noted, have been restricted to older plates with temperatures that are cooler than 650 degrees Celsius (1,202 degrees Fahrenheit).Melgar's team theorizes that seawater infiltration into the fabric of the stressed and diving Cocos plate has possibly accelerated the cooling, making it susceptible to tension earthquakes previously seen only in older and colder locations. It's also possible, the researchers noted, that the 8.0 magnitude 1933 Oaxaca earthquake, previously thought to be in a traditional subduction zone event, was instead similar to the one that struck last year.If such water-driven cooling is possible, it could suggest other areas, especially Guatemala southward in Central America, and the U.S. West Coast are susceptible to tension-zone earthquakes, Melgar said.The Cascadia subduction zone, from northern California to British Columbia, is 15 million years old and warmer than the similar geology along the Mexican-Central America coastlines, but could still be at risk.Building codes and hazard maps should reflect the potential danger, he added."Our knowledge of these places where large earthquakes happen is still imperfect," Melgar said. "We can still be surprised. We need to think more carefully when we make hazard and warning maps. We still need to do a lot of work to be able to provide people with very accurate information about what they can expect in terms of shaking and in terms of tsunami hazard." | Earthquakes | 2,018 |
October 19, 2018 | https://www.sciencedaily.com/releases/2018/10/181019135124.htm | Earth’s inner core is solid, 'J waves' suggest | A new study by researchers at The Australian National University (ANU) could help us understand how our planet was formed. | Associate Professor Hrvoje Tkalčić and PhD Scholar Than-Son Phạm are confident they now have direct proof that Earth's inner core is solid.They came up with a way to detect shear waves, or "J waves" in the inner core -- a type of wave which can only travel through solid objects."We found the inner core is indeed solid, but we also found that it's softer than previously thought," Associate Professor Tkalčić said."It turns out -- if our results are correct -- the inner core shares some similar elastic properties with gold and platinum. The inner core is like a time capsule, if we understand it we'll understand how the planet was formed, and how it evolves."Inner core shear waves are so tiny and feeble they can't be observed directly. In fact, detecting them has been considered the "Holy Grail" of global seismology since scientists first predicted the inner core was solid in the 1930s and 40s.So the researchers had to come up with a creative approach.Their so-called correlation wavefield method looks at the similarities between the signals at two receivers after a major earthquake, rather than the direct wave arrivals. A similar technique has been used by the same team to measure the thickness of the ice in Antarctica."We're throwing away the first three hours of the seismogram and what we're looking at is between three and 10 hours after a large earthquake happens. We want to get rid of the big signals," Dr Tkalčic said."Using a global network of stations, we take every single receiver pair and every single large earthquake -- that's many combinations -- and we measure the similarity between the seismograms. That's called cross correlation, or the measure of similarity. From those similarities we construct a global correlogram -- a sort of fingerprint of the Earth."The study shows these results can then be used to demonstrate the existence of J waves and infer the shear wave speed in the inner core.While this specific information about shear waves is important, Dr Tkalčić says what this research tells us about the inner core is even more exciting."For instance we don't know yet what the exact temperature of the inner core is, what the age of the inner core is, or how quickly it solidifies, but with these new advances in global seismology, we are slowly getting there."The understanding of the Earth's inner core has direct consequences for the generation and maintenance of the geomagnetic field, and without that geomagnetic field there would be no life on the Earth's surface."The research has been published in the journal | Earthquakes | 2,018 |
October 16, 2018 | https://www.sciencedaily.com/releases/2018/10/181016132030.htm | Looking and listening for signals of navy test explosions off Florida coast | Underwater explosions detonated by the U.S. Navy to test the sturdiness of ships' hulls have provided seismologists with a test opportunity of their own: how much can we know about an underwater explosion from the seismic and acoustic data it generates? | In a study published in the The explosions conducted off the Florida coast offer a chance to test the capabilities of the International Monitoring System (IMS) of the Comprehensive Test-Ban Treaty Organization, deployed to detect secret nuclear test explosions, say the authors."Underwater explosions of this size where ground-truth data are available are quite rare," said Heyburn. IMS stations "do record signals from many underwater events, both naturally occurring, like earthquakes, and human-made, like air-gun surveys. However, ground-truth data from the human-made sources are often not available, and usually any underwater explosions observed are much smaller in size than the Florida explosions."The Navy explosion tests were conducted in 2001, 2008 and 2016 within the Navy's Mayport test area, located 120 kilometers off the coast of Jacksonville, Florida. The experiments tested the hull integrity of the near-shore combat ships the USS Jackson, the USS Milwaukee, the USS Winston Churchill and the USS Mesa Verde. During the experiments, 10,000-pound chemical explosions were towed behind a vessel sailing alongside the combat ships, to test how the combat ship hulls responded to the shock waves produced by nearby detonations equivalent to a TNT charge weight of about 6759 kilograms.The explosions aren't the size of a nuclear test explosion, but they are much larger than many of the other underwater explosions that are available to study, explained Heyburn. "The explosions described in the paper are much smaller than a typical nuclear test explosion which is typically of the order of kilotons in TNT equivalent charge weight," he said. "For comparison, naval mines and anti-submarine depth charges typically contain explosive charges with TNT equivalent charge weights of the order of hundreds of kilograms."Heyburn and colleagues analyzed data from the explosions collected by IMS seismic stations around the globe and from a hydrophone station near Ascension Island in the South Atlantic Ocean for the tests in 2008 and 2016. The researchers used the location data provided by the Navy for several of the explosions as a way to improve the accuracy of calculations used to pinpoint the epicenter of explosions without available ground-truth data.Previous research on detonations in the Dead Sea and other underwater test sites has shown that there is a relationship between the local magnitude of an explosion and its charge weight. After taking into account the salinity of the seawater near Florida, Heyburn and colleagues were able to modify the Dead Sea calculations to estimate the Navy charge weights from published estimates of local magnitude for the explosions.Underwater explosions also produce unique seismic features that can be used to identify the source. One such notable feature is the bubble pulse, the underwater pulsing generated by the gas bubble created by the explosion as it expands and contracts in size. The researchers were able to identify the bubble pulse in the seismic records, confirming that the seismic signals were the result of an underwater explosion.The bubble pulse was not observed as clearly at the Ascension Island hydrophone station, and Heyburn and colleagues suggest that the shallow waters off the Florida Coast may have distorted the hydroacoustic signals of the explosion. "This showed that both the seismic and hydroacoustic data can be important when analyzing signals from underwater explosions," said Heyburn. | Earthquakes | 2,018 |
October 14, 2018 | https://www.sciencedaily.com/releases/2018/10/181014142703.htm | Fast, accurate estimation of the Earth's magnetic field for natural disaster detection | Researchers from Tokyo Metropolitan University have applied machine-learning techniques to achieve fast, accurate estimates of local geomagnetic fields using data taken at multiple observation points, potentially allowing detection of changes caused by earthquakes and tsunamis. A deep neural network (DNN) model was developed and trained using existing data; the result is a fast, efficient method for estimating magnetic fields for unprecedentedly early detection of natural disasters. This is vital for developing effective warning systems that might help reduce casualties and widespread damage. | The devastation caused by earthquakes and tsunamis leaves little doubt that an effective means to predict their incidence is of paramount importance. Certainly, systems already exist for warning people just before the arrival of seismic waves; yet, it is often the case that the S-wave (or secondary wave), that is, the It is known that earthquakes and tsunamis are accompanied by localized changes in the geomagnetic field. For earthquakes, it is primarily what is known as a piezo-magnetic effect, where the release of a massive amount of accumulated stress along a fault causes local changes in geomagnetic field; for tsunamis, it is the sudden, vast movement of the sea that causes variations in atmospheric pressure. This in turn affects the ionosphere, subsequently changing the geomagnetic field. Both can be detected by a network of observation points at various locations. The major benefit of such an approach is speed; remembering that electromagnetic waves travel at the speed of light, we can instantaneously detect the incidence of an event by observing changes in geomagnetic field.However, how can we tell whether the detected field is anomalous or not? The geomagnetic field at various locations is a fluctuating signal; the entire method is predicated on knowing what the "normal" field at a location is.Thus, Yuta Katori and Assoc. Prof. Kan Okubo from Tokyo Metropolitan University set out to develop a method to take measurements at multiple locations around Japan and create an estimate of the geomagnetic field at different, specific observation points. Specifically, they applied a state-of-the-art machine-learning algorithm known as a Deep Neural Network (DNN), modeled on how neurons are connected inside the human brain. By feeding the algorithm a vast amount of input taken from historical measurements, they let the algorithm create and optimize an extremely complex, multi-layered set of operations that most effectively maps the data to what was actually measured. Using half a million data points taken over 2015, they were able to create a network that can estimate the magnetic field at the observation point with unprecedented accuracy.Given the relatively low computational cost of DNNs, the system may potentially be paired with a network of high sensitivity detectors to achieve lightning-fast detection of earthquakes and tsunamis, delivering an effective warning system that can minimize damage and save lives. | Earthquakes | 2,018 |
October 4, 2018 | https://www.sciencedaily.com/releases/2018/10/181004143958.htm | Ground shaking during devastating flood offers new insights | A devastating wall of water gushed down the Bhotekoshi/Sunkoshi River in Nepal on July 5, 2016. It came from a lake that had been dammed by a glacial moraine but the dam broke and discharged more than 100,000 tons of water all at once. An international team of GFZ and Nepali scientists was able to record the sudden outburst with seismometers deployed the year before in the wake of the catastrophic Ghorka earthquake in April 2015. In a study to be published in | Kristen Cook, member of Niels Hovius' team at the GFZ and first author, visited the valley before and after the flood. Several lucky circumstances surrounded the event: The flood occurred in the evening when people were around their homes home but not asleep, and it was so powerful that it caused to ground to shake, says Kristen. She talked with two local residents who felt and heard the flood approaching. The younger one told her that he had thought of an earthquake but his elder companion remembered another flood from 1981 and urged to climb away from the river. Other people reacted similarly and fled in time so no people were hurt. However, there was substantial damage to infrastructure. From the scientists' point of view, the researchers were lucky to have deployed seismometers in the valley well before the flood occurred. The ground shaking felt by valley residents was also recorded by a network of seismic stations, the first time that a glacial lake outburst has been captured by seismometers in close vicinity and with such a high resolution. Using seismic data to study floods is a new and developing technique that lets scientists see thing that were impossible to observe using traditional river monitoring."We could identify two distinct pulses," says Kristen Cook. The first came from the wall of water, the second one only seconds later from rocks and coarse sediment within the water. The latter caused the biggest damage. Bridges were destroyed as well as hydro-power stations and roads. In the aftermath, a number of landslides came down as the river banks were destabilized by the erosive force of the water.As there are many glacial lakes, either dammed by glaciers or by land ridges, in the Himalayas and other mountain ranges world wide, the findings of the team are of great importance. To calculate erosion rates and the risk of landslides, it is necessary to take into account factors such as earthquakes, melting glaciers, air temperature, and glacial recharge of lakes. Things get even more complicated: "Even where lake outburst flood-frequency can be linked to precipitation, the relationship between fluvial erosion and precipitation will become non-linear," conclude the authors. Climate change with rising temperatures may worsen the situation and increase the risk of communities in mountainous regions prone to earthquakes. | Earthquakes | 2,018 |
October 2, 2018 | https://www.sciencedaily.com/releases/2018/10/181002102733.htm | New concept to cool boiling surface may help prevent nuclear power plant accidents | A new University of Hawaii at Manoa study has produced a new technique involving heat that could help prevent nuclear power plant accidents. | Boiling is usually associated with heating, however, in many industrial applications associated with extremely hot components, such as nuclear power plants and metal casting, boiling is used as an effective cooling mechanism. This is due to "latent heat," the heat absorbed to change water into vapor, which removes a huge amount of heat from a hot surface.There is a limit to the amount of heat that can be eliminated through boiling. Increasing this tolerable heat limit is important for many reasons, but especially for safety.Sangwoo Shin, an assistant professor in mechanical engineering at the UH Manoa College of Engineering, has demonstrated a novel concept that overcomes the tolerable heat limit or what's known as the critical heat flux (CHF). He leads a research team that has come up with a new method that increased the CHF by 10 percent compared to approaches used in the past.According to Shin, this is important because, if the surface is extremely hot, the water near the surface will quickly change into vapor, leaving no liquid to use for cooling the surface."The result of this failure of cooling leads to a meltdown of the heated surface, as witnessed from the disaster at the Fukushima nuclear power plant in 2011," explained Shin. The incident was prompted by the Tohoku earthquake that struck eastern Japan, which generated a tsunami and disabled the power and cooling systems of the plant's reactors. "In this regard, extensive efforts have been put into increasing the CHF," he said.To date, one of the most effective ways of enhancing the CHF is by roughening the surface with nanostructures, specifically, nanowires. High surface roughness leads to an increased number of sites at which the bubbling occurs, thus resulting in enhanced CHF.The study found that boiling heat transfer was much more favorable with a new concept that involves coating the hot surface using nanoscale bimorphs, a piece of long metal that can bend when exposed to heat due to thermal expansion.The hot surface causes the bimorphs to deform spontaneously, which makes the surface condition to be more favorable for boiling.Shin says future studies to further CHF enhancement can be expected by choosing the right geometry and material for the nano-bimorphs, which may contribute to developing energy-efficient technologies for extremely hot systems. | Earthquakes | 2,018 |
October 1, 2018 | https://www.sciencedaily.com/releases/2018/10/181001171151.htm | A wrench in Earth's engine: Stagnant slabs | Researchers at CU Boulder report that they may have solved a geophysical mystery, pinning down the likely cause of a phenomenon that resembles a wrench in the engine of the planet. | In a study published today in CU Boulder's Wei Mao and Shijie Zhong, however, may have found the reason for that halt. Using computer simulations, the researchers examined a series of stagnant slabs in the Pacific Ocean near Japan and the Philippines. They discovered that these cold rocks seem to be sliding on a thin layer of weak material lying at the boundary of the planet's upper and lower mantle -- roughly 660 kilometers, or 410 miles, below the surface.And the stoppage is likely temporary: "Although we see these slabs stagnate, they are a fairly recent phenomena, probably happening in the last 20 million years," said Zhong, a co-author of the new study and a professor in CU Boulder's Department of Physics.The findings matter for tectonics and volcanism on the Earth's surface. Zhong explained that the planet's mantle, which lies above the core, generates vast amounts of heat. To cool the globe down, hotter rocks rise up through the mantle and colder rocks sink."You can think of this mantle convection as a big engine that drives all of what we see on Earth's surface: earthquakes, mountain building, plate tectonics, volcanos and even Earth's magnetic field," Zhong said.The existence of stagnant slabs, which geophysicists first located about a decade ago, however, complicates that metaphor, suggesting that Earth's engine may grind to a halt in some areas. That, in turn, may change how scientists think diverse features, such as East Asia's roiling volcanos, form over geologic time.Scientists have mostly located such slabs in the western Pacific Ocean, specifically off the east coast of Japan and deep below the Mariana Trench. They occur at the sites of subduction zones, or areas where oceanic plates at the surface of the planet plunge hundreds of miles below ground.Slabs seen at similar sites near North and South America behave in ways that geophysicists might expect: They dive through Earth's upper mantle and into the lower mantle where they heat up near the core.But around Asia, "they simply don't go down," Zhong said. Instead, the slabs spread out horizontally near the boundary between the upper and lower mantle, a point at which heat and pressure inside Earth cause minerals to change from one phase to another.To find out why slabs go stagnant, Zhong and Mao, a graduate student in physics, developed realistic simulations of how energy and rock cycle around the entire planet.They found that the only way they could explain the behavior of the stagnant slabs was if a thin layer of less-viscous rock was wedged in between the two halves of the mantle. While no one has directly observed such a layer, researchers have predicted that it exists by studying the effects of heat and pressure on rock.If it does, such a layer would act like a greasy puddle in the middle of the planet. "If you introduce a weak layer at that depth, somehow the reduced viscosity helps lubricate the region," Zhong said. "The slabs get deflected and can keep going for a long distance horizontally."Stagnant slabs seem to occur off the coast of Asia, but not the Americas, because the movement of the continents above gives those chunks of rock more room to slide. Zhong, however, said that he doesn't think the slabs will stay stuck. With enough time, he suspects that they will break through the slick part of the mantle and continue their plunge toward the planet's core.The planet, in other words, would still behave like an engine -- just with a few sticky spots. "New research suggests that the story may be more complicated than we previously thought," Zhong said. | Earthquakes | 2,018 |
September 28, 2018 | https://www.sciencedaily.com/releases/2018/09/180928104523.htm | New approach offers high-resolution seismic monitoring of the shallow subsurface | Accurate monitoring of the ground beneath our feet for signs of seismic activity to identify natural phenomena such as earthquakes, volcanic eruptions and the leakage of fluids stored deep underground remains challenging. | Time-lapse 4-dimensional seismic monitoring surveys that employ an active seismic source can accurately map the subsurface, and comparing results from different surveys can show how fluids (such as carbon dioxide, COIn an article recently published in Obtaining a high-resolution characterization of the shallow subsurface has previously been held back by the limited number of ACROSS units, however the researchers were able to overcome this obstacle. As the lead author Tatsunori Ikeda explains: "applying spatially windowed surface-wave analysis allowed us to study the spatial variation of surface wave velocities using data from a single ACROSS unit."The research team validated their method against data gathered from hundreds of geophone measuring devices located around the ACROSS unit and a computational model of the site. Their analysis of the surface waves shows spatial variation in the surface wave velocities, and the impact of seasonal weather on these velocities. Confirmation of the method's accuracy highlights its potential to identify changes in the shallow subsurface that may be caused by natural phenomena or fluids leaking from storage sites much deeper underground.As well as drawing together experts from a variety of organizations in Japan and Canada, the publication represents another step forward for researchers in Kyushu University's International Institute for Carbon-Neutral Energy Research (I2CNER). As co-author Takeshi Tsuji notes: "The approach contributes to our ongoing work in Kyushu University to develop a downsized, continuous and controlled seismic monitoring system." The researchers have been operating the downsized monitoring system at the Kuju geothermal and volcanological research station on Japan's Kyushu Island. | Earthquakes | 2,018 |
September 27, 2018 | https://www.sciencedaily.com/releases/2018/09/180927083353.htm | Regional seismic data help locate September 2017 North Korean nuclear test | The epicenter of the Sept. 3, 2017 nuclear test explosion in North Korea occurred about 3.6 kilometers northwest of the country's first nuclear test in October 2006, according to a new high-precision analysis of the explosion and its aftermath. | The study published in Their paper is published as part of the journal's special focus section on the September 2017 North Korean explosion. The body-wave magnitude 6.1 underground test by the Democratic People's Republic of Korea (DPRK) is the largest such test in more than 20 years, and is the sixth declared nuclear test by the DPRK since 2006. The September explosion is an order of magnitude larger than the next largest test by the country, which occurred in September 2016.Zhao and colleagues used seismic wave data from 255 seismograph stations in the China National Digital Seismic Network, Global Seismic Network, International Federation of Digital Seismograph Networks and Full Range Seismograph Network in Japan to investigate the explosion and three other seismic events that occurred in the minutes and days after.Although global seismic networks may pick up the signal of underground nuclear tests, the signals they detect are often too weak to be to be used in the kind of location analysis performed by Zhao and colleagues. "The closer to the sources the better," said Zhao. "However, seismometers cannot be deployed in the North Korean test site due to political issues. Thus, seismologists have developed methods that can be applied to regional seismic data to investigate seismic characteristics of the underground nuclear explosions."The researchers used seismic data from the first DPRK nuclear test as the "master event" to calibrate their location analysis, since the epicenter of that small explosion could be visually located using satellite images of localized ground surface damage. The much larger September 2017 explosion produced surface damage over an area about nine square kilometers, however, in ground that was already disturbed by previous nuclear tests. "For example, after the sixth North Korean nuclear test, large displacements occurred on the west and south flank [of the test site] and debris flows were localized in pre-existing channels," Zhao explained. "These spatially distributing phenomena made it difficult for us to directly determine the epicenter of the explosion."Zhao and colleagues used regional seismic data instead to calculate that the epicenter of the September 2017 explosion was at 41.3018°N and 129.0696°E. A seismic event that took place about eight minutes after the explosion occurred very close to the explosion epicenter -- less than 200 meters away -- and probably represents the seismic signature of the collapse of a cavity left by the underground explosion, the researchers suggested.Two subsequent seismic events, one on 23 September and one on 12 October, were located about eight kilometers northeast of the nuclear test site. Zhao and colleagues said that the seismic signatures of these two events indicate that they are not explosions, but may have resulted from mechanisms such as landslide or ground collapse. They may also be very shallow natural earthquakes that were triggered by the explosion, they noted, a possibility that will require more research on the pre- and post-explosion stresses on the faults where the events occurred. | Earthquakes | 2,018 |
September 26, 2018 | https://www.sciencedaily.com/releases/2018/09/180926192044.htm | Plate tectonics may have been active on Earth since the very beginning | A new study suggests that plate tectonics -- a scientific theory that divides Earth into large chunks of crust that move slowly over hot viscous mantle rock -- could have been active from the planet's very beginning. The new findings defy previous beliefs that tectonic plates were developed over the course of billions of years. | The paper, published in "Plate tectonics set up the conditions for life," said Nick Dygert, assistant professor of petrology and geochemistry in UT's Department of Earth and Planetary Sciences and coauthor of the study. "The more we know about ancient plate tectonics, the better we can understand how Earth got to be the way it is now."For the research, Dygert and his team looked into the distribution of two very specific noble gas isotopes: Helium-3 and Neon-22. Noble gases are those that don't react to any other chemical element.Previous models have explained Earth's current Helium-3/Neon-22 ratio by arguing that a series of large-scale impacts (like the one that produced our moon) resulted in massive magma oceans, which degassed and incrementally increased the ratio each time.However, Dygert believes the scenario is unlikely."While there is no conclusive evidence that this didn't happen," he said, "it could have only raised the Earth's Helium-3/Neon-22 ratio under very specific conditions."Instead, Dygert and his team believe the Helium-3/Neon-22 ratio raised in a different way.As Earth's crust is continuously formed, the ratio of helium to neon in the mantle beneath the crust increases. By calculating this ratio in the mantle beneath the crust, and considering how this process would affect the bulk Earth over long periods of time, a rough timeline of Earth's tectonic plate cycling can be established."Helium-3 and Neon-22 were produced during the formation of the solar system and not by other means," Dygert said. "As such, they provide valuable insight into Earth's earliest conditions and subsequent geologic activity." | Earthquakes | 2,018 |
September 26, 2018 | https://www.sciencedaily.com/releases/2018/09/180926082734.htm | Researchers map susceptibility to human-made earthquakes | Earthquakes in Oklahoma and Kansas had been on the rise due to injection of wastewater -- a byproduct of oil and gas operations -- before regulations started limiting injections. Now a new model developed by Stanford University researchers incorporates earthquake physics and the Earth's hydrogeologic response to wastewater injection to forecast a decrease in human-made earthquakes in Oklahoma and Kansas through 2020. | The model is based on publicly available data on wastewater injection into the Arbuckle formation, a nearly 7,000-foot-deep sedimentary formation underlying north-central Oklahoma and southern Kansas. Assuming wastewater injection from oil and gas operations continues at its current rate, researchers mapped the likelihood that the region will experience future earthquakes. The research appears Sept. 26 in the journal "We've created a detailed model that allows regulators to know where most of the problems are likely to occur," said co-author Mark Zoback, the Benjamin M. Page Professor of Geophysics at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "This can be used in Oklahoma, or elsewhere, to provide a scientific basis for regulatory action."Oil and gas operations often produce large volumes of salty water that they dispose of by injecting it deep underground to protect water in aquifers near the surface used for drinking, livestock and irrigation.Fluid injected into the Arbuckle formation increases pressure that spreads over a large area. This pressure is problematic because it can affect large faults nearby that are already under stress from tectonic processes. These faults are capable of producing widely felt and potentially damaging earthquakes, if reached by the pressure increase caused by injection.The same pressure increase in different areas can cause up to 100 times the number of earthquakes, according to lead author Cornelius Langenbruch, a postdoctoral researcher at Stanford Earth. The earthquakes are not necessarily concentrated in areas where the pressure change is highest. In order to understand where earthquakes will or will not occur on a local scale, the new model evaluates the pressure increase in the context of the area's vulnerable, pre-existing faults."It was surprising for me to see that the local susceptibility to earthquakes fluctuates by such a large amount," Langenbruch said. "The example of Oklahoma shows that the key to managing seismic hazards related to these human-made, induced earthquakes is managing how much and where the wastewater is injected."Oklahoma's induced earthquakes increased drastically around 2009 and peaked in 2015, with nearly 1,000 widely felt earthquakes spread across the northern and central parts of the state. Oklahoma's public utilities commission, the Oklahoma Corporation Commission, mandated a 40 percent water injection reduction in early 2016 and the number of earthquakes declined thereafter.The new model -- which includes data from 809 injection wells from 2000 through 2018 -- shows there will be a 32 percent, 24 percent and 19 percent probability of potentially damaging earthquakes of magnitude 5.0 or above in 2018, 2019 and 2020, respectively, suggesting that Oklahoma's policies are working. If current injection practices in north-central Oklahoma and southern Kansas continue, one potentially damaging magnitude 5.0 or larger earthquake is expected to occur through 2020."The result of the new study is definitely good news -- it shows injection rate reductions are still effective. In 2015 and 2016 the probabilities were as high as 70 percent," Langenbruch said. "However, the problem is that earthquake probabilities in some areas are still much higher than historic rates."The predictive maps from the study allow residents of Oklahoma and Kansas to see the probability that potentially damaging earthquakes will strike close to their homes. The new model can also be used to evaluate future injection scenarios intended to mitigate seismic hazards."The nice thing about the methodology is not only the predictions it makes, but its ability to make new predictions based on new measures that might be taken by the regulators," Zoback said. "It turns out you can do these analyses fairly early in the process."The researchers hope this model will also be used in other areas with expanding oil and gas operations. Places like the Permian Basin in West Texas are being developed at an incredible rate and water injection is probably resulting in earthquakes in that area already, Zoback said.A co-author of the paper is Matthew Weingarten, an assistant professor at San Diego State University who conducted research for the study as a postdoctoral researcher at Stanford. Zoback is also a senior fellow at Stanford's Precourt Institute for Energy, an affiliate of the Stanford Woods Institute for the Environment and the director of the Stanford Natural Gas Initiative. Funding for the study was provided by the Stanford Center for Induced and Triggered Seismicity (SCITS), an industrial affiliates program involving 10 Stanford professors. | Earthquakes | 2,018 |
September 25, 2018 | https://www.sciencedaily.com/releases/2018/09/180925140407.htm | Seasonal reservoir filling in India deforms rock, may trigger earthquakes | The seasonal filling and emptying of reservoirs in India can cause measurable deformation of the surrounding rock, reducing the strength of nearby faults and potentially triggering earthquakes, according to two new papers published in the | Researchers in India used global positioning system (GPS) observations along with satellite-based radar data called InSAR to track the movement -- particularly the uplift and subsidence -- of rock layers around the reservoirs. The teams measured crustal deformation around the Koyna and Warna reservoirs in western India, as well as the Tehri, Ukai and Dharoi reservoirs in India.The studies are included in the journal's special issue on reservoir-triggered seismicity around the globe, 51 years after a magnitude 6.3 earthquake in the Koyna-Warna region of India became the largest seismic event ever attributed to reservoir filling. Occurring in a region where there had been no previous record of significant seismicity, the Koyna earthquake caused 180 deaths and more than 2000 injuries, and thousands of smaller earthquakes still impact the region today.Reservoir-triggered seismicity is thought to be the result of shifting water levels in the reservoirs impounded behind the dams, which may increase stresses on faults by altering the stress and pore pressure in rocks along the fault. Reservoir filling accounts for 23 percent of the total number of earthquakes caused by human activity, according to the Human-Induced Earthquake Database.Vineet Gahalaut of the National Center for Seismology in India and colleagues used data collected by five Koyna-Warna GPS stations since 2013, along with InSAR observations, to learn more about the underlying tectonic movements in the region. The researchers found that the eastern block of the Koyna-Warna fault zone appears to be moving faster to the northeast compared to the western block, a motion that may be increasing stress on faults in the region.At the same time, seasonal filling and emptying of the Koyna and Warna reservoirs are deforming the surrounding rock layers, said Gahalaut. "All the reservoirs in the Indian subcontinent are fed by precipitation during the monsoon season, which starts during May and continues until August," he explained. "From September onward, the water in the reservoirs starts decreasing due to release for irrigation, electricity generation and urban usage.""It is inferred that seasonal deformation due to reservoir water level changes leads to a reduction in the strength of the critically stressed faults in the Koyna and Warna seismic zones, leading to triggering of earthquakes," said Gahalaut.A reservoir can have a stabilizing or destabilizing influence on nearby faults, depending on the kind of slip along those faults and where they are located with respect to the reservoir."In the case of the Koyna-Warna reservoir-triggered seismicity, it has been found that the reservoir impoundments decreased the strength of the identified faults in the region," said Gahalaut. "It has also been noticed that earthquakes tend to occur during the filling and high-water stand in the reservoirs when the destabilizing influence is relatively stronger."In a second study, Gahalaut and colleagues quantified crustal deformation related to seasonal filling at Koyna and three other reservoirs in west central and central India: Tehri, Ukai and Dharoi. The reservoirs are located in different geological zones and are of varying sizes, offering the researchers a chance to look for potential variation in deformation caused by reservoir filling.They found that even the smallest reservoir -- Dharoi at 0.9 cubic kilometers (roughly equivalent to 730,000 acre feet) could cause deformation capable of triggering an earthquake -- if there are critically stressed faults in the region.Gahalaut said more GPS observations gathered over longer periods of time will help to quantify the deformation rate around Indian reservoirs, which in turn could help refine our understanding of triggered earthquakes. He said it will also be important to collect more information about the surrounding crustal structure, including changes in the thickness and rock types around the faults, to fully understand how reservoir filling may trigger nearby earthquakes. | Earthquakes | 2,018 |
September 24, 2018 | https://www.sciencedaily.com/releases/2018/09/180924174500.htm | New earthquake risk model could better inform disaster planning | Researchers have developed a new way to model seismic risk, which they hope will better inform disaster risk reduction planning in earthquake-prone areas. | The study, which is published in This approach, which the team calls 'ensemble modelling', allows the researchers to estimate whether particular impacts are specific to certain earthquakes, or occur irrespective of the location or magnitude of an earthquake.The team hopes that this method will provide contingency planners with a more complete picture of earthquake risk and potentially help guide the use of limited resources available for earthquake risk reduction.The ensemble modelling method is novel as it goes beyond the standard probabilistic (identifying all possible earthquake scenarios at a given site) and deterministic (worst-case-event) approaches, focusing instead on the impacts of multiple possible earthquake scenarios.Dr Tom Robinson, Durham University Department of Geography, said: "Earthquakes remain one of the deadliest natural hazards in the world and are a significant planning challenge for governments and aid agencies."Traditional assessments of seismic risk focus primarily on improving understanding of earthquake hazard, in terms of potential ground shaking but for contingency planning, it is the potential impacts of an earthquake that are of more importance."Our method provides critical information on the likelihood, and probable scale, of impacts in future earthquakes. We hope this can help better inform how governments and aid agencies direct limited disaster mitigation resources, for example how they distribute resources geographically."The research team hope that the ensemble modelling method will help planners to better understand where risks are greater, for example because of the relative vulnerability of communities, or their location in relation to identified likely earthquake impacts, and direct resources in a more targeted, informed way.As part of their study the research team worked with colleagues at Nepal's National Society of Earthquake Technology to use Nepal as a case study for their modelling approach.Together the team modelled fatalities from 90 different scenario earthquakes and established whether or not the impacts where specific to a certain scenario.Dr Robinson said: "The results showed that for most districts in Nepal similar impacts occurred irrespective of the scenario earthquake and that impacts were typically closer to the minimum rather than the worst-case scenario."This suggests that planning for the worst-case scenario in Nepal may place an unnecessarily large burden on the limited resources available."Our results also showed that the most at-risk districts are predominantly in rural western Nepal and that there are around 9.5 million Nepalese people who live in districts that are at a higher seismic risk than the capital, Kathmandu."Disaster risk reduction planning therefore needs to focus on rural, as well as urban, communities, as our modelling shows they are at higher risk."The results of the case study allow the team to demonstrate that a sole planning focus on urban earthquake risk in Kathmandu could be inappropriate, as many rural populations within Nepal are at greater relative risk.However, the new modelling approach is not only relevant to Nepal and can be applied anywhere, to help inform earthquake disaster risk reduction planning. | Earthquakes | 2,018 |
September 24, 2018 | https://www.sciencedaily.com/releases/2018/09/180924153409.htm | North Korea's 2017 bomb test set off later earthquakes, new analysis finds | Using newly refined analysis methods, scientists have discovered that a North Korean nuclear bomb test last fall set off aftershocks over a period of eight months. The shocks, which occurred on a previously unmapped nearby fault, are a window into both the physics of nuclear explosions, and how natural earthquakes can be triggered. The findings are described in two papers just published online in the journal | The September 3, 2017 underground test was North Korea's sixth, and by far largest yet, yielding some 250 kilotons, or about 17 times the size of the bomb that destroyed Hiroshima. Many experts believe the device was a hydrogen bomb -- if true, a significant advance from cruder atomic devices the regime previously exploded. The explosion itself produced a magnitude 6.3 earthquake. This was followed 8.5 minutes later by a magnitude 4 quake, apparently created when an area above the test site on the country's Mt. Mantap collapsed into an underground cavity occupied by the bomb.The test and collapse were picked up by seismometers around the world and widely reported at the time. But later, without fanfare, seismic stations run by China, South Korea and the United States picked up 10 smaller shocks, all apparently scattered within 5 or 10 kilometers around the test site. The first two came on Sept. 23, 2017; the most recent was April 22, 2018. Scientists assumed the bomb had shaken up the earth, and it was taking a while to settle back down. "It's not likely that there would be so many events in that small area over a small period of time," said the lead author of one of the studies, Won-Young Kim, a seismologist at Columbia University's Lamont-Doherty Earth Observatory. "These are probably triggered due to the explosion."After looking at the series of aftershock reports, Kim's group sifted more closely through the data and spotted three other aftershocks that had not previously been recognized, for a total of 13. The tremors were all modest, all between magnitude 2.1 and 3.4, and almost certainly harmless. In the past they would have been hard to pick out using far-off seismometers, he said. However, under new international cooperation agreements, he and colleagues obtained recordings from relatively nearby instruments including ones in Ussuriysk, Russia, a borehole in South Korea, and Mudanjiang, northeast China.The group then used a new analysis method developed in part by Lamont seismologist David Schaff that looks at energy waves that are much lower frequency and slower-moving seismic than those used in conventional earthquake analyses. These slow-moving waves allowed Schaff and the rest of the team to pinpoint the locations of the quakes with far greater precision than with conventional recordings. Instead of the random scatter initially seen, the quake locations lined up in a neat 700-meter-long row about 5 kilometers northwest of the blast -- indication of a hidden fracture.Seismometers have long been routinely used to verify nuclear test treaties, and scientists have become increasingly confident that they can detect even small tests and distinguish them from natural earthquakes. But the link between explosions and subsequent quakes is less studied. Seismologists documented a handful of apparent aftershocks near a Nevada test site in the 1970s, and near a Soviet test site in Kazakhstan in 1989. However, they were not able to pinpoint the locations of these quakes with the technology then available. With more instruments and the new analysis method, 'now we can see everything," said Paul Richards, a Lamont seismologist who coauthored the papers. "It's a radical improvement in cataloging even tiny, tiny earthquakes. It shows not just what we can do with natural earthquakes, but that we can monitor what the North Koreans are doing. North Korea can't do anything at all now [in secret] and expect to get away with it."Richards said the exact location of tiny quakes could also help in the so far largely fruitless quest by some seismologists to predict bigger quakes. Richards did not assert that quakes could eventually be predicted, but said, "If you're ever going to do this, you have to understand locations, and how one earthquake affects its neighbors."This spring, the North Koreans made a show of blowing up part of the Mt. Mantap site, though it may already have become largely unusable due to the destruction caused by previous explosions. And no nuclear tests have been detected since North Korean leader Kim Jong Un and U.S. president Donald Trump met in June to discuss ending North Korea's tests. However, despite boasts by Trump that North Korea's program has been neutralized, U.S. diplomats have noted evidence suggesting that the North continues to quietly develop its weapons.Lamont scientists have studied previous North Korean tests, including ones in 2013 and 2009; they concluded that reports of one in 2010 was a false alarm. The current studies were coauthored by Eunyoung Jo and Yonggyu Ryoo of the Korea Meteorological Administration. | Earthquakes | 2,018 |
September 18, 2018 | https://www.sciencedaily.com/releases/2018/09/180918131713.htm | Geoscientists find unexpected 'deep creep' near San Andreas, San Jacinto faults | A new analysis of thousands of very small earthquakes that have occurred in the San Bernardino basin near the San Andreas and San Jacinto faults suggests that the unusual deformation of some -- they move in a different way than expected -- may be due to "deep creep" 10 km below the Earth's surface, say geoscientists at the University of Massachusetts Amherst. | The new understanding should support more refined assessments of fault loading and earthquake rupture risk in the region, they add. Writing in the current online Cooke says, "These little earthquakes are a really rich data set to work with, and going forward if we pay more attention than we have in the past to the details they are telling us, we can learn more about active fault behavior that will help us better understand the loading that leads up to large damaging earthquakes."Over the past 36 years, the authors point out, seismic stations have recorded the style of deformation for thousands of small earthquakes in California's San Bernardino basin. They state, "Findings of this study demonstrate that small earthquakes that occur adjacent to and between faults can have very different style of deformation than the large ground rupturing earthquakes produced along active faults. This means that scientists should not use the information recorded by these small earthquakes in the San Bernardino basin to predict loading of the nearby San Andreas and San Jacinto faults."Cooke explains that the usual type of fault in the region is called a strike-slip fault, where the motion is one of blocks sliding past each other. The less common kind, with "anomalous slip-sense," is an extending fault, where the motion between blocks is like a wave pulling away from the beach, one block dropping at an angle away from the other, "extending" the fault. "These only occur in this one small area, and nobody knew why," she points out. "We did the modeling that helps to explain the enigmatic data."This is an area where Cooke, an expert in 3D fault modeling, has done research of her own and where she is familiar with the broader research field, so she decided to try to model what is happening. She began with a hypothesis based on her earlier 3D modeling in the area that had replicated long-term deformation over thousands of years."I noticed that this basin was in extension in those models unlike the surrounding regions of strike-slip," she says. "The extension was limited to within the basin just like the pattern of the anomalous extensional earthquakes. That gave me a clue that maybe those faults weren't locked as they should be between big earthquakes, but that at depths below 10 km, they were creeping.""The typical way we look for creep is to use GPS stations set up on each side of the fault. Over time, you can note that there is movement; the faults are creeping slowly apart. The problem here is that the San Andreas and the San Jacinto faults are so close together that the GPS is unable to resolve if there is creep or not. That's why no one had seen this before. The traditional way to detect it was not able to do so."Cooke adds, "In this paper we've shown that there is a way to have these weird tiny earthquakes all the time next to the San Jacinto Fault below 10 km, which is where deep creep may be happening. We show that it's plausible and can account for nearby enigmatic earthquakes. The model may not be perfectly correct, but it's consistent with observations."As noted, this work has implications for assessing fault loading, Beyer and Cooke point out. Until now, seismologists have assumed that faults in the region are locked -- no creep is taking place -- and they use data from all the little earthquakes to infer loading on the primary faults. However, Cooke and Beyer write, "scientists should not use the information recorded by these small earthquakes in the San Bernardino basin to predict loading of the nearby San Andreas and San Jacinto faults."Cooke adds, "Our earthquake catalog is growing every year; we can see smaller and smaller ones every year, so we thought why not take advantage of the networks we've built and we can look at them in more detail. We don't want to wait around for the faults to move in a damaging earthquake, we want to take advantage of all the tinier earthquakes happening all the time in order to understand how the San Andreas and San Jacinto are loaded. If we can understand how they are being loaded maybe we can understand better when these faults will going to rupture."This research was supported by the Southern California Earthquake Center, which is funded by cooperative agreements with the National Science Foundation and U.S. Geological Survey. | Earthquakes | 2,018 |
September 4, 2018 | https://www.sciencedaily.com/releases/2018/09/180904093838.htm | New imagery solves mystery of why Mount St. Helens is out of line with other volcanoes | Some of the clearest, most comprehensive images of the top several miles of the Earth's crust have helped scientists solve the mystery of why Mount St. Helens is located outside the main line of the Cascade Arc of volcanoes. | A giant subsurface rock formation some 20-30 miles in diameter, known as the Spirit Lake batholith, appears to have diverted magma and partially melted rock outside of the arc and to the west, forming the region's most active volcano.Results of the study, which was supported by the National Science Foundation and carried out in collaboration with the U.S. Geological Survey, are being published this week in Previous imaging studies have primarily utilized seismic methods. During natural earthquakes and artificially induced tremors -- by setting off explosions -- scientists can image some of the properties of subsurface rocks by tracking the sound waves. This method provides clues to the structure, density and temperature of the rocks.More recently, researchers are using "magnetotelluric," or MT data, which measures the Earth's subsurface electrical conductivity. Variations in the geomagnetic and geoelectric fields can reveal much about the subsurface structure and temperature, as well as the presence of fluids such as magma."Either method by itself can lead to a level of uncertainty, but when you layer them together as we have done in this project you get a much clearer picture of what lies below," said Adam Schultz, an Oregon State University geophysicist who is principal investigator on the NSF grant to OSU and co-author on the "The longer you run the measurements, the crisper the images and the deeper you can 'see' the subsurface. We were focusing on the upper 12-15 kilometers of the crust, but with a longer experiment we could see 200 to 300 kilometers below the surface."Understanding the formation of Mount St. Helens begins with plate tectonics. Similar to the present day, where the Juan de Fuca plate is being subducted beneath North America, in the past crustal blocks with marine sediments were "slammed into the continent, where they accreted," Schultz said."This material is more permeable than surrounding rock and allows the magma to move through it," he noted. "The big batholith acts kind of like a plug in the crust and diverted magma that normally would have erupted in line with the other major Cascade volcanoes, resulting in St. Helens forming to the west of the Cascadia Arc, and Mt. Adams slightly to the east."Mount St. Helens experienced a major eruption in May of 1980 and since has gone through periods of dome-building (2004-08) and dormancy. A study in 2006 by researchers from the University of Canterbury in New Zealand provided some images of the volcano's subsurface. During the next year, Schultz and the author of the 2006 study will use magnetotelluric technology to gather new and hopefully crisper images to see how much has changed since that study.Schultz said that the images from the latest study are clear enough that by continuously monitoring the geoelectric and geomagnetic fields, they may be able to detect changes in the movement of magma beneath Mount St. Helens, and perhaps other volcanoes."This may give us a new tool to monitor the magma cycle so we don't have to wait for the dome-building phase to tell us conditions are changing," Schultz said. | Earthquakes | 2,018 |
August 30, 2018 | https://www.sciencedaily.com/releases/2018/08/180830143220.htm | Injection wells can induce earthquakes miles away from the well | A study of earthquakes induced by injecting fluids deep underground has revealed surprising patterns, suggesting that current recommendations for hydraulic fracturing, wastewater disposal, and geothermal wells may need to be revised. | Researchers at UC Santa Cruz compiled and analyzed data from around the world for earthquakes clearly associated with injection wells. They found that a single injection well can cause earthquakes at distances more than 6 miles (10 kilometers) from the well. They also found that, in general, injecting fluids into sedimentary rock can cause larger, more distant earthquakes than injecting into the underlying basement rock."This is problematic, since the current advice is to preferentially inject into the sedimentary sequence as a theoretically safer alternative to the basement rock," said Emily Brodsky, professor of Earth and planetary sciences at UC Santa Cruz.Postdoctoral researcher Thomas Goebel said the key issue is the spatial footprint of induced seismicity around the injection well. "It's not that the basement rock is safe, because there is still the possibility of encountering a fault in the basement rock that can cause a large earthquake, but the probability is reduced because the spatial footprint is smaller," he said.In a paper published August 31 in The physical mechanism by which injection wells induce earthquakes was thought to be a direct result of increased fluid pressure in the pores of the rock, causing faults to slip more easily. This mechanism can account for the spatial pattern of seismicity seen with injection into basement rock, Goebel said. But the pattern seen with injection into sedimentary rock suggests a different mechanism resulting from efficient "poroelastic coupling," which controls the ability of the rock to transmit fluid stresses into the solid rock matrix."When you inject water into the ground, it pushes on the surrounding rock and creates elastic stress in the rock, which can put pressure on faults at a distance without putting water into those faults. So if poroelasticity is dominant, you end up with a larger footprint because it's loading neighboring faults beyond the area of increased pore pressure," Brodsky said.According to Goebel, the crystalline basement rock is stiffer and has lower porosity than sedimentary rock. "Therefore, the increase in pore pressure is limited to isolated pockets around the well, and the coupling of that with the overall stress field is low," he said.Goebel said their findings help explain the extent of induced seismicity in regions such as Oklahoma where there are many injection sites in oil and gas fields. Oklahoma has seen a dramatic surge in earthquakes since 2010, to the extent that there are now more earthquakes each year in Oklahoma than in California. Goebel and Brodsky did not include sites in Oklahoma in their study, however, because there are so many injection wells they couldn't isolate the effects of individual wells."In Oklahoma, they are injecting into the high-porosity sedimentary unit above the basement, but these elastic stresses can be transmitted over a large distance, so you could activate a large basement fault at a distance of 10 kilometers," Goebel said. "That may be what we're seeing in places like Oklahoma." | Earthquakes | 2,018 |
August 29, 2018 | https://www.sciencedaily.com/releases/2018/08/180829143753.htm | Earthquakes: Attacking aftershocks | In the weeks and months following a major earthquake, the surrounding area is often wracked by powerful aftershocks that can leave an already damaged community reeling and significantly hamper recovery efforts. | While scientists have developed empirical laws, like Bäth's Law and Ohmori's Law, to describe the likely size and timing of those aftershocks, methods for forecasting their location have been harder to grasp.But sparked by a suggestion from researchers at Google, Brendan Meade, a Professor of Earth and Planetary Sciences, and Phoebe DeVries, a post-doctoral fellow working in his lab, are using artificial intelligence technology to try to get a handle on the problem.Using deep learning algorithms, the pair analyzed a database of earthquakes from around the world to try to predict where aftershocks might occur, and developed a system that, while still imprecise, was able to forecast aftershocks significantly better than random assignment. The work is described in an August 30 paper published in "There are three things you want to know about earthquakes -- you want to know when they are going to occur, how big they're going to be and where they're going to be," Meade said. "Prior to this work we had empirical laws for when they would occur and how big they were going to be, and now we're working the third leg, where they might occur.""I'm very excited for the potential for machine learning going forward with these kind of problems -- it's a very important problem to go after," DeVries said. "Aftershock forecasting in particular is a challenge that's well-suited to machine learning because there are so many physical phenomena that could influence aftershock behavior and machine learning is extremely good at teasing out those relationships. I think we've really just scratched the surface of what could be done with aftershock forecasting...and that's really exciting."The notion of using artificial intelligent neural networks to try to predict aftershocks first came up several years ago, during the first of Meade's two sabbaticals at Google in Cambridge.While working on a related problem with a team of researchers, Meade said, a colleague suggested that that the then-emerging "deep learning" algorithms might make the problem more tractable. Meade would later partner with DeVries, who had been using neural networks to transform high performance computing code into algorithms that could run on a laptop to focus on aftershocks."The goal is to complete the picture and we hope we've contributed to that," Meade said.To do it, Meade and DeVries began by accessing a database of observations made following more than 199 major earthquakes."After earthquakes of magnitude 5 or larger, people spend a great deal of time mapping which part of the fault slipped and how much it moved," Meade said. "Many studies might use observations from one or two earthquakes, but we used the whole database...and we combined it with a physics-based model of how the Earth will be stressed and strained after the earthquake, with the idea being that the stresses and strains caused by the main shock may be what trigger the aftershocks."Armed with that information, they then separate an area found the into 5-kilometer-square grids. In each grid, the system checks whether there was an aftershock, and asks the neural network to look for correlations between locations where aftershocks occurred and the stresses generated by the main earthquake."The question is what combination of factors might be predictive," Meade said. "There are many theories, but one thing this paper does is clearly upend the most dominant theory -- it shows it has negligible predictive power, and it instead comes up with one that has significantly better predictive power."What the system pointed to, Meade said, is a quantity known as the second invariant of the deviatoric stress tensor -- better known simply as J2."This is a quantity that occurs in metallurgy and other theories, but has never been popular in earthquake science," Meade said. "But what that means is the neural network didn't come up with something crazy, it came up with something that was highly interpretable. It was able to identify what physics we should be looking at, which is pretty cool."That interpretability, DeVries said, is critical because artificial intelligence systems have long been viewed by many scientists as black boxes -- capable of producing an answer based on some data."This was one of the most important steps in our process," she said. "When we first trained the neural network, we noticed it did pretty well at predicting the locations of aftershocks, but we thought it would be important if we could interpret what factors it was finding were important or useful for that forecast."Taking on such a challenge with highly complex real-world data, however, would be a daunting task, so the pair instead asked the system to create forecasts for synthetic, highly-idealized earthquakes and then examining the predictions."We looked at the output of the neural network and then we looked at what we would expect if different quantities controlled aftershock forecasting," she said. "By comparing them spatially, we were able to show that J2 seems to be important in forecasting."And because the network was trained using earthquakes and aftershocks from around the globe, Meade said, the resulting system worked for many different types of faults."Faults in different parts of the world have different geometry," Meade said. "In California, most are slip-faults, but in other places, like Japan, they have very shallow subduction zones. But what's cool about this system is you can train it on one, and it will predict on the other, so it's really generalizable.""We're still a long way from actually being able to forecast them," she said. "We're a very long way from doing it in any real-time sense, but I think machine learning has huge potential here."Going forward, Meade said, he is working on efforts to predict the magnitude of earthquakes themselves using artificial intelligence technology with the goal of one day helping to prevent the devastating impacts of the disasters."Orthodox seismologists are largely pathologists," Meade said. "They study what happens after the catastrophic event. I don't want to do that -- I want to be an epidemiologist. I want to understand the triggers, causing and transfers that lead to these events."Ultimately, Meade said, the study serves to highlight the potential for deep learning algorithms to answer questions that -- until recently -- scientists barely knew how to ask."I think there's a quiet revolution in thinking about earthquake prediction," he said. "It's not an idea that's totally out there anymore. And while this result is interesting, I think this is part of a revolution in general about rebuilding all of science in the artificial intelligence era."Problems that are dauntingly hard are extremely accessible these days," he continued. "That's not just due to computing power -- the scientific community is going to benefit tremendously from this because...AI sounds extremely daunting, but it's actually not. It's an extraordinarily democratizing type of computing, and I think a lot of people are beginning to get that." | Earthquakes | 2,018 |
August 23, 2018 | https://www.sciencedaily.com/releases/2018/08/180823095952.htm | Research into deadly 2016 Italian earthquakes could improve future seismic forecasts | The timing and size of three deadly earthquakes that struck Italy in 2016 may have been pre-determined, according to new research that could improve future earthquake forecasts. | A joint British-Italian team of geologists and seismologists have shown that the clustering of the three quakes might have been caused by the arrangement of a cross-cutting network of underground faults.The findings show that although all three earthquakes occurred on the same major fault, several smaller faults prevented a single massive earthquake from occurring instead and also acted as pathways for naturally occurring fluids that triggered later earthquakes.The cluster of three earthquakes, termed a "seismic sequence" by seismologists, each had magnitudes greater than six and killed more than 300 people in Italy's Apennine mountains between 24 August and 30 October 2016.The research, led by Durham University, UK, comes ahead of the second anniversary of the start of the earthquake sequence.The study is published in the journal The researchers say the findings could have wider implications for the study of seismic hazards, enabling scientists to better understand potential earthquake sequences following a quake.Dr Richard Walters, Assistant Professor in the Department of Earth Sciences, Durham University, said: "These results address a long-standing mystery in earthquake science -- why a major fault system sometimes fails in a single large earthquake that ruptures its entire length, versus failing in multiple smaller earthquakes drawn-out over months or years."Our results imply that even though we couldn't have predicted when the earthquake sequence would start, once it got going, both the size and timing of the major earthquakes may have been pre-determined by the arrangement of faults at depth."This is all information we could hypothetically know before the event, and therefore, this could be a hugely important avenue for improving future earthquake forecasts."Dr Walters and the team used satellite data to estimate which part of the fault failed in each earthquake, and compared this pattern with the location and timing of thousands of tiny aftershocks throughout the seismic sequence.They found that intersections of small faults with the main fault system separated each of the three largest earthquakes, strongly suggesting these intersections stop the growth of each earthquake and prevent the faults failing in a single large event.But in addition, the scientists also found that after the first earthquake, thousands of aftershocks crept northwards along these same fault intersections at a rate of around 100 metres per day, in a manner consistent with naturally occurring water and gas being pumped along the faults by the first earthquake on 24 August, 2016.The second earthquake, on the 26 October, occurred exactly when these fluids reached its location, therefore controlling the relative timing of failure.Dr Walters added: "It was a big surprise that these relatively small faults were having such a huge influence over the whole sequence."They stop the first earthquake in its tracks, and then they channel the fluids that start the sequence up again months later. No-one's ever seen this before."Co-author Dr Laura Gregory, in the School of Earth and Environment, at the University of Leeds, UK, said it was important to understand whether or not a fault fails in a seismic sequence, and that the team's results were only made possible by combining a varied array of different datasets.Dr Gregory said: "A seismic sequence has vastly different implications for seismic hazard compared to a single large earthquake. If the faults in Italy in 2016 had failed together in one big event, the impact on the local population would have been much worse."This is the first time we've ever had this quality of modern data over one of these earthquake sequences, and bringing together a range of specialists was key for unpicking how the earthquakes related to one another."I was scrambling over the mountainside immediately after each earthquake with British and Italian colleagues, measuring the metre-high cliffs that had suddenly formed. Meanwhile, other members of our team were analysing data from seismometers stationed around the world, or were mapping the tiny bending of the ground around the faults using satellites orbiting the planet at 500 miles altitude."The research was partly supported by the UK's Natural Environment Research Council, via an Urgency Grant, and through the Centre for the Observation and Modelling of Earthquakes, Volcanoes and Tectonics (COMET). | Earthquakes | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822150815.htm | Seismic noise tracks water levels in underground aquifers | Seismic noise -- the low-level vibrations caused by everything from subway trains to waves crashing on the beach -- is most often something seismologists work to avoid. They factor it out of models and create algorithms aimed at eliminating it so they can identify the signals of earthquakes. | But Tim Clements thinks it might be tool to monitor one of the most precious resources in the world -- water.A graduate student working in the lab of Assistant Professor of Earth and Planetary Sciences Marine Denolle, Clements is the lead author of a recent study that used seismic noise to measure the size and the water levels in underground aquifers in California. The technique could even be used to track whether and how aquifers rebound following precipitation, and understand geological changes that might occur as water is pumped out. The study is described in a recently-published paper in "The way this would commonly be done today would be to take a measurement at a groundwater well," Clements said. "And if you have a network of those wells, you can develop a model where you assume a number of hydrological parameters...and that allows you to measure the health of the aquifer."But what we showed is we can just directly measure these waves that are travelling through the entire aquifer," he continued. "So we don't have to make those assumptions, because we can directly measure the waves."Using those measurements, researchers were able to measure the water depth of the San Gabriel Valley aquifer, located just outside Los Angeles, to within a centimeter. Efforts to measure the size of the aquifer were limited by the existing seismic network, Clements said, and so were accurate only to about a kilometer."That gives us a way to begin thinking about volume," Denolle said. "What we found is that using this method the volume we calculated as having been pumped out of the aquifer equaled the volume that was published.""We estimated it at about half a cubic kilometer," Clements said. "And that's exactly what the San Gabriel water master said they pumped out during the drought to meet demand."That drought, Clements said, was one reason researchers chose to focus on the San Gabriel Valley."They had experienced a massive drought over the last five years, and there are over 1 million people who live in this relatively small area outside Los Angeles who depend on the groundwater for all their water-use needs," he said. "Over the past five years, they had lost a large amount of ground water, and there's a large financial cost to that, so our goal was to understand if we can use seismic waves to understand what's happening with the aquifer."The region is also already equipped with a network of seismographs, he said, making it relatively easy to obtain seismic noise data and use it to examine the aquifer.While the study wasn't the first to hit upon the idea of using seismic noise to study groundwater, Denolle said earlier efforts were hampered because they relied on a signal that was relatively weak in comparison to environmental factors like temperature and pressure."This was a large signal we looked at," she said. "The aquifer oscillated with 20 meters of water-height changes in a couple years, so it's a bigger signal than any environmental influence."The system could also be a useful tool for anyone involved in water resource management, Clements said, because it can give them a moment-to-moment view of precisely what is happening in an underground aquifer."This could be used for water management," Clements said. "In this study, we looked at about 17 years of data, from 2000 to 2017, but going forward this could be used in a water management application, so you could get a picture of what's happening with the aquifer on a daily basis."Aside from providing groundwater measurements, the technique can also be used to monitor the health of an aquifer over time."If we had the data, we may be able to use this technology to look back at what aquifers looked like the past and study the long-term evolution of an aquifer," Denolle said. "One of the challenges for people who manage water resources is whether aquifers still respond elastically, meaning can we recharge it with the same storagage capacity or is it losing capacity over time as we pump water out? Using seismic waves, we can potentially find out whether these aquifers are elastic or not."Going forward, Clements said, he plans to pursue ways to improve the resolution of the system at both the micro and macro levels.Working in collaboration with faculty at Tufts University, he installed wells and seismometers on campus to track changes as groundwater is pumped to the surface to irrigate sports fields. Other efforts are focused on using the existing seismometer network in California to improve ways to measure the overall size of aquifers.This research was supported with funding from Harvard University. | Earthquakes | 2,018 |
August 22, 2018 | https://www.sciencedaily.com/releases/2018/08/180822141037.htm | A milestone for forecasting earthquake hazards | Earthquakes pose a profound danger to people and cities worldwide, but with the right hazard-mitigation efforts, from stricter building requirements to careful zoning, the potential for catastrophic collapses of roads and buildings and loss of human lives can be limited. | All of these measures depend on science delivering high-quality seismic hazard models. And yet, current models depend on a list of uncertain assumptions, with predictions that are difficult to test in the real world due to the long intervals between big earthquakes.Now, a team of researchers from Columbia University's Lamont-Doherty Earth Observatory, University of Southern California, University of California at Riverside and the U.S. Geological Survey has come up with a physics-based model that marks a turning point in earthquake forecasting. Their results appear in the new issue of "Whether a big earthquake happens next week or 10 years from now, engineers need to build for the long run," says the study's lead author, Bruce Shaw, a geophysicist at Lamont-Doherty. "We now have a physical model that tells us what the long-term hazards are."Simulating nearly 500,000 years of California earthquakes on a supercomputer, researchers were able to match hazard estimates from the state's leading statistical model based on a hundred years of instrumental data. The mutually validating results add support for California's current hazard projections, which help to set insurance rates and building design standards across the state. The results also suggest a growing role for physics-based models in forecasting earthquake hazard and evaluating competing models in California and other earthquake prone regions.The earthquake simulator used in the study, RSQSim, simplifies California's statistical model by eliminating many of the assumptions that go into estimating the likelihood of an earthquake of a certain size hitting a specific region. The researchers, in fact, were surprised when the simulator, programmed with relatively basic physics, was able to reproduce estimates from a model that has improved steadily for decades. "This shows our simulator is ready for prime time," says Shaw.Seismologists can now use RSQSim to test the statistical model's region-specific predictions. Accurate hazard estimates are especially important to government regulators in high-risk cities like Los Angeles and San Francisco, who write and revise building codes based on the latest science. In a state with a severe housing shortage, regulators are under pressure to make buildings strong enough to withstand heavy shaking while keeping construction costs down. A second tool to confirm hazard estimates gives the numbers added credibility."If you can get similar results with different techniques, that builds confidence you're doing something right," says study coauthor Tom Jordan, a geophysicist at USC.A hallmark of the simulator is its use of rate and state-dependent friction to approximate how real-world faults break and transfer stress to other faults, sometimes setting off even bigger quakes. Developed at UC Riverside more than a decade ago, and refined further in the current study, RSQSim is the first physics-based model to replicate California's most recent rupture forecast, UCERF3. When results from both models were fed into California's statistical ground-shaking model, they came up with similar hazard profiles.John Vidale, director of the Southern California Earthquake Center, which helped fund the study, says the new model has created a realistic 500,000-year history of earthquakes along California's faults for researchers to explore. Vidale predicted the model would improve as computing power grows and more physics are added to the software. "Details such as earthquakes in unexpected places, the evolution of earthquake faults over geological time, and the viscous flow deep under the tectonic plates are not yet built in," he said.The researchers plan to use the model to learn more about aftershocks, and how they unfold on California's faults, and to study other fault systems globally. They are also working on incorporating the simulator into a physics-based ground-motion model, called CyberShake, to see if it can reproduce shaking estimates from the current statistical model."As we improve the physics in our simulations and computers become more powerful, we will better understand where and when the really destructive earthquakes are likely to strike," says study coauthor Kevin Milner, a researcher at USC. | Earthquakes | 2,018 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.