Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
August 21, 2018 | https://www.sciencedaily.com/releases/2018/08/180821145226.htm | Steps to keep buildings functioning after natural hazards | After an earthquake, hurricane, tornado or other natural hazard, it's considered a win if no one gets hurt and buildings stay standing. But an even bigger victory is possible: keeping those structures operational. This outcome could become more likely with improved standards and codes for the construction of residential and commercial buildings, according to a new report recently delivered to the U.S. Congress by the National Institute of Standards and Technology (NIST). | "Current standards and codes focus on preserving lives by reducing the likelihood of significant building damage or structural collapse from hazards," said Steven McCabe, director of the NIST-led, multiagency National Earthquake Hazards Reduction Program (NEHRP) and one of the authors of the new publication. "But they generally don't address the additional need to preserve quality of life by keeping buildings habitable and functioning as normally as possible, what we call 'immediate occupancy.' The goal of our report is to put the nation on track to achieve this performance outcome."The impact of a natural hazard on a community is usually most evident in the lives lost and physical destruction, but the accompanying economic shock, social disruptions and reduced quality of life can often be devastating as well. "Cities and towns can be rebuilt, but lifestyles are damaged, sometimes permanently, if businesses, schools, utilities, transportation and other essential operations are out of service for an extended period," said Therese McAllister, manager of NIST's Community Resilience Program and another report author.The infamous 1906 San Francisco earthquake provides a dramatic example of that impact. In the half-century following the 1840s Gold Rush in California, San Francisco was the fastest growing metropolitan area in the region. That all changed on April 18, 1906, when the quake and resulting fires destroyed 80 percent of the city, killed some 3,000 residents and left nearly 300,000 people -- three-fourths of the population -- homeless, out of work and without essential services. Though San Francisco would rebuild quickly, the disaster diverted trade, industry and people south to Los Angeles, which then supplanted the "City by the Bay" as the largest, most important urban center in the western United States.Even with modern building codes and standards in place, there is still room for improvement, as evidenced by the massive damage from the May 2011 tornado in Joplin, Missouri, and the three major 2017 hurricanes striking Texas, Florida and Puerto Rico."Immediate occupancy performance measures would help avoid catastrophes because they could build up a community's resiliency against natural hazards so that people still can live at home, still can go to work and still can have the supporting infrastructure providing them services such as water and electricity," McCabe said.In 2017, Congress tasked NIST to define what it would take to achieve immediate occupancy performance codes and standards for all buildings in all types of natural hazards, specifically in terms of fundamental research needs, possible technological applications based on that research and key strategies that could be used to implement any resulting regulations.The result of that effort is the new NIST report, Research Needs to Support Immediate Occupancy Building Performance Objective Following Natural Hazard Events. The publication identifies a large portfolio of research and implementation activities that target enhanced performance objectives for residential and commercial buildings."The report provides valuable information about steps that could be taken to achieve immediate occupancy in the future," McAllister said.The potential research activities presented in the report to Congress were developed with the assistance of a steering committee of recognized experts and stakeholder input obtained during a national workshop hosted by NIST in January 2018. The workshop participants identified four key areas that they believe must be considered when developing plans to achieve immediate occupancy performance: building design, community needs, economic and social impacts, and fostering acceptance and use of new practices.For example, the report states that immediate occupancy performance measures must be developed, established and implemented with a sensitivity to how they will economically affect building owners, business operators, occupants and even whole communities. "You have to make sure that the cost of keeping buildings functional after natural hazards remains reasonable enough that everyone will be able to afford them," McCabe said.The report also discusses key challenges facing the effort to make buildings functional in the wake of natural hazards, such as motivating communities to make the investment, managing how costs and benefits are balanced, and garnering public support.Finally, the report concludes by recognizing that "increasing the performance goals for buildings would not be easily achieved, but the advantages may be substantial" and that making such objectives a reality "would entail a significant shift in practice for development, construction and maintenance or retrofit of buildings." The report, its authors state, is the first step toward creating an action plan to achieve immediate occupancy across the nation with coordinated and detailed research goals and implementation activities."Our report outlines the steps that could be taken for a big raise of the bar -- perhaps the biggest change in building standards and codes in 50 years -- but one we believe is possible," McCabe said. | Earthquakes | 2,018 |
August 21, 2018 | https://www.sciencedaily.com/releases/2018/08/180821094150.htm | New Antarctic rift data has implications for volcanic evolution | New data revealing two tectonic plates fused to form a single Antarctic Plate 15 million years later than originally predicted and this extra motion has major implications for understanding of the tectono-volcanic activity surrounding the Pacific Ocean, from the Alpine mountains in New Zealand to the California geological setting, according to research from Ben-Gurion University of the Negev (BGU). | In a study published in "Since Antarctica tectonically connects the Pacific Plate to the rest of the world, these results have important ramifications for understanding the tectonic evolution around the Pacific Ocean -- the rise of New Zealand's Alpine Mountains, motions along the San Andreas Fault in California, and more," says Dr. Granot.Over 200 million years ago, a rift bisected Antarctica. The motion between East Antarctic and West Antarctic Plates accommodated along the length of this rift created one of the longest mountain ranges in the world (the Transantarctic Mountains). It also caused the eruption of hundreds of volcanoes, mostly under the ice sheets, and shaped the sub-ice topography. These motions dictated, and still dictate, the heat flow rate that the crust releases under the ice and is one of the factors controlling the rate by which the glaciers are advancing toward the surrounding southern ocean.GPS data and a lack of seismic activity suggest that the rift in Antarctica is no longer tectonically active. According to the researchers, one of the key unanswered question was: How did the plates drift relative to each other over the last 26 million years and when did the rift stop being active?New marine geophysical data recorded during two excursions on a French icebreaker enabled Drs. Roi Granot and Jérôme Dyment to date the ocean floor and calculate the relative motion between the Antarctic Plates and the Australian Plate."Antarctica forms an important link in the global plate tectonic circuits which enable to calculate the motion along different plate boundaries. Understanding past plate motions between East and West Antarctica therefore affects our ability to accurately predict the kinematic evolutions of other plate boundaries," says Dr. Granot. | Earthquakes | 2,018 |
August 15, 2018 | https://www.sciencedaily.com/releases/2018/08/180815141444.htm | Climate change sea level rises could increase risk for more devastating tsunamis worldwide | As sea levels rise due to climate change, so do the global hazards and potential devastating damages from tsunamis, according to a new study by a partnership that included Virginia Tech. | Even minor sea-level rise, by as much as a foot, poses greater risks of tsunamis for coastal communities worldwide.The threat of rising sea levels to coastal cities and communities throughout the world is well known, but new findings show the likely increase of flooding farther inland from tsunamis following earthquakes. Think of the tsunami that devasted a portion of northern Japan after the 2011 Tohoku-Oki earthquake, causing a nuclear plant to melt down and spread radioactive contamination.These findings are at the center of a new "Our research shows that sea-level rise can significantly increase the tsunami hazard, which means that smaller tsunamis in the future can have the same adverse impacts as big tsunamis would today," Weiss said, adding that smaller tsunamis generated by earthquakes with smaller magnitudes occur frequently and regularly around the world. For the study, Weiss was critical in helping create computational models and data analytics frameworks.At Virginia Tech, Weiss serves as director of the National Science Foundation-funded Disaster Resilience and Risk Management graduate education program and is co-lead of Coastal@VT, comprised of 45 Virginia Tech faculty from 13 departments focusing on contemporary and emerging coastal zone issues, such as disaster resilience, migration, sensitive ecosystems, hazard assessment, and natural infrastructure.For the study, Weiss and his partners, including Lin Lin Li, a senior research fellow, and Adam Switzer, an associate professor, at the Earth Observatory of Singapore, created computer-simulated tsunamis at current sea level and with sea-level increases of 1.5 feet and 3 feet in the Chinese territory of Macau. Macau is a densely populated coastal region located in South China that is generally safe from current tsunami risks.At current sea level, an earthquake would need to tip past a magnitude of 8.8 to cause widespread tsunami inundation in Macau. But with the simulated sea-level rises, the results surprised the team.The sea-level rise dramatically increased the frequency of tsunami-induced flooding by 1.2 to 2.4 times for the 1.5-foot increase and from 1.5 to 4.7 times for the 3-foot increase. "We found that the increased inundation frequency was contributed by earthquakes of smaller magnitudes, which posed no threat at current sea level, but could cause significant inundation at higher sea-level conditions," Li said.In the simulated study of Macau -- population 613,000 -- Switzer said, "We produced a series of tsunami inundation maps for Macau using more than 5,000 tsunami simulations generated from synthetic earthquakes prepared for the Manila Trench." It is estimated that sea levels in the Macau region will increase by 1.5 feet by 2060 and 3 feet by 2100, according to the team of U.S.-Chinese scientists.The hazard of large tsunamis in the South China Sea region primarily comes from the Manila Trench, a megathrust system that stretches from offshore Luzon in the Philippines to southern Taiwan. The Manila Trench megathrust has not experienced an earthquake larger than a magnitude 7.8 since the 1560s. Yet, study co-author Wang Yu, from the National Taiwan University, cautioned that the region shares many of the characteristics of the source areas that resulted in the 2004 Sumatra-Andaman earthquake, as well as the 2011 earthquake in northern Japan, both causing massive loss of life.These increased dangers from tsunamis build on already known difficulties facing coastal communities worldwide: The gradual loss of land directly near coasts and increased chances of flooding even during high tides, as sea levels increase as the Earth warms."The South China Sea is an excellent starting point for such a study because it is an ocean with rapid sea-level rise and also the location of many mega cities with significant worldwide consequences if impacted. The study is the first if its kind on the level of detail, and many will follow our example," Weiss said.Policymakers, town planners, emergency services, and insurance firms must work together to create or insure safer coastlines, Weiss added."Sea-level rise needs to be taken into account for planning purposes, for example for reclamation efforts but also for designing protective measures, such as seawalls or green infrastructure."He added, "What we assumed to be the absolute worst case a few years ago now appears to be modest for what is predicted in some locations. We need to study local sea-level change more comprehensively in order to create better predictive models that help to make investments in infrastructure that are or near sustainable." | Earthquakes | 2,018 |
August 15, 2018 | https://www.sciencedaily.com/releases/2018/08/180815102909.htm | Predicting landslide boundaries two weeks before they happen | University of Melbourne researchers have developed a software tool that uses applied mathematics and big data analytics to predict the boundary of where a landslide will occur, two weeks in advance. | Professor Antoinette Tordesillas from the School of Mathematics and Statistics said there are always warning signs in the lead up to a collapse or 'failure', the tricky part is identifying what they are."These warnings can be subtle. Identifying them requires fundamental knowledge of failure at the microstructure level -- the movement of individual grains of earth," Professor Tordesillas said."Of course, we cannot possibly see the movement of individual grains in a landslide or earthquake that stretches for kilometres, but if we can identify the properties that characterise failure in the small-scale, we can shed light on how failure evolves in time, no matter the size of the area we are observing."These early clues include patterns of motion that change over time and become synchronised."In the beginning, the movement is highly disordered," said Professor Tordesillas. "But as we get closer to the point of failure -- the collapse of a sand castle, crack in the pavement or slip in an open pit mine -- motion becomes ordered as different locations suddenly move in similar ways."Our model decodes this data on movement and turns it into a network, allowing us to extract the hidden patterns on motion and how they are changing in space and time. The trick is to detect the ordered motions in the network as early as possible, when movements are very subtle."Professor Robin Batterham from the Department of Chemical and Biomolecular Engineering said the new software focuses on turning algorithms and big data into risk assessment and management actions that can save lives."People have gone somewhat overboard on so-called data analytics, machine learning and so on," said Professor Batterham."While we've been doing this sort of stuff for 40 years, this software harnesses the computer power and memory available to look not just at the surface movement, but extract the relevant data patterns. We're able to do things that were just unimaginable in a mathematical sense 30 years ago."We can now predict when a rubbish landfill might break in a developing country, when a building will crack or the foundation will move, when a dam could break or a mudslide occur. This software could really make a difference." | Earthquakes | 2,018 |
August 6, 2018 | https://www.sciencedaily.com/releases/2018/08/180806095201.htm | Earthquakes can be weakened by groundwater | Around 100,000 earthquakes are recorded worldwide every year, but not all are naturally occurring. Some of the weaker ones are triggered by human activity underground -- this is referred to as induced seismicity. Researchers from EPFL's Laboratory of Experimental Rock Mechanics (LEMR) and the Ecole Normale Supérieure in Paris have just completed a study into the role of fluids in the propagation of induced earthquakes in an effort to decipher the underlying mechanisms. Their findings include the extremely counterintuitive discovery that highly pressurized water in the vicinity of an earthquake tends to limit -- rather than increase -- its intensity. These results were published today in | Induced earthquakes can be the result of activities like mining, gas and oil extraction, toxic waste or COGeothermal energy is captured by tapping into subterranean heat. Highly pressurized water is pumped into the earth's crust at a depth of between two and four kilometers. The water is then recovered as steam and used to drive an electricity-producing turbine. "Injecting water can affect water-rock equilibria and disrupt nearby faults, thus triggering earthquakes in the area," says Marie Violay, who runs LEMR.This type of earthquake is a thorn in the side of geothermal proponents, notes Mateo Acosta, a PhD student at LEMR and the study's lead author: "These earthquakes may be low in intensity, but they can cause damage and affect public opinion -- to the point of derailing projects."Acosta ran tests in which he sought to replicate earthquake conditions in order to study the impact of different levels of underground water pressure on fault dynamics. He focused mainly on earthquake propagation, which is when the two plates in a fault rub against each other, sending seismic waves out into the surrounding area."Rock friction generates a significant amount of heat, which further fuels the propagation effect," says the PhD student. "Some of this heat is absorbed by the water in the surrounding rock, and the amount absorbed depends to a large extent on the water's thermodynamic parameters. What we learned from our experiments is that the closer the fluid's initial pressure is to the critical pressure of water, the weaker the earthquake will be.""This research shows that the initial fluid pressure in the rocks is crucial, especially at depths commonly reached by geothermal activities. Geothermal models need to take this into account," says François-Xavier Passelègue, an LEMR researcher and the study's second author.The laboratory recently acquired sophisticated equipment that can be used to simulate pressure and temperature levels at a depth of 10 to 15 kilometers in the earth's crust. The researchers plan to use this equipment to more accurately measure the impact of groundwater on earthquake intensity. | Earthquakes | 2,018 |
August 2, 2018 | https://www.sciencedaily.com/releases/2018/08/180802102352.htm | Earthquakes can systematically trigger other ones on opposite side of Earth | New research shows that a big earthquake can not only cause other quakes, but large ones, and on the opposite side of the Earth. | The findings, published Aug. 2 in Scientists at Oregon State University looked at 44 years of seismic data and found clear evidence that temblors of magnitude 6.5 or larger trigger other quakes of magnitude 5.0 or larger.It had been thought that aftershocks -- smaller magnitude quakes that occur in the same region as the initial quake as the surrounding crust adjusts after the fault perturbation -- and smaller earthquakes at great distances -- were the main global effects of very large earthquakes.But the OSU analysis of seismic data from 1973 through 2016 -- an analysis that excluded data from aftershock zones -- using larger time windows than in previous studies, provided discernible evidence that in the three days following one large quake, other earthquakes were more likely to occur.Each test case in the study represented a single three-day window "injected" with a large-magnitude (6.5 or greater) earthquake suspected of inducing other quakes, and accompanying each case was a control group of 5,355 three-day periods that didn't have the quake injection."The test cases showed a clearly detectable increase over background rates," said the study's corresponding author, Robert O'Malley, a researcher in the OSU College of Agricultural Sciences. "Earthquakes are part of a cycle of tectonic stress buildup and release. As fault zones near the end of this seismic cycle, tipping points may be reached and triggering can occur."The higher the magnitude, the more likely a quake is to trigger another quake. Higher-magnitude quakes, which have been happening with more frequency in recent years, also seem to be triggered more often than lower-magnitude ones.A tremblor is most likely to induce another quake within 30 degrees of the original quake's antipode -- the point directly opposite it on the other side of the globe."The understanding of the mechanics of how one earthquake could initiate another while being widely separated in distance and time is still largely speculative," O'Malley said. "But irrespective of the specific mechanics involved, evidence shows that triggering does take place, followed by a period of quiescence and recharge."Earthquake magnitude is measured on a logarithmic 1-10 scale -- each whole number represents a 10-fold increase in measured amplitude, and a 31-fold increase in released energy.The largest recorded earthquake was a 1960 temblor in Chile that measured 9.5. The 2011 quake that ravaged the Fukushima nuclear power plant in Japan measured 9.0.In 1700, an approximate magnitude 9.0 earthquake hit the Cascadia Subduction Zone -- a fault that stretches along the West Coast of North American from British Columbia to California.Collaborating with O'Malley were Michael Behrenfeld of the College of Agricultural Sciences, Debashis Mondal of the College of Science and Chris Goldfinger of the College of Earth, Ocean and Atmospheric Sciences. | Earthquakes | 2,018 |
July 31, 2018 | https://www.sciencedaily.com/releases/2018/07/180731125543.htm | Urban geophone array offers new look at northern Los Angeles basin | Using an array of coffee-can sized geophones deployed for about a month in backyards, golf courses and public parks, researchers collected enough data to allow them to map the depth and shape of the San Gabriel and San Bernardino sedimentary basins of Los Angeles, California. | Seismologists think these sedimentary basins may act a "waveguide" to focus and trap energy from an earthquake on the southern San Andreas Fault, so understanding their structure is important to help predict how well they might channel the energy from such an earthquake into downtown Los Angeles.The research team, led by Patricia Persaud of Louisiana State University and Robert Clayton from the California Institute of Technology, was able to map the two basins in greater detail than previous studies, showing that the San Gabriel basin is deeper than the San Bernardino basin and that the San Bernardino basin has an irregular shape. Persaud and colleagues also uncovered signs of deep offsets in layers of the earth's crust that could be related to two faults -- the Red Hill and Raymond faults -- that have been previously mapped in nearby areas at the surface."It is currently too early to say how our results will change how we might think of these basins' ability to channel seismic energy," Persaud said. "However, we are collecting more data in the area that will be used to further refine the basin structure."Geophones are instruments that convert the velocity of ground motion into voltage which can be used to determine the geometry of structures beneath the earth's surface. Visualizing the details of sedimentary basin structure requires a large number of seismic stations that are closely spaced in order to capture important changes in structure laterally across the basin. Geophone arrays offer an inexpensive and feasible way to collect this data in a densely-populated urban area, compared to the complications and expense of deploying broadband seismometers, Persaud noted.Each of the 202 nodes deployed in the study, in three lines spanning the northern basins, are about the size of a coffee can. "They weigh about six pounds and have a data logger, battery and recorder all in one container," Persaud explained. "To place them in the ground we dig a small hole that will allow the nodes to be covered with about two inches of soil once they are firmly planted. Most Los Angeles area residents tell us to put them wherever we want, some even help us dig the holes; so we choose a site in their yards and in about five minutes we have the node in place and recording."In most cases, property owners were "extremely friendly and accommodating" during the current study, said Persaud. "What's interesting is when we got a positive response it was almost immediate. Los Angeles residents are very much aware of the elevated seismic hazard in this region, and are often curious about our study and the nodes, and they want to find out more. Some offer to spread the word about our study through social media and encourage their friends and neighbors to participate as well."The nodes collected data continuously for 35 days. In this time, they detected ground motion from magnitude 6 and greater earthquakes that occurred thousands of kilometers away from Los Angeles. Seismic waveform data from these teleseismic earthquakes can be used with a method called the receiver function technique to map the thickness of the crust and shallow crustal structures below a seismic station. The receiver functions calculated from the nodal arrays are similar to those calculated from broadband data, the researchers concluded, but the nodal array offers a higher-resolution look at crustal structures such as the boundary between the earth's crust and mantle and the interface between sediments and basement rock across the basins.This summer, the research team is back in California placing nodes "along new lines that are intended to fill in any areas where there might be a change in basin shape," said Persaud. "We have just deployed three new profiles and will then compile the results from all of our profiles to produce an updated structural model for the basins." | Earthquakes | 2,018 |
July 30, 2018 | https://www.sciencedaily.com/releases/2018/07/180730120329.htm | New understanding of deep earthquakes | Researchers have known for decades that deep earthquakes -- those deeper than 60 kilometers, or about 37 miles below the Earth's surface -- radiate seismic energy differently than those that originate closer to the surface. But a systematic approach to understanding why has been lacking. | Now a team of researchers from the University of Houston has reported a way to analyze seismic wave radiation patterns in deep earthquakes to suggest global deep earthquakes are in anisotropic rocks, something that had not previously been done. The rock anisotropy refers to differences in seismic wave propagation speeds when measured along different directions.Their findings were published Monday, July 30, by the journal Most earthquakes occur at shallow depths, according to the U.S. Geological Survey, and they generally cause more damage than deeper earthquakes. But there are still substantial questions about the causes of deep earthquakes.Normal rocks are ductile, or pliable, at these great depths because of high temperature and thus aren't able to rupture in an abrupt fashion to produce deep earthquakes, which occur below subduction zones where two tectonic plates collide at ocean trenches. The plate being pushed under is referred to as the subducting slab. The fact that deep earthquakes occur only in these slabs suggests some unusual process is happening within the slab.Yingcai Zheng, assistant professor of seismic imaging in the UH College of Natural Sciences and Mathematics and corresponding author for the paper, said seismologists have sought to understand deep earthquakes since the phenomenon was discovered in 1926. Hypotheses include the effect of fluids, runaway thermal heating or solid-phase change due to sudden collapse of the mineral crystal structure.In addition to Zheng, researchers involved in the work include the first author Jiaxuan Li, a Ph.D. candidate in the Department of Earth and Atmospheric Sciences; Leon Thomsen, research professor of geophysics; Thomas J. Lapen, professor of geology; and Xinding Fang, adjunct professor at UH and concurrently associate professor at the Southern University of Science and Technology China."Over the past 50 years, there has been growing evidence that a large proportion of deep earthquakes do not follow the double-couple radiation pattern seen in most shallow earthquakes," Zheng said. "We set out to look at why that happens." The double-couple pattern is caused by a shear rupture of a pre-existing fault.The work, funded by the National Science Foundation, looked at potential reasons for the differing radiation patterns; Zheng said earlier theories suggest that deep earthquakes stem from a different rupture mechanism and possibly different physical and chemical processes than those that spark shallow earthquakes.But after studying the radiation patterns of 1,057 deep earthquakes at six subduction zones worldwide, the researchers determined another explanation. They found that the surrounding rock fabric enclosing the deep quake alters the seismic radiation into a non-double-couple pattern. "Both the common double-couple radiation patterns and uncommon patterns of deep earthquakes can be explained simultaneously by shear rupture in a laminated rock fabric," Li said.Before the subducting plate enters the trench, it can absorb sea water to form hydrated anisotropic minerals. As the slab descends in the Earth's mantle, the water can be expelled due to high pressure and high temperature conditions, a process known as dehydration. The dehydration and strong shearing along the slab interface can make the rock brittle and lead to rupture in intermediate-depth earthquakes, defined as those between 60 kilometers and 300 kilometers deep (37 miles to 186 miles)."We found at these depths that the anisotropic rock fabric is always parallel to the slab surface, although the slab can change directions greatly from place to place," Li said.Anisotropy is also found in rocks at even greater depths, which suggests materials such as magnesite or aligned carbonatite melt pockets may be involved in generating the deep ruptures, the researchers said. Because the inferred anisotropy is high -- about 25 percent -- the widely believed meta-stable solid phase change mechanism is not able to provide the needed anisotropy inferred by the researchers. | Earthquakes | 2,018 |
July 26, 2018 | https://www.sciencedaily.com/releases/2018/07/180726085827.htm | Yellowstone super-volcano eruptions were produced by gigantic ancient oceanic plate, study finds | The long-dormant Yellowstone super-volcano in the American West has a different history than previously thought, according to a new study by a Virginia Tech geoscientist. | Scientists have long thought that Yellowstone Caldera, part of the Rocky Mountains and located mostly in Wyoming, is powered by heat from the Earth's core, similar to most volcanoes such as the recently active Kilauea volcano in Hawaii. However, new research published in Nature Geoscience by Ying Zhou, an associate professor with the Virginia Tech College of Science's Department of Geosciences, shows a different past."In this research, there was no evidence of heat coming directly up from the Earth's core to power the surface volcano at Yellowstone," Zhou said. "Instead, the underground images we captured suggest that Yellowstone volcanoes were produced by a gigantic ancient oceanic plate that dove under the Western United States about 30 million years ago. This ancient oceanic plate broke into pieces, resulting in perturbations of unusual rocks in the mantle which led to volcanic eruptions in the past 16 million years."The eruptions were very explosive, Zhou added. A theoretical seismologist, Zhou created X-ray-like images of the Earth's deep interior from USArray -- part of the Earthscope project funded by the National Science Foundation -- and discovered an anomalous underground structure at a depth of about 250 to 400 miles right beneath the line of volcanoes."This evidence was in direct contradiction to the plume model," Zhou said.In her study, Zhou found the new images of the Earth's deep interior showed that the oceanic Farallon plate, which used to be where the Pacific Ocean is now, wedged itself beneath the present-day Western United States. The ancient oceanic plate was broken into pieces just like the seafloor in the Pacific today. A section of the subducted oceanic plate started tearing off and sinking down to the deep earth.The sinking section of oceanic plate slowly pushed hot materials upward to form the volcanoes that now make up Yellowstone. Further, the series of volcanoes that make up Yellowstone have been slowly moving, achingly so, ever since. "The process started at the Oregon-Idaho border about 16 million years ago and propagated northeastward, forming a line of volcanoes that are progressively younger as they stretched northeast to present-day Wyoming," Zhou added.The previously-held plume model was used to explain the unique Yellowstone hotspot track -- the line of volcanoes in Oregon, Idaho, and Wyoming that dots part of the Midwest. "If the North American plate was moving slowly over a position-fixed plume at Yellowstone, it will displace older volcanoes towards the Oregon-Idaho border and form a line of volcanoes, but such a deep plume has not been found." Zhou said. So, what caused the track? Zhou intends to find out."It has always been a problem there, and scientists have tried to come up with different ways to explain the cause of Yellowstone volcanoes, but it has been unsuccessful," she said, adding that hotspot tracks are more popular in oceans, such as the Hawaii islands. The frequent Geyser eruptions at Yellowstone are of course not volcanic eruptions with magna, but due to super-heated water. The last Yellowstone super eruption was about 630,000 years ago, according to experts. Zhou has no predictions on when or if Yellowstone could erupt again.The use of the X-ray-like images for this study is unique in itself. Just as humans can see objects in a room when a light is on, Zhou said seismometers can see structures deep within the earth when an earthquake occurs. The vibrations spread out and create waves when they hit rocks. The waves are detected by seismometers and used in what is known as diffraction tomography."This is the first time the new imaging theory has been applied to this type of seismic data, which allowed us to see anomalous structures in the Earth's mantle that would otherwise not be resolvable using traditional methods," Zhou said.Zhou will continue her study of Yellowstone. "The next step will be to increase the resolution of the X-ray-like images of the underground rock," she added."More detailed images of the unusual rocks in the deep earth will allow us to use computer simulation to recreate the fragmentation of the gigantic oceanic plate and test different scenarios of how rock melting and magma feeding system work for the Yellowstone volcanoes." | Earthquakes | 2,018 |
July 16, 2018 | https://www.sciencedaily.com/releases/2018/07/180716114538.htm | Sound waves reveal enormous diamond cache deep in Earth's interior | There may be more than a quadrillion tons of diamond hidden in the Earth's interior, according to a new study from MIT and other universities. But the new results are unlikely to set off a diamond rush. The scientists estimate the precious minerals are buried more than 100 miles below the surface, far deeper than any drilling expedition has ever reached. | The ultradeep cache may be scattered within cratonic roots -- the oldest and most immovable sections of rock that lie beneath the center of most continental tectonic plates. Shaped like inverted mountains, cratons can stretch as deep as 200 miles through the Earth's crust and into its mantle; geologists refer to their deepest sections as "roots."In the new study, scientists estimate that cratonic roots may contain 1 to 2 percent diamond. Considering the total volume of cratonic roots in the Earth, the team figures that about a quadrillion (1016) tons of diamond are scattered within these ancient rocks, 90 to 150 miles below the surface."This shows that diamond is not perhaps this exotic mineral, but on the [geological] scale of things, it's relatively common," says Ulrich Faul, a research scientist in MIT's Department of Earth, Atmospheric, and Planetary Sciences. "We can't get at them, but still, there is much more diamond there than we have ever thought before."Faul's co-authors include scientists from the University of California at Santa Barbara, the Institut de Physique du Globe de Paris, the University of California at Berkeley, Ecole Polytechnique, the Carnegie Institution of Washington, Harvard University, the University of Science and Technology of China, the University of Bayreuth, the University of Melbourne, and University College London.Faul and his colleagues came to their conclusion after puzzling over an anomaly in seismic data. For the past few decades, agencies such as the United States Geological Survey have kept global records of seismic activity -- essentially, sound waves traveling through the Earth that are triggered by earthquakes, tsunamis, explosions, and other ground-shaking sources. Seismic receivers around the world pick up sound waves from such sources, at various speeds and intensities, which seismologists can use to determine where, for example, an earthquake originated.Scientists can also use this seismic data to construct an image of what the Earth's interior might look like. Sound waves move at various speeds through the Earth, depending on the temperature, density, and composition of the rocks through which they travel. Scientists have used this relationship between seismic velocity and rock composition to estimate the types of rocks that make up the Earth's crust and parts of the upper mantle, also known as the lithosphere.However, in using seismic data to map the Earth's interior, scientists have been unable to explain a curious anomaly: Sound waves tend to speed up significantly when passing through the roots of ancient cratons. Cratons are known to be colder and less dense than the surrounding mantle, which would in turn yield slightly faster sound waves, but not quite as fast as what has been measured."The velocities that are measured are faster than what we think we can reproduce with reasonable assumptions about what is there," Faul says. "Then we have to say, 'There is a problem.' That's how this project started."The team aimed to identify the composition of cratonic roots that might explain the spikes in seismic speeds. To do this, seismologists on the team first used seismic data from the USGS and other sources to generate a three-dimensional model of the velocities of seismic waves traveling through the Earth's major cratons.Next, Faul and others, who in the past have measured sound speeds through many different types of minerals in the laboratory, used this knowledge to assemble virtual rocks, made from various combinations of minerals. Then the team calculated how fast sound waves would travel through each virtual rock, and found only one type of rock that produced the same velocities as what the seismologists measured: one that contains 1 to 2 percent diamond, in addition to peridotite (the predominant rock type of the Earth's upper mantle) and minor amounts of eclogite (representing subducted oceanic crust). This scenario represents at least 1,000 times more diamond than people had previously expected."Diamond in many ways is special," Faul says. "One of its special properties is, the sound velocity in diamond is more than twice as fast as in the dominant mineral in upper mantle rocks, olivine."The researchers found that a rock composition of 1 to 2 percent diamond would be just enough to produce the higher sound velocities that the seismologists measured. This small fraction of diamond would also not change the overall density of a craton, which is naturally less dense than the surrounding mantle."They are like pieces of wood, floating on water," Faul says. "Cratons are a tiny bit less dense than their surroundings, so they don't get subducted back into the Earth but stay floating on the surface. This is how they preserve the oldest rocks. So we found that you just need 1 to 2 percent diamond for cratons to be stable and not sink."In a way, Faul says cratonic roots made partly of diamond makes sense. Diamonds are forged in the high-pressure, high-temperature environment of the deep Earth and only make it close to the surface through volcanic eruptions that occur every few tens of millions of years. These eruptions carve out geologic "pipes" made of a type of rock called kimberlite (named after the town of Kimberley, South Africa, where the first diamonds in this type of rock were found). Diamond, along with magma from deep in the Earth, can spew out through kimberlite pipes, onto the surface of the Earth.For the most part, kimberlite pipes have been found at the edges of cratonic roots, such as in certain parts of Canada, Siberia, Australia, and South Africa. It would make sense, then, that cratonic roots should contain some diamond in their makeup."It's circumstantial evidence, but we've pieced it all together," Faul says. "We went through all the different possibilities, from every angle, and this is the only one that's left as a reasonable explanation."This research was supported, in part, by the National Science Foundation. | Earthquakes | 2,018 |
July 5, 2018 | https://www.sciencedaily.com/releases/2018/07/180705113954.htm | Reconstruction of underwater avalanche sheds light on geohazards that threaten underwater telecommunication cables | As part of an international team, a researcher from the University of Liverpool reconstructed the 1929 Grand Banks underwater avalanche to better understand these common geohazards, which threaten critical seafloor infrastructure. | The 1929 Grand Banks underwater avalanche, which was triggered by an earthquake off the coast of Newfoundland, is the first and only underwater avalanche of this size to have been directly measured.Despite being extremely common, little is known about underwater avalanches as they are exceptionally difficult to measure; inaccessible and destructive. However, they pose a major geohazard to seafloor infrastructure, such as telecommunication cables that carry more than 95% of global internet traffic, and oil and gas pipelines.Dr Chris Stevenson, a lecturer in Quantitative Sedimentology at the University's School of Earth Sciences, was part of the team that revisited the area in order to reconstruct the avalanche.The team mapped the bathymetry of the seafloor where the 1929 avalanche passed through and collected core samples of deposits that it left behind. They then combined this forensic evidence with the historic measurements of flow speed from the old cable breaks to reconstruct the properties of the avalanche.Dr Stevenson, who was chief sedimentologist on the research cruise, said: "It is awe inspiring when you piece together how big and powerful this avalanche was: 230 m thick, which is about the height of Canary Wharf in London, moving at 40 mph, and highly concentrated with fist-sized boulders, gravel, sand and mud. It would not have been a good place to be at the time.""Underwater avalanches are a bit of a mystery to scientists because they are really difficult to measure directly. What tends to happen is that avalanches destroy the measuring equipment you place in their path."The Grand Banks avalanche was the first, and remains the only, giant underwater avalanche that has been directly measured. At the time, it transformed how scientists viewed the seafloor and it's taken almost 90 years for us to revisit the area and confidently piece together its fundamental properties.""This research cruise has enabled us to reconstruct the fundamental properties of this underwater avalanche which has implications for seafloor infrastructure. It can help provide engineers and modellers with the information they need to design expensive seafloor installations to withstand similar flows around the world, or build them out of harm's way."It also provides the first real-world example of a giant avalanche from which scientists can use to validate their theories and models."Triggered by a 7.2 magnitude earthquake, the Grand Banks underwater avalanche was huge; generating a tsunami that killed 28 people and burying an area the size of the UK in half a metre of sand and mud. It was highly destructive and broke seafloor telecommunications cables along its path.The exact location and timings of the cable breaks were recorded, meaning that the speed of the avalanche could be calculated. | Earthquakes | 2,018 |
July 5, 2018 | https://www.sciencedaily.com/releases/2018/07/180705100430.htm | Upper and lower plate controls on the 2011 Tohoku-oki earthquake | Researchers at Tohoku University's Department of Geophysics, have been studying the great Tohoku-oki earthquake which occurred on March 11, 2011, to the east of Japan's Honshu Island. | The earthquake, which registered with a moment magnitude (Mw) of 9.0, was the most powerful earthquake ever recorded in Japan, and the fourth most powerful earthquake in the world since modern record-keeping began in 1900. It triggered powerful tsunami waves causing over 18,000 causalities. The tsunami caused nuclear accidents at the Fukushima Daiichi Nuclear Power Plant, and subsequent evacuations affected hundreds of thousands of residents. This earthquake has attracted great interest among researchers, because few experts expected such a large earthquake would occur in that area.In Northeast Japan (Tohoku), the Pacific plate is subducting northwestward beneath the Okhotsk plate, causing the 2011 Tohoku-oki earthquake. Subduction is a process where one of Earth's tectonic plates sinks under another. To date, many researchers have investigated the causal mechanism of the Tohoku-oki earthquake, and a key question has arisen: Which plate controlled this huge earthquake? The upper Okhotsk plate or the lower Pacific plate? There have been conflicting results, because the detailed structure in and around the source zone is still unclear.The Tohoku University team, comprising Dapeng Zhao and Xin Liu (now at Ocean University of China), applied a method of seismic tomography*1 to over 144,000 P-wave arrival-time data recorded by the dense Japanese seismic network to determine a high-resolution tomography beneath the Tohoku-oki region. They also used seafloor topography and gravity data to constrain the structure of the source zone.Seismic tomography is an effective tool for investigating the three-dimensional (3-D) structure of the Earth's interior, in particular, for clarifying the detailed structure of large earthquake source areas. Using this method, the team received clear 3-D images of the Tohoku-oki source zone, and showed that the 2011 Tohoku-oki earthquake occurred in an area with high seismic velocity in the Tohoku megathrust zone*2. This high-velocity area reflects a mechanically strong (hard) patch which was responsible for the 2011 Tohoku-oki earthquake. This hard patch results from both granitic batholiths*3 in the overriding Okhotsk plate and hard rocks atop the subducting Pacific plate.These results indicate that structural anomalies in and around the Tohoku megathrust originate from both the upper Okhotsk plate and the lower Pacific plate, which controlled the generation and rupture processes of the 2011 Tohoku-oki earthquake. This huge earthquake was caused by collision of harder rocks in both the upper and lower plates. This work sheds new light on the causal mechanism of megathrust earthquakes. It also suggests that the location of a future great earthquake may be pinpointed by investigating the detailed structure of the megathrust zone.*1 Seismic tomography: A method to image the three-dimensional structure of the Earth's interior by inverting abundant seismic wave data generated by many earthquakes and recorded at many seismic stations.*2 Megathrust: A great thrust fault where a tectonic plate subducts beneath another plate. The lower plate is called subducting plate, and the upper one is called overriding plate. In Tohoku, the upper and lower plates are the Okhotsk and Pacific plates, respectively.*3 Granitic batholith: A batholith is a large mass of intrusive igneous rock, larger than 100 square kilometers in area, that forms from cooled magma deep in the Earth's crust. Granitic batholiths are much harder than other rocks such as sedimentary materials. | Earthquakes | 2,018 |
July 3, 2018 | https://www.sciencedaily.com/releases/2018/07/180703112844.htm | A bright and vibrant future for seismology | Fibre-optic cables can be used to detect earthquakes and other ground movements. The data cables can also pick up seismic signals from hammer shots, passing cars or wave movements in the ocean. This is the result of a study appearing in the journal | The scientists sent pulses of laser light through an optical fibre, which was part of a 15 kilometer long cable deployed in 1994, within the telecommunication network on Reykjanes peninsula, SW Island, crossing a well-known geological fault zone in the rift between Eurasian and American tectonic plates. The light signal was analyzed and compared to data sets from a dense network of seismographs. The results amazed even experts: "Our measurements revealed structural features in the underground with unprecedented resolution and yielded signals equaling data points every four meters," says Philippe Jousset from the GFZ. He adds: "This is denser than any seismological network worldwide." After presenting preliminary ideas at several conferences since 2016 Philippe was told that the new method was a 'game changer for seismology'. Although the method is not new in other applications (it is used for years in boreholes for reservoir monitoring), the team is the first worldwide that carried out such measurements along the surface of the ground for seismological objectives, and with such a long cable.Their current study not only shows well-known faults and volcanic dykes. The scientists also found a previously unknown fault, below the ground surface. Furthermore, the team measured subsurface deformation taking place over a period of several minutes. Small local earthquakes, waves originating from large distant quakes, and microseism of the ocean floor were also recorded via fibre-optic cable. "We only need one strand of a modern fibre-optic line," says Charlotte Krawczyk, Director of GFZ's Geophysics Department.The advantages of the new method are enormous as there are countless fibre-optic cables spanning the globe in the dense telecommunication network. Especially beneath megacities with high seismic hazards, such as San Francisco, Mexico City, Tokyo, or Istanbul, and many others, such cables could provide a cost efficient and widely spread addition to existing seismological measuring devices.Future studies are planned to investigate whether deep-sea cables can also be used for seismic measurements. The scientists are optimistic. They think that the cables on the sea floor will detect submarine earthquakes, ground motions of tectonic plates, and also variations of the water pressure. Thus, the new method will help seismologists as well as oceanographers. | Earthquakes | 2,018 |
June 27, 2018 | https://www.sciencedaily.com/releases/2018/06/180627160232.htm | Seismologists use massive earthquakes to unlock secrets of the outer core | By applying new data and Princeton's supercomputers to the classic question of what lies beneath our feet, Princeton seismologist Jessica Irving and an international team of colleagues have developed a new model for the Earth's outer core, a liquid iron region deep in the Earth. | The outer core is churning constantly, sustaining the planet's magnetic field and providing heat to the mantle. "Understanding the outer core is crucial for understanding the history of the magnetic field," said Irving, an assistant professor of geosciences. Her team's work appears today in the journal "The model we have produced, EPOC -- Elastic Parameters of the Outer Core -- is going to be the background model, the one thing that underlies everything else," said Irving. The researchers describe EPOC as an outer core update of the existing Preliminary Earth Reference Model (PREM), a model of how fundamental Earth properties vary with depth, which was developed almost 40 years ago.The key data in the research came from "normal modes," which are standing waves that can be measured after the very largest earthquakes, typically magnitude 7.5 or higher. Unlike the body waves and surface waves that most seismologists study, normal modes are "the vibration of the whole Earth at once, which is kind of an amazing thing to think about," Irving said. "We could say that the Earth 'rings like a bell,' at characteristic frequencies."The new model, EPOC, was first envisioned at a four-week summer science workshop where Irving was housed with fellow seismologists Sanne Cottaar, at the University of Cambridge, and Vedran Leki?, at the University of Maryland-College Park."PREM is a venerable, very simple, well-regarded model, but it can't represent any small-scale structures," Irving said. "We thought, 'Can we make a simple model, with even fewer parameters than PREM, that does the job just as well?' It turned out we could make a model that does the job much better."For one, EPOC reduces the need for a "complicated little layer" at the boundary between the core and the mantle, she said. Researchers in recent decades had found discrepancies between the PREM-predicted body wave velocity and the data they were finding, especially at the top of the core, and some had argued that there must be an anomalously slow layer hidden there. They debated how thick it should be -- estimates range from 50 to 300 miles -- and exactly what it must be composed of.Her team's model doesn't offer any more specifics than PREM, Irving said, "but we suggest that because EPOC fits the data better, maybe you don't need this little layer." And additionally, it provides information about the material properties of the outer core.The outer core is vitally important to the thermal history of the planet and its magnetic field, said Irving, but "it's not tangible. We can't show you a rock from the outer core. But at the same time, it is such a huge section of our planet. The core has roughly 30 percent of the mass of the planet in it. The crust is insignificant by comparison. There is so much that we don't understand about the deep earth -- and these aren't even the complicated properties. We're just looking for the very slowly varying bulk properties."To create their model, Irving and fellow seismologists pooled their skills. Cottaar had experience with equations of state -- the physics explaining the connections between temperature, pressure, volume and other fundamental characteristics -- and Leki? was fluent in Bayesian techniques, a probabilistic approach that helped the team sift through countless possible models and find the most likely ones. And because of her background with normal mode seismology, Irving knew how to work with the newly updated dataset."So all three of us were seismologists with different specialized skill sets, and we liked to have coffee at breakfast together," Irving said. "It's so much fun doing science with friends."The researchers fed the equations of state into Princeton's Tiger supercomputer cluster to generate millions of possible models of the outer core. "Every six seconds we created a new model," Irving said. "Some we rejected because they looked wrong. We have scientific tests for 'wrong,' for models that say things like, 'The mass of the Earth should be twice what we think it is.'"The team then took the best of the models and used them to predict what frequencies the whole Earth would shake at after a massive earthquake. The researchers compared the measured frequencies of normal modes to the predictions from their models until they found their preferred model.When teaching about normal modes, Irving uses the metaphor of two bells, one of brass and one of steel, both painted white. "If you hit those bells, you'll get different notes out of them, and that will tell you that you have different materials in there," she said. "The exact frequencies -- the exact pitch that the Earth at shakes after these very large earthquakes -- depends on the material properties of the Earth. Just like we can't see through the paint on the bells, we can't see through the planet, but we can listen for the pitch, the frequencies of these whole-Earth observations, and use them to make inferences about what's going on deep in the Earth." | Earthquakes | 2,018 |
June 27, 2018 | https://www.sciencedaily.com/releases/2018/06/180627160212.htm | Study yields a new scale of earthquake understanding | Nanoscale knowledge of the relationships between water, friction and mineral chemistry could lead to a better understanding of earthquake dynamics, researchers said in a new study. Engineers at the University of Illinois at Urbana-Champaign used microscopic friction measurements to confirm that, under the right conditions, some rocks can dissolve and may cause faults to slip. | The study, published in the journal "Water is everywhere in these systems," said Rosa Espinosa-Marzal, a civil and environmental engineering professor and co-author of the study. "There is water on the surface of minerals and in the pore spaces between mineral grains in rocks. This is especially true with calcite-containing rocks because of water's affinity to the mineral."According to the researchers, other studies have correlated the presence of water with fault movement and earthquakes, but the exact mechanism remained elusive. This observation is particularly prevalent in areas where fracking operations are taking place -- a process that involves a lot of water.The study focuses on calcite-rich rocks in the presence of brine -- naturally occurring salty groundwater -- along fault surfaces. The rock surfaces that slide past each other along faults are not smooth. The researchers zoomed in on the naturally occurring tiny imperfections or unevenness on rocks' surfaces, called asperities, at which friction and wear originate when the two surfaces slide past each other."The chemical and physical properties of faulted rocks and mechanical conditions in these systems are variable and complex, making it difficult to take every detail into account when trying to answer these types of questions," Espinosa-Marzal said. "So, to help understand water's role in fault dynamics, we looked at a scaled-down, simplified model by examining single asperities on individual calcite crystals."For the experiments, the team submerged calcite crystals in brine solutions at various concentrations and subjected them to different pressures to simulate a natural fault setting. Once the crystals were in equilibrium with the solution, they used an atomic force microscope to drag a tiny arm with a silicon tip -- to simulate the asperity -- across the crystal to measure changes in friction.In most of the experiments, the researchers first found what they expected: As the pressure applied on the crystals increased, it became more difficult to drag the tip across the crystal's surface. However, when they increased pressure to a certain point and the tip was moved slowly enough, the tip began to slide more easily across the crystal."This tells us that something has happened to this tiny asperity under higher pressures that caused a decrease in friction," said graduate student and co-author Yijue Diao. "The atomic force microscope also allows us to image the crystal surface, and we can see that the groove increased in size, confirming that calcite had dissolved under pressure. The dissolved mineral and water acted as a good lubricant, thereby causing the observed weakening of the single-asperity contact.""This shows that studies such as these warrant serious consideration in future work," Espinosa-Marzal said. The researchers acknowledge that there are still many questions to address related to this research. However, their work demonstrates that certain brine-calcite interactions, under applied stress, induce dissolution and decrease the frictional strength at the single-asperity scale."Our research also suggests that it might be possible to mitigate earthquake risk by purposely changing brine compositions in areas that contain calcite-rich rocks. This consideration could be beneficial in areas where fracking is taking place, but this concept requires much more careful investigation," Espinosa-Marzal said."As a young scientist who works at the nanoscale, I never thought that earthquake dynamics would be the type of thing I would be researching," Diao said. "However, we have learned so much about things at the macroscale that nanoscale studies like ours can reveal new critical insights into many large-scale natural phenomena." | Earthquakes | 2,018 |
June 26, 2018 | https://www.sciencedaily.com/releases/2018/06/180626113328.htm | Geologists detail likely site of San Andreas Fault's next major quake | Back in 1905, the Colorado River, swollen with heavy rainfall and snowmelt, surged into a dry lake bed along California's San Andreas Fault and formed the Salton Sea. The flood waters submerged most of the small town of Salton, along with nearby tribal lands. The inundation also covered a key, seismically active stretch of the San Andreas Fault's southern tip in silt, hiding evidence of its potential volatility. | Utah State University geologist Susanne Jänecke began hypothesizing the location and geometry of the sediment-obscured fault zone more than a decade ago. After securing funding from the Southern California Earthquake Center in 2011, she, along with USU graduate student Dan Markowski and colleagues, embarked on the painstaking task of documenting the uplifted, highly folded and faulted area with geologic mapping and analysis.The geologists' persistence revealed a nearly 15.5-mile-long, sheared zone with two, nearly parallel master faults and hundreds of smaller, rung-like cross faults. Dubbed the "Durmid Ladder" by the team, the well-organized structure could be the site of the region's next major earthquake. Jänecke, Markowski, USU colleague Jim Evans, Patricia Persaud of Louisiana State University and Miles Kenney of California's Kenney GeoScience, reported findings in the June 19, 2018, online issue of The discovery of the Durmid Ladder reveals the southern tip of the San Andreas Fault changes fairly gradually into the ladder-like Brawley Seismic zone. The structure trends northwest, extending from the well-known main trace of the San Andreas Fault along the Salton Sea's northeastern shore, to the newly identified East Shoreline Fault Zone on the San Andreas' opposite edge."We now have critical evidence about the possible nucleation site of the next major earthquake on the San Andreas Fault," says Jänecke, professor in USU's Department of Geology. "That possible nucleation site was thought to be a small area near Bombay Beach, California, but our work suggests there may be an additional, longer 'fuse' south of the Durmid Ladder within the 37-mile-long Brawley Seismic zone."Future earthquakes in that zone or near the San Andreas Fault could potentially trigger a cascade of earthquakes leading to the overdue major quake scientists expect along the southern San Andreas fault zone, she says."Fortunately, the northern continuation of the newly identified East Shoreline strand of the San Andreas Fault is farther away from major population centers than we first thought," Jänecke says. "The fault lies along the eastern edge of Coachella Valley. In addition, the broken rock throughout the ladder structure could damped ground-shaking associated with the next large earthquake."On the other hand, she says the Durmid Ladder present an increase in the surface-rupture hazard in Durmid Hill and, if the Brawley Seismic Zone is involved, the next large earthquake might be slightly larger than scientists previously expected.Among the tools Jänecke and her team used to identify the fault were high resolution aerial photography and false color imaging."Many months of fieldwork were critical to the research," she says. "We relied on this imagery to integrate the field study into our map of the complex ladder structure."Geophysical imaging and drilling confirmed the northward extend and identified the tilted fault zone in the subsurface near Palm Springs."On the ground and to our eyes, all of the tan-colored sediment looks the same," Jänecke says. "But further analysis with digital imaging tools highlighted the slight color differences of distinctive marker units."These markers, she says, allowed the team to recognize the hundreds of faults that displace the 3-0.2 million-year-old sedimentary rocks of the Durmid Ladder."The new maps and analysis revealed the ladder structure, which is a particular type of 'step-over,' where overlapping fault strands have many connecting cross faults," Jänecke says. "It's not clear now past earthquakes interacted with this structure and that makes its future behavior difficult to predict."Until now, the main trace of the San Andreas Fault has been the only well-studied active fault this area, she says. "We need further study of the Durmid Ladder, the East Shoreline Fault and other fault zones of this area to identify the potential for surface-faulting hazards, ground sharing and cascading ruptures, to determine how to mitigate the risk posed by these important structures." | Earthquakes | 2,018 |
June 19, 2018 | https://www.sciencedaily.com/releases/2018/06/180619164153.htm | Site of the next major earthquake on the San Andreas Fault? | Many researchers hypothesize that the southern tip of the 1300-km-long San Andreas fault zone (SAFZ) could be the nucleation site of the next major earthquake on the fault, yet geoscientists cannot evaluate this hazard until the location and geometry of the fault zone is documented. | In their new paper for This newly identified Durmid ladder structure is at least 25 km long, has tens of master right-lateral and right-reverse faults along its edges and hundreds of left- and right-lateral cross faults in between. The Durmid ladder structure trends northwest, extends from the well-known main trace of the San Andreas fault (mSAF) on the northeast side to a newly identified East Shoreline fault zone (ESF) on the opposite edge.Many years of detailed field study validated the team's 2011 hypothesis about the existence of the East Shoreline strand of the SAFZ northeast of the margin of the Salton Sea, and this paper documents this previously unknown active fault using geophysical and geologic datasets along the entire northeast margin of Coachella Valley, California. The East Shoreline fault, say the authors, probably becomes the Garnet Hills fault, north of Palm Springs, and together they parallel the mSAF for >100 km.Uplifted, highly folded and faulted Pliocene to Holocene sedimentary rocks, evidence for pervasive shortening, map-scale damage zones, and extremely rapid block rotation indicate that convergence across the Durmid ladder structure of the SAFZ is the smaller, secondary component that accompanies more rapid right-lateral motions. Small amounts of shallow creep and triggered slip regularly produce hairline fractures along the mSAF and Jänecke and colleagues recognize identical features within the ESF and along some cross faults of the Durmid ladder structure.It is not clear how past earthquakes interacted with this well-organized multi-fault structure, and, notes Jänecke, this makes future behavior difficult to predict. The mSAF was the only active fault considered by the geoscience community in this crucial area prior to our detailed study.New and published geophysical data sets and drill hole data in Coachella Valley show that the East Shoreline fault is a voluminous fault zone that extends in all three dimensions. It is well-imaged southwest of the mSAF and appears to persist into the subsurface at the southwest edge of a flower structure that may converge and simplify at depth.In such an interpretation, the ESF is steep, dips northeast, and is a key structure at the basinward edge of an asymmetric flower-like structure identified by Fuis et al. (2017) directly northwest of this study area. Southward, the Durmid ladder structure widens gradually as it bends and interacts with the even wider Brawley Seismic zone. The component of shortening across the southernmost San Andreas fault zone gives way along strike to components of extension in the Brawley Seismic Zone within a defined transition zone. This geometry makes it likely that both fault zones could fail during a single earthquake, as suggested by prior research.Several-kilometer-wide strike-slip fault zones, like the southern 30 km of the SAFZ, occur along many active faults and underlie metropolitan areas. The 2016 Mw 7.8 Kaikoura earthquake in New Zealand revealed that ladder-like fault zones can be enormous (at least 25 km wide and 150 km long) and fail in a piecemeal fashion. The surface-faulting hazards, ground shaking, and cascading ruptures that might arise from interactions among faults in active, voluminous fault zones are not well understood or quantified and much research is needed to mitigate the risk posed by this important type of structure. | Earthquakes | 2,018 |
June 18, 2018 | https://www.sciencedaily.com/releases/2018/06/180618112859.htm | 'Slow earthquakes' on San Andreas Fault increase risk of large quakes | Geologists have long thought that the central section of California's famed San Andreas Fault -- from San Juan Bautista southward to Parkfield, a distance of about 80 miles -- has a steady creeping movement that provides a safe release of energy. | Creep on the central San Andreas during the past several decades, so the thinking goes, has reduced the chance of a big quake that ruptures the entire fault from north to south.However new research by two Arizona State University geophysicists shows that the earth movements along this central section have not been smooth and steady, as previously thought.Instead, the activity has been a sequence of small stick-and-slip movements -- sometimes called "slow earthquakes" -- that release energy over a period of months. Although these slow earthquakes pass unnoticed by people, the researchers say they can trigger large destructive quakes in their surroundings. One such quake was the magnitude 6 event that shook Parkfield in 2004."What looked like steady, continuous creep was actually made of episodes of acceleration and deceleration along the fault," says Mostafa Khoshmanesh, a graduate research assistant in ASU's School of Earth and Space Exploration (SESE). He is the lead author of a "We found that movement on the fault began every one to two years and lasted for several months before stopping," says Manoochehr Shirzaei, assistant professor in SESE and co-author of the paper."These episodic slow earthquakes lead to increased stress on the locked segments of the fault to the north and south of the central section," Shirzaei says. He points out that these flanking sections experienced two magnitude 7.9 earthquakes, in 1857 (Fort Tejon) and 1906 (San Francisco).The scientists also suggest a mechanism that might cause the stop-and-go movements."Fault rocks contain a fluid phase that's trapped in gaps between particles, called pore spaces," Khoshmanesh says. "Periodic compacting of fault materials causes a brief rise in fluid pressure, which unclamps the fault and eases the movement."The two scientists used synthetic aperture radar data from orbit for the years 2003 to 2010. This data let them map month-to-month changes in the ground along the central part of the San Andreas. They combined the detailed ground movement observations with seismic records into a mathematical model. The model let them explore the driving mechanism of slow earthquakes and their link to big nearby quakes."We found that this part of the fault has an average movement of about three centimeters a year, a little more than an inch," says Khoshmanesh. "But at times the movement stops entirely, and at other times it has moved as much as 10 centimeters a year, or about four inches."The picture of the central San Andreas Fault emerging from their work suggests that its stick-and-slip motion resembles on a small timescale how the other parts of the San Andreas Fault move.They note that the new observation is significant because it uncovers a new type of fault motion and earthquake triggering mechanism, which is not accounted for in current models of earthquake hazards used for California.As Shirzaei explains, "Based on our observations, we believe that seismic hazard in California is something that varies over time and is probably higher than what people have thought up to now." He adds that accurate estimates of this varying hazard are essential to include in operational earthquake forecasting systems.As Khoshmanesh says, "Based on current time-independent models, there's a 75% chance for an earthquake of magnitude 7 or larger in both northern and southern California within next 30 years." | Earthquakes | 2,018 |
June 6, 2018 | https://www.sciencedaily.com/releases/2018/06/180606090513.htm | Disaster recovery requires rebuilding livelihoods | The short-term losses people suffer when natural disasters strike can turn into long-term poverty if reconstruction policies don't consider how people are going to make a living. | Areas rich in biodiversity raise those stakes even more as people's needs compete with environmental protections, according to a new Michigan State University (MSU) study that is the first to focus on the "livelihood portfolio" of families when evaluating disaster recovery. It appears in this month's journal "This poverty may, in turn, prompt destructive use of natural resources and lead to poverty-environment traps in which poverty exacerbates environmental degradation and environmental degradation worsens poverty," the article authors write.More people across the world are exposed to more natural disasters thanks to ecological degradation and climate change. Massive hurricanes, floods and earthquakes not only wreak substantial damage to ecosystems. Natural disasters often cause tremendous socioeconomic losses to human communities.A group of natural and social scientists at MSU looked at the well-being of people after a disaster. Their focus was on the recovery of the massive 2008 Wenchuan Earthquake in the Wolong Nature Reserve in southwestern China, but the results have relevance for areas worldwide that seek to balance sustainability recovery with the long-term well-being of people.In Wolong, governmental agencies were eager to replace homes and infrastructure and move people out ofResidents of a housing resettlement in Wolong, China the mountainous hills of the nature reserve and closer to more flat areas such as towns and roads. The intentions were good and provided what seemed like multiple levels of solutions. People once living in the high mountains had more access to work in towns, and people were moved further from the fragile biodiversity that is home to the threatened and iconic giant pandas and other valuable animals and plants.Yet the scientists, led by PhD student Hongbo Yang of the MSU Center for Systems Integration and Sustainability (CSIS), measured what constituted well-being for the quake's human victims and found as yet unidentified losses.To build new housing, cropland was eliminated, as were opportunities for income from farming. Many people from Wolong were able to find work in distant urban areas, but city work came with hardships and stress, such as increased living expenses and an unfair education system for their children and a lack of sense of belonging in the cities. Unintended consequences, in essence, were in many ways eroding their well-being.These were examples of how factors distant from the earthquake's epicenter could have an impact on the people there. These socioeconomic and environmental interactions over distances, known as telecouplings, can significantly influence on disaster recovery, both in Wolong as well in areas decimated by other natural disasters hurricanes, volcanoes, floods."In these times of tremendous global connectivity, a natural disaster will be felt far from its point of occurrence, and as people strive for recovery, they will be affected by solutions far away," said senior author Jianguo "Jack" Liu, MSU Rachel Carson Chair in Sustainability and CSIS director. "It has become imperative we are able to look at solutions systemically and holistically to avoid unintended consequences and make sure we don't just rebuild -- but also we preserve the well-being of people." | Earthquakes | 2,018 |
June 5, 2018 | https://www.sciencedaily.com/releases/2018/06/180605103503.htm | Scientists find pre-earthquake activity in central Alaska | Earth scientists consistently look for a reliable way to forecast earthquakes. New research from University of Alaska Fairbanks Geophysical Institute professor Carl Tape may help in that endeavor, due to a unique set of circumstances. | "Our observations have recorded an unequivocally interesting sequence of events," Tape said.Tape and his colleagues found evidence for accelerating activity before a 2016 earthquake in a laterally moving fault zone in central Alaska. The activity included a phenomenon known as very low-frequency earthquakes, referring to the type of energy waves associated with it.Typical earthquakes have two associated energy waves, called the P and S waves. Very low-frequency earthquakes do not have such signals. Instead, their waves occur on much lower frequencies."Most earthquakes start abruptly, but not always," said Luciana Astiz, a program director in the National Science Foundation's Division of Earth Sciences, which supported the research. "A fault zone in central Alaska monitored by new scientific instruments offers a look at a more complex process. This study reports the first observations of a slow process that transitions into an earthquake -- something previously observed only in laboratory experiments. These new observations contribute toward understanding the physics of earthquakes."In 2015, Tape installed 13 seismic stations in the Minto Flats of central Alaska to capture the area's fault activity. Nine days later, the instruments recorded a long-duration, very low-frequency process, normally only seen in deep subduction zones. This event showed a small amount of activity gathering, or nucleating, in a central area below the surface. It did not lead to an earthquake.A second, similar event in 2016 led to a key observation. At Minto Flats, a magnitude 3.7 quake occurred at a depth of about 10.5 miles, not an unusual event in itself. However, the event was preceded by a 12-hour accelerating sequence of earthquakes and 22 seconds of distinct high- and low-frequency waves in a concentrated area.Tape said that this kind of slow event transitioning into a rupture had previously only been seen in laboratory experiments."The rupture process started, then it found a patch of the fault that was ready to go, and that's what people have not seen. It's really exciting," Tape said."The leap we make, and maybe the more controversial thing, is that this emergent long-period signal only seen on top of the fault is a low-frequency signal that can sometimes turn into an earthquake and sometimes not," Tape said.Tape and his colleagues may have seen this kind of activity before. In 2012, there was a similar small event recorded in central Alaska. At that time, a magnitude 8.6 earthquake took place under the Indian Ocean and its energy was felt around the world. Because of the magnitude of this event, the smaller activity from central Alaska was overshadowed. Whatever signal the Minto Flats site gave off could not be confirmed. However, it was intriguing enough to help justify putting sensors in the area."Never in my wildest dreams did I expect we'd see something like that again," Tape said. "I assumed that the conditions that happened in 2012 were somehow unique and that huge surface waves led to this nucleation. Even though I proposed putting instruments on the area in a proposal, it was the last item I put on. I thought, "Maybe we'll see something crazy out there.'"By 2016, Tape had high-quality stations on top of the Minto Flat faults, around 18 miles from the main events, and no triggering earthquake to complicate the data."We are staring right at this process, and what it showed was that exactly during the tremor-like signal there is this emergent long-duration signal that hints at what's driving this nucleation phase," he said.Geologists have been looking for something like this for a long time. So why hasn't anyone seen it?"I'm left saying 'I don't know,'" Tape said. "I'm going to assume everyone has been looking for something before the P wave forever. It leads me to believe there is something special about this fault zone."Minto Flats has a deep sedimentary basin, strike-slip faulting, active tectonics and deep earthquakes; it is an unusual site."In some ways, I wish there wasn't anything special. I wish it was a global phenomenon that we discovered, but it's not," Tape said. "It appears there is something special about the conditions in Minto Flats."The results of the research will appear in the latest issue of the journal The project was primarily funded by a National Science Foundation CAREER project that supported Tape and his student co-authors, Vipul Silwal and Kyle Smith. | Earthquakes | 2,018 |
June 4, 2018 | https://www.sciencedaily.com/releases/2018/06/180604112606.htm | Long thought silent because of ice, study shows east Antarctica seismically active | Because instruments were finally installed there, scientists can no longer say that East Antarctica is unusually seismically silent. | Since the first earthquake was detected in 1982, there have been just eight more seismic events recorded in East Antarctica. But after a team that included Amanda Lough, PhD -- then a student but now an assistant professor in Drexel University's College of Arts and Sciences -- set up the first winter-through-summer seismic array, 27 earthquakes were recorded in 2009 alone, tripling the total number of events recorded on East Antarctica's section of the Earth's crust.So instead of being exceptionally stable, it appears East Antarctica just wasn't being watched closely enough."Ultimately, the lack of recorded seismicity wasn't due to a lack of events but a lack of instruments close enough to record the events," explained Lough, who is the lead author on a study discussing the array's results in To scientists, East Antarctica is what's known as a "craton": a large, stable piece of rock on the Earth's crust that forms significant pieces of continents. But East Antarctica was always considered to be unusual for its lack of seismic activity. Many believed this was due to suppression via the incredible weight of Antarctica's thick ice.Recordings from the array, however show that East Antarctica is similar to other cratons across the world, especially like the Canadian Shield, which stretches from the Northwest Territories to Quebec. The ice did not appear to be the culprit of the silence, it was solely a lack of ability to record what was happening.Not having an array wasn't for lack of motivation, though. Setting up the array was no easy task, and it took a lot of determination and effort from the team Lough was on.That included living in Antarctica's frigid, harsh environment, and flying from point to point across Antarctica's icy expanse (often having to dig out their own runways) to place the seismic recording equipment (which required even more digging). Those flights were usually cramped, with Lough wedging herself in among equipment and even fuel.But the efforts paid off once the 27 earthquakes were all measured in 2009, each ranging in magnitude from 2.1 to 3.9.Most seismic activity was recorded in basins near the Gumburtsev Subglacial Mountains, which Lough and others believe are part of an ancient continental rift system, older than the type in West Africa. "Rifts" are areas where the Earth's crust is being pulled apart from each other."There is a study we cite in the paper that shows that more than 52 percent of seismic events in continental areas occur in rifted crust, so it's not unexpected that we see the correlation here," Lough said.Very little seismic activity was recorded outside of the rifts, which are similar to the New Madrid Seismic Zone in the southern Midwest United States."The rifts provide zones of weakness that enable faulting to occur more easily, and it may be that the situation here is such that activity is occurring preferentially along these areas of preexisting weakness," Lough said, though she emphasized "we only have one year of data" and more needs to be observed before they have a "full picture."But the study from this array will contribute to gaining a fuller picture of what happens inside of the Earth's tectonic plates."East Antarctica is basically another piece of the puzzle," Lough said.And while Antarctica is largely home to penguins, seals and not much else, what is learned there can provide lessons for the places on Earth where people do live. The big one: You don't know what you don't measure."Antarctica is the least-instrumented continent, but other areas of the globe also lack sufficient instrumentation," Lough said. "There are some obvious holes in coverage in the Global Seismic Network. For example, the ocean covers 71 percent of the planet, but it is expensive and very difficult to get instruments there. We need to think about improving coverage and then improving the density of it." | Earthquakes | 2,018 |
June 4, 2018 | https://www.sciencedaily.com/releases/2018/06/180604112555.htm | Doubt cast on the predictive value of earthquake foreshocks | No one can predict when or where an earthquake will strike, but in 2011 scientists thought they had evidence that tiny underground tremors called foreshocks could provide important clues. If true, it suggested seismologists could one day warn people of impending temblors. | But a new study published in the online June 4 issue of The previous evidence came from a 7.6 magnitude earthquake in 1999 near Izmit, Turkey, that killed more than 17,000 people. A 2011 study in the journal Science found that the deadly quake was preceded by a series of small foreshocks -- potential warning signs that a big seismic event was imminent."We've gone back to the Izmit earthquake and applied new techniques looking at seismic data that weren't available in 2011," said lead author William Ellsworth, a professor (research) of geophysics at Stanford School of Earth, Energy & Environmental Sciences. "We found that the foreshocks were just like other small earthquakes. There was nothing diagnostic in their occurrence that would suggest that a major earthquake was about to happen.""We'd all like to find a scientifically valid way to warn the public before an earthquake begins," said co-author Fatih Bulut, an assistant professor of geodesy at Bo?aziçi University's Kandilli Observatory and Earthquake Research Institute. "Unfortunately, our study doesn't lead to new optimism about the science of earthquake prediction."Scientists including Ellsworth have proposed two ideas of how major earthquakes form, one of which -- if scientists can detect them -- could warn of a larger quake."About half of all major earthquakes are preceded by smaller foreshocks," Ellsworth said. "But foreshocks only have predictive value if they can be distinguished from ordinary earthquakes."One idea, known as the cascade model, suggests that foreshocks are ordinary earthquakes that travel along a fault, one quake triggering another one nearby. A series of smaller cascading quakes could randomly trigger a major earthquake, but could just as easily peter out. In this model, a series of small earthquakes wouldn't necessarily predict a major quake."It's a bit like dominos," Bulut said. "If you put dominos on a table at random and knock one over, it might trigger a second or third one to fall down, but the chain may stop. Sometimes you hit that magic one that causes the whole row to fall."Another theory suggests that foreshocks are not ordinary seismic events but distinct signals of a pending earthquake driven by slow slip of the fault. In this model, foreshocks repeatedly rupture the same part of the fault, causing it to slowly slip and eventually trigger a large earthquake.In the slow-slip model, repeating foreshocks emanating from the same location could be early warnings that a big quake is coming. The question had been whether scientists could detect a slow slip when it is happening and distinguish it from any other series of small earthquakes.In 2011, a team argued in Science that the foreshocks preceding the 1999 quake in Izmit were driven by slow slip, and could have been detected with the right equipment -- the first evidence that foreshocks would be useful for predicting a major earthquake."That result has had a large influence in thinking about the question of whether foreshocks can be predictive," Ellsworth said.The city of Izmit is located on the North Anatolian Fault, which stretches about 900 miles (1,500 kilometers) across Turkey. For the 2011 study, a team analyzed data from a single seismic station several miles from the earthquake epicenter, which ultimately recorded seismograms of 18 foreshocks occurring about 9 miles (15 kilometers) below the surface -- very close to the where the larger earthquake began -- and each with similar waveforms.Those similarities led the authors to conclude that all of the foreshocks repeatedly broke the same spot on the fault, driven by slow slip that ultimately triggered the major earthquake. They concluded that monitoring similar events could provide timely warning that a big quake is imminent."The Science paper concluded that there was a lot of slow slip, and had we been there with the right instruments we might have seen it," Ellsworth said. "We decided to test that idea that the foreshocks were co-located."Domino effectInstead of relying on data from one seismic station, Ellsworth and Bulut analyzed seismograms recorded at nine additional stations during the 1999 earthquake.With more stations, Ellsworth and Bulut identified a total of 26 foreshocks. None were identical, and the largest ones progressively moved from west to east along the fault. This finding is consistent with the cascade model, where an ordinary earthquake triggers another quake on a neighboring part of the fault, but doesn't necessarily predict a major quake.Bulut and Ellsworth found no evidence that slow slip played a role in triggering the Izmit earthquake."The authors of the Science paper were quite optimistic, but what they proposed had happened did not happen," Ellsworth said.What the researchers don't know is why this series of cascading foreshocks triggered a massive earthquake when so many others don't. Ellsworth said that without better seismic instrumentation, important challenges like our ability to predict earthquakes will remain."We're not giving up on foreshocks just because we currently can't tell them apart from other earthquakes," Ellsworth said. "We want to understand if they have predictive value and if not, why not. To answer that question will require observations made close to the action, deep in the heart of the earthquake machine, not as we currently do from the surface where we're blind to changes deep underground." | Earthquakes | 2,018 |
May 31, 2018 | https://www.sciencedaily.com/releases/2018/05/180531102812.htm | Widespread methane seeps off Oregon coast | For the past two years, scientists from Oregon State University and the National Oceanic and Atmospheric Administration (NOAA) have surveyed the Pacific Northwest near-shore region mapping sites where underwater bubble streams signify methane gas is being released from the seafloor. | And what they have found is eye-opening. Since the first evidence of underwater methane was discovered in the late 1980s, only about 100 "seep sites" had been identified along the Northwest coast through 2015. They often were discovered by accident, when fishermen would spot anomalies on their fish-finders that turned out to be acoustic reflections of the bubbling methane gas.But over the past two years the scientists -- aided by new sonar technology on the Exploration Vessel (E/V) Nautilus, owned and operated by the Ocean Exploration Trust -- have purposefully gone seeking evidence of underwater methane and have expanded the total number of offshore seep emission sites to a whopping 1,000 locations.It is not yet clear whether the methane presents an opportunity for a new source of energy or a potentially serious environmental threat, but for now the researchers want to map the distribution of the sites and conduct research on the composition and sources of the gas. They believe they will discover new methane seeps this summer when they utilize several research vessels to conduct additional mapping off the Northwest coast."Using this new sonar system, we've mapped only about 38 percent of the seafloor and about 25 percent of the water column data from Washington to Northern California," said Susan Merle, an Oregon State oceanographer who works out of OSU's Hatfield Marine Science Center in Newport, Ore. "No doubt, there are more sites out there, and we hope to find them."The researchers will embark on another expedition this June aboard the E/V Nautilus and use a remotely operated vehicle during daytime to collect samples of gas, methane hydrate, seepwater, fauna and rocks right where the methane is exiting the seafloor. Investigating these samples will help tell the story of the origin of the seeping methane and its impact on life at these sites and in the overlying ocean.During the night, they will do extensive mapping of areas not yet surveyed in an attempt to locate more methane seeps.The potentially vast storehouse of this potent greenhouse gas could provide an intriguing energy source, but it is widely distributed and may not be economical to extract, noted Robert Embley, an OSU geophysicist who spent much of his career with NOAA's Pacific Marine Environmental Laboratory."It is very tricky and potentially hazardous to attempt to extract methane," Embley said. "Mining would have all sorts of implications. When methane appears as a hydrate it can look like a chunk of ice or snow. But when you try to bring it to the surface, it immediately begins to decompose. In addition to hydrates, we've found hundreds of bubble streams, but their origin and scope remains to be seen."In their survey, the researchers have found that together with some free gas, much of the methane below an ocean depth of 500 meters (or about 1,600 feet), is found as solidified hydrates. Above 500 meters, it usually appears as a gas in bubble streams."One question we'd like to address is whether hydrates are formed by methane gas seeping out of the Earth and meeting the cold, deep seawater, or are the bubbles we're seeing a result of the hydrates breaking down and releasing gas," said John Lupton, a chemical oceanographer with NOAA/PMEL.Just how much methane is off the Northwest coast is uncertain, the researchers say. But it appears to be a lot and it could cause potential environmental problems."One concern is what would happen during a major Cascadia Subduction Zone earthquake," Embley pointed out. "It would add more permeability to the seafloor, add more pathways for the methane to escape, and increase the potential for its release to the atmosphere."So what is happening with all of the methane that is bubbling up out of the seafloor and into the Pacific Ocean waters?Tamara Baumberger, a researcher with the Cooperative Institute for Marine Resources Studies -- a joint OSU/NOAA center based at Hatfield -- has sampled some of the bubbles from the site and found different chemical signatures that helped the researchers pinpoint the origin of the methane. Some of it was "thermogenic" -- the result of organic material like dead plankton being heated up and transformed into the gas. Some was "biogenic," in which the organic material was altered by microbial activity."When methane is in seawater, it often is oxidized into carbon dioxide by microbial activity, which can keep much of it from reaching the atmosphere," Baumberger said. "The downside, of course, is that the newly formed CO2 is also a problem and it can both reach the atmosphere and increase ocean acidification."Baumberger said that methane released into shallow water can get into the atmosphere more quickly because the bacteria don't have enough time to oxidize it. However, the researchers are unsure how many methane seeps lie in shallow water, which they define as less than about 150 meters."We know very little about methane seep distribution in shallow water because it is very difficult to map there," Merle said. "But everywhere we looked for seeps, we found them. That's one of our goals this June is to get a better handle on how prevalent the shallower seeps might be."OSU's La Verne Kulm was one of the first to discover methane seeps off the coast in the 1980s, and a decade later, researchers documented ample methane at Hydrate Ridge off the Oregon coast. University of Washington scientist Paul Johnson mapped many of the Washington locations, and OSU's Marta Torres found more hydrates off the Heceta Bank in 1998.Beginning in 2016, though, the search began in earnest and the researchers have found a large aggregation of methane seep sites off the Coquille Bank near Coos Bay, as well as in the Astoria Canyon, "which is full of them," Merle said. "Wherever we find canyons, we seem to find methane."They also discovered methane seeps off Newport, Oregon, in water that was only 40 meters deep.Some of the methane samples included traces of helium, which is only found in the mantle, the researchers noted."This research has raised some interesting questions," Baumberger said. "How common is mantle gas in the Cascadia Margin methane seeps? How stable is the system during an earthquake? Will a warming ocean lead to an increase in the release of methane? What we're trying to do is identify how much is out there and establish a baseline. Then we can address these and other scientific questions."The NOAA Office of Ocean Exploration and Research and the Ocean Exploration Trust, Inc., supported the research. | Earthquakes | 2,018 |
May 30, 2018 | https://www.sciencedaily.com/releases/2018/05/180530113243.htm | Seismometer readings could offer debris flow early warning | First came the fire, then the rain -- and finally, the devastating mud. | In the wake of the largest wildfire in California's history, the December 2017 Thomas Fire, a powerful storm dumped about five inches of rain on the denuded hillsides of Ventura County, triggering debris flows on January 9 that killed 21 people and destroyed hundreds of homes in the Montecito and San Ysidro Creek areas.Seismologists at Caltech noticed that the rumble and roar of the mudslide was detected by a seismometer about 1.5 kilometers away from the worst of the damage. Significantly, they found that the seismogram generated by the event reveals information about debris-flow speed, the width of the flow and the size of boulders it carried, and the location of the event -- results suggesting that the current generation of seismometers in the field could be used to provide an early warning of an incoming debris flow to residents in mudslide-prone areas.Their study, which was published online on May 30 by The research was led by Victor Tsai (BS '04), corresponding author of the paper and professor of geophysics.Tsai has long been interested in exploring what information can be gathered from seismometers beyond the usual earthquake signals they were designed to detect. "The motion of the ground can indicate a lot of things, from the detonation of a warhead to the motion of a glacier. The trick is determining what the signal means," he says. As such, he had already started working on a model that predicted what a mudslide should look like on a seismometer, based on existing models of sediment transported by water.Collaborating with Tsai, graduate student Voon Hui Lai gathered data from the three seismometers located within a few kilometers of the mudslide to search for the signal predicted by Tsai's model. Due to proximity and technical issues, two of the seismometers did not robustly record the slide. The third, however, did. "It wasn't immediately obvious, but after a while, we found it," Lai says.The signal, a rumbling lasting for nearly 20 minutes, showed up in the 5-10 hertz frequency band, which is at the lower threshold of human hearing. The team was able to determine that the signal was indeed the mudslide based on its timing and by ruling out other potential sources. It almost perfectly matched the predictions made by Tsai's model.Analyzing the resulting seismogram, Tsai and Lai were further able to show how the signal could be used to estimate key elements of a debris flow (size, speed, and intensity) based on how they influence the shaking of the ground. The signal indicated that the debris flow "snout" (where the largest-sized boulders are) covered about a 50 meter by 50 meter area; that the boulders carried by the flow reached up to around 1.3 meters in diameter; and that the flow speed was about 2.4 meters per second.Now that they know what to look for and have a model for what the seismogram is indicating, scientists can use this to develop an early warning system based on existing seismometers, Tsai says. "Debris flows move much slower than earthquakes, so we could potentially develop an early warning system that would offer important warnings for residents and first responders," he says."Like earthquakes, debris flows occur infrequently and are dangerous to observe directly," adds co-author Michael Lamb, professor of geology. "By measuring ground shaking at a safe distance, our study shows that seismology has great potential to improve our understanding of when, where, and why debris flows happen."The researchers plan to keep testing and fine-tuning the model using controlled experiments that yield more precise measurements.The study is titled "The Seismic Signature of Debris Flows: Flow Mechanics and Early Warnings at Montecito, California." Co-authors include Thomas Ulizio, laboratory manager and research assistant at Caltech, and Alexander Beer, postdoctoral scholar in geology. This research was supported by the National Science Foundation and the Swiss National Science Foundation. | Earthquakes | 2,018 |
May 29, 2018 | https://www.sciencedaily.com/releases/2018/05/180529131958.htm | Flow in the asthenosphere drags tectonic plates along | New simulations of Earth's asthenosphere find that convective cycling and pressure-driven flow can sometimes cause the planet's most fluid layer of mantle to move even faster than the tectonic plates that ride atop it. | That's one conclusion from a new study by Rice University geophysicists who modeled flow in the 100-mile-thick layer of mantle that begins at the base of Earth's tectonic plates, or lithosphere.The study, which is available online in the journal "Tectonic plates float on top of the asthenosphere, and the leading theory for the past 40 years is that the lithosphere moves independently of the asthenosphere, and the asthenosphere only moves because the plates are dragging it along," said graduate student Alana Semple, lead co-author of the new study. "Detailed observations of the asthenosphere from a Lamont research group returned a more nuanced picture and suggested, among other things, that the asthenosphere has a constant speed at its center but is changing speeds at its top and base, and that it sometimes appears to flow in a different direction than the lithosphere."Computational modeling carried out at Rice offers a theoretical framework that can explain these puzzling observations, said Adrian Lenardic, a study co-author and professor of Earth, environmental and planetary sciences at Rice."We've shown how these situations can occur through a combination of plate- and pressure-driven flow in the asthenosphere," he said. "The key was realizing that a theory developed by former Rice postdoc Tobias Höink had the potential to explain the Lamont observations if a more accurate representation of the asthenosphere's viscosity was allowed for. Alana's numerical simulations incorporated that type of viscosity and showed that the modified model could explain the new observations. In the process, this offered a new way of thinking about the relationship between the lithosphere and asthenosphere."Though the asthenosphere is made of rock, it is under intense pressure that can cause its contents to flow."Thermal convection in Earth's mantle generates dynamic pressure variations," Semple said. "The weakness of the asthenosphere, relative to tectonic plates above, allows it to respond differently to the pressure variations. Our models show how this can lead to asthenosphere velocities that exceed those of plates above. The models also show how flow in the asthenosphere can be offset from that of plates, in line with the observations from the Lamont group"The oceanic lithosphere is formed at mid-ocean ridges and flows toward subduction zones where one tectonic plate slides beneath another. In the process, the lithosphere cools and heat from Earth's interior is transferred to its surface. Subduction recycles cooler lithospheric material into the mantle, and the cooling currents flow back into the deep interior.Semple's 3D model simulates both this convective cycle and the asthenosphere. She credited Rice's Center for Research Computing (CRC) for its help running simulations -- some of which took as long as six weeks -- on Rice's DAVinCI supercomputer.Semple said the simulations show how convective cycling and pressure-driven flow can drive tectonic movement."Our paper suggests that pressure-driven flow in the asthenosphere can contribute to the motion of tectonic plates by dragging plates along with it," she said. "A notable contribution does come from 'slab-pull,' a gravity-driven process that pulls plates toward subduction zones. Slab-pull can still be the dominant process that moves plates, but our models show that asthenosphere flow provides a more significant contribution to plate movement than previously thought."The research was supported by the National Science Foundation. DAVinCI is administered by CRC and was procured in partnership with Rice's Ken Kennedy Institute for Information Technology. | Earthquakes | 2,018 |
May 24, 2018 | https://www.sciencedaily.com/releases/2018/05/180524112401.htm | Cold production of new seafloor | A mountain range with a total length of 65,000 kilometers runs through all the oceans. It marks the boundaries of tectonic plates. Through the gap between the plates material from the Earth's interior emerges, forming new seafloor, building up the submarine mountains and spreading the plates apart. Very often, these mid-ocean ridges are described as a huge, elongated volcano. But this image is only partly correct, because the material forming the new seafloor is not always magmatic. At some spreading centres material from the Earth's mantle reaches the surface without being melted. The proportion of seabed formed this has been previously unknown. | Scientists from the Universities of Kiel (Germany), Austin (Texas, USA) and Durham (Great Britain) have now published data in the international journal One of these zones is located in the Cayman Trough south of the island of Grand Cayman in the Caribbean. In 2015, the researchers used the German research vessel METEOR to investigate the seafloor seismically, i.e. by using sound waves. Sound signals sent through different rocks or sediment layers, are being reflected and refracted in different ways by each layer. Rock, which has been melted and solidified on the seabed, has a different signature in the seismic signal than rock from the Earth's mantle, which has not been melted.But scientists had a problem so far: The contact with the seawater changes the mantle rocks. "After this process called serpentinisation mantle rocks are barely distinguishable from magmatic rocks in seismic data," says Professor Grevemeyer. Until now, mantle rock on the seabed could only be detected by taking samples directly from the seafloor and analyzing them in the laboratory. "But that way you only get information about a tiny spot. A large-scale or even in-depth information on the composition of the seabed cannot be achieved," says Grevemeyer.However, during the expedition in 2015, the team not only used the energy of ordinary sound waves -- it also detected so-called shear waves, which occur only in solid materials. They could be recorded very clearly thanks to a clever selection of measuring points.From the ratio of the speed of both types of waves, the scientists were able to differentiate mantle material from magmatic material. "So we could prove for the first time with seismic methods that up to 25 percent of the young ocean floor is not magmatic at the ultra-slow spreading centre in the Cayman trough," says Ingo Grevemeyer.Since there are similar spreading centres in other regions, such as the Arctic or Indian Ocean, these results are of great importance for the general idea about the global composition of the seabed. "This is relevant, if we want to create global models on the interactions between seabed and seawater or on processes of plate tectonics," summarizes Professor Grevemeyer. | Earthquakes | 2,018 |
May 23, 2018 | https://www.sciencedaily.com/releases/2018/05/180523150010.htm | Applying machine learning tools to earthquake data offers new insights | For all that seismologists have learned about earthquakes, new technologies show how much remains to be discovered. | In a new study in "It's a totally new way of studying earthquakes," said study coauthor Benjamin Holtzman, a geophysicist at Columbia's Lamont-Doherty Earth Observatory. "These machine learning methods pick out very subtle differences in the raw data that we're just learning to interpret."The approach is novel in several ways. The researchers assembled a catalog of 46,000 earthquake recordings, each represented as energy waves in a seismogram. They then mapped changes in the waves' frequency through time, which they plotted as a spectrogram -- a kind of musical roadmap of the waves' changing pitches, were they to be converted to sound. Seismologists typically analyze seismograms to estimate an earthquake's magnitude and where it originated. But looking at an earthquake's frequency information instead allowed the researchers to apply machine-learning tools that can pick out patterns in music and human speech with minimal human input. With these tools, the researchers reduced each earthquake to a spectral "fingerprint" reflecting its subtle differences from the other quakes, and then used a clustering algorithm to sort the fingerprints into groups.The machine-learning assist helped researchers make the link to the fluctuating amounts of water injected belowground during the energy-extraction process, giving the researchers a possible explanation for why the computer clustered the signals as it did. "The work now is to examine these clusters with traditional methods and see if we can understand the physics behind them," said study coauthor Felix Waldhauser, a seismologist at Lamont-Doherty. "Usually you have a hypothesis and test it. Here you're building a hypothesis from a pattern the machine has found."If the earthquakes in different clusters can be linked to the three mechanisms that typically generate earthquakes in a geothermal reservoir -- shear fracture, thermal fracture and hydraulic cracking -- it could be possible, the researchers say, to boost power output in geothermal reservoirs. If engineers can understand what's happening in the reservoir in near real-time, they can experiment with controlling water flows to create more small cracks, and thus, heated water to generate steam and eventually electricity. These methods could also help reduce the likelihood of triggering larger earthquakes -- at The Geysers, and anywhere else fluid is pumped underground, including at fracking-fluid disposal sites. Finally, the tools could help identify the warning signs of a big one on its way -- one of the holy grails of seismology.The research grew out of an unusual artistic collaboration. As a musician, Holtzman had long been attuned to the strange sounds of earthquakes. With sound designer Jason Candler, Holtzman had converted the seismic waves of recordings of notable earthquakes into sounds, and then speeded them up to make them intelligible to the human ear. Their collaboration, with study coauthor Douglas Repetto, became the basis for Seismodome, a recurring show at the American Museum of Natural History's Hayden Planetarium that puts people inside the earth to experience the living planet.As the exhibit evolved, Holtzman began to wonder if the human ear might have an intuitive grasp of earthquake physics. In a series of experiments, he and study coauthor Arthur Paté, then a postdoctoral researcher at Lamont-Doherty, confirmed that humans could distinguish between temblors propagating through the seafloor or more rigid continental crust, and originating from a thrust or strike-slip fault.Encouraged, and looking to expand the research, Holtzman reached out to study coauthor John Paisley, an electrical engineering professor at Columbia Engineering and Columbia's Data Science Institute. Holtzman wanted to know if machine-learning tools might detect something new in a gigantic dataset of earthquakes. He decided to start with data from The Geysers because of a longstanding interest in geothermal energy."It was a typical clustering problem," says Paisley. "But with 46,000 earthquakes it was not a straightforward task."Paisley came up with a three-step solution. First, a type of topic modeling algorithm picked out the most common frequencies in the dataset. Next, another algorithm identified the most common frequency combinations in each 10-second spectrogram to calculate its unique acoustic fingerprint. Finally, a clustering algorithm, without being told how to organize the data, grouped the 46,000 fingerprints by similarity. Number crunching that might have taken a computer cluster several weeks was done in a few hours on a laptop thanks to another tool, stochastic variational inference, Paisley had earlier helped develop.When the researchers matched the clusters against average monthly water-injection volumes across The Geysers, a pattern jumped out: A high injection rate in winter, as cities send more run-off water to the area, was associated with more earthquakes and one type of signal. A low summertime injection rate corresponded to fewer earthquakes, and a separate signal, with transitional signals in spring and fall.The researchers plan to next apply these methods to recordings of other naturally occurring earthquakes as well as those simulated in the lab to see if they can link signal types with different faulting processes. Another study published last year in Geophysical Research Letters suggests they are on a promising track. A team led by Los Alamos researcher Paul Johnson showed that machine learning tools could pick out a subtle acoustic signal in data from laboratory experiments and predict when the next microscopic earthquake would occur. Though natural faults are more complex, the research suggests that machine learning could lead to insights for identifying precursors to big earthquakes.The current research was funded with a 2016 RISE grant from Columbia's Office of the Executive Vice President. It even inspired a new course, "Sonic and Visual Representation of Data," which Holtzman and Paisley taught last spring in Columbia's Music Department and developed with a Columbia Collaboratory grant: "The Search for Meaning in Big Data." | Earthquakes | 2,018 |
May 17, 2018 | https://www.sciencedaily.com/releases/2018/05/180517123252.htm | Repeating seismic events offer clues about Costa Rican volcanic eruptions | Repeating seismic events -- events that have the same frequency content and waveform shapes -- may offer a glimpse at the movement of magma and volcanic gases underneath Turrialba and Poas, two well-known active volcanoes in Costa Rica. | At the 2018 SSA Annual Meeting, Rebecca Salvage of the Observatorio Vulcanologico y Sismologico de Costa Rica presented an analysis of these repeating signals from the volcanoes since July 2016.When these repeating events are identified at a seismic station, researchers assume that these "events are all produced by a single mechanism and at a similar location at depth ... and by a source which is either non-destructive or able to quickly renew itself," Salvage noted. "Therefore, the identification and an understanding of repeating seismicity may allow us some insight into which parts of the volcanic system at depth are active, and the frequency content of the repeating seismicity may be indicative of processes occurring at depth."At Turrialba, for instance, Salvage and her colleagues identified a type of repeating event called "drumbeat seismicity," characterized by a very short time interval between events. In January 2017, drumbeat seismicity at the volcano lasted less than three hours but contained hundreds of events. Eight hours later, there was a small eruption at Turrialba. In this case, the drumbeat seismicity may have been a "precursor signal" of the eruption, related to magma moving toward the surface, Salvage said."However, not all eruptions are preceded by these types of earthquakes, and often these earthquakes occur with no identifiable eruptive activity," she added. "A better understanding of drumbeats in terms of the conditions under which they do occur, and statistical analysis on inter-event times and occurrence rates will allow us to better assess whether these can actually be used as a warning tool."At Poas, the researchers noted another interesting halt in six families of repeating seismic events, just two hours after a swarm of magnitude 2.7 and higher earthquakes was recorded very near the volcano. In this case, Salvage and her colleagues think that the earthquakes may have influenced the stress field around the volcano in a way that halted the repeating events. The stress field may have changed when the earthquakes generated small displacements on local faults that created similar small diversions in magmatic gas and ash rising to the surface. | Earthquakes | 2,018 |
May 17, 2018 | https://www.sciencedaily.com/releases/2018/05/180517113810.htm | Evaluating active pressure management of induced earthquakes | Can altering the amount or rate of fluid injection and production in an oil and gas field or carbon storage site affect induced earthquakes in that field? A physics-based simulation presented at the 2018 SSA Annual Meeting suggests that this type of "active pressure management" can be useful in controlling induced seismicity at certain wells. | The experiments conducted by Lawrence Livermore National Laboratory researcher Kayla Kroll and her colleagues were prompted by a recent spike in induced earthquake activity related to oil and gas production in the U.S. and Canada. The rise in induced earthquakes has some scientists proposing changes in injection or production processes to reduce the fluid pressures that destabilize faults in these regions.In their simulations, Kroll and colleagues "found that active management was most advantageous for wells that were closest to a fault. This scenario is most successful at reducing the total number of seismic events and also the maximum magnitude of those events," Kroll said. In their simulations, a "close well" was one to four meters away from a fault.The researchers also noted fewer seismic events of lower magnitude when they reduced fluid injection rates over a constant period of time, compared to injections at a higher constant rate.Kroll and her colleagues used a 3-D computer simulator called RSQSim to create a catalog of earthquakes that might occur due to carbon dioxide injection into a saline reservoir adjacent to a fault. Through the simulator, they experimented with different injection and production scenarios of producing various fractions of the total injection volume from wells at different distances from the fault."With a physics simulation we can manipulate all the physical parameters that we think are important in the earthquake cycle, things that actually give rise to earthquake occurrence," Kroll explained. "We can't exactly know or test the parameters in the field.""Right now all the regulations about induced seismicity, especially in areas like Oklahoma, are reactive," Kroll said. "For example, they wait for occurrence of a large magnitude event and then they regulate injection operations in that area."Learning more about how active pressure management through the simulator, she said, "could someday help us reduce hazard of induced seismicity in real time and maybe even before an injection operation starts."The researchers do not have any site-specific pressure recommendations for well managers yet, said Kroll. She noted that scientists would need much more data on local geology, the flow parameters of fluid reservoirs, and data from dense seismic and pressure monitoring networks to create useful active management guidelines for a particular oil or carbon storage field. | Earthquakes | 2,018 |
May 17, 2018 | https://www.sciencedaily.com/releases/2018/05/180517113315.htm | Continental shelf shape leads to long-lasting tsunami edge waves during Mexican earthquake | The shape of the continental shelf off the southern Mexican coast played a role in the formation of long-lasting tsunami edge waves that appeared after last September's magnitude 8.2 earthquake, according to researchers speaking at the SSA 2018 Annual Meeting. | Edge waves are coastal waves generated by a larger tsunami wave. They travel back and forth parallel to a shoreline. They can be an important part of overall tsunami hazard, depending on how big the edge waves are and how long they last, said University of Oregon researcher Diego Melgar."They make a bad problem worse," he said. "When a tsunami happens, you get one big wave because of the earthquake, but then if this edge wave problem is present, you're going to get large waves that follow it ... it's like sloshing in a kiddie pool."During the September Tehuantepec earthquake, said Melgar, high amplitude edge waves lasted for an unusually long time, around 48 hours."These edge waves have been seen with pretty much every tsunami, but they're usually not very pronounced, they're usually small players," he explained. "We were surprised to see that these lasted two days."To understand this phenomenon, Melgar and his colleagues modeled the tsunami's effects in relation to the shape of the region's continental shelf, which had been mapped out previously using satellite data and sonar soundings from ships. Their models show that because the shelf is longer and flatter than other continental shelves worldwide, the shape of the shelf efficiently "trapped" the edge waves along the Mexican coast.Melgar said it's possible that similar edge wave trapping could be a tsunami hazard in other places around the world where there are large flat continental shelves, such as the U.S. Pacific Northwest, Alaska near Anchorage, and northern Japan."We need to make a systematic survey to see whether the edge waves can be trapped efficiently in other places," he noted.The 2018 Annual Meeting, held May 14-17 in Miami, Florida, is a joint conference between the Seismological Society of America and the Latin American and Caribbean Seismological Commission (LACSC). | Earthquakes | 2,018 |
May 16, 2018 | https://www.sciencedaily.com/releases/2018/05/180516102302.htm | How large can a tsunami be in the Caribbean? | The 2004 Indian Ocean tsunami has researchers reevaluating whether a magnitude 9.0 megathrust earthquake and resulting tsunami might also be a likely risk for the Caribbean region, seismologists reported at the SSA 2018 Annual Meeting. | "Before 2004, we thought an earthquake of about 8.0 was about right for the largest we might see in the Caribbean, based on the history of earthquakes there and the length and motion of the faults," said Christa von Hillebrandt-Andrade of the National Oceanic and Atmospheric Administration (NOAA)."But now some think that several faults in the region could be capable of producing earthquakes of 8.6, and the catastrophic planning by our emergency management community is considering 8.5 and 9.0 earthquakes," she noted."It's been a long time since a big earthquake and tsunami have hit the region, but almost 3500 people have lost their lives in the past 500 years from tsunamis in the Caribbean," said von Hillebrandt-Andrade. "The vulnerability is just huge because so much of our population and infrastructure is located right along the coast."The region contains several large subduction zones and faults, most of which are located offshore and are challenging to study. One particular focus is the subduction zone associated with the Puerto Rico Trench, located north of Puerto Rico and roughly on the border of the Caribbean Sea and Atlantic Ocean. A large earthquake on the Trench could produce a tsunami that reaches Puerto Rico within 20 minutes, and might be felt as far away as the U.S. Eastern seaboard.There is no historical evidence of any megathrust 9.0 earthquakes taking place on the trench, and the fault there has "a bit of oblique motion that takes up some of the energy and doesn't create the big offset of the seafloor creating the tsunami" that would be expected from a straightforward subduction zone, von Hillebrandt-Andrade said.Another concern is a tsunami generated by submarine landslides that might occur after a more moderate-sized earthquake, she noted, adding that researchers have uncovered traces of very large landslides along the steep parts of the seafloor along the trench.Other regions that seismologists are keeping a closer eye on include the Lesser Antilles, the Dominican Republic, and the area offshore north of Panama.von Hillebrandt-Andrade said emergency management planners often work with "scenarios" -- how a magnitude 9.0 earthquake and tsunami would impact a region -- instead of focusing on the probabilities of a certain-level tsunami taking place within a specified time range. As seismologists continue to work out the recurrence rate of earthquakes in the region, von Hillebrandt-Andrade said researchers also are beginning to look at probabilistic tsunami hazards, beginning in the American Caribbean."It would really be helpful to have more ocean bottom seismology," she said, "where we can place seismometers on the seafloor close to these faults so that we can appreciate more of their movement, and to extrapolate [motion] from smaller earthquakes."More Caribbean paleoseismology -- Identifying and analyzing the evidence of past earthquakes -- would also help researchers pinpoint the recurrence times of possible large earthquakes, von Hillebrandt-Andrade said.Since 2005, 48 countries and territories organized under the UNESCO IOC Intergovernmental Coordination Group for Tsunamis and Other Coastal Hazards for the Caribbean and Adjacent Regions have been developing a tsunami warning system. Tsunami response exercises, such as the annual CARIBE Wave draw on seismic hazard research findings to guide their practice scenarios."There is a big challenge in dealing with such an infrequent hazard, which can be forgotten or overlooked because people can be concerned with more immediate-type events like annual hurricane season," von Hillebrandt-Andrade said. "But tsunamis have the potential of killing so many people if we do not respond appropriately."The 2018 Annual Meeting, held May 14-17 in Miami, Florida, is a joint conference between the Seismological Society of America and the Latin American and Caribbean Seismological Commission (LACSC). | Earthquakes | 2,018 |
May 15, 2018 | https://www.sciencedaily.com/releases/2018/05/180515105622.htm | Eyewitness accounts fill in details of 1946 Dominican Republic tsunami | Almost 70 years later, the man remembers the August day in Playa Rincon, when he clung to the top of an almond tree to survive a tsunami where the waters rushed about 700 meters inland after a magnitude 8.1 earthquake. | His recollections and other astonishing eyewitness accounts of the tsunami that struck the Dominican Republic in 1946 are being used to reconstruct the tsunami's heights and inundation distances, said Georgia Tech researcher Hermann Fritz at the 2018 SSA Annual Meeting.Fritz and his colleagues carried out the eyewitness surveys in 2014 and 2016, hoping to learn more about one of the strongest earthquakes ever reported in the Caribbean. The 1946 tsunami was detected by tide gauges as far away as Atlantic City, New Jersey.The 1946 quake suggests that earthquakes along the region's Hispaniola and Puerto Rico trench subduction zones "pose a significant tsunami hazard not just for the islands themselves, but these events are also relevant for the seaboard of the eastern United States," Fritz said.The 2010 magnitude 7.0 earthquake in Haiti prompted Fritz and others to look more closely at the Dominican Republic's tsunami potential. Although the Haitian quake produced only a one-two meter tsunami at the Haiti-Dominican Republic border, "there was a complete lack of tsunami preparedness and awareness," Fritz said.The eyewitness survey was inspired by a Dominican meteorologist at the Oficina Nacional de Meteorologia, who told Fritz that his grandfather recalled "palm trees bouncing from one side to the other side" during the 1946 earthquake and tsunami.The eyewitness surveys covered about 300 kilometers of observations along the Dominican Republic's north coast, allowing the researchers to make 29 runup and tsunami height measurements at 21 locations. Locations between Cabrera and El Limon took the brunt of the waves, with tsunami heights over five meters. The tsunami flooded inland at distances of 600 meters or more at places like Las Terrenas and Playa Rincon.Although much of the coast has been changed by erosion, the researchers were able to find eyewitnesses who could remember where the water reached during the midday event. In the town of Matanzas, for instance, the survey team spoke with a woman named Patria, who was a teenager when the tsunami came ashore."We went to the beach with her where there was a palm tree that had survived the tsunami, and she was able to show us using the tree how high the tsunami was at that location," Fritz explained.The team's reconstruction of a two-meter tsunami height and other information from their survey has even changed official accounts of the earthquake at Matanzas, Fritz said. The tsunami was thought to have washed away most of the town and killed hundreds of residents. But with the new information uncovered by the survey team, the town has changed a local plaque commemorating the event to note "sin victimas ["no deaths"] from the tsunami."I was surprised by interviewing these people, how lucid some of these accounts were," Fritz said. "It almost brought the tsunami event back to life."The 2018 Annual Meeting, held May 14-17 in Miami, Florida, is a joint conference between the Seismological Society of America and the Latin American and Caribbean Seismological Commission (LACSC). | Earthquakes | 2,018 |
May 10, 2018 | https://www.sciencedaily.com/releases/2018/05/180510150045.htm | Radar reveals details of mountain collapse after North Korea's most recent nuclear test | As North Korea's president pledges to "denuclearize" the Korean peninsula, an international team of scientists is publishing the most detailed view yet of the site of the country's latest and largest underground nuclear test on Sept. 3, 2017. | The new picture of how the explosion altered the mountain above the detonation highlights the importance of using satellite radar imaging, called SAR (synthetic aperture radar), in addition to seismic recordings to more precisely monitor the location and yield of nuclear tests in North Korea and around the world.The researchers -- Teng Wang, Qibin Shi, Shengji Wei and Sylvain Barbot from Nanyang Technological University in Singapore, Douglas Dreger and Roland Bürgmann from the University of California, Berkeley, Mehdi Nikkhoo from the German Research Centre for Geosciences in Potsdam, Mahdi Motagh from the Leibniz Universität Hannover, and Qi-Fu Chen from the Chinese Academy of Sciences in Beijing -- will report their results online this week in advance of publication in the journal That explosion took place under Mt. Mantap at the Punggye-ri nuclear test site in the country's north, rocking the area like a 5.2-magnitude earthquake. Based on seismic recordings from global and regional networks, and before-and-after radar measurements of the ground surface from Germany's TerraSAR-X and Japan's ALOS-2 radar imaging satellites, the team showed that the underground nuclear blast pushed the surface of Mt. Mantap outward by as much as 11 feet (3.5 meters) and left the mountain about 20 inches (0.5 meters) shorter.By modelling the event on a computer, they were able to pinpoint the location of the explosion, directly under the mile-high summit, and its depth, between a quarter and a third of a mile (400-600 meters) below the peak.They also located more precisely another seismic event, or aftershock, that occurred 8.5 minutes after the nuclear explosion, putting it some 2,300 feet (700 meters) south of the bomb blast. This is about halfway between the site of the nuclear detonation and an access tunnel entrance and may have been caused by the collapse of part of the tunnel or of a cavity remaining from a previous nuclear explosion."This is the first time the complete three-dimensional surface displacements associated with an underground nuclear test were imaged and presented to the public," said lead author Teng Wang of the Earth Observatory of Singapore at Nanyang Technological University.Putting all of this together, the researchers estimate that the nuclear test, North Korea's sixth and the fifth inside Mt. Mantap, had a yield between 120 and 300 kilotons, about 10 times the strength of the bomb dropped by the United States on Hiroshima during World War II. That makes it either a small hydrogen, or fusion, bomb or a large atomic, or fission, bomb.The new scenario differs from two reports last week, one of which has been accepted for publication in the journal Geophysical Research Letters, that pinpointed the blast nearly a kilometer to the northwest of the site identified in the new paper, and concluded that the blast rendered the entire mountain unfit for future nuclear tests."SAR really has a unique role to play in monitoring explosions because it is direct imaging of the local ground surface, unlike seismology, where you learn the nature of the source analyzing waves radiating outward from the event at distant stations," said Dreger, a UC Berkeley professor of earth and planetary science and a member of the Berkeley Seismological Laboratory. "SAR provides some measure of ground truthing of the location of the event, a very challenging thing to get at. This is the first time anyone has actually modeled the mechanics of an underground explosion using satellite and seismic data together.""As opposed to standard optical imaging satellite imagery, SAR can be used to measure earth deformation day and night and under all weather conditions," added Dreger's colleague and co-author Roland Bürgmann, a UC Berkeley professor of earth and planetary science. "By precisely tracking the image pixel offsets in multiple directions, we were able to measure the full three-dimensional surface deformation of Mt. Mantap."According to Dreger, the new information suggests the following scenario: The explosion occurred more than a quarter mile (450 meters) below the summit of Mt. Mantap, vaporizing granite rock within a cavity about 160 feet (50 meters) across -- the size of a football stadium -- and damaging a volume of rock about 1,000 feet (300 meters) across. The blast likely raised the mountain six feet (2 meters) and pushed it outward up to 11 feet (3-4 meters), though within minutes, hours or days the rock above the cavity collapsed to form a depression.Eight and a half minutes after the bomb blast, a nearby underground cavity collapsed, producing the 4.5-magnitude aftershock with the characteristics of an implosion.Subsequently, a much larger volume of fractured rock, perhaps 1 mile (1-2 kilometers) across, compacted, causing the mountain to subside to about 1.5 feet (0.5 meters) lower than before the blast."There may be continuing post-explosion compaction at the mountain. It takes time for these aseismic processes to occur," Dreger said.While it is possible to discriminate explosions from natural earthquakes using seismic waveforms, the uncertainty can be large, Dreger said. Explosions often trigger nearby earthquake faults or other natural rock movements that make the seismic signals look earthquake-like, confusing the analysis. The SAR data revealed that additional constraints from the local static displacement can help to narrow down the source."I am hoping that by jointly analyzing the geodetic and seismic data, we will be able to improve discrimination between earthquakes and explosions, and certainly help with estimating the yield of an explosion and improving our estimation of source depth," Dreger said."This study demonstrates the capability of spaceborne remote sensing to help characterize large underground nuclear tests, if any, in the future," Wang said. "While surveillance of clandestine nuclear tests relies on a global seismic network, the potential of spaceborne monitoring has been underexploited."The work was supported by the Singapore Ministry of Education and the National Research Foundation of Singapore, as well as the U.S. Air Force Research Laboratory. | Earthquakes | 2,018 |
May 9, 2018 | https://www.sciencedaily.com/releases/2018/05/180509105004.htm | 500-year-old Leaning Tower of Pisa mystery unveiled by engineers | Why has the Leaning Tower of Pisa survived the strong earthquakes that have hit the region since the middle ages? This is a long-standing question a research group of 16 engineers has investigated, including a leading expert in earthquake engineering and soil-structure interaction from the University of Bristol. | Professor George Mylonakis, from Bristol's Department of Civil Engineering, was invited to join a 16-member research team, led by Professor Camillo Nuti at Roma Tre University, to explore this Leaning Tower of Pisa mystery that has puzzled engineers for many years.Despite leaning precariously at a five-degree angle, leading to an offset at the top of over five metres, the 58-metre tall Tower has managed to survive, undamaged, at least four strong earthquakes that have hit the region since 1280.Given the vulnerability of the structure, which barely manages to stand vertically, it was expected to sustain serious damage or even collapse because of moderate seismic activity. Surprisingly this hasn't happened and until now this has mystified engineers for a long time. After studying available seismological, geotechnical and structural information, the research team concluded that the survival of the Tower can be attributed to a phenomenon known as dynamic soil-structure interaction (DSSI).The considerable height and stiffness of the Tower combined with the softness of the foundation soil, causes the vibrational characteristics of the structure to be modified substantially, in such a way that the Tower does not resonate with earthquake ground motion. This has been the key to its survival. The unique combination of these characteristics gives the Tower of Pisa the world record in DSSI effects.Professor Mylonakis, Chair in Geotechnics and Soil-Structure Interaction, and Head of Earthquake and Geotechnical Engineering Research Group in the Department of Civil Engineering at the University of Bristol, said: "Ironically, the very same soil that caused the leaning instability and brought the Tower to the verge of collapse, can be credited for helping it survive these seismic events."Results from the study have been presented to international workshops and will be formally announced at the 16th European Conference in Earthquake Engineering taking place in Thessaloniki, Greece next month [18 to 21 June 2018]. | Earthquakes | 2,018 |
May 7, 2018 | https://www.sciencedaily.com/releases/2018/05/180507153141.htm | 'Snowball Earth' resulted from plate tectonics | About 700 million years ago, the Earth experienced unusual episodes of global cooling that geologists refer to as "Snowball Earth." | Several theories have been proposed to explain what triggered this dramatic cool down, which occurred during a geological era called the Neoproterozoic. Now two geologists at The University of Texas at Dallas and UT Austin suggest that those major climate changes can be linked to one thing: the advent of plate tectonics.The research was published online in December 2017 and in the April print edition of the journal Plate tectonics is a theory formulated in the late 1960s that states the Earth's crust and upper mantle -- a layer called the lithosphere -- is broken into moving pieces, or plates. These plates move very slowly -- about as fast as your fingernails and hair grow -- causing earthquakes, mountain ranges and volcanoes."Earth is the only body in our solar system known to currently have plate tectonics, where the lithosphere is fragmented like puzzle pieces that move independently," said Dr. Robert Stern, professor of geosciences in UT Dallas' School of Natural Sciences and Mathematics, and co-author of the study, along with Dr. Nathaniel Miller, a research scientist in UT Austin's Jackson School of Geosciences who earned his PhD in geosciences from UT Dallas in 1995."It is much more common for planets to have an outer solid shell that is not fragmented, which is known as 'single lid tectonics'," Stern said.Geoscientists disagree about when the Earth changed from single lid to plate tectonics, with the plate fragmenting from one plate to two plates and so on to the present global system of seven major and many smaller plates. But Stern highlights geological and theoretical evidence that plate tectonics began between 800 million and 600 million years ago, and has published several articles arguing for this timing.In the new study, Stern and Miller provide new insights by suggesting that the onset of plate tectonics likely initiated the changes on Earth's surface that led to Snowball Earth. They argue that plate tectonics is the event that can explain 22 theories that other scientists have advanced as triggers of the Neoproterozoic Snowball Earth."We went through the literature and examined all the mechanisms that have been put forward for Snowball Earth," Stern said. "The start of plate tectonics could be responsible for each of these explanations."The onset of plate tectonics should have disturbed the oceans and the atmosphere by redistributing continents, increasing explosive arc volcanism and stimulating mantle plumes, Stern said."The fact that strong climate and oceanographic effects are observed in the Neoproterozoic time is a powerful supporting argument that this is indeed the time of the transition from single lid to plate tectonics," Stern said. "It's an argument that, to our knowledge, hasn't yet been considered."In the present day, climate is in the news because we're changing it by putting more carbon dioxide into the atmosphere," Stern said. "But imagine a time when Earth didn't have plate tectonics, and it then evolved to have plate tectonics -- that would have been a major shift in the Earth's operating system, and it would have had a huge effect on climate, too." | Earthquakes | 2,018 |
May 7, 2018 | https://www.sciencedaily.com/releases/2018/05/180507111858.htm | Could seismology equipment help to protect elephants from poachers? | Using tools developed to monitor earthquakes, an interdisciplinary team of researchers reporting in | "We were surprised by the size of the forces acting on the ground that were generated by elephants when they vocalize," says Beth Mortimer of the Universities of Oxford and Bristol, UK. "We found that the forces generated through elephant calls were comparable to the forces generated by a fast elephant walk. This means that elephant calls can travel significant distances through the ground and, in favorable conditions, further than the distance that calls travel through the air."Mortimer is generally interested in animals that use vibrations through materials for information. Her earlier studies focused on spiders and their webs. In the new study, she teamed up with geophysicist colleagues including Tarje Nissen-Meyer at the University of Oxford, UK, to look to animals at the other end of the size spectrum. There had been hints that elephants might use ground vibrations to communicate, but their transmission through the ground wasn't well understood.To explore in the new study, Mortimer and Will Rees, a Masters student in Nissen-Meyer's seismology lab, recorded vibrations generated by wild elephants in Kenya while they displayed different behaviors, including walking and calling. They relied on techniques more commonly used to investigate earth processes, such as earthquakes. Their goal was to learn how far elephant-generated vibrations travel and how they are affected by human-generated noise, terrain type, and other factors.The researchers found that by recording the vibrations that elephants generate through the ground, they could classify particular elephant behaviors. Computer models showed they could detect and classify particular behaviors over kilometers. However, their abilities were influenced by physical factors including noise and terrain type.The findings suggest that human-generated noise may interfere with elephants' ability to communicate through vibrations. They may also have practical implications for helping to keep elephants safe."We suggest that monitoring ground-based vibrations can be used in a practical context to not only detect elephants, but determine their behaviors," Mortimer says. "Using multiple seismic recorders in remote locations, we suggest that detection, location, and classification algorithms can be generated that allow monitoring of elephants in real-time."While more experimental data are needed to confirm the potential over long distances, the findings could lead to the development of seismic monitoring systems to listen in for signs of elephants in trouble."We hope to build on these initial findings to develop a comprehensive approach for monitoring and understanding the behavior of large mammals in these pristine, changing, and fragile environments," Nissen-Meyer says.He says they now plan to deploy a larger, long-term network of seismic sensors along with aerial, visual, and acoustic instruments to examine elephants' reactions when researchers echo their own recorded signals back to them. | Earthquakes | 2,018 |
May 3, 2018 | https://www.sciencedaily.com/releases/2018/05/180503085601.htm | Earthquake aftermath: Life-threatening blood clots in legs and lungs from sitting in cars for extended periods | apanese physicians highlight the risks and clinical significance for individuals who remain seated and immobile in vehicles for prolonged periods. They call for preventive awareness activities and education about the risk of venous thromboembolisms (VTE) in a Letter to the Editor in the | Earlier reports have identified a sharp increase in sudden cardiac death following natural disasters like earthquakes, but less is known about the risk of other secondary health damage such as acute cerebral and cardiovascular diseases, and in particular, the risk of VTE as a result of being confined in a car for a long time.Following the Kumamoto earthquake in April 2016, there was a high number of night aftershocks. Because many people were afraid to return to their homes, they chose to evacuate. Although some individuals reached a public evacuation shelter, many others were forced to stay in their vehicles overnight.In order to assess the impact of remaining seated in cars for extended periods of time, the Kumamoto Earthquake Thrombosis and Embolism Protection (KEEP) project investigators gathered data from the aftermath of the Kumamoto earthquakes. They found an "epidemic" of blood clots developing in the legs, and in numerous cases going to the lungs, in many of the people forced to evacuate. Analysis of questionnaires from 21 local medical institutions established that 51 patients were hospitalized following the earthquakes due to VTE. Of these, 42 patients (82.4 percent) had spent the night in a vehicle. VTE was complicated by pulmonary thromboembolism (PTE) in 35 cases."Preventive awareness activities by professional medical teams, supported by education in the media about the risk of VTEs after spending the night in a vehicle, and raising awareness of evacuation centers, could lead to a reduced number of victims of VTE," noted lead investigator Seiji Hokimoto, MD, PhD, from the Department of Cardiovascular Medicine, Graduate School of Medical Sciences, Kumamoto University, Kumamoto, Japan."This is a dramatic example of the risks inherent in spending prolonged periods immobilized in a cramped position," commented Stanley Nattel, MD, Editor-in-Chief of the | Earthquakes | 2,018 |
May 2, 2018 | https://www.sciencedaily.com/releases/2018/05/180501130848.htm | Historical records help uncover new mechanism in deadly 1906 Taiwan quake | Researchers reexamining historical seismograms from the 1906 Meishan earthquake have uncovered a new mechanism for the quake, one of the deadliest to ever strike Taiwan. | The new mechanism provides a better fit for the fault rupture expected from the magnitude 7.1. earthquake, as well as a better fit for the distribution of recorded damage and aftershocks, according to the report in the journal The findings will help seismologists improve their understanding of the complex fault systems in this area, "which is important to provide better information for seismic hazard assessment," said Kuo-Fong Ma of the Earthquake Disaster & Risk Evaluation and Management (E-DREaM) Center in Taiwan.The result might also encourage others to use historical records in tectonically active countries to "explore the full fault system rather than a single fault segment for seismic hazard evaluation," she added.The 1906 Meishan earthquake caused 1258 deaths, 2385 injuries and leveled 6769 houses near its namesake village. Ground shaking has been estimated at a Mercalli intensity of IX or "violent," meaning that there would have been significant damage to large buildings, including building shifted off their foundations.Field studies conducted after the earthquake suggested that the earthquake was related an east-west strike-slip rupture on the Meishan Fault. However, there has been a longstanding debate about whether this rupture represents the earthquake's true origins. The surface rupture was short compared to what might be expected for such an intense quake, for instance, and reports of damage and aftershocks were confined in a north-south pattern, instead of an east-west pattern.With these inconsistencies in mind, Ma and her colleagues decided to reexamine the historical records of the earthquake. The researchers used the original seismogram recordings from three seismic stations in Taiwan, which were archived at the Earthquake Research Institute of Tokyo University, along with historical literature collected with the help of Taiwan's Central Weather Bureau.The researchers used these original records to create new artificial waveform simulations for the earthquake, to evaluate several models of how the fault might have ruptured. They created artificial waveforms from the original seismograms because of the difficulties in accounting for unknown instrument responses and in digitizing the historical records, Ma explained.The simulated waveform data uncovered a discrepancy between the first motions of P-waves and S-waves (the first and second waves detected by a seismograph in the event of an earthquake) that suggested the earthquake's mechanism might not have been a pure strike-slip rupture, the researchers said.The scientists then hunted for an alternative fault motion in the region that could better explain the data. They concluded that the preferred mechanism would be a thrust fault oriented in a northeast-southwest direction, with a small right-lateral component.Rupture along this type of thrust fault is more consistent with the level of shaking intensity and the pattern of aftershocks seen after the 1906 earthquake, Ma and colleagues said. | Earthquakes | 2,018 |
May 1, 2018 | https://www.sciencedaily.com/releases/2018/05/180501085147.htm | Small earthquakes caused by migrating gasses in the underground | The metropolitan area of Istanbul with around 15 million inhabitants is considered to be particularly earthquake-prone. In order to be able to assess the risk correctly, researchers must decipher the processes underground. Now further progress has been made by an international team, to which Marco Bohnhoff from the GFZ German Research Center for Geosciences belongs. Below the Marmara Sea, they detected earthquakes that were not directly caused by tectonic stresses but by rising natural gas. Their findings are published in the journal | The team led by Louis Geli of the French Research Center Ifremer analyzed seismic data recorded after an earthquake in the Western part of the Maramara Sea on 25 July 2011 with a magnitude of 5.1. As expected, several aftershocks occurred in the following days and weeks, but they were less severe. "A stronger earthquake changes the stress in the underground. This results in further shocks -- so-called aftershocks -- in which the stress changes are then compensated again," explains Bohnhoff. This happened in the summer of 2011 below the Marmara Sea near Istanbul. It was striking, however, that only a few aftershocks occurred in the crystalline basement where the main earthquake had its origin. "Instead, we recorded a lot of tremors at very shallow depths below the seafloor," says Bohnhoff who was involved in the localization and analysis of the shallow quakes. "This was quite surprising, because these layers consist of soft sediment that typically deforms aseismically under tectonic stress and does not make abrupt movements typical for earthquakes."In fact, there is another underlying mechanism, as the authors explain: The M5.1 earthquake has disrupted the stress field like striking a bell so that a natural gas reservoir in close proximity to the tectonic disturbance has come under increased pressure. As a result, gas escaped and moved upwards where it triggered weaker earthquakes. "Different processes come into question. Small shear fractures may have been activated, or the outgassing may have caused oscillations of water-filled cavities, a process also known from volcanoes or gas leaks. "The exact processes taking place below the bottom of the Maramara Sea can not be resolved from the available data, says the geophysicist. This requires seismometers, which are installed even closer to the source locaction, for example in boreholes. These were missing -- yet.Bohnhoff and his colleagues from GFZ and other international partner institutes have set up such downhole instrumentation further east in the greater Istanbul area as part of the GONAF observatory (Geophysical Observatory at the North Anatolian Fault). They are designed to detect the ongoing deformation of the tectonic plates, tensions in the Earth's crust and vibrations very accurately and thus ultimately allow for a more realistic risk analysis for the upcoming strong earthquake at the gates of the mega-city. Basically, there the probability for a magnitude 7 or larger earthquake by the year 2040 is 35 to 70 percent."The seismic hazard and risk for the metropolitan region of Istanbul does not necessarily change as a result of the new findings. But they must be included in various earthquake scenarios to make them more realistic," says Bohnhoff. "In this way, we also highlight an aspect hitherto completely ignored by the public, i.e. that the spatial proximity of the North Anatolian Fault Zone and the gas deposit poses an additional hazard potential." Due to the deposit large gas tanks are located at a short distance on land. In the case of a strong earthquake, there is an increased risk of explosion or gas leaks. Bohnhoff: "Such hazards increase the risk to the population of being damaged as a result of an earthquake." | Earthquakes | 2,018 |
April 30, 2018 | https://www.sciencedaily.com/releases/2018/04/180430131831.htm | Ample warning of supervolcano eruptions likely, experts say | Concern over the potential imminent eruptions of Earth's supervolcanoes, like Taupo in New Zealand or Yellowstone in the United States, may be quelled by the results of a new study suggesting that geological signs pointing to a catastrophic eruption would be clear far in advance. | To help forecast supervolcano eruptions, the study led by the University of Illinois has quantified the often-overlooked effects of tectonic stress on the rocks that house these sleeping giants, and suggests that people need not be quick to panic -- at least not yet.In the study, researchers set out to investigate regional-scale tectonic stress and unexpectedly found that their models could help forecast supervolcano eruption timing and inform experts on what to expect, geologically, well before an eruption."Traditionally, it is thought that eruptions occur when the pressure caused by hot magma overtakes the strength of a volcano's roof rock," said geology professor Patricia Gregg. "But supervolcanoes tend to occur in areas of significant tectonic stress, where plates are moving toward, past or away from each other. That plate motion will affect model calculations."Gregg, graduate student Haley Cabaniss and Pomona College geology professor Eric Grosfils published their findings in the journal The team created a model based on the Taupo Volcanic Zone in northern New Zealand. They chose this system because of its relatively uncomplicated extensional tectonic setting -- the type of area often associated with supervolcanoes. However, their models found that any tectonic stress would have a profound effect on the stability of supervolcanoes."It does not matter if it is extensional, compressional or shear stress," Cabaniss said. "Any tectonic stress will help destabilize rock and trigger eruptions, just on slightly different timescales. The remarkable thing we found is that the timing seems to depend not only on tectonic stress, but also on whether magma is being actively supplied to the volcano."Using their model, the team looked at scenarios with different amounts of stress, tectonic plate movement and magma supply. They found that in any given tectonic setting, the magma reservoirs inside of supervolcanoes appear to remain stable for hundreds to thousands of years while new magma is being actively suppled to the system."We were initially surprised by this very short timeframe of hundreds to thousands of years," Gregg said. "But it is important to realize that supervolcanes can lay dormant for a very long time, sometimes a million years or more. In other words, they may remain stable, doing almost nothing for 999,000 years, then start a period of rejuvenation leading to a large-scale eruption."Of course, panic sets in whenever Yellowstone or Taupo experience any change in seismic or geyser activity, but this research suggests that the precursors to catastrophic eruption will be far greater and long-lasting than anything yet documented, the researchers said."When new magma starts to rejuvenate a supervolcano system, we can expect to see massive uplift, faulting and earthquake activity," Gregg said. "Far greater than the meter-scale events we have seen in recent time. We are talking on the range of tens to hundreds of meters of uplift. Even then, our models predict that the system would inflate for hundreds to thousands of years before we witness catastrophic eruption.""People need to keep in mind that sites like Yellowstone are very well-monitored," Cabaniss said. "It is also important to note that our research suggests that the whole rejuvenation-to-eruption process will take place over several or more human lifetimes. Our models indicate that there should be plenty of warning." | Earthquakes | 2,018 |
April 26, 2018 | https://www.sciencedaily.com/releases/2018/04/180426180017.htm | Sub-sea rift spills secrets to seismic probe | The first study to spring from a Rice University-led 2013 international expedition to map the sea floor off the coast of Spain has revealed details about the evolution of the fault that separates the continental and oceanic plates. | A paper in That thinness made it easier to capture 3-D data for about 525 square miles of the Galicia, the first transition zone in the world so analyzed.Sophisticated seismic reflection tools towed behind a ship and on the ocean floor enabled the researchers to model the Galicia. Though the rift is buried under several hundreds of meters of powdered rock and invisible to optical instruments, seismic tools fire sound into the formation. The sounds that bounce back tell researchers what kind of rock lies underneath and how it's configured.Among the data are the first seismic images of what geologists call the S-reflector, a prominent detachment fault within the continent-ocean transition zone. They believe this fault accommodated slipping along the zone in a way that helped keep the crust thin."The S-reflector, which has been studied since the '70s, is a very low-angle, normal fault, which means the slip happens due to extension," Schuba said. "What's interesting is that because it's at a low angle, it shouldn't be able to slip. But it did."One mechanism people have postulated is called the rolling hinge," she said. "The assumption is that an initially steep fault slipped over millions of years. Because the continental crust there is so thin, the material underneath it is hot and domed up in the middle. The initially steep fault started rolling and became almost horizontal."So with the help of the doming of the material coming from below and also the continuous slip, that's how it is likely to have happened," Schuba said.The large data set also provided clues about interactions between the detachment fault and the serpentinized mantle, the dome of softer rock that presses upward on the fault and lowers friction during slippage. The researchers believe that led the Galicia to evolve differently, weakening faults and allowing for longer durations of activity.The research is relevant to geologists who study land as well as sea because detachment faults are common above the water, Schuba said. "One of my advisers, (adjunct faculty member) Gary Gray, is jazzed about this because he says you can see these faults in Death Valley and Northern California, but you can't ever see them fully because the faults keep going underground. You can't see how deep they go or how the fault zones change or how they're associated with other faults."But a 3-D dataset is like having an MRI," she said. "We can bisect it any way we want. It makes me happy that this was the first paper to come out of the Galicia data and the fact that we can see things no one else could see before." | Earthquakes | 2,018 |
April 18, 2018 | https://www.sciencedaily.com/releases/2018/04/180418141432.htm | Portable device to sniff out trapped humans | The first step after buildings collapse from an earthquake, bombing or other disaster is to rescue people who could be trapped in the rubble. But finding entrapped humans among the ruins can be challenging. Scientists now report in the ACS journal | In the hours following a destruction-causing event, the survival rate of people stuck in the rubble rapidly drops, so it's critical to get in there fast. Current approaches include the use of human-sniffing dogs and acoustic probes that can detect cries for help. But these methods have drawbacks, such as the limited availability of canines and the silence of unconscious victims. Devices that detect a human chemical signature, which includes molecules that are exhaled or that waft off the skin, are promising. But so far, these devices are too bulky and expensive for wide implementation, and they can miss signals that are present at low concentrations. So, Sotiris E. Pratsinis and colleagues wanted to develop an affordable, compact sensor array to detect even the most faint signs of life.The researchers built their palm-sized sensor array from three existing gas sensors, each tailored to detect a specific chemical emitted by breath or skin: acetone, ammonia or isoprene. They also included two commercially available sensors for detecting humidity and CO | Earthquakes | 2,018 |
April 17, 2018 | https://www.sciencedaily.com/releases/2018/04/180417115641.htm | Can your dog predict an earthquake? Evidence is shaky, say researchers | For centuries people have claimed that strange behavior by their cats, dogs and even cows can predict an imminent earthquake, but the first rigorous analysis of the phenomenon concludes that there is no strong evidence behind the claim. | The paper published in the Heiko Woith and colleagues at the GFZ German Research Centre for Geosciences say scientists must determine whether the link between the animal behavior and the earthquake is based on clearly defined rules (such as the animal's distance from earthquakes of a certain magnitude), whether the animal behavior has ever been observed and not followed by an earthquake, whether there is a statistical testing hypothesis in place to examine the evidence, and whether the animal population is a healthy, among other questions.These questions are rarely asked, making it difficult to systematically analyze the evidence for animal prediction, the researchers concluded after studying 729 reports of abnormal animal behavior related to 160 earthquakes."Many review papers on the potential of animals as earthquake precursors exist, but to the best of our knowledge, this is the first time that a statistical approach was used to evaluate the data," said Woith.The researchers collected reports on potential earthquake predictions across a variety of animals, from elephants to silkworms. Most reports were anecdotes rather than experimental studies, and the majority of the reports came from three events: the 2010 Darfield earthquake in New Zealand, the 1984 Nagano-ken Seibu earthquake in Japan, and the 2009 L'Aquila earthquake in Italy.The unusual behaviors occurred anywhere from seconds to months prior to the earthquakes, and at distances from a few to hundreds of kilometers from the earthquake origins. Only 14 of the reports record a series of observations of the animals over time -- most reports are single observations.These weaknesses in the data make it difficult to confirm that these behaviors are predictive -- meaning they signal an earthquake event before the event begins -- rather than random occurrences or behaviors linked to the initial stages of an earthquake, such as foreshocks.Foreshocks and abnormal animal behavior strongly cluster together in the statistical analysis by Woith and colleagues, suggesting that at least some of the behaviors may be related to physical phenomena from a seismic event already underway."The animals may sense seismic waves -- it could P, S or surface waves -- generated by foreshocks," Woith suggested. "Another option could be secondary effects triggered by the foreshocks, like changes in groundwater or release of gases from the ground which might be sensed by the animals."One of the biggest problems with the animal data, Woith says, is the lack of continuous, long-term observations of animals experiencing earthquakes. "Up to now, only very few time series with animal behavior exist at all, the longest being just one year."Without a long record, Woith said, researchers cannot be sure that their observations relate to an earthquake and not some other kind of environmental change or long-term fluctuation in the health of an animal population or its predators.For instance, one study analyzed by Woith and colleagues found that toads were behaving "abnormally" for half of the total observation time recorded in the study -- both before and after the earthquake.Future studies should include a stricter, quantitative definition of just what constitutes "unusual or abnormal behavior" on the part of the animals, as well as a physical explanation for the change in behavior, the researchers note. | Earthquakes | 2,018 |
April 12, 2018 | https://www.sciencedaily.com/releases/2018/04/180412102945.htm | Tsunamis could cause beach tourism to lose hundreds of millions of dollars every year | Going to the beach this summer? European tourists are more frequently going to places all over the world with significant tsunami risk, researchers have found. A global tourism destination risk index for tsunamis was released today at the 2018 Annual Conference of the European Geosciences Union (EGU) in Vienna, based on a study led by Andreas Schaefer of Karlsruhe Institute of Technology (KIT). This study examined all prominent tourism destinations globally with regard to the potential tourism loss impact for businesses given the loss of beaches post-tsunami. | Andreas Schaefer, an engineering geophysicist at the Geophysical Institute at KIT, presented the team's findings showing that the equivalent of over 250 million USD (ca. 200 million €) is lost annually to beach economies around the world. Based on the simulation model "TsuPy," the team examined over 24,000 beaches and their contributions to over 10,000 tourism destinations globally to rank the risk of each destination in terms of their beach-related business value."In absolute terms, Hawaii is by far the highest risk area on the globe for tourism risk to tsunamis, as it can be affected by many possible tsunami sources from Japan, Alaska, South America and other regions," said Schaefer, "most of the loss would be monetary however due to significant investment in warnings." The famous beach economy in Hawaii would have significant issues through the loss of infrastructure, erosion and other effects. The last major tsunami there occurred from the 1960 Chile earthquake with over 60 fatalities and around $500 million damage in today's terms.There are, however, many other locations globally where a devastating tsunami can cause damaging waves within minutes at beach resorts and towns, Schaefer said. Following the tsunami across the Indian Ocean in 2004, 228,000 people were killed, two thousand Europeans among them, and over $10 billion damage was caused. In 2011, in the Tohoku tsunami in Japan, despite warnings and seawalls, around 22,000 people died.Using the tsunami simulation model "TsuPy," developed by Schaefer, the KIT scientists "hit" 24,000 beaches with thousands of potential tsunamis In this way, many tsunamis that are possible, but have not actually happened were analysed. This allowed the research team to then evaluate the impacts of all the potential tsunamis on the local economy based around each beach. Schaefer's interest peaked in the 2011 tsunami, when he was a young engineering student, he has since been working on the simulation model.The Top 10 tourism locations in terms of possible absolute tsunami losses to beach tourism:1. Hawaii, USA2. Lima, Peru3. Valparaiso, Chile4. Guerrero, Mexico5. Bali, Indonesia6. Greater Los Angeles, USA7. Phuket, Thailand8. Southwest Turkey9. Bio-Bio, Chile10. Puntarenas, Costa RicaOver a billion US dollars is likely to be lost globally in the tourism sector somewhere around the world due to tsunamis every ten years, the researchers found. The beach-related business value at each tourism destination was developed from state, province and county tourism data from each country. "It was important to get the latest and best tourism and hotel information," James Daniell, a Natural Hazards Risk Engineer who is part of the research team at KIT said, "not just international, but also domestic tourism plays a major role in the number of people at the tourism destinations -- tourism contributes over 6 trillion USD directly and indirectly to the global economy every year." The economic data for tourism, hotels and revenue was collected for over 10,000 states, provinces and counties globally in over 200 countries by the research group and finds a significant increase in the numbers of tourists heading to most vulnerable locations from Europe and abroad.The researchers also studied the places on earth with the highest economic losses per dollar of tourism-related business. The top 5 consisted of 1) Guam, 2) Galapagos Islands in Ecuador, 3) Vanuatu, 4) Tonga and 5) Valparaiso in Chile. "These locations are most likely those to suffer most should a big tsunami strike as they are mostly small island nations with a significant need for tourist dollars," Daniell said.Whether tourists are likely to go to different places in the future is difficult to say, says Andreas Schäfer. "Every country is different, and it depends on the location and size of the country. In some past events, such as in the Indian Ocean, significant numbers of tourists stayed away from the entire region and prices decreased, due to beach loss, hotel damage and infrastructure issues.""In comparison to the significant rewards beach tourism offers globally, tsunami risk seems small, however, for those locations that are hit, the losses can be devastating," says Schaefer. In the Maldives, more than 20% of the beach resorts closed down after the 2004 Indian Ocean earthquake and tsunami. In Phang Nga and Phuket in Thailand, around two thirds and a quarter of hotels respectively were gone within six months after the disaster."Some countries, Japan among them, are employing extreme measures such as increasing sea wall heights for coastal protection. However, such measures to prevent possible fatalities in the next tsunami are not available for most other, and often poorer places in the world" says Mr. Schaefer, "The best councils, businesses and hotels near the beaches can do is being adequately prepared and work on emergency and evacuation planning to save the lives of the people going there. I hope that our risk index can provide a first step to alerting certain locations to their potential risk for tsunamis."Andreas Schaefer presented this research in the session "Global and continental scale risk assessment for natural hazards" at the 2018 Annual Conference of the European Geosciences Union (EGU) in Vienna. | Earthquakes | 2,018 |
April 9, 2018 | https://www.sciencedaily.com/releases/2018/04/180409112610.htm | Shaking up megathrust earthquakes with slow slip and fluid drainage | Megathrust earthquakes are the most powerful type of earthquake, occurring at subduction zones -- where one tectonic plate is pushed beneath another. By contrast, slow slip events (SSEs) release seismic stress at a lower rate than large earthquakes, re-occurring in cycles (across months to years). These processes can take place along the megathrust and other planes of weakness in response to loading, releasing low frequency seismic waves. Researchers at Tokyo Institute of Technology (Tokyo Tech) and Tohoku University consider the fluid drainage processes that can occur from SSEs and their impact on seismic activity. | While the drainage of fluids during megathrust earthquakes has been thought to occur when the megathrusts open up new pathways for fluid drainage through deformation, little has been understood on whether such fluid movements occur as a result of SSEs. Professor Junichi Nakajima at Tokyo Tech and Associate Professor Naoki Uchida at Tohoku University have suggested that fluid drainage resulting from slow slip may be an additional contributor to megathrust seismic activity.The team investigated the relationship between SSEs and seismic activity, while analyzing a rich dataset of seismic events around the Philippine Sea Plate. As shown in their recent publication in In their publication, they discuss how pore-fluid pressures play a role, emphasizing that areas of slow slip tend to have extremely high pore fluid pressures, and thus have a high potential to release fluids into other portions of the rock bodies. It is suggested SSEs could cause the movement of fluid into overlying rock units (if there were enough fracture or pore space to do so), inducing weakness in these areas and triggering seismicity.Based on this idea, the scientists speculate that if the overlying plate were impermeable (with no suitable spaces for fluid to move into), then fluid would be forced to travel through the mega thrust itself (rather than surrounding rock pores or fractures). This could in turn, help to trigger megathrust earthquake events as a result. Therefore, slow slip could catalyze seismic activity in megathrusts. While stress modulation are important contributors to megathrust-induced seismic activity, fluid transfer by episodic SSE may play a greater role than previously thought. | Earthquakes | 2,018 |
April 2, 2018 | https://www.sciencedaily.com/releases/2018/04/180402192654.htm | Water pressure a critical factor for mega-earthquakes | The 2016 Mw 7.6 earthquake of Southern Chile was the first large earthquake to occur within the rupture bounds of the great 1960 Mw 9.5 Valdivia earthquake, the largest ever observed in historical times. Using GPS, InSAR, gravity, seismic reflection, and geological data, Marcos Moreno and colleagues from GFZ as well as Chile show that the 2016 earthquake occurred at the deep boundary of a persistent asperity on the interface between the subducting Nazca and overriding South American plates, where both plates are coupled and not sliding past each other in spite of the high convergence velocity of 68 mm/year. This asperity broke during the 1960 Chile earthquake b has since healed and recovered. | Their study, published in According to this model, the shallower failure is representative of a great event (1960-class) and the deeper event represents a large earthquake (2016-class). Given the lag time of 56 years since the 1960 event, the model suggests that the pressure of fluid (i.e. largely water) at the plate interface zone is close to lithostatic at the deeper interface and is slightly lower at the shallower interface. If the water pressure at the plate interface zone becomes as high as the pressure of the overlying rock column, the strength of the rocks at the plate interface becomes practically zero - an effect akin to aquaplaning will initiate eventually triggering an earthquake. It is proposed that the development of this modelling strategy could enable the estimation of critical failure thresholds for other mapped subduction asperities where subducting and overriding plates are currently locked. | Earthquakes | 2,018 |
April 2, 2018 | https://www.sciencedaily.com/releases/2018/04/180402123244.htm | Modeling future earthquake and tsunami risk in southwestern Japan | Geoscience researchers at the University of Massachusetts Amherst, Smith College and the Japanese Agency for Marine-Earth Science and Technology this week unveiled new, GPS-based methods for modeling earthquake-induced tsunamis for southwestern Japan along the Nankai Trough. A Nankai-induced tsunami is likely to hit there in the next few decades, says lead author Hannah Baranes at UMass Amherst, and has the potential to displace four times the number of people affected by the massive Tohoku tsunami of 2011. | She and her doctoral advisor Jonathan Woodruff, with Smith College professor Jack Loveless and Mamoru Hyodo at the Japanese agency report details in the current As she explains, after the unexpectedly devastating 2011 quake and tsunami, Japan's government called for hazard-assessment research to define the nation's worst-case scenarios for earthquakes and tsunamis. Baranes notes, "The government guideline has focused attention on the Nankai Trough. It's a fault offshore of southern Japan that is predicted to generate a magnitude 8 to 9 earthquake within the next few decades."The team's research, supported by the National Science Foundation and a NASA graduate fellowship, began with a study of coastal lake sediments in Japan to establish long-term records of tsunami flooding. Between 2012 and 2014, Baranes and Woodruff collected sediment cores from lakes, looking for marine sand layers washed onshore by past extreme coastal floods. "These sand deposits get trapped and preserved at the bottoms of coastal lakes," she says. "We can visit these sites hundreds or even thousands of years later and find geologic evidence for past major flood events."Results from Lake Ryuuoo, a small lake on an island in the Bungo Channel, show a surprising sand layer washed into Lake Ryuuoo by seawater rushing over a 13-foot-high barrier beach. "We were able to date the layer to the early 1700s, which is consistent with the known Nankai Trough tsunami event of record from 1707," Baranes says.She adds, "We were a bit puzzled. The Bungo Channel is tucked between two of Japan's main islands and is relatively sheltered from Nankai Trough-generated tsunamis. Given recent tsunamis in the region, a minimum 13-foot tsunami in the channel seemed very unlikely." Further, she points out, the Bungo Channel area today has much sensitive and critical infrastructure, including the only nuclear power plant on the island of Shikoku. This gave the researchers "particular concern" for tsunami hazard there, so they decided to investigate their original finding further using numerical modeling techniques.As Baranes explains, an earthquake is caused by plates slipping past each other along faults in the earth's crust. That slip causes the earth's surface to deform, to uplift in some places and sink, or subside, in others. "When earthquake-induced uplift occurs on the sea floor, it displaces the entire column of water above it and generates the wave that we call a tsunami," she adds. "We can simulate that process with numerical models."She and Woodruff tried using one of the most widely-cited models for the 1707 Nankai Trough earthquake to flood Lake Ryuuoo, but this only generated a six-foot tsunami that came nowhere near overtopping the 13-foot barrier beach."At that point, we were still stumped," says Baranes. "But it wasn't long before we had a stroke of good luck in learning that a leading expert on tectonic modeling in Japan, Jack Loveless, is a professor just down the road at Smith College." Loveless uses very precise GPS measurements of earth surface motion to model the extent and spatial distribution of frictional locking that causes fault stress to build up between earthquakes.With Loveless, the team created earthquake scenarios based on GPS estimates of present-day frictional locking along the Nankai Trough and for the first time rigorously tested methods for creating potential future earthquake scenarios from the GPS measurements. They tested various methods for creating a suite of GPS-based earthquake scenarios and simulated the resulting ground surface displacement and tsunami inundation.Baranes reports that they found GPS measurements of present-day earth surface motion around the Nankai Trough yield an earthquake of a similar magnitude and extent as the 1707 event, and their simulated tsunami heights are consistent with historical accounts of the 1707 event. As for matching the Lake Ryuuoo geologic record, she adds, "Our model earthquake scenarios showed the Bungo Channel region subsiding seven feet and lowering Lake Ryuuoo's barrier beach from 13 to six feet, such that a tsunami with a feasible height for an inland region easily flooded the lake."Woodruff, who conducted the study as part of a Fulbright fellowship, says, "Although our methodology was well received, our result for the Bungo Channel was met with a lot of skepticism. We needed to find an independent method for validating it." They enlisted Hyodo, who had previously published earthquake scenarios based on models of the Nankai Trough's physical characteristics. His physical model yielded the same focused subsidence in the Bungo Channel, Woodruff reports.Baranes adds, "His model was also consistent with our GPS-based model in terms of earthquake magnitude, ground surface displacement and tsunami inundation. This was a really neat result because in addition to providing an independent line of evidence for significant tsunami hazard in the Bungo Channel, we demonstrated a connection between the Nankai Trough's physical characteristics and GPS measurements of surface motion." | Earthquakes | 2,018 |
March 27, 2018 | https://www.sciencedaily.com/releases/2018/03/180327132032.htm | Sediment core from sluice pond contains evidence for 1755 New England earthquake | Signs of a 1755 earthquake that was strong enough to topple steeples and chimneys in Boston can be seen in a sediment core drawn from eastern Massachusetts' Sluice Pond, according to a new report published in | Katrin Monecke of Wellesley College and her colleagues were able to identify a layer of light brown organic-rich mud within the core, deposited between 1740 and 1810, as a part of an underwater landslide, possibly unleashed by the 1755 Cape Ann earthquake.The Cape Ann earthquake is the most damaging historic earthquake in New England. While its epicenter was probably located offshore in the Atlantic, the shaking was felt along the North American eastern seaboard from Nova Scotia to South Carolina. Based on contemporary descriptions of damage from Boston and nearby villages, the shaking has been classified at modified Mercalli intensities of "strong" to "very strong," ((VI-VII) meaning that it would have caused slight to moderate damage of ordinary structures.New England is located within a tectonic plate, so "it is not as seismically active as places like California, at an active tectonic plate margin," said Monecke. "There are zones of weakness mid-plate in New England and you do build up tectonic stress here, you just don't build it up at the same rate that would occur at a plate boundary."With few faults to study, however, researchers like Monecke and her colleagues are looking for signs of seismically-induced landslides or the deformation of soft soils to trace the historic and prehistoric record of earthquakes in the region.Monecke hopes that the new Sluice Pond core will give seismologists a way "to calibrate the sedimentary record of earthquakes in regional lakes," she said."It is important to see what an earthquake signature looks like in these sediments, so that we can start looking at deeper, older records in the region and then figure out whether 1755-type earthquakes take place for example, every 1000 years, or every 2000 years," Monecke added.The researchers chose Sluice Pond to look for signs of the Cape Ann earthquake for a variety of reasons. First, the lake is located within the area of greatest shaking from the 1755 event, "and we know from other studies of lakes that have been carried out elsewhere that you need intensities of approximately VII to cause any deformation within the lake sediments," Monecke said.Sluice Pond also has steep sides to its center basin, which would make it susceptible to landsliding or underwater sliding during an earthquake with significant shaking. The deep basin with a depth of close to 65 feet also harbored a relatively undisturbed accumulation of sediments for coring.Through a painstaking analysis of sediment size and composition, pollen and plant material and even industrial contaminants, the research team was able to identify changes in sediment layers over time in the core. The light brown layer deposited at the time of the Cape Ann quake caught their eye, as it contained a coarser mix of sediments and a slightly different mix of plant microfossils."These were our main indicators that something had happened in the lake. We saw these near shore sediments and fragments of near-shore vegetation that appear to have been washed into the deep basin," by strong shaking, said Monecke.In an interesting twist, land clearing by early settlers from as far back as 1630 may have made the underwater slopes more susceptible to shaking, Monecke said. Sediment washed into the lake from cleared land loads up the underwater slopes and makes them more prone to failure during an earthquake, she noted.For that reason, the sediment signature linked to prehistoric earthquakes may look a little different from that seen with the Cape Ann event, and Monecke and her colleagues are hoping to sample even older layers of New England lakes to continuing building their record of past earthquakes.The research team is taking a closer look at a more famous New England body of water: Walden Pond. "It got slightly less ground shaking [than Sluice Pond] in 1755, but it might have been affected by a 1638 earthquake in southern New Hampshire," Monecke explained. "We already have sediment cores from that lake, and now we are unraveling its sedimentary history and trying to get an age model there as well." | Earthquakes | 2,018 |
March 21, 2018 | https://www.sciencedaily.com/releases/2018/03/180321141417.htm | Seismologists introduce new measure of earthquake ruptures | A team of seismologists has developed a new measurement of seismic energy release that can be applied to large earthquakes. Called the Radiated Energy Enhancement Factor (REEF), it provides a measure of earthquake rupture complexity that better captures variations in the amount and duration of slip along the fault for events that may have similar magnitudes. | Magnitude is a measure of the relative size of an earthquake. There are several different magnitude scales (including the original Richter scale), with the "moment magnitude" now the most widely used measure because it is uniformly applicable to all sizes of earthquakes. The seismic energy released in an earthquake can also be measured directly from recorded ground shaking, providing a distinct measure of the earthquake process. Earthquakes of a given magnitude can have very different radiated seismic energy.Researchers at UC Santa Cruz and California Institute of Technology (Caltech) devised REEF in an effort to understand variations in the rupture characteristics of the largest and most destructive earthquakes, such as the 2004 Sumatra earthquake (magnitude 9.2) and 2011 Tohoku earthquake in Japan (magnitude 9.1). They introduced the new measurement in a paper published March 21 in REEF is measured by the ratio of the earthquake's actual measured radiated energy (in seismic waves recorded around the world) to the minimum possible energy that an event of equal seismic moment and rupture duration would produce. If the rupture is jerky and irregular, it radiates more seismic energy, especially at high frequencies, and this indicates frictional conditions and dynamic processes on the fault plane during rupture, Lay explained.The researchers made systematic measurements of REEF for 119 recent major earthquakes of magnitudes 7.0 to 9.2. They found clear regional patterns, with some subduction zones having higher REEF ruptures on average than other zones."This indicates, for the first time, that energy release is influenced by regional properties of each fault zone," said Lay, a professor of Earth and planetary sciences at UCSC.The precise cause of some regions radiating higher energy in an event of given size is still under investigation, but may be linked to regional differences in the roughness of the faults, in the fluid distributions on the faults, or in the sediments trapped in the fault zone, he said.Further research using REEF could help seismologists achieve better understanding of earthquake mechanics and earthquake hazards around the world.This research was supported by the National Science Foundation of China, Chinese Academy of Sciences, and U.S. National Science Foundation. | Earthquakes | 2,018 |
March 21, 2018 | https://www.sciencedaily.com/releases/2018/03/180321110855.htm | Radar images show large swath of Texas oil patch is heaving and sinking at alarming rates | Two giant sinkholes near Wink, Texas, may just be the tip of the iceberg, according to a new study that found alarming rates of new ground movement extending far beyond the infamous sinkholes. | That's the finding of a geophysical team from Southern Methodist University, Dallas that previously reported the rapid rate at which the sinkholes are expanding and new ones forming.Now the team has discovered that various locations in large portions of four Texas counties are also sinking and uplifting.Radar satellite images show significant movement of the ground across a 4000-square-mile area -- in one place as much as 40 inches over the past two-and-a-half years, say the geophysicists."The ground movement we're seeing is not normal. The ground doesn't typically do this without some cause," said geophysicist Zhong Lu, a professor in the Roy M. Huffington Department of Earth Sciences at SMU and a global expert in satellite radar imagery analysis."These hazards represent a danger to residents, roads, railroads, levees, dams, and oil and gas pipelines, as well as potential pollution of ground water," Lu said. "Proactive, continuous detailed monitoring from space is critical to secure the safety of people and property."The scientists made the discovery with analysis of medium-resolution (15 feet to 65 feet) radar imagery taken between November 2014 and April 2017. The images cover portions of four oil-patch counties where there's heavy production of hydrocarbons from the oil-rich West Texas Permian Basin.The imagery, coupled with oil-well production data from the Texas Railroad Commission, suggests the area's unstable ground is associated with decades of oil activity and its effect on rocks below the surface of the earth.The SMU researchers caution that ground movement may extend beyond what radar observed in the four-county area. The entire region is highly vulnerable to human activity due to its geology -- water-soluble salt and limestone formations, and shale formations."Our analysis looked at just this 4000-square-mile area," said study co-author and research scientist Jin-Woo Kim, a research scientist in the SMU Department of Earth Sciences."We're fairly certain that when we look further, and we are, that we'll find there's ground movement even beyond that," Kim said. "This region of Texas has been punctured like a pin cushion with oil wells and injection wells since the 1940s and our findings associate that activity with ground movement."Lu, Shuler-Foscue Chair at SMU, and Kim reported their findings in the Nature publication The researchers analyzed satellite radar images that were made public by the European Space Agency, and supplemented that with oil activity data from the Texas Railroad Commission.The study is among the first of its kind to identify small-scale deformation signals over a vast region by drawing from big data sets spanning a number of years and then adding supplementary information.The research is supported by the NASA Earth Surface and Interior Program, and the Shuler-Foscue Endowment at SMU.The SMU geophysicists focused their analysis on small, localized, rapidly developing hazardous ground movements in portions of Winkler, Ward, Reeves and Pecos counties, an area nearly the size of Connecticut. The study area includes the towns of Pecos, Monahans, Fort Stockton, Imperial, Wink and Kermit.The images from the European Space Agency are the result of satellite radar interferometry from recently launched open-source orbiting satellites that make radar images freely available to the public.With interferometric synthetic aperture radar, or InSAR for short, the satellites allow scientists to detect changes that aren't visible to the naked eye and that might otherwise go undetected.The satellite technology can capture ground deformation with an accuracy of sub-inches or better, at a spatial resolution of a few yards or better over thousands of miles, say the researchers.The SMU researchers found a significant relationship between ground movement and oil activities that include pressurized fluid injection into the region's geologically unstable rock formations.Fluid injection includes waste saltwater injection into nearby wells, and carbon dioxide flooding of depleting reservoirs to stimulate oil recovery.Injected fluids increase the pore pressure in the rocks, and the release of the stress is followed by ground uplift. The researchers found that ground movement coincided with nearby sequences of wastewater injection rates and volume and CO2 injection in nearby wells.Also related to the ground's sinking and upheaval are dissolving salt formations due to freshwater leaking into abandoned underground oil facilities, as well as the extraction of oil.As might be expected, the most significant subsidence is about a half-mile east of the huge Wink No. 2 sinkhole, where there are two subsidence bowls, one of which has sunk more than 15.5 inches a year. The rapid sinking is most likely caused by water leaking through abandoned wells into the Salado formation and dissolving salt layers, threatening possible ground collapse.At two wastewater injection wells 9.3 miles west of Wink and Kermit, the radar detected upheaval of about 2.1 inches that coincided with increases in injection volume. The injection wells extend about 4,921 feet to 5,577 feet deep into a sandstone formation.In the vicinity of 11 CO2 injection wells nearly seven miles southwest of Monahans, the radar analysis detected surface uplift of more than 1 inch. The wells are about 2,460 feet to 2,657 feet deep. As with wastewater injection, CO2 injection increased pore pressure in the rocks, so when stress was relieved it was followed by uplift of about 1 inch at the surface.The researchers also looked at an area 4.3 miles southwest of Imperial, where significant subsidence from fresh water flowing through cracked well casings, corroded steel pipes and unplugged abandoned wells has been widely reported.Water there has leaked into the easily dissolved Salado formation, created voids, and caused the ground to sink and water to rise from the subsurface, including creating Boehmer Lake, which didn't exist before 2003.Radar analysis by the SMU team detected rapid subsidence ranging from three-fourths of an inch to nearly 4 inches around active wells, abandoned wells and orphaned wells."Movements around the roads and oil facilities to the southwest of Imperial, Texas, should be thoroughly monitored to mitigate potential catastrophes," the researchers write in the study.About 5.5 miles south of Pecos, their radar analysis detected more than 1 inch of subsidence near new wells drilled via hydraulic fracturing and in production since early 2015. There have also been six small earthquakes recorded there in recent years, suggesting the deformation of the ground generated accumulated stress and caused existing faults to slip."We have seen a surge of seismic activity around Pecos in the last five to six years. Before 2012, earthquakes had not been recorded there. At the same time, our results clearly indicate that ground deformation near Pecos is occurring," Kim said. "Although earthquakes and surface subsidence could be coincidence, we cannot exclude the possibility that these earthquakes were induced by hydrocarbon production activities."Kim stated the need for improved earthquake location and detection threshold through an expanded network of seismic stations, along with continuous surface monitoring with the demonstrated radar remote sensing methods."This is necessary to learn the cause of recent increased seismic activity," Kim said. "Our efforts to continuously monitor West Texas with this advanced satellite technique can help sustain safe, ongoing oil production."The satellite radar datasets allowed the SMU geophysicists to detect both two-dimension east-west deformation of the ground, as well as vertical deformation.Lu, a leading scientist in InSAR applications, is a member of the Science Team for the dedicated U.S. and Indian NASA-ISRO (called NISAR) InSAR mission, set for launch in 2021 to study hazards and global environmental change.InSAR accesses a series of images captured by a read-out radar instrument mounted on the orbiting satellite Sentinel-1A/B. The satellites orbit 435 miles above the Earth's surface. Sentinel-1A was launched in 2014 and Sentinel-1B in 2016 as part of the European Union's Copernicus program.The Sentinel-1A/B constellation bounces a radar signal off the earth, then records the signal as it bounces back, delivering measurements. The measurements allow geophysicists to determine the distance from the satellite to the ground, revealing how features on the Earth's surface change over time."Near real-time monitoring of ground deformation at high spatial and temporal resolutions is possible in a few years, using multiple satellites such as Sentinel-1A/B, NISAR and others," said Lu. "This will revolutionize our capability to characterize human-induced and natural hazards, and reduce their damage to humanity, infrastructure and the energy industry." | Earthquakes | 2,018 |
March 19, 2018 | https://www.sciencedaily.com/releases/2018/03/180319144601.htm | Historians to climate researchers: Let's talk | History can tell us a lot about environmental upheaval, say Princeton University historians John Haldon and Lee Mordechai. What is missing in today's debate about climate change is using what we know about how past societies handled environmental stresses to help inform our own situation. | Developing policies to address the challenges of modern, global climate change requires understanding the science and the contemporary politics, as well as understanding how societies through history have responded to the climate changes they encountered.Ours is not the first society to be confronted by environmental change, Haldon, Mordechai and an international team of co-authors noted in a paper published in the current issue of the Proceedings of the National Academy of Sciences. Over the course of history, some societies have been destroyed by natural disasters, like the eruption of Pompeii, while others have learned how to accommodate floods, droughts, volcanic eruptions and other natural hazards.The key is "how a society plans for and interacts with the stress from nature," said Mordechai, who earned his Ph.D. in history from Princeton in 2017. He cautioned that policymakers looking at how to prepare for global climate change "should understand that it is not going to be a short-term process. It will take time. We, collectively, as a society, need to prepare for these things in advance."Human societies are much more resilient, much more adaptive to change than we would expect," he said. He pointed out that many societies developed precisely in locations where the environment was difficult to control, such as the flood basins of the Nile and Euphrates rivers or earthquake-prone areas like Constantinople (now Istanbul).Similar catastrophic natural events can play out very differently, Mordechai said, depending on how well prepared its society is to handle the occurrence and its aftermath. "In 2010, there were two very similar earthquakes: one in Christchurch, New Zealand, and the other in Port au Prince, Haiti," he said. "The Haiti earthquake killed anywhere between 46,000 and 316,000 people. The Christchurch earthquake, at the same magnitude, killed one person. And it's actually debatable [if that is what] killed him or not."The differences in the outcomes in Haiti and in New Zealand highlight the host of factors that come into play when examining the connection between a society and its environment. Too often, researchers will spot a correlation between the climate record and the historical record and leap to a too-simple conclusion, said Haldon, the Shelby Cullom Davis '30 Professor of European History and a professor of history and Hellenic studies.In their article, Haldon and his co-authors looked at four case studies to examine some of the ways societies have and have not coped with natural stresses: the Mediterranean in the Early Middle Ages (600-900), Europe during the Carolingian Era (750-950), Central America in the Classic Period (650-900) and Poland during the Little Ice Age (1340-1700).In each case, they showed how a simple, environmental interpretation of events overlooked the key context. "If I would have to summarize what history has to contribute: it adds nuance to our interpretation of past events," said Mordechai.In the case of the Mayans in Caracol, Belize, for example, the authors noted that before its apparent collapse, Mayan society had withstood some 2,000 years of climate variations in a challenging environment. Other authors have suggested that a severe drought ended the civilization, but Haldon's research team correlated archaeological data, written hieroglyphic history and the projected drought cycles and found that the community actually expanded after each drought.So what else could be responsible for the abrupt end to the massive city? The researchers saw that after a century of warfare, Caracol's elite had adjusted long-standing economic and social policies to widen the divide between themselves and the commoners. The research team concluded that socioeconomic factors, accompanied by warfare, were more responsible for the city's abrupt demise than drought.Economic inequality is nothing new, said Mordechai, who is now a Byzantine studies postdoctoral fellow at the University of Notre Dame. "You find this over and over again," he said. "Disasters serve, in a way, to emphasize differences in our human society. [After a hazardous event], rich people suffer less. You see that all over the place."In this and their other three case studies, the researchers argued that historians have a vital contribution to make to conversations between archaeologists and climate scientists, because written documents can unlock what they call the "cultural logic" of a society: how people understand what is happening, which in turn determines how they respond to it.Historians bring "nuance to the search for 'tipping points,'" said Monica Green, a history professor at Arizona State University and a 1985 Ph.D. graduate of Princeton who was not involved in this research. "We want to know which straw broke the camel's back. But sometimes, we realize that the answer lies not in identifying a specific straw, but something about the camel or the ambient environment."Others have also called for this convergence of history with science, but Haldon's group is the first to show exactly what that might look like, said Carrie Hritz, associate director of research for the National Socio-Environmental Synthesis Center in Annapolis, Maryland, who was not involved in this research. "Past work has centered around calls for integration with somewhat vague statements about how history and archaeological data can be relevant to current studies of the human dimensions of climate change. This paper is unique in that it [provides] detailed examples that link these data to current topics."In recent years, archaeologists have begun incorporating scientific data sets -- such as pollen deposits that reveal crop choices and tree rings that reflect good and bad growing seasons -- even as biologists have started writing history books that argue for "environmental determinism," the idea that natural events often determine the course of societies.Neither side has the whole story, said Haldon, who is also an associated faculty member with the Princeton Environmental Institute and the director of the Sharmin and Bijan Mossavar-Rahmani Center for Iran and Persian Gulf Studies at Princeton."There's a danger that we perceived that historians who didn't understand the methodologies and problems of the sciences could easily misuse the science," Haldon said. "And we also saw that the same problem works the other way around. Scientists don't really understand how social scientists work and why we ask the questions we ask, so they're often in danger of misusing history and archaeology."To bring historians, archaeologists and paleoclimate scientists into conversation, Haldon helped launch the Climate Change and History Research Initiative, which funds field research, public lectures, workshops and more.Since 2013, its collaborators have addressed the question of "how do we get scientists and social scientists to work together and not misunderstand each other or misuse each other's work?" by creating cross-disciplinary research teams that pose and tackle research questions together.After several years of semiannual, face-to-face group meetings with a growing set of researchers, "we knew we were doing something that nobody else does, but we hadn't thought of how to publicize what we were doing other than through the regular social science approach of writing rather long, boring articles and publishing them in journals nobody reads," said Haldon with a chuckle.At the suggestion of one of their science collaborators, Haldon and Mordechai distilled their research into a paper for the scientific community."The paper is of extremely high importance, because it addresses the lack of true interdisciplinary research in the field of historical environmental studies," said Sabine Ladstätter, director of the Austrian Archaeological Institute, who was not involved in the research. "Complex historical phenomena are currently often discussed without historians in the scientific community as well as in public. This situation in turn leads to simplifying explanatory models, that do not withstand a critical evaluation by historians. The required cooperation between historians, archaeologists and natural scientists (in this case paleo-environmental sciences) is to be welcomed and urgently needed."If he could leave policymakers with one key piece of advice, said Haldon, he would urge them to resist simplistic conclusions and easy explanations:"We're trying to explain how societies can respond in differently resilient ways to stresses and strains, and therefore, it's not that climate and environment don't have a direct impact on society, but rather that the way in which societies respond is often very different, and what is catastrophic for one society might be perfectly well-managed by another one, right next door.""History meets palaeoscience: Consilience and collaboration in studying past societal responses to environmental change," by John Haldon, Lee Mordechai, Timothy Newfield, Arlen Chase, Adam Izdebski, Piotr Guzowski, Inga Labuhn and Neil Roberts was published March 12 in the | Earthquakes | 2,018 |
March 19, 2018 | https://www.sciencedaily.com/releases/2018/03/180319091031.htm | Listening for micro earthquakes, hearing mega whales in the Arctic | How does the sound of a tiny tremor of the earth differ from the sound of a huge passing whale? That is one of the things that scientists had to figure out while listening for the sound of methane release from the sea floor. | A recent study in "We can't say if the micro-earthquakes or other micro seismic events are causing the methane leaks or if it is the other way around. A possible explanation is that the build-up of methane below the ocean floor creates bubbles. They could cause very weak tremors as they migrate, expand and release gas into the water column in the area." says Peter Franek first author of the study, and researcher at CAGE Centre for Arctic Gas Hydrate, Environment and Climate.The study area is a host to hundreds if not thousands of known methane leaks. They are associated with temporal changes in dissociation of gas hydrates -the icy substance that contains huge amounts of methane. This causes methane release from the seafloor, and potentially into the atmosphere.The tremors described in this particular study are called short duration events, and are definitely not caused by the same mechanisms as proper earthquakes. Short duration events are only detectable due to a novel use of highly sensitive listening devices."This study is unique because we are using instrumentation and techniques commonly used for earthquake research. We use them to identify tiny earth movements generated by the circulation and release of gas from the seafloor. Very few in the world are working with this approach at such a scale." says researcher and co-author Andreia Plaza Faverola at CAGE.The underwater world is full of different natural sounds that can be recorded by cutting-edge underwater technology such as ocean bottom seismometers (OBS), that include hydro- and geophones. An OBS was placed at the ocean floor at 400-meter water depth offshore Western Svalbard, and recorded every sound from the seabed and ocean for a full year. It was then retrieved by the research vessel Helmer Hanssen from UiT The Arctic University of Norway.All sounds within a given frequency range were recorded for this study. Scientists then needed to differentiate between the sounds of methane associated tremors and other recurrent sounds."It turns out that fin whales are abundant offshore Western Svalbard and communicate with each other within the range of our investigation. It was an added pleasure for us to record their activity as well as the activity of the methane seeps themselves." says co-author of the study professor, Jürgen Mienert at CAGE.Fin whales are some of the largest animals in the world, only surpassed in size by the blue whale. Peter Franek says that the scientists clearly were able to make out the calls of the fin whales to such detail that it might be useful even to the biologists who wish to study movement and sound communication patterns of these majestic animals."OBS is a non-invasive tool primarily designed for recording natural, non- biological or man-made seismic and acoustic signals. Unexpectedly, it can be used to study communication between whales. It can give a more detailed insight on call patterns, respiration times and swimming speeds of fin whales in their natural habitat, without disturbing them with any human presence." | Earthquakes | 2,018 |
March 14, 2018 | https://www.sciencedaily.com/releases/2018/03/180314144449.htm | Scientists helping to improve understanding of plate tectonics | Scientists at The Australian National University (ANU) are helping to improve understanding of how rocks in Earth's hot, deep interior enable the motions of tectonic plates, which regulate the water cycle that is critical for a habitable planet. | Research team leader Professor Ian Jackson said tectonic plates were continuously created at mid-ocean ridges and destroyed when they sink back into the Earth's mantle."Plate tectonics is responsible for diverse geological phenomena including continental drift, mountain building and the occurrence of volcanoes and earthquakes," said Professor Jackson from the ANU Research School of Earth Sciences.The stirring of the Earth's interior, which is responsible for the plate motions at the surface, has resulted in the Earth's gradual cooling over its 4.5 billion-year life.He said defects allowed the normally strong and hard minerals of the Earth's deep interior to change their shape and flow like viscous fluid on geological timescales."We have found that flaws in the regular atomic packing in the dominant upper-mantle mineral, called olivine, that become more prevalent under oxidising conditions, substantially reduce the speeds of seismic waves," Professor Jackson said.Seismic waves, caused by earthquakes, are used to image the Earth's deep interior in a manner similar to medical CAT scanning."Our new findings challenge a long-held theory that defects involving water absorption in these normally dry rocks could control both their viscosity and seismic properties," Professor Jackson said.ANU Research School of Earth Sciences (RSES) PhD scholar Chris Cline is the lead author of the study undertaken in collaboration with RSES colleagues and Professor Ulrich Faul at the Massachusetts Institute of Technology in the United States.The team used specialised equipment in a laboratory at ANU to make synthetic specimens similar to upper mantle rocks and measured their rigidity, which controls seismic wave speeds, under conditions simulating those of the Earth's mantle.Professor Jackson said the research was particularly relevant to environments where old, cold, and oxidised tectonic plates sink into the Earth's hot interior."We have the potential to help map the extent of oxidised regions of the Earth's mantle that play such an important role in the chemical evolution of Earth," he said. | Earthquakes | 2,018 |
March 13, 2018 | https://www.sciencedaily.com/releases/2018/03/180313091840.htm | Urban planning can help develop cities with reduced seismic risk | Researchers from Universidad Politécnica de Madrid (UPM) suggest a new methodology to establish urban modifiers that affect the building habitability in seismic risk areas. | What types of buildings do they have greater predisposition to suffer damage after an earthquake? This is what a team of researchers from the Research Group on Earthquake Engineering at School of Land Surveying, Geodesy and Mapping Engineering of UPM is trying to find out.For this purpose, the researchers carried out a study that took into account parameters of urban planning (urban modifiers) and they were used to identify the typologies of the buildings and classify them according to their habitability after an earthquake. Applying this methodology we could create a visual catalog of buildings and rapidly identify those that could be uninhabitable after an earthquake.The seismic vulnerability of a building refers to its predisposition to suffer damage after an earthquake. Today there are many studies that classify buildings by taking into account their seismic vulnerability and there are also numerous methodologies that use these typologies or add new ones to obtain the damage of the buildings after an event of this type.The aim of the project carried out by UPM researchers was to find out whether the urban modifiers were related to habitability. To this end, they developed a classification of the urban parameters that increased the damage after an earthquake and later, after a statistical study and the modifier grouping, they classified the typologies of buildings that could remain uninhabited after an earthquake.Sandra Martínez Cuevas, a female researcher involved in this study, says "This information would be very valuable for councils and regions that are located in seismic risk areas since it would allow them to catalog their park estate as well as for civil protection that could predict which buildings would remain inhabitable."In order to determine the link between the urban modifiers and the damage, the researchers carried out an exploratory study of such modifiers in the city of Lorca to establish its correlation with the damage caused by the 2011 earthquake. They selected three areas of study in the city with a total of 816 buildings and conducted an exhaustive fieldwork to assess the seismic vulnerability. They directly observed the characteristics of each constructive unit.The buildings were classified according to their structure (concrete or masonry) and their urban modifiers. Besides, they took into account the ground on which the buildings were located and the type of damage to homogeneously compare the data and carry out a statistical study. As a result, an extensive and complete database with information about the constructions of Lorca was obtained and it was implemented in a Geographic information system.Later, they conducted a correlation analysis to find out the link between the urban modifiers and the habitability and defined an index of discrimination. Having this first analysis, they calibrated the urban modifiers for each type of ground and each structural typology.Sandra Martínez Cuevas says "this first graduation of the urban modifiers in relation to the damage will allow us to initially influence on the urban regulation in the city of Lorca and give recommendations for urban planning and reduce the damage of possible future earthquakes."Finally, researchers studied their contingency table to assess the link between the dependence between two qualitative nominal or ordinal variables. Therefore, from this contingency table, we can assess the urban modifiers and habitability.With the obtained results, researchers carried out a scale of habitability and added the levels with the associated probability to the damage greater than or equal to 70% on each constructive typology (concrete and masonry) and for each type of ground (hard and soft). Sandra Martínez Cuevas says, "as a result, we obtained diverse building typologies that would give the habitability cartography with at least 70% matching, in other words, we could say with 70% reliability which buildings would be uninhabitable in Lorca if an earthquake with the same characteristics occurred in Lorca as it happened in May 2011 ."Besides, she adds "by using this method, we can perform mapping analysis of the habitability of buildings that can provide an approach of great interest for mitigation tasks and early response planning." It is essential to transfer the results to the organisms responsible for c city planning and management of civil protection and emergency in order to develop cities with less seismic risk. | Earthquakes | 2,018 |
February 27, 2018 | https://www.sciencedaily.com/releases/2018/02/180227233301.htm | Human-made earthquake risk reduced if fracking is 895m from faults | The risk of human-made earthquakes due to fracking is greatly reduced if high-pressure fluid injection used to crack underground rocks is 895m away from faults in the Earth's crust, according to new research. | The recommendation, from the ReFINE (Researching Fracking) consortium, is based on published microseismic data from 109 fracking operations carried out predominantly in the USA.Jointly led by Durham and Newcastle Universities, UK, the research looked at reducing the risk of reactivating geological faults by fluid injection in boreholes.Researchers used microseismic data to estimate how far fracking-induced fractures in rock extended horizontally from borehole injection points.The results indicated there was a one per cent chance that fractures from fracking activity could extend horizontally beyond 895m in shale rocks.There was also a 32 per cent chance of fractures extending horizontally beyond 433m, which had been previously suggested as a horizontal separation distance between fluid injection points and faults in an earlier study.The research is published in the journal Fracking -- or hydraulic fracturing -- is a process in which rocks are deliberately fractured to release oil or gas by injecting highly pressurised fluid into a borehole. This fluid is usually a mixture of water, chemicals and sand.In 2011 tremors in Blackpool, UK, were caused when injected fluid used in the fracking process reached a previously unknown geological fault at the Preese Hall fracking site.Fracking is now recommencing onshore in the UK after it was halted because of fracking-induced earthquakes.Research lead author Miles Wilson, a PhD student in Durham University's Department of Earth Sciences, said: "Induced earthquakes can sometimes occur if fracking fluids reach geological faults. Induced earthquakes can be a problem and, if they are large enough, could damage buildings and put the public's safety at risk."Furthermore, because some faults allow fluids to flow along them, there are also concerns that if injected fluids reach a geological fault there is an increased risk they could travel upwards and potentially contaminate shallow groundwater resources such as drinking water."Our research shows that this risk is greatly reduced if injection points in fracking boreholes are situated at least 895m away from geological faults."The latest findings go further than a 2017 ReFINE study which recommended a maximum distance of 433m between horizontal boreholes and geological faults. That research was based upon numerical modelling in which a number of factors, including fluid injection volume and rate, and fracture orientation and depth, were kept constant.Researchers behind the latest study said that changing these parameters might lead to different horizontal extents of fractures from fluid injection points.The researchers added that this did not mean the modelling results of the previous study were wrong. Instead they said the previous study was approaching the same problem using a different method and the new study provided further context.In the latest research the researchers used data from previous fracking operations to measure the distance between the furthest detected microseismic event -- a small earthquake caused by hydraulic fracturing of the rock or fault reactivation -- and the injection point in the fracking borehole.From the 109 fracking operations analysed, the researchers found that the horizontal extent reached by hydraulic fractures ranged from 59m to 720m.There were 12 examples of fracking operations where hydraulic fractures extended beyond the 433m proposed in the 2017 study.According to the new study, the chance of a hydraulic fracture extending beyond 433m in shale was 32 per cent and beyond 895m was one per cent.The research also found that fracking operations in shale rock generally had their furthest detected microseismic events at greater distances than those in coal and sandstone rocks.Microseismic data was used in previous Durham University research from 2012. This suggested a minimum vertical distance of 600m between the depth of fracking and aquifers used for drinking water, which now forms the basis of hydraulic fracturing regulation in the UK's Infrastructure Act 2015.Professor Richard Davies, Newcastle University, who leads the ReFINE project, said: "We strongly recommend that for the time being, fracking is not carried out where faults are within 895m of the fracked borehole to avoid the risk of fracking causing earthquakes and that this guideline is adopted world-wide."ReFINE is led jointly by Durham and Newcastle Universities and has been funded by the Natural Environment Research Council (UK), Total, Shell, Chevron, GDF Suez, Centrica and Ineos.Working closely with a global network of leading scientists and institutions, ReFINE focuses on researching the potential environmental risks of hydraulic fracturing for shale gas and oil exploitation. | Earthquakes | 2,018 |
February 26, 2018 | https://www.sciencedaily.com/releases/2018/02/180226122503.htm | Health staff 'too stressed' to deal with disasters | Increasing stress and a lack of motivation among healthcare staff could result in hospitals having to shut down in the wake of a major incident such as flooding or an earthquake, according to new research published in the journal | The research, led by Anglia Ruskin University, examined studies from across the world. It found that the capacity of clinical and non-clinical staff in hospitals and clinics to deal with incidents such as floods, earthquakes or other natural hazard is severely limited by a high workload and challenging targets which result in high levels of psychological stress.The findings also suggested that some staff are left feeling unmotivated and unattached to their workplace, meaning they are less likely to take the initiative in such a scenario and may even avoid coming into work. Only 21% of participants in the research expressed complete satisfaction with their jobs and workplace.Dr Nebil Achour, lead author and Senior Lecturer in Healthcare Management at Anglia Ruskin University, said: "Healthcare services in many countries across the world are under severe strain, which leaves little opportunity for staff to be trained in disaster resilience. Yet healthcare is among the most critical services in any country during and after a major incident has occurred."Staff suffer from increasing workload and stricter performance measures with less flexibility. This has caused psychological and physical stress and makes them unable to respond to any further stress associated with major hazards."Many staff members do not feel attached to their workplace and do not feel that they have enough flexibility to take the initiative and lead their own way. This in turn also makes them less motivated to learn the extra skills needed to deal with a catastrophic event."Combined, these factors expose healthcare services to major risk of staff shortage and thus inoperability when a major hazard does strike." | Earthquakes | 2,018 |
February 20, 2018 | https://www.sciencedaily.com/releases/2018/02/180219124758.htm | Earthquakes follow wastewater disposal patterns in southern Kansas | Wastewater created during oil and gas production and disposed of by deep injection into underlying rock layers is the probable cause for a surge in earthquakes in southern Kansas since 2013, a new report in the | Until 2013, earthquakes were nearly unheard of in Harper and Sumner counties, the site of recent intensive oil and gas production. But between 2013 and 2016, 127 earthquakes of magnitude 3 or greater occurred in Kansas, with 115 of them taking place in Harper and Sumner counties. Prior to 1973, there were no felt earthquakes reported in the area, and only one magnitude 2.0 earthquake between 1973 and 2012.Using data collected by a network of seismic stations installed by the U.S. Geological Survey, lead researcher Justin Rubinstein and his colleagues analyzed 6,845 earthquakes that occurred in the counties between March 2014 and December 2016.They found that the dramatic uptick in seismicity correlated in time and location with increases in wastewater disposal that began in 2012 -- and that decreases in seismicity during that time also corresponded to decreases in disposal rates.Between 1974 and 2012, there were no magnitude 4 or greater earthquakes in the study area. Between 2012 and 2016, six such earthquakes occurred. "The probability of this rate change occurring randomly is approximately 0.16%," Rubinstein and colleagues write in the BSSA study.Kansas had the second-highest statewide earthquake rate in the central United States between 2013 and 2016, coming in behind Oklahoma, where a similar dramatic increase in seismicity also has been linked to wastewater injection.In the southern Kansas study area, wastewater injection decreased significantly in 2015, falling from an average of 5 million barrels a month from July to December 2014 to 3.8 million barrels per month in March 2015. This decrease was likely due in part to a drop in oil and gas production as the prices for those commodities dropped, the researchers said, noting that the price of a barrel of oil fell by half between August 2014 and January 2015.During the same time, the Kansas Corporation Commission developed rules to limit wastewater injection in the study area, with the rules going into full effect in July 2015. Since then, wastewater injection in the study area has dropped by almost 50 percent.There is a corresponding decrease in seismicity in 2015, but Rubinstein said it is difficult to tell how much economic shifts or new regulation contributed to that trend. "We can't fully disentangle it. But there's no question that economics plays a role here because injection started to drop off before the new rules went into place. ""It certainly seems probable that regulations have an effect," he said, "but we would need to speak to individual [oil and gas] operators to determine the extent of that effect."The researchers say that fluids injected into the crystalline rock basement below the Kansas oil and gas sites increased pressures in the rock pores and reduced friction along faults to trigger these induced earthquakes. Not all wastewater injection disposal sites have earthquakes associated with them, Rubinstein noted. In some cases, the fluid pressures may not be able to get to depths where earthquakes occur, or there may be some places in the rock basement that are more susceptible than others to fluid effects.It's also difficult to know how long these stresses might continue to produce earthquakes, he added. Seismicity related to an injection well in Youngstown, Ohio in 2013 lasted only a few weeks, while fluid injection at Rocky Mountain Arsenal in Colorado stopped in 1966 but significant seismicity continued in the area until the 1980s."It's hard to say how long it's going to last given that what we're looking at in Kansas is a much higher rate of injection than in the places where seismicity slowed quickly, and many, many more wells," said Rubinstein. "If they shut off all the injection, the decay could still take years, just because there's been such a dramatic change in the regional pressure field."Rubinstein will continue to work in Kansas to learn more about whether seismologists can consistently see foreshocks and earthquake swarms in the seismic record. "We have an incredible network there, and one of the best documented cases of induced seismicity with publicly available seismic data," he said. | Earthquakes | 2,018 |
February 14, 2018 | https://www.sciencedaily.com/releases/2018/02/180214150155.htm | Analysis of major earthquakes supports stress reduction assumptions | A comprehensive analysis of 101 major earthquakes around the Pacific ring of fire between 1990 and 2016 shows that most of the aftershock activity occurred on the margins of the areas where the faults slipped a lot during the main earthquakes. The findings support the idea that the area of large slip during a major earthquake is unlikely to rupture again for a substantial time. | The idea that earthquakes relieve stress on faults in the Earth's crust makes intuitive sense and underlies the common assumption that the portion of a fault that has just experienced an earthquake is relatively safe for some time. But not all studies have supported this, according to Thorne Lay, professor of Earth and planetary sciences at UC Santa Cruz."This intuition has been challenged by statistical treatments of seismic data that indicate that, based on the clustering of earthquakes in space and time, the area that has just slipped is actually more likely to have another failure," Lay said. "The truth appears to be more nuanced. Yes, the area that slipped a lot is unlikely to slip again, as the residual stress on the fault has been lowered to well below the failure level, but the surrounding areas have been pushed toward failure in many cases, giving rise to aftershocks and the possibility of an adjacent large rupture sooner rather than later."In the new study, published February 14 in "This produces a halo of aftershocks surrounding the rupture and indicates that the large-slip zone is not likely to have immediate rerupture," Lay said.These findings indicate that the stress reduction during a major earthquake is large and pervasive over the ruptured surface of the fault. Stress will eventually build up again on that portion of the fault through frictional resistance to the gradual motions of the tectonic plates of Earth's crust, but that's a very slow process. Although immediate rerupture of the large-slip zone is unlikely, regional clustering of earthquakes is likely to occur due to the increased stress outside the main slip zone.The findings also suggest that if unusually intense aftershock activity is observed within the high-slip zone, a larger earthquake in the immediate vicinity of the first event might still be possible. The authors noted that earthquake sequences are highly complex and involve variable amounts of slip and stress reduction. | Earthquakes | 2,018 |
February 13, 2018 | https://www.sciencedaily.com/releases/2018/02/180213145820.htm | Earthquakes continue for years after gas field wastewater injection stops, study finds | Efforts to stop human-caused earthquakes by shutting down wastewater injection wells that serve adjacent oil and gas fields may oversimplify the challenge, according to a new study from seismologists at Southern Methodist University, Dallas. | The seismologists analyzed a sequence of earthquakes at Dallas Fort Worth International Airport and found that even though wastewater injection was halted after a year, the earthquakes continued.The sequence of quakes began in 2008, and wastewater injection was halted in 2009. But earthquakes continued for at least seven more years."This tells us that high-volume injection, even if it's just for a short time, when it's near a critically stressed fault, can induce long-lasting seismicity," said SMU seismologist Paul O. Ogwari, who developed a unique method of data analysis that yielded the study results.The earthquakes may be continuing even now, said Ogwari, whose analysis extended through 2015.The study's findings indicate that shutting down injection wells in reaction to earthquakes, as some states such as Oklahoma and Arkansas are doing, may not have the desired effect of immediately stopping further earthquakes, said seismologist Heather DeShon, a co-author on the study and an associate professor in the SMU Earth Sciences Department."The DFW earthquake sequence began on Halloween in 2008 -- before Oklahoma seismicity rates had notably increased," said DeShon. "This study revisits what was technically the very first modern induced earthquake sequence in this region and shows that even though the wastewater injector in this case had been shut off very quickly, the injection activity still perturbed the fault, so that generated earthquakes even seven years later."That phenomenon is not unheard of. Seismologists saw that type of earthquake response from a rash of human-induced earthquakes in Colorado after wastewater injection during the 1960s at the Rocky Mountain Arsenal near Denver. Similarly in that case, injection was started and stopped, but earthquakes continued.Such a possibility has not been well understood outside scientific circles, said DeShon. She is a member of the SMU seismology team that has studied and published extensively on their scientific findings related to the unusual spate of human-induced earthquakes in North Texas."The perception is that if the oil and gas wastewater injectors are leading to this, then you should just shut the injection wells down," DeShon said. "But Paul's study shows that there's a lot to be learned about the physics of the process, and by monitoring continuously for years."Ogwari, DeShon and fellow SMU seismologist Matthew J. Hornbach reported the findings in the peer-reviewed The DFW Airport's unprecedented earthquake clusters were the first ever documented in the history of the North Texas region's oil-rich geological system known as the Fort Worth Basin. The quakes are also the first of multiple sequences in the basin tied to large-scale subsurface disposal of waste fluids from oil and gas operations.The DFW Airport earthquakes began in 2008, as did high-volume wastewater injection of brine. Most of the seismic activity occurred in the first two months after injection began, primarily within .62 miles, or 1 kilometer, from the well. Other clusters then migrated further to the northeast of the well over the next seven years. The quakes were triggered on a pre-existing regional fault that trends 3.7 miles, or 6 kilometers, northeast to southwest.Ogwari, a post-doctoral researcher in the SMU Roy M. Huffington Earth Sciences Department, analyzed years of existing seismic data from the region to take a deeper look at the DFW Airport sequence, which totaled 412 earthquakes through 2015.Looking at the data for those quakes, Ogwari discovered that they had continued for at least seven years into 2015 along 80% of the fault, even though injection was stopped after only 11 months in August of 2009.In another important finding from the study, Ogwari found that the magnitude of the DFW Airport earthquakes didn't lessen over time, but instead held steady. Magnitude ranged from 0.5 to 3.4, with the largest one occurring three years after injection at the well was stopped."What we've seen here is that the magnitude is consistent over time within the fault," Ogwari said. "We expect to see the bigger events during injection or immediately after injection, followed by abrupt decay. But instead we're seeing the fault continue to produce earthquakes with similar magnitudes that we saw during injection."While the rate of earthquakes declined -- there were 23 events a month from 2008 to 2009, but only 1 event a month after May 2010 -- the magnitude stayed the same. That indicates the fault doesn't heal completely."We don't know why that is," Ogwari said. "I think that's a question that is out there and may need more research."Answering that question, and others, about the complex characteristics and behavior of faults and earthquakes, requires more extensive monitoring than is currently possible given the funding allotted to monitor quakes.Monitoring the faults involves strategically placed stations that "listen" and record waves of intense energy echoing through the ground, DeShon said.The Fort Worth Basin includes the Barnett shale, a major gas producing geological formation, atop the deep Ellenberger formation used for wastewater storage, which overlays a granite basement layer. The ancient Airport fault system extends through all units.Friction prevented the fault from slipping for millions of years, but in 2008 high volumes of injected wastewater disturbed the Airport fault. That caused the fault to slip, releasing stored-up energy in waves. The most powerful waves were "felt" as the earth shaking."The detailed physical equations relating wastewater processes to fault processes is still a bit of a question," DeShon said. "But generally the favored hypothesis is that the injected fluid changes the pressure enough to change the ratio of the downward stress to the horizontal stresses, which allows the fault to slip."?Earthquakes in North Texas were unheard of until 2008, so when they began to be felt, seismologists scrambled to install monitors. When the quakes died down, the monitoring stations were removed."As it stands now, we miss the beginning of the quakes. The monitors are removed when the earthquakes stop being felt," DeShon said. "But this study tells us that there's more to it than the 'felt' earthquakes. We need to know how the sequences start, and also how they end. If we're ever going to understand what's happening, we need the beginning, the middle -- and the end. Not just the middle, after they are felt."Monitors the SMU team installed at the DFW Airport were removed when seismic activity appeared to have died down in 2009.Ogwari hypothesized he could look at historical data from distant monitoring stations still in place to extract information and document the history of the DFW Airport earthquakes.The distant stations are a part of the U.S. permanent network monitored and maintained by the U.S. Geological Survey. The nearest one is 152 miles, 245 kilometers, away.Earthquake waveforms, like human fingerprints, are unique. Ogwari used the local station monitoring data to train software to identify DFW earthquakes on the distant stations. Ogwari took each earthquake's digital fingerprint and searched through years of data, cross-correlating waveforms from both the near and regional stations and identified the 412 DFW Airport events."The earthquakes are small, less than magnitude three," DeShon said. "So on the really distant stations it's like searching for a needle in a haystack, sifting them from all the other tiny earthquakes happening all across the United States."Each path is unique for every earthquake, and seismologists record each wave's movement up and down, north to south, and east to west. From that Ogwari analyzed the evolution of seismicity on the DFW airport fault over space and time. He was able to look at data from the distant monitors and find seismic activity at the airport as recent as 2015."Earthquakes occurring close in space usually have a higher degree of similarity," Ogwari said. "As the separation distance increases the similarity decreases."To understand the stress on the fault, the researchers also modeled the location and timing of the pressure in the pores of the rock as the injected water infiltrated.For the various earthquake clusters, the researchers found that pore pressure increased along the fault at varying rates, depending on how far the clusters were from the injection well, the rate and timing of injection, and hydraulic permeability of the fault.The analysis showed pore-pressure changes to the fault from the injection well where the earthquakes started in 2008; at the location of the May 2010 quakes along the fault; and at the northern edge of the seismicity.Will the DFW Airport fault continue to slip and trigger earthquakes?"We don't know," Ogwari said. "We can't tell how long it will continue. SMU and TexNet, the Texas Seismic Network, continue to monitor both the DFW Airport faults and other faults in the Basin." | Earthquakes | 2,018 |
February 12, 2018 | https://www.sciencedaily.com/releases/2018/02/180212125801.htm | Acoustic imaging reveals hidden features of megathrust fault off Costa Rica | Geophysicists have obtained detailed three-dimensional images of a dangerous megathrust fault west of Costa Rica where two plates of the Earth's crust collide. The images reveal features of the fault surface, including long grooves or corrugations, that may determine how the fault will slip in an earthquake. | The study, published February 12 in "Our new imagery shows large variability in the conditions along the megathrust, which may be linked to a number of earthquake phenomena we observe in the region," Edwards said.Megathrusts, the huge continuous faults found in subduction zones, are responsible for Earth's largest earthquakes. Megathrust earthquakes can generate destructive tsunamis and are a serious hazard facing communities located near subduction zones. Understanding the mechanisms at work along these faults is vital for disaster management around the globe.Edwards worked with a team of geophysicists at UC Santa Cruz, the U.S. Geological Survey, the University of Texas-Austin, and McGill University to obtain 3-dimensional imagery of the fault interface using cutting-edge acoustic imaging technology. The long grooves, or corrugations, they observed along the interface are similar in size to those found along the base of fast-flowing glaciers and along some ocean ridges. The images also showed varying amounts of smoothness and corrugations on different portions of the fault."This study produced an unprecedented view of the megathrust. Such 3-D information is critical to our ability to better understand megathrust faults and associated hazards worldwide," said coauthor Jared Kluesner, a geophysicist at the USGS in Santa Cruz.The acoustic dataset was collected in spring 2011 on the academic research vessel Marcus G. Langseth. The ship towed an array of underwater microphones and sound sources behind it as it made a series of overlapping loops over the area of the fault. The data were processed over the next 2 years and have since been used in a number of studies looking at different aspects of the subduction zone process. This particular study focused on the interface between the sliding plates, which serves as a record of slip and slip processes."The 3-D site selection was really good and the resulting acoustic dataset showed extraordinary detail," said Edwards, noting that coauthor Emily Brodsky, professor of Earth and planetary sciences, was the first to recognize the corrugations. Such features had been observed in exposed faults on land, but never before in a fault deep beneath the surface."I had an early rendition of the interface that vaguely showed long grooves, and during my qualifying exam, Brodsky saw them and asked, 'are those corrugations!?' I didn't know, but I knew they were real features. Slip-derived corrugations was a really neat hypothesis, and we dug into it after that," he said.The area in this study had long been a target for drilling into the megathrust by the Costa Rica Seismic Project (CRISP). Coauthor Eli Silver, professor emeritus of Earth and planetary sciences at UC Santa Cruz, and others in the CRISP program decided to pursue a 3-D seismic study, which must precede any deep drilling, and the project was funded by the National Science Foundation in 2009. "At present, two drilling expeditions have been accomplished with shallower targets, and, though not yet scheduled, we are hopeful that the deep drilling will occur," Silver said.Researchers hope to use similar imaging techniques on other subduction zones, such as the Cascadia margin along the northern U.S. west coast, where there is a long history of large megathrust earthquakes and related tsunamis. "Conducting this type of 3-D study along the Cascadia margin could provide us with key information along the megathrust, a plate boundary that poses a substantial hazard risk to the U.S. west coast," Kluesner said. | Earthquakes | 2,018 |
February 9, 2018 | https://www.sciencedaily.com/releases/2018/02/180209100713.htm | Giant lava dome confirmed in Japan's Kikai Caldera | Since the Kobe Ocean Bottom Exploration Center (KOBEC) was established in 2015, the Center has carried out three survey voyages to the Kikai Caldera, south of Japan's main islands. Based on these voyages, researchers have confirmed that a giant lava dome was created after the caldera-forming supereruption 7300 years ago. The dome is in the world's largest class of post-caldera volcano, with a volume of over 32 cubic kilometers. The composition of this lava dome is different from the magma that caused the giant caldera to erupt -- it shows the same chemical characteristics as the current post-caldera volcano on the nearby Satsuma Iwo-jima Island. It is possible that currently a giant magma buildup may exist under the Kikai Caldera. | These findings were published in the online edition of There is roughly a 1% chance of a giant caldera-forming eruption occurring within the Japanese archipelago during the next 100 years. An eruption like this would see over 40 cubic kilometers of magma released in one burst, causing enormous damage. The mechanism behind this and how to predict this event are urgent questions.Researchers equipped training ship Fukae Maru, part of the Kobe University Graduate School of Maritime Sciences, with the latest observation equipment to survey the Kikai Caldera. They chose this volcano for two main reasons. Firstly, for land-based volcanoes it is hard to carry out large-scale observations using artificial earthquakes because of the population density, and it is also difficult to detect giant magma buildups with precise visualization because they are often at relatively low depths (roughly 10km). Secondly, the Kikai Caldera caused the most recent giant caldera-forming eruption in the Japanese archipelago (7300 years ago), and there is a high possibility that a large buildup of magma may exist inside it.During the three survey voyages, KOBEC carried out detailed underwater geological surveys, seismic reflection, observations by underwater robots, samples and analysis of rocks, and observations using underwater seismographs and electromagnetometers.In their upcoming March 2018 voyage, researchers plan to use seismic reflection and underwater robots to clarify the formation process of the double caldera revealed in previous surveys and the mechanism that causes a giant caldera eruption.They will also use seismic and electromagnetic methods to determine the existence of a giant magma buildup, and in collaboration with the Japan Agency for Marine-Earth Science and Technology will carry out a large-scale underground survey, attempting to capture high-resolution visualizations of the magma system within the Earth's crust (at a depth of approximately 30km). Based on results from these surveys, the team plans to continue monitoring and aims to pioneer a method for predicting giant caldera-forming eruptions.Formation of metallic ore deposits are predicted to accompany the underwater hydrothermal activity, so the team also plan to evaluate these undersea resources.1. Caldera: a depression in the land formed when a volcano erupts2. Giant caldera-forming eruption: an eruption that releases a large amount of magma (>40km3. Seismic reflection survey: Causing an artificial earthquake with air guns or similar, receiving the seismic waves that have reflected or refracted below ground, and estimating the subsurface structure. | Earthquakes | 2,018 |
February 8, 2018 | https://www.sciencedaily.com/releases/2018/02/180208145149.htm | Earthquake simulations of California's Hayward fault | In the next 30 years, there is a one-in-three chance that California's Hayward fault will rupture with a 6.7 magnitude or higher earthquake, according to the United States Geologic Survey (USGS). Such an earthquake will cause widespread damage to structures, transportation and utilities, as well as economic and social disruption in the eastern region of the San Francisco Bay Area (East Bay). | Lawrence Livermore and Lawrence Berkeley national laboratory scientists have used some of the world's most powerful supercomputers to model ground shaking for a magnitude (M) 7.0 earthquake on the Hayward fault and show more realistic motions than ever before. The research appears in Past simulations resolved ground motions from low frequencies up to 0.5-1 Hertz (vibrations per second). The new simulations are resolved up to 4-5 Hertz (Hz), representing a four to eight times increase in the resolved frequencies. Motions with these frequencies can be used to evaluate how buildings respond to shakingThe simulations rely on the LLNL-developed SW4 seismic simulation program and the current best representation of the three-dimensional (3D) earth (geology and surface topography from the USGS) to compute seismic wave ground shaking throughout the San Francisco Bay Area. Importantly, the results are, on average, consistent with models based on actual recorded earthquake motions from around the world."This study shows that powerful supercomputing can be used to calculate earthquake shaking on a large, regional scale with more realism than we've ever been able to produce before," said Artie Rodgers, LLNL seismologist and lead author of the paper.The Hayward fault is a major strike-slip fault on the eastern side of the Bay Area. This fault is capable of M 7 earthquakes and presents significant ground motion hazard to the heavily populated East Bay, including the cities of Oakland, Berkeley, Hayward and Fremont. The last major rupture occured in 1868 with an M 6.8-7.0 event. Instrumental observations of this earthquake were not available at the time, however historical reports from the few thousand people who lived in the East Bay at the time indicate major damage to structures.The recent study reports ground motions simulated for a so-called scenario earthquake, one of many possibilities."We're not expecting to forecast the specifics of shaking from a future M 7 Hayward fault earthquake, but this study demonstrates that fully deterministic 3D simulations with frequencies up to 4 Hz are now possible. We get good agreement with ground motion models derived from actual recordings and we can investigate the impact of source, path and site effects on ground motions," Rodgers said.As these simulations become easier with improvements in SW4 and computing power, the team will sample a range of possible ruptures and investigate how motions vary. The team also is working on improvements to SW4 that will enable simulations to 8-10 Hz for even more realistic motions.For residents of the East Bay, the simulations specifically show stronger ground motions on the eastern side of the fault (Orinda, Moraga) compared to the western side (Berkeley, Oakland). This results from different geologic materials -- deep weaker sedimentary rocks that form the East Bay Hills. Evaluation and improvement of the current 3D earth model is the subject of current research, for example using the Jan. 4, 2018 M 4.4 Berkeley earthquake that was widely felt around the northern Hayward fault.Ground motion simulations of large earthquakes are gaining acceptance as computational methods improve, computing resources become more powerful and representations 3D earth structure and earthquake sources become more realistic.Rodgers adds: "It's essential to demonstrate that high-performance computing simulations can generate realistic results and our team will work with engineers to evaluate the computed motions, so they can be used to understand the resulting distribution of risk to infrastructure and ultimately to design safer energy systems, buildlings and other infrastructure."Other Livermore authors include seismologist Arben Pitarka, mathematicians Anders Petersson and Bjorn Sjogreen, along with project leader and structural engineer David McCallen of the University of California Office of the President and LBNL.This work is part of the DOE's Exascale Computing Project (ECP). The ECP is focused on accelerating the delivery of a capable exascale computing ecosystem that delivers 50 times more computational science and data analytic application power than possible with DOE HPC systems such as Titan (ORNL) and Sequoia (LLNL), with the goal to launch a U.S. exascale ecosystem by 2021. The ECP is a collaborative effort of two Department of Energy organizations -- the DOE Office of Science and the National Nuclear Security Administration.Simulations were performed using a Computing Grand Challenge allocation on the Quartz supercomputer at LLNL and with an Exascale Computing Project allocation on Cori Phase-2 at the National Energy Research Scientific Computing Center (NERSC) at LBNL.Video: | Earthquakes | 2,018 |
February 8, 2018 | https://www.sciencedaily.com/releases/2018/02/180208141415.htm | New map profiles induced earthquake risk for West Texas | Stanford geophysicists have developed a detailed map of the stresses that act in the Earth throughout the Permian Basin in West Texas and southeastern New Mexico, highlighting areas of the oil-rich region that could be at greater risk for future earthquakes induced by production operations. | The new study, published this month in the journal Previous Stanford research has shown that wastewater injected as a step in hydraulic fracturing (fracking) underlies an increase in seismic activity in parts of the central and eastern U.S., particularly in Oklahoma, starting in 2005. While none of these small-to-moderate earthquakes has yet caused significant property damage or injury, they represent an increased probability of larger earthquakes.Now, Texas is poised to take center stage as the Permian Basin is becoming the country's most important oil- and gas-producing region. In the 1920s, energy companies began extracting the basin's bountiful petroleum deposits during a boom that lasted decades. More recently, the advance of hydraulic fracturing techniques has spurred a new development frenzy. Hundreds of thousands of wells could be drilled in the region in the next few decades."We want to get out ahead of the problem in Texas," said study co-author Mark Zoback, the Benjamin M. Page Professor of Geophysics in Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth), who led a number of the Stanford studies in Oklahoma. "We want to stop fluid injection from triggering even small earthquakes in Texas so that the probability of larger earthquakes is significantly reduced."To gauge the risk of future quakes, researchers must first understand the direction of the stresses in a region and their approximate magnitude. When the stress field aligns with a pre-existing fault in a certain manner, the fault can slip, potentially producing an earthquake. In regions such as the central and eastern U.S., far from tectonic plate boundaries such as the San Andreas Fault, this slippage occurs as a natural process, but very rarely. But increasing fluid pressure at depth reduces the friction along the fault, sometimes triggering an earthquake."Fluid injection can cause a quake on a fault that might not produce a natural earthquake for thousands of years from now," said study lead author Jens-Erik Lund Snee, a PhD student in the Department of Geophysics at Stanford Earth.In a previous study, Zoback and postdoctoral scholar Cornelius Langenbruch found that in Oklahoma, fluid injection caused about 6,000 years of natural earthquakes to occur in about five years.Building on previous efforts to create maps of stress and seismic potential in the Permian Basin, the Stanford researchers added hundreds of new data points from West Texas and southeastern New Mexico, much of the data being provided by the oil and gas industry. Their findings paint a complicated picture of the Permian Basin, which features some relatively consistent horizontal stress areas along with others that show dramatic directional rotations. "We were surprised to see such high variability," said Lund Snee. "It raises a lot of questions about how you can have rotations like that in the middle of a continental plate, far from a plate boundary.""This is the one of the most interesting stress fields I've ever seen," Zoback said. "While the stress field in this region is surprisingly complex, the data is excellent and having documented what it is, we can now take action on this information and try to prevent the Permian Basin from becoming Oklahoma 2.0."The Stanford researchers said the new stress map provides oil companies with detailed quantitative data to inform decisions on more effective drilling operations in the Permian Basin. "This is the most complete picture of stress orientation and relative magnitude that they've ever had," Zoback said. "They can use these data every day in deciding the best direction to drill and how to carry out optimal hydraulic fracturing operations."Future studies will focus on improving knowledge of fault lines in the region and gaining a better understanding of fluid pressure, specifically how the amount of water injection (both now and in the past) has impacted the geological mechanisms at work in the area."There is the potential for a lot of earthquakes in this area," said Lund Snee. "We want to understand what's causing them and provide companies with the tools to avoid triggering them."Zoback is also a senior fellow at the Stanford Precourt Institute for Energy, co-director of the Stanford Center for Induced and Triggered Seismicity and director of the Stanford Natural Gas Initiative.The study was supported by the Stanford Center for Induced and Triggered Seismicity, an industrial affiliates program that studies scientific and operational issues associated with triggered and induced earthquakes. | Earthquakes | 2,018 |
February 7, 2018 | https://www.sciencedaily.com/releases/2018/02/180207142713.htm | A one-two punch may have helped deck the dinosaurs | The debate goes on: What killed off the dinosaurs? | New University of Oregon research has identified gravity-related fluctuations dating to 66 million years ago along deep ocean ridges that point to a "one-two punch" from the big meteor that struck off Mexico's Yucatan peninsula, possibly triggering a worldwide release of volcanic magma that could have helped seal the dinosaurs' fate."We found evidence for a previously unknown period of globally heighted volcanic activity during the mass-extinction event," said former UO doctoral student Joseph Byrnes.The study by Byrnes and Leif Karlstrom, a professor in the UO's Department of Earth Sciences, was published Feb. 7 in The findings of the UO's National Science Foundation-supported study, Karlstrom said, point to a pulse of accelerated worldwide volcanic activity that includes enhanced eruptions at India's Deccan Traps after the Chicxulub impact. The Deccan Traps, in west-central India, formed during a period of massive eruptions that poured out layers of molten rock thousands of feet deep, creating one of the largest volcanic features on Earth.The Deccan Traps region has been in and out of the dinosaur debate. Rare volcanic events at such a scale are known to cause catastrophic disturbances to Earth's climate, and, when they occur, they are often linked to mass extinctions. Huge volcanic events can eject so much ash and gas into the atmosphere that few plants survive, disrupting the food chain and causing animals to go extinct.Since evidence of the meteor strike near present-day ChicxulubProgressively improving dating methods indicate that the Deccan Traps volcanoes already were active when the meteor struck. Resulting seismic waves moving through the planet from the meteor strike, Karlstrom said, probably fueled an acceleration of those eruptions."Our work suggests a connection between these exceedingly rare and catastrophic events, distributed over the entire planet," Karlstrom said. "The meteorite's impact may have influenced volcanic eruptions that were already going on, making for a one-two punch."That idea gained strength in 2015 when researchers at the University of California, Berkeley, proposed that the two events might be connected. That team, which included Karlstrom, suggested that the meteorite may have modulated distant volcanism by generating powerful seismic waves that produced shaking worldwide.Similar to the impacts that normal tectonic earthquakes sometimes have on wells and streams, Karlstrom said, the study proposed that seismic shaking liberated magma stored in the mantle beneath the Deccan Traps and caused the largest eruptions there.The new findings at the UO extend this eruption-triggering in India to ocean basins worldwide.Byrnes, now a postdoctoral researcher at the University of Minnesota, analyzed publicly available global data sets on free-air gravity, ocean floor topography and tectonic spreading rates.In his analyses, he divided the seafloor into 1-million-year-old groupings, constructing a record back to 100 million years ago. At about 66 million years, he found evidence for a "short-lived pulse of marine magmatism" along ancient ocean ridges. This pulse is suggested by a spike in the rate of the occurrence of free-air gravity anomalies seen in the data set.Free-air gravity anomalies, measured in tiny increments call milligals, account for variations in gravitational acceleration, found from satellite measurements of additional seawater collecting where the Earth's gravity is stronger. Byrnes found changes in free-air gravity anomalies of between five and 20 milligals associated with seafloor created in the first million years after the meteor. | Earthquakes | 2,018 |
February 6, 2018 | https://www.sciencedaily.com/releases/2018/02/180206121024.htm | September 2017 earthquakes highlight successes of Mexico's early warning system | Mexico's earthquake early warning system gave Mexico City's residents almost two minutes of warning prior to the arrival of strong seismic waves from the September 7, 2017 Tehuantepec earthquake centered off the southern coast of Mexico, according to a report in the journal | The magnitude 8.2 earthquake is the largest earthquake detected by the alert system, known as SASMEX, since it began operations in 1993. SASMEX also sent an alert for the magnitude 7.1 Morelos earthquake that occurred on September 19, but the epicenter of the Morelos earthquake was much closer to Mexico City, allowing only a few seconds of warning prior to strong shaking.The alerts highlighted how some recent improvements to the system may help decrease the time needed to receive, detect and broadcast the alerts, but they also point to places where the system can improve in the future, said Gerardo Suárez, a researcher at the Universidad Nacional Autónoma de México (UNAM).SASMEX's main focus is on earthquakes originating in the subduction zone off the southern coast of Mexico, where the Cocos tectonic plate subducts below the North American Plate. SASMEX tracks seismicity in the subduction zone through 97 seismic monitoring stations. The system also includes strong motion instruments placed further inland, closer to Mexico City, that monitor "in slab" seismicity; that is, earthquakes within the part of the Cocos plate that is already buried beneath the North American Plate. The Tehuantepec earthquake originated offshore in the subduction zone, while the Morelos earthquake was an example of in-slab seismicity.The monitoring systems communicate with data collection centers in cities receiving SASMEX alerts, through a redundant system of satellite links, Internet links and radio, in case any of these communication lines falls silent during an earthquake. The alerts to the public -- a siren sound -- are then sent through radio receivers, television and radio stations in several cities (including Mexico City, Oaxaca City, and Acapulco) that have subscribed to receive the alerts, and by municipal loudspeakers in Mexico City. Since 1993, the network has recorded 6,896 earthquakes and issued 158 seismic early warnings.After the September earthquakes, the Mexico City government launched a cell phone app called 911 CDMX that could someday be included as part of the alert network. "My major concern is whether the cell phone system in Mexico is capable of issuing a timely alert to the thousands of subscribers without delays," said Suárez, explaining that the app is still in at an experimental stage.Like all earthquake early warning systems, SASMEX uses algorithms to process incoming seismic waves from earthquakes to determine the magnitude of the earthquake. Researchers at Centro de Instrumentación y Registro Sísmico (CIRES) have recently begun testing a new algorithm that would reduce the processing time needed to detect and decode seismic data, to deliver alerts faster, said Suárez.The tests of the new algorithm show, he noted, that SASMEX would have been able to speed up its alert for the Morelos earthquake, "giving about eight to ten seconds of warning in Mexico City."Even with faster alerts in the future, Suárez cautions that even the best earthquake early warning systems "give very little time of opportunity for people and government to react.""To me, this shows that we should not enamored of the technology and simply install seismic early warning systems without thinking of the social issues. Any early warning system should have a clearly thought-out strategy as to who will be warned," he said, "After more than 25 years, the Mexican seismic early warning system is still lacking these procedures that I believe should be a governmental responsibility." | Earthquakes | 2,018 |
February 6, 2018 | https://www.sciencedaily.com/releases/2018/02/180206121020.htm | Satellite-based earthquake early warning system tested against Chilean great quakes | Researchers testing a satellite-based earthquake early warning system developed for the U.S. West Coast found that the system performed well in a "replay" of three large earthquakes that occurred in Chile between 2010 and 2015. Their results, reported in the journal | The early warning module, called G-FAST, uses ground motion data measured by Global Navigation Satellite Systems (GNSS) to estimate the magnitude and epicenter for large earthquakes-those magnitude 8 and greater. These great quakes often take place at subducting tectonic plate boundaries, where one plate thrusts beneath another plate, as is the case off the coast of Chile and the U.S. Pacific Northwest.Using data collected by Chile's more than 150 GNSS stations, Brendan Crowell of the University of Washington and his colleagues tested G-FAST's performance against three large megathrust earthquakes in the country: the 2010 magnitude 8.8 Maule, the 2014 magnitude 8.2 Iquique, and the 2015 magnitude 8.3 Illapel earthquakes.G-FAST was able to provide magnitude estimates between 40 to 60 seconds after the origin time of all three quakes, providing magnitude estimates that were within 0.3 units of the known magnitudes. The system also provided estimates of the epicenter and fault slip for each earthquake that agreed with the actual measurements, and were available 60 to 90 seconds after each earthquake's origin time. "We were surprised at how fast G-FAST was able to converge to the correct answers and how accurately we were able to characterize all three earthquakes," said Crowell.Most earthquake early warning systems measure properties of seismic waves to quickly characterize an earthquake. These systems often cannot collect enough information to determine how a large earthquake will grow and as a result may underestimate the earthquake magnitude-a problem that can be avoided with satellite-based systems such as G-FAST.It's difficult to test these types of early warning systems, Crowell noted, because magnitude 8+ earthquakes are relatively rare. "We decided to look at the Chilean earthquakes because they included several greater than magnitude 8 earthquakes, recorded with an excellent and consistent GNSS network. In doing so, we would be able to better categorize the strengths and weaknesses in G-FAST."The Chilean tests will play a part in furthering developing G-FAST for use in the U.S., where Crowell and colleagues have been working to include it in the prototype earthquake early warning system called ShakeAlert, now operating in California, Oregon and Washington. The Chilean earthquakes, Crowell said, represent about half of magnitude 8 events in the recorded catalog of earthquakes that are used to test G-FAST and other geodetic algorithms for inclusion in ShakeAlert.Ten magnitude 8 or greater earthquakes have occurred along the Chilean coast in the past 100 years, including the 1960 magnitude 9.5 Valdivia earthquake, which is the largest earthquake recorded by instruments.After the 2010 Maule earthquake, the country began installing a network of digital broadband seismic and ground motion stations, Global Positioning System stations, and GNSS stations to provide accurate information for tsunami warnings and damage assessment. Since 2012, the Centro Sismológico Nacional at the Universidad de Chile has operated more than 100 stations, and has recently begun to operate almost 300 strong-motion accelerometers that measure ground shaking.In a second paper published in | Earthquakes | 2,018 |
February 1, 2018 | https://www.sciencedaily.com/releases/2018/02/180201141519.htm | Oklahoma's earthquakes strongly linked to wastewater injection depth | Human-made earthquakes in Oklahoma, USA, are strongly linked to the depth at which wastewater from the oil and gas industry are injected into the ground, according to a new study led by the University of Bristol. | Oklahoma has been a seismic hotspot for the past decade, with the number of damaging earthquakes -- including the magnitude 5.8 Pawnee earthquake in 2016 -- regularly impacting on the lives of residents, leading to litigation against well operators.The human-made, or induced, earthquakes pose an increased risk to critical infrastructure such as a major commercial oil storage facility at Cushing, making them a national security threat.The connection between 'seismicity' -- the frequency of earthquakes -- and deep fluid injection into underground rock formations is well established, but scientists, policymakers, and the oil and gas industry have been bewildered by the unprecedented surge in earthquake activity. At its peak, there has been an approximately 800-fold increase in the annual number of earthquakes in Oklahoma since 2011.Oklahoma's well operators have injected on average 2.3 billion barrels of fluids per year into the ground since 2011. Wastewater is routinely disposed of typically at depths one to two km below the ground surface, well below the level of fresh ground water supplies. Also, saltwater is injected deep underground to enable recovery of oil and gas.Now a major study by the University of Bristol and involving the University of Southampton, Delft University of Technology and Resources for the Future, published today in the journal Lead author of the study, Dr Thea Hincks, Senior Research Associate at the University of Bristol's School of Earth Sciences, said: "Our new modelling framework provides a targeted, evidential basis for managing a substantial reduction in induced seismicity in Oklahoma, with extensive possibilities for application elsewhere in the world. This marks a step forward in understanding the evolution of seismicity in the Oklahoma region."Using a powerful computer model incorporating injection well records and earthquake data from the US Geological Survey, the team examined the connections between injection volume, depth, and location, as well as geological features, over a six-year period.The study used innovative new software, Uninet, which was developed by co-author Professor Roger Cooke's group at Delft University of Technology and is freely available for academic users from LightTwist Software. Uninet has previously been used to develop causal risk models for the aviation industry.The team found that the joint effects of depth and volume are critical, and that injection volume becomes more influential -- and more likely to cause earthquakes -- at depths where layered sedimentary rocks meet crystalline basement rocks. This is because deeper wells allow easier access for fluids into fractured basement rocks that are much more prone to earthquakes.Dr Tom Gernon, Associate Professor in Earth Science at the University of Southampton, and co-author on the study, said: "The underlying causes of Oklahoma's induced earthquakes are an open and complex issue, not least because there are over 10,000 injection wells, with many different operators and operating characteristics, all in an area of complex geology."Thanks to an innovative model capable of analysing large and complex data sets, our study establishes for the first time a clear link between seismicity and fluid injection depth."The study also shows how raising injection well depths to above the basement rocks in key areas could significantly reduce the annual energy released by earthquakes -- thereby reducing the relative likelihoods of larger, damaging earthquakes. Current regulatory interventions include requiring operators to either reduce injection or raise wells above the basement, often by an unspecified amount.Professor Willy Aspinall, of the University of Bristol and Aspinall & Associates, who conceived the study, added: "This new diagnostic finding has potential implications for scientists, regulators and civil authorities concerned about induced seismicity, both in the US and internationally. The research addresses a growing need for a broader understanding of how operational, spatial and geologic parameters combine to influence induced seismic risk."Our analysis allows regulatory actions to be evaluated on a rational, quantitative basis in terms of seismic effects."Thea Hincks and Willy Aspinall were supported in part by the CREDIBLE consortium (NERC Grant NE/J017299/1). | Earthquakes | 2,018 |
January 30, 2018 | https://www.sciencedaily.com/releases/2018/01/180130091241.htm | Giant earthquakes: Not as random as thought | By analyzing sediment cores from Chilean lakes, an international team of scientists discovered that giant earthquakes reoccur with relatively regular intervals. When also taking into account smaller earthquakes, the repeat interval becomes increasingly more irregular to a level where earthquakes happen randomly in time. | "In 1960, South-Central Chile was hit by the largest known quake on earth with a magnitude of 9.5. Its tsunami was so massive that -in addition to inundating the Chilean coastline- it travelled across the Pacific Ocean and even killed about 200 persons in Japan," says Jasper Moernaut, an assistant professor at the University of Innsbruck, Austria, and lead author of the study. "Understanding when and where such devastating giant earthquakes may occur in the future is a crucial task for the geoscientific community."It is generally believed that giant earthquakes release so much energy that several centuries of stress accumulation are needed to produce a new big one. Therefore, seismological data or historical documents simply do not go back far enough in time to reveal the patterns of their recurrence. "It is an ongoing topic of very vivid debate whether we should model large earthquake recurrence as a quasi-regular or random process in time. Of course, the model choice has very large repercussions on how we evaluate the actual seismic hazard in Chile for the coming decades to centuries."In their recent paper in Earth and Planetary Science Letters, Moernaut`s team of Belgian, Chilean and Swiss researchers presented a new approach to tackle the problem of large earthquake recurrence. By analyzing sediments on the bottom of two Chilean lakes, they recognized that each strong earthquake produces underwater landslides which get preserved in the sedimentary layers accumulating on the lake floor. By sampling these layers in up to 8 m long sediment cores, they retrieved the complete earthquake history over the last 5000 years, including up to 35 great earthquakes of a magnitude larger than 7.7."What is truly exceptional is the fact that in one lake the underwater landslides only happen during the strongest shaking events (like a M9 earthquake), whereas the other lake also reacted to "smaller" M8 earthquakes," says Maarten Van Daele from Ghent University, Belgium. "In this way we were able to compare the patterns in which earthquakes of different magnitudes take place. We did not have to guess which model is the best, we could just derive it from our data."With this approach, the team found that giant earthquakes (like the one in 1960) reoccur every 292 ±93 years and thus the probability for such giant events remains very low in the next 50-100 years. However, the "smaller" (~M8) earthquakes took place every 139 ±69 years and there is a 29.5% chance that such an event may occur in the next 50 years. Since 1960, the area has been seismically very quiet, but a recent M7.6 earthquake (on 25 DEC 2016) near Chiloé Island suggests a reawakening of great earthquakes in South-Central Chile."These Chilean lakes form a fantastic opportunity to study earthquake recurrence," says Moernaut. "Glacial erosion during the last Ice Age resulted in a chain of large and deep lakes above the subduction zone, where the most powerful earthquakes are getting generated. We hope to extend our approach along South America, which may allow us to discover whether e.g. earthquakes always rupture in the same segments, or whether other areas in the country are capable of producing giant M9+ earthquakes.""In the meanwhile, we already initiated similar studies on Alaskan, Sumatran and Japanese lakes," says Marc De Batist from Ghent University. "We are looking forward to some exciting comparisons between the data from these settings, and see if the Chilean patterns hold for other areas that have experienced giant M9+ earthquakes in the past." | Earthquakes | 2,018 |
January 24, 2018 | https://www.sciencedaily.com/releases/2018/01/180124123127.htm | Could underwater sound waves be the key to early tsunami warnings? | Mathematicians have devised a way of calculating the size of a tsunami and its destructive force well in advance of it making landfall by measuring fast-moving underwater sound waves, opening up the possibility of a real-time early warning system. | The sound waves, known as acoustic gravity waves (AGWs), are naturally occurring and can be generated in the deep ocean after tsunami trigger events, such as underwater earthquakes.They can travel over 10 times faster than tsunamis and spread out in all directions, regardless of the trajectory of the tsunami, making them easy to pick up using standard underwater hydrophones and an ideal source of information for early warning systems.In a new study published in the More importantly, once the fault characteristics are found, calculating the tsunami amplitude and potential destructive force becomes more trivial, the researchers state.Lead author of the study Dr Usama Kadri, from Cardiff University's School of Mathematics, said: "By taking measurements of acoustic gravity waves, we basically have everything we need to set off a tsunami alarm." Underwater earthquakes are triggered by the movement of tectonic plates on the ocean floor and are the main cause of tsunamis.Tsunamis are currently detected via dart buoys -- floating devices that are able to measure pressure changes in the ocean caused by tsunamis. However, the technology relies on a tsunami physically reaching the dart buoys, which could be problematic if the buoys are close to the shoreline.The current technology also requires the distribution of a huge number of buoys in oceans all around the world, which is very costly."Though we can currently measure earthquakes using seismic sensors, these do not tell us if tsunamis are likely to follow," Dr Kadri continued."Using sound signals in the water, we can identify the characteristics of the earthquake fault, from which we can then calculate the characteristics of a tsunami. Since our solution is analytical, everything can be calculated in near real-time."Our aim is to be able to set off a tsunami alarm within a few minutes from recording the sound signals in a hydrophone station."AGWs are naturally occurring sounds waves that move through the deep ocean at the speed of sound and can travel thousands of metres below the surface.AGWs can measure tens or even hundreds of kilometres in length and it is thought that certain lifeforms such as plankton, that are unable to swim against a current, rely on the waves to aid their movement, enhancing their ability to find food. | Earthquakes | 2,018 |
January 18, 2018 | https://www.sciencedaily.com/releases/2018/01/180118142706.htm | Fox Creek earthquakes linked to completion volume and location of hydraulic fracturing | The volume of hydraulic fracturing fluid and the location of well pads control the frequency and occurrence of measurable earthquakes, new Alberta Geological Survey and UAlberta research has found. | Ryan Schultz has been studying earthquakes in the Fox Creek, Alberta area since they started in December 2013. The seismologist -- who works at the Alberta Geological Survey (a branch of the Alberta Energy Regulator) and with the University of Alberta -- wanted to better understand what was causing the quakes.Schultz and his colleagues found that when increased volumes were injected in susceptible locations (i.e., in connection with a nearby slip-ready fault), it transmits increased pressure to the fault line, leading to more numerous measurable earthquakes.It's not as simple as more volume equals more earthquakes, though-a link that scientists have long identified in the history of induced seismicity, dating back to the 1950s. There is another factor at play in the Fox Creek area, and it's all about location, explained Schultz."If there is a pre-existing fault, but you're not connected to it by some sort of fluid pathway, you can hydraulically fracture the formation, and you're probably not going to cause a significant earthquake," said Schultz. "It's conceptually quite simple, but actually determining those things underground is really hard to do in practice."Since 2013, there has been a marked increase in the rate of earthquakes near Fox Creek, ranging up to magnitude 4s. While other research has pointed to industry activity as contributing to the quakes, this study is the first to identify specific factors causing the seismic activity.Schultz said the next steps for the scientists are to build on these findings to better understand the geological factors occurring in this concentrated area of the Duvernay Formation with the future goal of better predicting best places to conduct hydraulic fracturing where it is least likely to cause these earthquakes.To answer these questions, the UAlberta alumnus continues to work with Jeff Gu, geophysics professor in the Department of Physics and Schultz's former graduate supervisor, and colleagues at the Alberta Geological Survey."We want to characterize everything we can about these earthquakes so that we can describe them in as much detail as possible," said Schultz. "But when you answer questions, more questions come up.""Hydraulic fracturing volume is associated with induced earthquake productivity in the Duvernay play" will be published in the January 19 issue of | Earthquakes | 2,018 |
January 17, 2018 | https://www.sciencedaily.com/releases/2018/01/180117135133.htm | New details emerge on temperature, mobility of earth's lower crust in Rocky Mountains | Everything on the surface of the Earth rests on massive tectonic plates that resemble a jelly sandwich, with two rigid pieces -- the upper crust and the upper mantle -- enclosing a gooey middle layer of very hot rocks, which is the lower crust. The plates move ever so slowly around the globe over a deeper hot layer called the asthenosphere. | Temperature plays a fundamental role in determining the strength, thickness, and buoyancy of the lower crust. A research team led by Colorado State University has mapped the temperature and viscosity of the lower crust for the first time and found that, under much of the western United States, the layer is hot enough to be near its initial melting point and, therefore, quite runny.This new research shows that significant regions of the lower crust have little strength, and that over several million years, could lead to many mountains in the western U.S. being flattened."Having a map of the temperature gives us an understanding of how strong the plate is," said Derek Schutt, associate professor in CSU's Department of Geosciences. "What we found is that there are places where the crust is not strong enough to hold the topography."Imagine three slices of Silly Putty, two frozen pieces lying on the top and bottom of one that is room temperature. When you push on the top layer, the warm Silly Putty will be squeezed flat. Similar mechanics are at work in the Earth's crust."Mountains are formed by forces pushing things around, and weak areas collapsing," according to Schutt.Outside forces could potentially push on the crust and that force could be transferred deep inland, which is called orogenic float, he said. The new study suggests this process may occur more often than previously thought."That can cause mountains to form at a great distance from where you're pushing on things," Schutt said. "Because the lower crust is mobile, force can be transmitted over a large distance."Scientists generally think of tectonic plates, or lithosphere, as being made up of the crust and a cold uppermost mantle. But in this new analysis, the team saw something akin to ball bearings slipping between the crust and mantle. While not unexpected, this study was able to map the extent of the areas resembling ball bearings."The 'ball bearings' keep separate what's happening in the mantle from what's happening in the crust," said Schutt.Researchers calculated temperatures at the bottom of the crust, which varies in thickness, by measuring the velocity of seismic waves that travel near the interface between the lower crust and uppermost mantle.In the western U.S., the crust is very hot, which is what makes it so weak."We know in general that the lower crust in the western United States seems hot," said Schutt. "But this is the first time we've been able to really ascribe a temperature to a specific location."The findings, he said, are not too surprising. But the research could lead to more insight about why mountains exist and, more specifically, why they exist in places where the temperatures in the lower crust are so high.Schutt and the research team will continue to explore the causes of variations in tectonic plate strength as part of an ongoing project between Colorado State University, Utah State University, and Scripps Institution of Oceanography at the University of California, San Diego. This research is funded by the National Science Foundation's Earthscope Program. | Earthquakes | 2,018 |
January 16, 2018 | https://www.sciencedaily.com/releases/2018/01/180116131319.htm | Rates of great earthquakes not affected by moon phases, day of year | There is an enduring myth that large earthquakes tend to happen during certain phases of the Moon or at certain times during the year. But a new analysis published in | After matching dates and lunar phases to 204 earthquakes of magnitude 8 or larger, Susan Hough of the U.S. Geological Survey concluded that there is no evidence that the rates of these great earthquakes are affected by the position of the Earth relative to either the Moon or the Sun.In fact, the patterns that some observers see as linking large earthquakes with specific parts of the lunar cycle "are no different from the kinds of patterns you would get if the data are completely random," Hough noted.To determine this, Hough looked at both the day of the year and the lunar phase for 204 large earthquakes from the global earthquake catalog, dating back to the 1600s. To avoid detecting clusters of earthquakes within the data that are related to other factors, she chose to look at larger earthquakes because they are less likely to be an aftershock of a bigger earthquake.Looking at only large earthquakes also allowed Hough to pare down the list to a manageable number that could be matched to lunar phase information found in online databases.Her analysis did turn up some clusters of earthquakes on certain days, but to test for any significance in the patterns she was observing, she randomized the dates of the earthquakes to find out what kind of patterns would appear in these random data. The patterns in the random data were no different from the kinds of patterns showing up in the original data set, she found.This isn't an unusual finding, Hough noted. "When you have random data, you can get all sorts of apparent signals, just like when you flip a coin, you sometimes end up with five heads in a row."Hough did see some unusual "signals" in the original data; for instance, the highest number of earthquakes (16) occurring on a single day came seven days after the new moon. But this signal was not statistically significant, "and the lunar tides would be at a minimum at this point, so it doesn't make any physical sense," she noted.Hough said that the Moon and Sun do cause solid Earth tidal stresses -- ripples through the Earth itself, and not the waters hitting the coastline -- and could be one of the stresses that contribute in a small way to earthquake nucleation."Some researchers have shown that "there is in some cases a weak effect, where there are more earthquakes when tidal stresses are high," she said, "But if you read those papers, you'll see that the authors are very careful. They never claim that the data can be used for prediction, because the modulation is always very small."The idea that the Sun and Moon's positions in the sky can modulate earthquake rates has a long history, she said. "I've read Charles Richter's files, the amateur predictors who wrote to him in droves, because he was the one person that people knew to write to ... and if you read the letters, they're similar to what people are saying now, it's all the same ideas.""Sooner or later there is going to be another big earthquake on a full moon, and the lore will pop back up," said Hough. "The hope is that this will give people a solid study to point to, to show that over time, there isn't a track record of big earthquakes happening on a full moon." | Earthquakes | 2,018 |
January 10, 2018 | https://www.sciencedaily.com/releases/2018/01/180110112930.htm | Further reducing injections of oilfield wastewater can prevent larger earthquakes | In a new study, Virginia Tech researchers have found that efforts to curb earthquakes triggered by the injection of oilfield wastewater into the ground in Oklahoma are not targeting the most dangerous tremblers, and that a larger reduction in injection volumes is needed. | Prior to 2011, Oklahoma averaged one magnitude 3-plus earthquake per year, but in 2015 there were more than 900 such earthquakes, making Oklahoma the most seismically active state in the mainland United States. Increased seismic activity has occurred simultaneously with the increased retrieval of unconventional oil and gas, which uses hydrofracturing, commonly referred to as fracking, to unlock previously inaccessible oil and gas resources. This rapid proliferation of unconventional oil and gas recovery has also resulted in millions of gallons of highly brackish wastewater, which comes up with the retrieved oil and gas. To dispose of this wastewater, the liquid is re-injected into geologic formations deep underground.Such wastewater injections have been taking place for decades, but the rapid increase in oil and gas production via fracking means substantially more oilfield wastewater is now being re-injected. In Oklahoma, the injections triggering earthquakes are taking place in the Arbuckle formation, a deep and highly porous sedimentary rock layer.The new study shows that locations that experienced earthquakes are tied in proximity and timeliness to mass waste water injection sites. Further, the study indicates that tracking annual data on the injection well locations can help predict how corresponding earthquake activity will change. This new finding builds on previous studies showing that earthquake activity increases when wastewater injections increase."Our results show that average annual injection well locations are a predictor of increasing earthquake activity," said Ryan M. Pollyea, an assistant professor with the Virginia Tech College of Science's geosciences department, and director of the Computational Geofluids Laboratory, who spearheaded the study, published online in the journal The research team includes Martin Chapman, a research associate professor of geosciences and director of the Virginia Tech Seismological Observatory; and Neda Mohammadi, a postdoctoral scholar and John E. Taylor, a professor and director of the Network Dynamics Lab, both at Georgia Tech.Earthquakes became so frequent in Oklahoma, that in April 2015, the Oklahoma Geological Survey acknowledged that wastewater injections are likely triggering earthquakes in the north-central portions of the state. In April 2016, Oklahoma Gov. Mary Fallin signed a law authorizing the Oklahoma Corporation Commission to limit injection volumes in the immediate aftermath of earthquake swarms."The number of magnitude 3-plus earthquakes decreased in 2016, but there were still more than 600 earthquakes, far above the historical average," said Pollyea, who specializes in groundwater modeling and simulation. "Previous research in this area used seismological and groundwater water modeling methods, but we thought that geospatial analysis might help us understand the situation on the ground."The results showed that the spatial correlation between earthquakes and wastewater injection volume indeed decreased in 2016. However, the decrease relates only to small magnitude earthquakes, but showed little to no effect on larger magnitude earthquakes, over a 3 ranking."When we compared the spatial correlation using datasets that include only magnitude 3-plus earthquakes, there was no change," said Pollyea, adding that a larger reduction in wastewater injection volumes is needed to reduce the dangers of large magnitude earthquakes.Pollyea and his team caution that while earthquakes cannot be predicted, the study indicates that geologists can test hypothetical wastewater injection scenarios before wells come online to estimate potential impact on earthquake activity."This study differs from and complements other studies on wastewater injection induced seismicity," said Shemin Ge, professor and chair of the Department of Geological Sciences at University of Colorado. (Ge was not involved with the study.) "Applying a geospatial approach, Pollyea and colleagues examined the linkage between spatially averaged injection data and seismicity occurrence. They offer a wider-angle view of the linkage at a spatial scale that is large than previously thought. Advocating reducing injection volume over time, this study offers new insights into the rationale for a continuing call for reducing injection rate in order to reduce seismicity."The new study is part of a larger effort that Pollyea and his collaborators are pursuing to study fluid-triggered earthquakes and wastewater injection volume and their relation to population centers in Oklahoma, building codes, policy decisions, and industry regulations. | Earthquakes | 2,018 |
January 10, 2018 | https://www.sciencedaily.com/releases/2018/01/180110100958.htm | Earthquakes as a driver for the deep-ocean carbon cycle | In a paper recently published in | At a depth of 7,542 metres below sea level, the team took a core sample from the Japan Trench, an 800-km-long oceanic trench in the northwestern part of the Pacific Ocean. The trench, which is seismically active, was the epicentre of the Tohoku earthquake in 2011, which made headlines when it caused the nuclear meltdown at Fukushima. Such earthquakes wash enormous amounts of organic matter from the shallows down into deeper waters. The resulting sediment layers can thus be used later to glean information about the history of earthquakes and the carbon cycle in the deep ocean.The current study provided the researchers with a breakthrough. They analysed the carbon-rich sediments using radiocarbon dating. This method -- measuring the amount of organic carbon as well as radioactive carbon (14C) in mineralised compounds -- has long been a means of determining the age of individual sediment layers. Until now, however, it has not been possible to analyse samples from deeper than 5,000 metres below the surface, because the mineralised compounds dissolve under increased water pressure.Strasser and his team therefore had to use new methods for their analysis. One of these was what is known as the online gas radiocarbon method, developed by ETH doctoral student Rui Bao and the Biogeoscience Group at ETH Zurich. This greatly increases efficiency, since it takes just a single core sample to make more than one hundred 14C age measurements directly on the organic matter contained within the sediment.In addition, the researchers applied the Ramped PyrOx measurement method (pyrolysis) for the first time in the dating of deep-ocean sediment layers. This was done in cooperation with the Woods Hole Oceanographic Institute (U.S.), which developed the method. The process involves burning organic matter at different temperatures. Because older organic matter contains stronger chemical bonds, it requires higher temperatures to burn. What makes this method novel is that the relative age variation of the individual temperature fractions between two samples very precisely distinguishes the age difference between sediment levels in the deep sea.Thanks to these two innovative methods, the researchers could determine the relative age of organic matter in individual sediment layers with a high degree of precision. The core sample they tested contained older organic matter in three places, as well as higher rates of carbon export to the deep ocean. These places correspond to three historically documented yet hitherto imprecisely dated seismic events in the Japan Trench: the Tohoku earthquake in 2011, an unnamed earthquake in 1454, and the Sanriku earthquake in 869.At the moment, Strasser is working on a large-scale geological map of the origin and frequency of sediments in deep-ocean trenches. To do so, he is analysing multiple core samples taken during a follow-up expedition to the Japan Trench in 2016. "The identification and dating of tectonically triggered sediment deposits is also important for future forecasts about the likelihood of earthquakes," Strasser says. "With our new methods, we can predict the recurrence of earthquakes with much more accuracy." | Earthquakes | 2,018 |
January 5, 2018 | https://www.sciencedaily.com/releases/2018/01/180105082335.htm | Shakedown in Oklahoma: To cut the number of bigger earthquakes, inject less saltwater | In Oklahoma, reducing the amount of saltwater (highly brackish water produced during oil and gas recovery) pumped into the ground seems to be decreasing the number of small fluid-triggered earthquakes. But a new study shows why it wasn't enough to ease bigger earthquakes. The study, led by Ryan M. Pollyea of Virginia Tech in Blacksburg, Virginia, was published online ahead of print in | Starting around 2009, saltwater disposal (SWD) volume began increasing dramatically as unconventional oil and gas production increased rapidly throughout Oklahoma. As a result, the number of magnitude 3-plus earthquakes rattling the state has jumped from about one per year before 2011 to more than 900 in 2015. "Fluids are basically lubricating existing faults," Pollyea explains. Oklahoma is now the most seismically active state in the lower 48 United States.Previous studies linked Oklahoma SWD wells and seismic activity in time. Instead, Pollyea and colleagues studied that correlation in space, analyzing earthquake epicenters and SWD well locations. The team focused on the Arbuckle Group, a porous geologic formation in north-central Oklahoma used extensively for saltwater disposal. The earthquakes originate in the basement rock directly below the Arbuckle, at a depth of 4 to 8 kilometers.The correlation was clear: "When we plotted the average annual well locations and earthquake epicenters, they moved together in space," says Pollyea. The researchers also found that SWD volume and earthquake occurrence are spatially correlated up to 125 km. That's the distance within which there seems to be a connection between injection volume, fluid movement, and earthquake occurrence.By separating data by year from 2011 through 2016, Pollyea and colleagues also found that the spatial correlation for smaller earthquakes weakened in 2016, when new regulations reduced pumping volumes. Yet the spatial correlation for M3.0+ earthquakes persists unabated. In fact, two particularly alarming earthquakes shook the region in September 2016 and November 2016. The first, M5.8, was the largest ever recorded in Oklahoma. The second, M5.0, rocked the area surrounding the nation's largest oil storage facility, containing millions of barrels of oil.Pollyea's theory for why reduced fluid pressure has only affected small faults: "It's like the traffic on the freeway is still moving, but the smaller arterials are cut off." He emphasizes that so far, they can't predict single earthquakes or even blame specific wells for specific shaking. But to reduce large fluid-triggered earthquakes, Pollyea concludes, "It appears that the way to do it is to inject less water." | Earthquakes | 2,018 |
December 18, 2017 | https://www.sciencedaily.com/releases/2017/12/171218120327.htm | Heat from below Pacific Ocean fuels Yellowstone, study finds | Recent stories in the national media are magnifying fears of a catastrophic eruption of the Yellowstone volcanic area, but scientists remain uncertain about the likelihood of such an event. To better understand the region's subsurface geology, University of Illinois geologists have rewound and played back a portion of its geologic history, finding that Yellowstone volcanism is more far more complex and dynamic than previously thought. | "The heat needed to drive volcanism usually occurs in areas where tectonic plates meet and one slab of crust slides, or subducts, under another. However, Yellowstone and other volcanic areas of the inland western U.S. are far away from the active plate boundaries along the west coast," said geology professor Lijun Liu who led the new research. "In these inland cases, a deep-seated heat source known as a mantle plume is suspected of driving crustal melting and surface volcanism."In the new study, reported in the journal "Our goal is to develop a model that matches up with what we see both below ground and on the surface today," Zhou said. "We call it a hybrid geodynamic model because most of the earlier models either start with an initial condition and move forward, or start with the current conditions and move backward. Our model does both, which gives us more control over the relevant mantle processes."One of the many variables the team entered into their model was heat. Hot subsurface material -- like that in a mantle plume -- should rise vertically toward the surface, but that was not what the researchers saw in their models."It appears that the mantle plume under the western U.S. is sinking deeper into Earth through time, which seems counterintuitive," Liu said. "This suggests that something closer to the surface -- an oceanic slab originating from the western tectonic boundary -- is interfering with the rise of the plume."The mantle plume hypothesis has been controversial for many years and the new findings add to the evidence for a revised tectonic scenario, the researchers said."A robust result from these models is that the heat source behind the extensive inland volcanism actually originated from the shallow oceanic mantle to the west of the Pacific Northwest coast," Liu said. "This directly challenges the traditional view that most of the heat came from the plume below Yellowstone.""Eventually, we hope to consider the chemical data from the volcanic rocks in our model," Zhou said. "That will help us further constrain the source of the magma because rocks from deep mantle plumes and near-surface tectonic plates could have different chemistries."As for likelihood of a violent demise of Yellowstone occurring anytime soon, the researchers say it is still too early to know."Of course, our model can't predict specific future super-eruptions. However, looking back through 20 million years of history, we do not see anything that makes the present-day Yellowstone region particularly special -- at least not enough to make us suspect that it may do something different from the past when many catastrophic eruptions have occurred," Liu said. "More importantly, this work will give us a better understanding of some of the mysterious processes deep within Earth, which will help us better understand the consequences of plate tectonics, including the mechanism of earthquakes and volcanoes." | Earthquakes | 2,017 |
December 13, 2017 | https://www.sciencedaily.com/releases/2017/12/171213120037.htm | Residual strain despite mega earthquake | On 22 May 1960, an earthquake shook the southern Chilean continental margin on a length of about 1,000 kilometers. Estimates suggest that around 1,600 people died as a direct result of the quake and the following tsunami, leaving around two million people homeless. With a strength of 9.5 on the moment magnitude scale, the Valdivia earthquake from 1960 still ranks number one on the list of strongest earthquakes ever measured. | More than half a century later, on 25 December 2016, the earth was trembling around the southern Chilean island of Chiloé. With a strength of 7,5 Mw this event can be described as rather moderate by Chilean standards. But the fact that it broke the same section of the Chilean subduction zone as the 1960 earthquake is quite interesting for scientists. As researchers from the GEOMAR Helmholtz Centre for Ocean Research Kiel and the Universidad de Chile have now published in the journal To understand why Chile is being hit so frequently by heavy earthquakes, one has to look at the seabed off the coast. It belongs to the so-called Nazca plate, a tectonic plate, which moves eastwards with a rate of 6.6 cm per year. Off the Chilean coast it collides with the South American plate and is submerged beneath it. In this process, strains build up between the plates -- until they break and the earth trembles.During such an earthquake, the strain is released within minutes. During the 1960 earthquake for example, the plates shifted by more than 30 meters against each other. As a result, landmasses were lifted up or down several meters with a fundamental change of Chilean landscapes and coastline. "The scale of the slip also gives information about the accumulated energy between the two plates," explains Dr. Lange.From the time interval (56 years), the known speed of the Nazca plate, and further knowledge of the subduction zone, the German-Chilean team has calculated the accumulated energy and thus the theoretical slip of the 2016 earthquake to about 3.4 meters. But the analysis of seismic data and GPS surveys showed a slip of more than 4.5 m. "The strain must have had accumulated for more than 56 years. It is older than the last earthquake in the same region," says Dr. Lange.Similar results have recently been obtained in another subduction zones. Along with them, the new study suggests that for risk assessment in earthquake-prone areas, not just a single seismic cycle from one earthquake to the next should be considered. "The energy can be greater than that resulting from the usual calculations, which can, for example, have an impact on recommendations for earthquake-proof construction," says Dr. Lange. | Earthquakes | 2,017 |
December 11, 2017 | https://www.sciencedaily.com/releases/2017/12/171211120349.htm | The origin of the Andes unravelled | Why do the Andes exist? Why is it not a place of lowlands or narrow seas? Wouter Schellart, a geophysicist at the Vrije Universiteit Amsterdam, has been pondering these questions for more than a decade. Now, he has found the answers using an advanced computer model. "It's a matter of enormous size, longevity and great depth," he said. "These aspects made the Andes the longest and second-highest mountain belt in the world." | All the other major mountain belts on Earth, such as the Himalaya and the Alps, were formed due to colliding continents. But there are no colliding continents in the Andes; rather, the Andes are located at a so-called subduction zone, a place where an oceanic tectonic plate sinks below another plate (in this case the Nazca plate sinking below the South American plate) into the Earth's interior, the mantle. There are numerous other subduction zones on Earth, such as in Greece and Indonesia, but these locations are characterized by small seas (such as the Aegean Sea) and tropical lowlands, not massive mountain chains. So the big question is: Why did a massive mountain chain form in South America?Schellart's model, which took more than two years to complete on Australia's supercomputer Raijin, has reproduced the evolution of the South American subduction zone, from start to present (initiating some 200 million years ago and thereby the oldest subduction zone in the world), to investigate the origin of the Andes. What came out? The size of the subduction zone, some 7000 km and thereby the largest in the world, is crucial for mountain building. What else came out? The first signs of crustal shortening and mountain formation started already in the mid Cretaceous, some 120-80 million years ago. Before this time there were elongated narrow seas at the western edge of South America rather than mountains. Form the mid Cretaceous onwards the subduction zone was deep enough to induce large-scale flow in the deep mantle, down to 2900 km, the boundary between the Earth's mantle and core.These flows dragged South America westward, causing the continent to collide with the subduction zone and thereby forming the Andes. Because the South American subduction zone is so wide, it provides much resistance to migrate laterally, in particular in the centre. This is why the collisional forces between the South American continent and the subduction zone are largest in the centre, resulting in the highest mountains in the Central Andes and formation of the Altiplano, a high plateau at 4 km above sea level, but much lower mountains in the north and south. | Earthquakes | 2,017 |
December 6, 2017 | https://www.sciencedaily.com/releases/2017/12/171206162305.htm | West coast earthquake early warning system continues progress toward public use | A decade after beginning work on an earthquake early warning system, scientists and engineers are fine-tuning a U.S. West Coast prototype that could be in limited public use in 2018. | In two papers published December 6 in The development of ShakeAlert has shown that a dense network of seismic stations, swift transfer of seismic data to a central processing and alert station, speedy paths for distributing alert information to users, and education and training on how to use the alerts are all necessary for a robust early warning system, said Monica Kohler, a Caltech research faculty member in the Department of Mechanical and Civil Engineering.Although ShakeAlert was developed for the West Coast, a similar system could be used for earthquake early warning in places such as Alaska, Hawaii and even in places such as Oklahoma where induced earthquakes are the main source of seismic damage. "We have been getting questions along the lines of 'can we try your system where we live?' or 'can we port your system over?'" Kohler said. "The answer in theory is yes, but there have to be these certain key elements in place."ShakeAlert was developed using seismic data collected by the Advanced National Seismic System (ANSS), a national collection of seismic networks supported by the U.S. Geological Survey. About 760 seismic stations now contribute to ShakeAlert. One of the "top of the list" priorities for improving West Coast early warning, said Kohler, would be the addition of almost 1000 more stations in these networks."Parts of Los Angeles and San Francisco are covered pretty well by regional network stations, but many areas of California, Washington and Oregon are not covered very well," Kolher explained. "There's some funding in the works to get some new stations in place, but right now it's not enough to complete the regional arrays that are necessary for earthquake early warning in its most robust form."To remedy some of this shortfall, the ShakeAlert system is testing the use of "volunteer" accelerometers, mostly in the Los Angeles area, that are sensitive enough to detect earthquakes and can be plugged in at a business or home, bypassing the more expensive permitting and installation needed for a full-scale seismic station.Even in places where a dense network of stations already exists, some stations lack high-speed and reliable telemetry -- the data communication capabilities that include everything from radio waves to satellite and commercial internet to send seismic signals to a central data processing center in real time. "With earthquake early warning, the name of the game is fast," Kohler said. "The signals to central processing need to be fast, and the processing center in turn has to be able to send out alerts to its users fast enough to be useful. Some of the funding that is needed for earthquake early warning needs to be put into upgrading the existing telemetry."ShakeAlert's algorithms now estimate a "point source" for an earthquake, but "we are working on the ability to incorporate algorithms that can handle very large earthquakes that happen in a way that can't be approximated to a single point," Kohler said.The current version of the system, she said, is less successful in providing useful warnings about earthquakes that rupture a long section of a fault and evolve over time, such as the M9.1 2011 Tohoku earthquake in Japan.The ShakeAlert team has also been testing the system against past and real-time earthquake events, looking at its ability to determine parameters such as earthquake magnitude, epicenter and origin time, and how long the delay is between the start of the earthquake and the time a user gets an alert. "But what end users really care about is how severe the ground shaking will be at their location, so we also are now testing algorithms that could provide these data," Kohler said. "We fully expect that a future iteration of ShakeAlert will have ground shaking assessment as one of the most basic parts of the alert."When ShakeAlert expanded from California to include Washington and Oregon in April 2017, the developers also created a single set of algorithms that could be used in the Pacific Northwest and Northern and Southern California. The challenge, said Kohler, was to develop a standard set of algorithms that could account for the different tectonic environments in each region, including the offshore subduction zone in the Pacific Northwest and the mostly on-shore faults in California.From the start, ShakeAlert has had a steady group of beta users -- ranging from government agencies to businesses -- who have helped the team with valuable feedback for the system.For instance, the ShakeAlert team is learning more from its beta users about how much uncertainty the users will tolerate in the alerts and how fast versus how accurate they want those alerts to be. ShakeAlert works with third-party companies to decide how to disseminate the alerts, what form the alerts should take, and what kind of information is in the alerts -- and there isn't a one-size-fits-all approach, Kohler noted.For instance, a school may have higher tolerance for false alarms -- students will lose a few minutes of instruction by ducking under their desks -- but a railroad or a machinery assembly line may want to shut down operations only for the most high-certainty alerts."We are learning what their tolerance of alert time versus accuracy is," Kohler said. "Because the sooner you want the alert, the less accurate it will be. If you wait for more data to produce the alert, the more accurate it will be."As ShakeAlert moves toward wider public use, Kohler and her colleagues hope their SRL papers will serve as a good reminder of all the elements necessary for earthquake early warning systems."Sometimes there is a misperception that all you need is a good algorithm, or all you need is a small array of seismometers in your region of interest, but either one of those or even both of them isn't going to get you very far in terms of producing a robust system over a large seismically active area," Kohler said. | Earthquakes | 2,017 |
December 6, 2017 | https://www.sciencedaily.com/releases/2017/12/171206122525.htm | Unearthing the underground effects of earthquakes and volcanoes | Most of what we know about earthquakes and volcanoes is based on what we can observe at the Earth's surface. However, most of the action -- especially early activity that could help with disaster prediction and preparedness -- occurs deep underground. | Developing a clearer picture of changes in subsurface conditions, together with continuous monitoring, could provide life-saving information in advance of future disasters. In earthquake-prone Japan, especially, there is ongoing need for effective means of foreseeing seismic activity.Japan's National Research Institute for Earth Science and Disaster Prevention (NIED) has developed the Hi-net network of hundreds of high-sensitivity seismographs evenly distributed across the country. High-resolution seismic data from Hi-net shed light on the workings far below the country's surface. A key source of information from Hi-net is the velocity of seismic waves as they travel between stations. Faults, fractures, and fluids in the subsurface, among other factors, can influence seismic velocity. Thus, changes in seismic velocity can signal changes occurring underground but not yet apparent at the surface.Until recently, little variation in seismic velocity had been detected in central Kyushu, Japan's southernmost major island. However, in April 2016, the MW 7.0 Kumamoto earthquake struck the region, shortly after a MW 6.2 foreshock. These destructive earthquakes were followed by eruptions of Japan's largest active volcano, Mount Aso, in April, May, and October of the same year.A trio of researchers at Kyushu University and its International Institute for Carbon-Neutral Energy Research (I2CNER) investigated Hi-net seismic velocity data, collected continuously from December 2015 to November 2016, to better understand the subsurface conditions associated with these disasters. They reported their findings in "We applied seismic interferometry to the ambient noise recorded at 36 Hi-net seismic stations," Tatsunori Ikeda explains. "We found that during the earthquake, velocity slowed significantly, which may have been related to damage and pressure changes around the deep rupture fault. This was followed by gradual 'healing' of the fault over the following months, although different areas recovered to different extents."The earthquakes also may have mobilized fluids around Aso's magma body. Velocity below the caldera decreased when the earthquake struck, but recovered relatively rapidly after the eruptions; this may have released pressure."Although past studies have used similar approaches for velocity estimation, the higher spatial resolution we achieved over a broad area allowed us to identify the spatial distribution of the damage zone or stress state," corresponding author Takeshi Tsuji says. "Denser deployment allows local anomalies to be more accurately resolved. Velocity changes thus identified could be useful in the estimation of future earthquakes and volcanic activity." | Earthquakes | 2,017 |
December 5, 2017 | https://www.sciencedaily.com/releases/2017/12/171205093656.htm | Dark fiber: Using sensors beneath our feet to tell us about earthquakes, water, and other geophysical data | Scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have shown for the first time that dark fiber -- the vast network of unused fiber-optic cables installed throughout the country and the world -- can be used as sensors for detecting earthquakes, the presence of groundwater, changes in permafrost conditions, and a variety of other subsurface activity. | In a pair of recently published papers, a team led by Berkeley Lab researcher Jonathan Ajo-Franklin announced they had successfully combined a technology called "distributed acoustic sensing," which measures seismic waves using fiber-optic cables, with novel processing techniques to allow reliable seismic monitoring, achieving results comparable to what conventional seismometers can measure."This has huge potential because you can just imagine long stretches of fibers being turned into a massive seismic network," said Shan Dou, a Berkeley Lab postdoctoral fellow. "The idea is that by using fiber that can be buried underground for a long time, we can transform traffic noise or other ambient vibrations into usable seismic signals that can help us to monitor near-surface changes such as permafrost thaw and groundwater-level fluctuations."Dou is the lead author of "Distributed Acoustic Sensing for Seismic Monitoring of the Near Surface: A Traffic-Noise Interferometry Case Study," which was published in September in Nature's Dark fiber refers to unused fiber-optic cable, of which there is a glut thanks to a huge rush to install the cable in the early 1990s by telecommunications companies. Just as the cables were buried underground, the technology for transmitting data improved significantly so that fewer cables were needed. There are now dense corridors of dark fiber crisscrossing the entire country.Distributed acoustic sensing (DAS) is a novel technology that measures seismic wavefields by shooting short laser pulses across the length of the fiber. "The basic idea is, the laser light gets scattered by tiny impurities in the fiber," said Ajo-Franklin. "When fiber is deformed, we will see distortions in the backscattered light, and from these distortions, we can measure how the fiber itself is being squeezed or pulled."Using a test array they installed in Richmond, California -- with fiber-optic cable placed in a shallow L-shaped trench, one leg of about 100 meters parallel to the road and another perpendicular -- the researchers verified that they could use seismic waves generated by urban traffic, such as cars and trains, to image and monitor the mechanical properties of shallow soil layers.The measurements give information on how "squishy" the soil is at any given point, making it possible to infer a great deal of information about the soil properties, such as its water content or texture. "Imagine a slinky -- it can compress or wiggle," Ajo-Franklin said. "Those correspond to different ways you can squeeze the soil, and how much energy it takes to reduce its volume or shear it."He added: "The neat thing about it is that you're making measurements across each little unit of fiber. All the reflections come back to you. By knowing all of them and knowing how long it takes for a laser light to travel back and forth on the fiber you can back out what's happening at each location. So it's a truly distributed measurement."Having proven the concept under controlled conditions, the team said they expect the technique to work on a variety of existing telecommunications networks, and they are currently conducting follow-up experiments across California to demonstrate this. Ongoing research in Alaska is also exploring the same technique for monitoring the stability of Arctic permafrost.Added Dou: "We can monitor the near surface really well by using nothing but traffic noise. It could be fluctuations in groundwater levels, or changes that could provide early warnings for a variety of geohazards such as permafrost thaw, sinkhole formation, and landslides."Building on five years of Berkeley Lab-led research exploring the use of DAS for subsurface monitoring using non-earthquake seismic sources, Ajo-Franklin's group has now pushed the envelope and has shown that DAS is a powerful tool for earthquake monitoring as well.In the GRL study led by Lindsey in collaboration with Stanford graduate student Eileen Martin, the research team took measurements using the DAS technique on fiber-optic arrays in three locations -- two in California and one in Alaska. In all cases, DAS proved to be comparably sensitive to earthquakes as conventional seismometers, despite its higher noise levels. Using the DAS arrays, they assembled a catalog of local, regional, and distant earthquakes and showed that processing techniques could take advantage of DAS' many channels to help understand where earthquakes originate from.Ajo-Franklin said that dark fiber has the advantage of being nearly ubiquitous, whereas traditional seismometers, because they are expensive, are sparsely installed, and subsea installations are particularly scarce. Additionally, fiber allows for dense spatial sampling, meaning data points are only meters apart, whereas seismometers typically are separated by many kilometers.Lindsey added: "Fiber has a lot of implications for earthquake detection, location, and early warning. Fiber goes out in the ocean, and it's all over the land, so this technology increases the likelihood that a sensor is near the rupture when an earthquake happens, which translates into finding small events, improved earthquake locations, and extra time for early warning."The GRL paper notes other potential applications of using the dark fiber, including urban seismic hazard analysis, global seismic imaging, offshore submarine volcano detection, nuclear explosion monitoring, and microearthquake characterization. | Earthquakes | 2,017 |
December 4, 2017 | https://www.sciencedaily.com/releases/2017/12/171204091833.htm | Earthquakes in the Himalaya are bigger than in the Alps because tectonic plates collide faster | Earthquakes that happen in densely populated mountainous regions, such as the Himalaya, spell bigger earthquakes because of a fast tectonic-plate collision, according to a new study in | The new study shows that the frequency and magnitude of large earthquakes in the densely populated regions close to mountain chains -- such as the Alps, Apennines, Himalaya and Zagros -- depend on the collision rate of the smaller tectonic plates.In 2015, a magnitude 7.8 earthquake struck Gorkha-Nepal, and a year later, Norcia, Italy suffered a magnitude 6.2 earthquake. Previous research has attempted to explain the physical causes of earthquakes like these, but with ambiguous results. For the first time, the new study shows that the rate at which tectonic plates collide controls the magnitude of earthquakes in mountainous regions."The impact of large earthquakes in mountain belts is devastating," commented Luca Dal Zilio, lead author of the study from Geophysical Fluid Dynamics -- ETH Zürich. "Understanding the physical parameters behind the frequency and magnitude of earthquakes is important to improve the seismic hazard assessment. By combining classical earthquake statistics and newly developed numerical models, our contribution addresses a crucial aspect of the seismic hazard, providing an intuitive physical explanation for a global-scale problem. Our scientific contribution can help the society to develop a more complete view of earthquake hazard in one of the most densely populated seismic zones of the world and ultimately take action accordingly."There are seven large tectonic plates and several smaller ones in the earth's lithosphere -- its outermost layers. These plates move, sliding and colliding, and that movement causes mountains and volcanoes to form, and earthquakes to happen.The researchers developed 2D models that simulate the way the tectonic plates move and collide. The seismo-thermo-mechanical (STM) modelling approach utilises long-time scale processes to explain short time scale problems namely replicate the results observed from the historical earthquake catalogues. Also, it shows graphically the distribution of earthquakes by their magnitude and frequency that are caused by movement in the orogeny -- a belt of the earth's crust involved in the formation of mountains.The simulations suggest that the magnitude and frequency of the earthquakes in mountainous regions are directly related to the rate at which the tectonic plates collide. The researchers say this is because the faster they collide, the cooler the temperatures and the larger the areas that generate earthquakes. This increases the relative number of large earthquakes.The team confirmed the link by comparing earthquakes recorded in four mountain ranges: the Alps, Apennines and Himalaya and Zagros. Their results imply that the plate collisions in the Alps are more ductile than those in the Himalaya, reducing the hazard of earthquakes. | Earthquakes | 2,017 |
November 30, 2017 | https://www.sciencedaily.com/releases/2017/11/171130141045.htm | New early gravity signals to quantify the magnitude of strong earthquakes | After an earthquake, there is a disturbance in the field of gravity almost instantaneously. This could be recorded before the seismic waves that seismologists usually analyze. In a study published in | This work came out of the interaction between seismologists who wanted to better understand earthquakes and physicists who were developing fine gravity measurements with a view to detecting gravitational waves. Earthquakes change the equilibrium of forces on Earth brutally and emit seismic waves whose consequences may be devastating. But these same waves also disturb Earth's field of gravity, which emits a different signal. This is particularly interesting with a view to fast quantification of tremors because it moves at the speed of light, unlike tremor waves, which propagate at speeds between 3 and 10 km/s. So seismometers at a station located 1000 km from the epicenter may potentially detect this signal more than two minutes before the seismic waves arrive.The work presented here, which follows on a 2016 study which demonstrated this signal for the first time, greatly increases its understanding. First, the scientists did indeed observe these signals on the data from about ten seismometers located between 500 and 3000 km from the epicenter of the 2011 Japanese earthquake (magnitude 9.1). From their observations, the researchers then demonstrated that these signals were due to two effects. The first is the gravity change that occurs at the location of the seismometer, which changes the equilibrium position of the instrument's mass. The second effect, which is indirect, is due to the gravity change everywhere on Earth, which disturbs the equilibrium of the forces and produces new seismic waves that will reach the seismometer.Taking account of these two effects, the researchers have shown that this gravity-related signal is very sensitive to the earthquake's magnitude, which makes it a good candidate for rapidly quantifying the magnitude of strong earthquakes. The future challenge is to manage to exploit this signal for magnitudes below about 8 to 8.5, because below this threshold, the signal is too weak relative to the seismic noise emitted naturally by Earth, and dissociating it from this noise is complicated. So several technologies, including some inspired from instruments developed to detect gravitational waves, are being envisaged to take a new step forward in detection of these precious signals. | Earthquakes | 2,017 |
November 30, 2017 | https://www.sciencedaily.com/releases/2017/11/171130094142.htm | Mass of warm rock rising beneath New England | Slowly but steadily, an enormous mass of warm rock is rising beneath part of New England, although a major volcanic eruption isn't likely for millions of years, a Rutgers University-led study suggests. The research is unprecedented in its scope and challenges textbook concepts of geology. | "The upwelling we detected is like a hot air balloon, and we infer that something is rising up through the deeper part of our planet under New England," said lead author Vadim Levin, a geophysicist and professor in the Department of Earth and Planetary Sciences at Rutgers University-New Brunswick. "It is not Yellowstone (National Park)-like, but it's a distant relative in the sense that something relatively small -- no more than a couple hundred miles across -- is happening."The study, which tapped seismic data through the National Science Foundation's EarthScope program, was published online today in "Our study challenges the established notion of how the continents on which we live behave," Levin said. "It challenges the textbook concepts taught in introductory geology classes."Through EarthScope, thousands of seismic measurement devices, which were 46.6 miles apart, covered the continental United States for two years. Nothing on Earth has been done on this scale, Levin said. The EarthScope program seeks to reveal the structure and evolution of the North American continent and the processes that cause earthquakes and volcanic eruptions, the NSF says.Levin studies seismic waves, or the vibrations that pass through our planet following earthquakes. Seismic waves provide a window into the Earth's interior by revealing the shapes of objects, changes in the state of materials and clues about their texture. The Rutgers-led study focused on New England, where scientists had previously documented an area of great warmth (hundreds of degrees Celsius warmer than neighboring areas) in the Earth's upper mantle. The lithosphere, Earth's solid outer shell, consists of the upper mantle and the crust that includes the surface."We're interested in what happens at the interface between tectonic plates -- thick, solid parts that cover our planet -- and material in the upper mantle beneath the plates," Levin said. "We want to see how North America is gliding over the deeper parts of our planet. It is a very large and relatively stable region, but we found an irregular pattern with rather abrupt changes in it."Levin thinks the upwelling pattern detected is largely beneath central Vermont and western New Hampshire, but it's also under western Massachusetts. It may be present elsewhere, but the study's findings were based on available seismic observations."The Atlantic margin of North America did not experience intense geologic activity for nearly 200 million years," Levin said. "It is now a so-called 'passive margin' -- a region where slow loss of heat within the Earth and erosion by wind and water on the surface are the primary change agents. So we did not expect to find abrupt changes in physical properties beneath this region, and the likely explanation points to a much more dynamic regime underneath this old, geologically quiet area.""It will likely take millions of years for the upwelling to get where it's going," he added. "The next step is to try to understand how exactly it's happening." | Earthquakes | 2,017 |
November 29, 2017 | https://www.sciencedaily.com/releases/2017/11/171129143349.htm | North Texas earthquakes occurring on 'dead' faults, seismology research shows | Recent earthquakes in the Fort Worth Basin -- in the rural community of Venus and the Dallas suburb of Irving -- occurred on faults that had not been active for at least 300 million years, according to research led by SMU seismologist Beatrice Magnani. | The research supports the assertion that recent North Texas earthquakes were induced, rather than natural -- a conclusion entirely independent of previous analyses correlating seismicity to the timing of wastewater injection practices, but that corroborates those earlier findings. The full study, "Discriminating between natural vs induced seismicity from long-term deformation history of intraplate faults," published by "To our knowledge this is the first study to discriminate natural and induced seismicity using classical structural geology analysis techniques," said Magnani, associate professor of geophysics in SMU's Huffington Department of Earth Sciences. Co-authors for the study include Michael L. Blanpied, associate coordinator of the USGS Earthquake Hazard program, and SMU seismologists Heather DeShon and Matthew Hornbach.The results were drawn from analyzing the history of fault slip (displacement) over the lifetime of the faults. The authors analyzed seismic reflection data, which allow "mapping" of Earth's subsurface from reflected, artificially generated seismic waves. Magnani's team compared data from the North Texas area, where several swarms of felt earthquakes have been occurring since 2008, to data from the Midwestern U.S. region that experienced major earthquakes in 1811 and 1812 in the New Madrid seismic zone.Frequent small earthquakes are still recorded in the New Madrid seismic zone, which is believed to hold the potential for larger earthquakes in the future."These North Texas faults are nothing like the ones in the New Madrid Zone -- the faults in the Fort Worth Basin are dead," Magnani said. "The most likely explanation for them to be active today is because they are being anthropogenically induced to move."In the New Madrid seismic zone, the team found that motion along the faults that are currently active has been occurring over many millions of years. This has resulted in fault displacements that grow with increasing age of sedimentary formations.In the Fort Worth Basin, along faults that are currently seismically active, there is no evidence of prior motion over the past (approximately) 300 million years. "The study's findings suggest that that the recent Fort Worth Basin earthquakes, which involve swarms of activity on several faults in the region, have been induced by human activity," said USGS scientist Blanpied.The findings further suggest that these North Texas earthquakes are not simply happening somewhat sooner than they would have otherwise on faults continually active over long time periods. Instead, Blanpied said, the study indicates reactivation of long-dormant faults as a consequence of waste fluid injection.Seismic reflection profiles in the Venus region used for this study were provided by the U.S. Geological Survey Earthquake Hazards Program. Seismic reflection profiles for the Irving area are proprietary. Magnani and another team of scientists collected seismic reflection data used for this research during a 2008-2011 project in the northern Mississippi embayment, home to the New Madrid seismic zone. | Earthquakes | 2,017 |
November 29, 2017 | https://www.sciencedaily.com/releases/2017/11/171129120251.htm | Parkfield segment of San Andreas fault may host occasional large earthquakes | Although magnitude 6 earthquakes occur about every 25 years along the Parkfield Segment of the San Andreas Fault, geophysical data suggest that the seismic slip induced by those magnitude 6 earthquakes alone does not match the long-term slip rates on this part of the San Andreas fault, researchers report November 28 in the | The Parkfield section of the fault could rupture simultaneously with a magnitude 7.7 earthquake on the fault segment immediately to the south. These southern earthquakes -- the latest of which was the 1857 Fort Tejon earthquake -- appear to occur about every 140 to 300 years. Using these data, Sylvain Michel of the University of Cambridge, UK and colleagues calculate that an earthquake occurring on the Parkfield segment during these simultaneous ruptures could reach the equivalent of a magnitude 6.4 to 7.5 earthquake, and help to close the "slip budget" on the fault.Michel and colleagues compared the amount of slip in earthquakes on the Parkfield segment of the fault with the between-earthquake accumulation of seismic moment (a measure of earthquake size that is related to the fault area, amount of fault slip, and the material strength). The buildup of this seismic moment between earthquakes is called the "moment deficit," which is available for release during the next earthquake.The seismic moment released from the six earthquakes of about magnitude 6 that have occurred on the Parkfield fault segment since 1857 would only account for about 12 percent of the available moment deficit, Michel said. "This analysis shows that balancing the moment budget on the Parkfield segment of the San Andreas fault probably requires more frequent or larger earthquakes than what the instrumental and historical data suggest," he and his colleagues write in the BSSA paper.The Parkfield segment has been studied intensely by seismologists, especially as it forms the transition zone between the "creeping" northern half of the fault and its "locked" southern portion. Michel and colleagues took advantage of the wealth of geophysical data that have been collected in this region, using a catalog of earthquakes that have occurred in the area and models of the fault slip rate inferred from surface deformation given by Global Positioning System (GPS) and satellite observations of ground changes. The detailed information allowed the researchers to apply the slip budget concept to assessing the seismic potential of the fault, and thus the frequency of earthquakes.After concluding that the Parkfield segment must host occasional large earthquakes under the slip budget model, they calculated the likely occurrence of these large earthquakes over 30-year and 200-year periods.Michel and colleagues report that the probability of an earthquake of magnitude 6 or more is equal to about 43 percent over the span of 30 years, and 96 percent over the span of 200 years.The findings will help seismologists further examine how earthquakes on the Parkfield segment might occur in the future, the researchers said. For instance, their data could be used to explore whether locked patches of the fault separately host magnitude 6 or smaller earthquakes, and if larger, less frequent earthquakes might rupture across patches. | Earthquakes | 2,017 |
November 28, 2017 | https://www.sciencedaily.com/releases/2017/11/171128135641.htm | Geophysicists uncover new evidence for an alternative style of plate tectonics | When renowned University of Toronto (U of T) geophysicist J. Tuzo Wilson cemented concepts in the emerging field of plate tectonics in the 1960s, he revolutionized the study of Earth's physical characteristics and behaviours. Decades later, successor researchers at U of T and Istanbul Technical University have determined that a series of volcanoes and a mountain plateau across central Turkey formed not solely by the collision of tectonic plates, but instead by a massive drip and then detachment of the lower tectonic plate beneath Earth's surface. | The researchers propose that the reason the Central Anatolian (Turkish) Plateau has risen by as much as one kilometre over the past 10 million years is because the planet's crust and upper mantle -- the lithosphere -- has thickened and dripped below the region. As the lithosphere sank into the lower mantle, it first formed a basin at the surface, which later sprang up when the weight below broke off and sank further into the deeper depths of the mantle."It seems the heavy base of the tectonic plate has 'dripped' off into the mantle, leaving a massive gap in the plate beneath Central Anatolia. Essentially, by dropping this dense lithospheric anchor, there has been an upward bobbing of the entire land mass across hundreds of kilometres," said Professor Oğuz H. Göğüş of the Eurasia Institute of Earth Sciences at Istanbul Technical University (ITU), lead author of a study reporting the findings published in It's a new idea where plate shortening initially squeezed and folded a mountain belt, triggering the thickening and dripping of the deep lithosphere, and then increasing the elevation of most of central Turkey. Puzzled by the presence of such a process at a significant distance away from regular plate tectonic boundaries, the research team set about identifying why, in an area of high heating and high elevation, is the lithosphere below completely gone -- something that was recently discovered from seismology.They tested high-performance computational models against known geological and geophysical observations of the Central Anatolian Plateau, and demonstrated that a drip of lithospheric material below the surface can account for the measured elevation changes across the region."It's a new variation on the fundamental concepts of plate tectonics," said Professor Russell Pysklywec, chair of the Department of Earth Sciences at U of T and one of the study's coauthors. "It gives us some insight into the connection between the slow circulation of near-solid rock in Earth's mantle caused by convection currents carrying heat upwards from the planet's interior, and observed active plate tectonics at the surface."This is part of the holy grail of plate tectonics -- linking the two processes to understand how the crust responds to the mantle thermal engine of the planet."Pysklywec carried out the study with Göğüş, who received his PhD from U of T in 2010, and fellow researchers at ITU including Professor A. M. C. Şengör, and Erkan Gün of the Eurasia Institute of Earth Sciences at Istanbul Technical Institute. Gün is also now a current graduate student at U of T, supervised by Pysklywec. The research adds to decades of groundbreaking work in plate tectonics at U of T, and builds on Wilson's seminal work."Tuzo Wilson is a towering figure in geophysics internationally and the person most responsible for pioneering the ideas of plate tectonics in the 1960s," said Pysklywec. "I am pleased that we are continuing his legacy in geophysics with our work."While Pysklywec notes there are many locations on Earth missing its lithosphere below, he is quick to reassure that no place is in imminent danger of sinking into the mantle or boosting upwards overnight. "Our results show that the Central Anatolian Plateau rose over a period of millions of years. We're talking about mantle fluid motions and uplift at the pace at which fingernails grow."Göğüş highlights the links of the tectonics with human history saying, "The findings are exciting also because of the link with the remarkable historical human activity of Central Anatolia where some of the earliest known civilizations have existed. For example, Central Anatolia is described as an elevated, dry, cold plain of Lycaonia in Strabo's Geographika in 7 BC, and even cave paintings in the region dating to approximately 7000 BC record active volcanic eruptions on the plateau." | Earthquakes | 2,017 |
November 28, 2017 | https://www.sciencedaily.com/releases/2017/11/171128113544.htm | Is Agung going to blow? | Simon Carn studies carbon dioxide and sulfur dioxide emissions from volcanoes using remote sensing. | Carn notes that monitoring emissions from volcanoes is a useful indicator to predict when volcanoes will erupt. With Mount Agung on eruption watch in Bali, Carn notes that monitoring emissions from the volcano may aid volcanologists in determining whether or not an Agung eruption is imminent."Something is going on under the volcano, probably a magma intrusion," Carn says. "The big question we can't really answer is if it will erupt or if it will subside and go quiet again. Fumaroles and gas emissions can be seen from satellites. High-resolution satellite images of the crater allow us to see that there were some changes in late September."Agung has erupted about once per century for the past 5,000 years. It last erupted in 1963, killing more than 1,000 people. A detectable drop in planetary temperature -- a few tenths of a degree -- in 1964 followed. A seismic swarm, an increase in seismic activity around the volcano, has been detectable for weeks by seismographic equipment, and several earthquakes have been felt by humans.Carn says better understanding gas emissions from volcanoes could lead to better eruption predictions."They have evacuated a lot of people from around the volcano, but volcanic unrest can persist for months or even decades without an eruption." | Earthquakes | 2,017 |
November 22, 2017 | https://www.sciencedaily.com/releases/2017/11/171122131429.htm | Mysterious deep-Earth seismic signature explained | New research on oxygen and iron chemistry under the extreme conditions found deep inside Earth could explain a longstanding seismic mystery called ultralow velocity zones. Published in | Sitting at the boundary between the lower mantle and the core, 1,800 miles beneath Earth's surface, ultralow velocity zones (UVZ) are known to scientists because of their unusual seismic signatures. Although this region is far too deep for researchers to ever observe directly, instruments that can measure the propagation of seismic waves caused by earthquakes allow them to visualize changes in Earth's interior structure; similar to how ultrasound measurements let medical professionals look inside of our bodies.These seismic measurements enabled scientists to visualize these ultralow velocity zones in some regions along the core-mantle boundary, by observing the slowing down of seismic waves passing through them. But knowing UVZs exist didn't explain what caused them.However, recent findings about iron and oxygen chemistry under deep-Earth conditions provide an answer to this longstanding mystery.It turns out that water contained in some minerals that get pulled down into Earth due to plate tectonic activity could, under extreme pressures and temperatures, split up -- liberating hydrogen and enabling the residual oxygen to combine with iron metal from the core to create a novel high-pressure mineral, iron peroxide.Led by Carnegie's Ho-kwang "Dave" Mao, the research team believes that as much as 300 million tons of water could be carried down into Earth's interior every year and generate deep, massive reservoirs of iron dioxide, which could be the source of the ultralow velocity zones that slow down seismic waves at the core-mantle boundary.To test this idea, the team used sophisticated tools at Argonne National Laboratory to examine the propagation of seismic waves through samples of iron peroxide that were created under deep-Earth-mimicking pressure and temperature conditions employing a laser-heated diamond anvil cell. They found that a mixture of normal mantle rock with 40 to 50 percent iron peroxide had the same seismic signature as the enigmatic ultralow velocity zones.For the research team, one of the most-exciting aspects of this finding is the potential of a reservoir of oxygen deep in the planet's interior, which if periodically released to Earth's surface could significantly alter Earth's early atmosphere, potentially explaining the dramatic increase in atmospheric oxygen that occurred about 2.4 billion years ago according to the geologic record."Finding the existence of a giant internal oxygen reservoir has many far-reaching implications," Mao explained. "Now we should reconsider the consequences of sporadic oxygen outbursts and their correlations to other major events in Earth's history, such as the banded-iron formation, snowball Earth, mass extinctions, flood basalts, and supercontinent rifts." | Earthquakes | 2,017 |
November 20, 2017 | https://www.sciencedaily.com/releases/2017/11/171120124511.htm | Seafloor sediments appear to enhance Earthquake and Tsunami danger in Pacific Northwest | The Cascadia Subduction Zone off the coast of the Pacific Northwest has all the ingredients for making powerful earthquakes -- and according to the geological record, the region is due for its next "big one." | A new study led by The University of Texas at Austin has found that the occurrence of these big, destructive quakes and associated devastating tsunamis may be linked to compact sediments along large portions of the subduction zone. In particular, they found that big, destructive quakes may have a better chance of occurring offshore of Washington and northern Oregon than farther south along the subduction zone -- although any large quake would impact the surrounding area."We observed very compact sediments offshore of Washington and northern Oregon that could support earthquake rupture over a long distance and close to the trench, which increases both earthquake and tsunami hazards," said lead author Shuoshuo Han, a postdoctoral fellow at the University of Texas Institute for Geophysics (UTIG). UTIG is a research unit of the Jackson School of Geosciences.The findings, published in Subduction zones are areas where one tectonic plate dives or "subducts" beneath another plate. The world's most powerful earthquakes are produced at the interface between the two plates. At certain subduction zones, such as those in Cascadia, Sumatra and eastern Alaska, a thick sediment layer overlies the subducting oceanic plate. Some of the sediment is scraped off during subduction and piled up on the top plate, forming a thick wedge of material, while the rest of the sediment travels down with the bottom plate.How the stress is built up and released at the plate interface is greatly influenced by the degree of compaction of both the sediment wedge and the sediment between the plates. To understand sediment compaction along Cascadia, Han and her collaborators conducted a seismic survey off the coast of Washington and Oregon that allowed the researchers to see up to four miles of sediment layers overlaying the subduction zone. This was accomplished by the using nearly five-mile-long seismic streamers, a scientific tool used to image the seafloor using soundwaves."These kinds of long-streamer marine seismic studies provide the best tools available to the science community to efficiently probe subduction zones in high resolution," said co-author Suzanne Carbotte, a research professor at Columbia University.Combining the seismic data with measurements from sediment samples previously retrieved from this region through ocean drilling, they found that while the thickness of the incoming sediment is similar offshore of Washington and Oregon, the compaction is very different. Off the coast of Washington and northern Oregon, where almost all of the sediments glom on to the top plate and are incorporated into the wedge, the sediments were tightly packed together without much water in the pore space between the sediment grains -- an arrangement that can make the plates more prone to sticking to each other and building up high stress that can be released as a large earthquake. In turn, the compacted sediments could boost the ability of large earthquakes to trigger large tsunamis because the sediments are able to stick and move together during earthquakes. This can boost their ability to move massive amounts of overlying seawater."That combination of both storing more stress and the ability for it to propagate farther is important for both generating large earthquakes and for propagating to very shallow depths," said Nathan Bangs, a senior research scientist at UTIG and study co-author.The propagation of earthquakes into shallow depths is what causes large tsunamis like the one that followed the Magnitude 9.0 earthquake that struck Tohoku, Japan in 2011.In contrast, off the coast of central Oregon, the thick layer of subducting sediments are less compact, with water in the pore space between the grains. This arrangement prevents the plates from sticking as much, and allows them to rupture with less stress accumulated-thereby generating smaller earthquakes.The Cascadia Subduction Zone generates a large earthquake roughly every 200 to 530 years. And with the last large earthquake occurring in 1700, scientists are expecting a large quake to occur in the future, although it's impossible to pinpoint the timing exactly. The research findings can help scientists understand more about the features that make some areas of subduction zones better earthquake incubators than others."The results are consistent with existing constraints on earthquake behavior, offer an explanation for differences in structural style along the margin, and may provide clues about the propensity for shallow earthquake slip in different regions," said co-author Demian Saffer, a Penn State University professor.The study was funded by the National Science Foundation. | Earthquakes | 2,017 |
November 15, 2017 | https://www.sciencedaily.com/releases/2017/11/171115133850.htm | Structure and origins of glacial polish on Yosemite's rocks | The glaciers that carved Yosemite Valley left highly polished surfaces on many of the region's rock formations. These smooth, shiny surfaces, known as glacial polish, are common in the Sierra Nevada and other glaciated landscapes. | Geologists at UC Santa Cruz have now taken a close look at the structure and chemistry of glacial polish and found that it consists of a thin coating smeared onto the rock as the glacier moved over it. The new findings, published in the November issue of This smooth layer coating the rock at the base of glaciers may influence how fast the glaciers slide. It also helps explain why glacial polish is so resistant to weathering long after the glaciers that created it are gone.According to coauthor Emily Brodsky, professor of Earth and planetary sciences at UC Santa Cruz, this ultrathin coating can help glaciologists better understand the mechanics of how glaciers move, and it provides a potential archive for dating when the material was pasted onto the rock."This is incredibly important now, as we think about the stability of ice sheets," Brodsky said. "It is pretty hard to get to the base of a glacier to see what's going on there, but the glacial polish can tell us about the composition of the gunk on the bottoms of glaciers and when the polish was formed."Lead author Shalev Siman-Tov, a postdoctoral researcher at UC Santa Cruz, had previously studied the highly polished surfaces found on some earthquake faults. To investigate glacial polish, he teamed up with Greg Stock, who earned his Ph.D. at UC Santa Cruz and is now the park geologist at Yosemite National Park."I wanted to apply what we know from fault zones and earthquakes to glaciology," Siman-Tov said. "I was not familiar with glaciated landscapes, and I was very interested to conduct a significant field study outside of my home country of Israel."He and Stock hiked into Yosemite National Park to collect small samples of glacial polish from dozens of sites. They chose samples from three sites for detailed analyses. One site (Daff Dome near Tuolomne Meadows) emerged from beneath the glaciers at the end of the last ice age around 15,000 years ago. The other two sites are in Lyell Canyon near small modern glaciers that formed during the Little Ice Age around 300 years ago. Lyell Glacier is no longer active, but McClure Glacier is still moving and has an ice cave at its toe that enabled the researchers to collect fresh polish from an area of active sliding and abrasion.The researchers used an ion beam to slice off thin sections from the samples, and they used electron microscopy techniques to image the samples and perform elemental analyses. The results showed that the tiny fragments in the coating were a mixture of all the minerals found in granodiorite bedrock. This suggests a process in which the glacier scrapes material from the rocks and grinds it into a fine paste, then spreads it across the rock surface to form a very thin layer only a few microns thick."Abrasive wear removes material and makes the surface smoother, while simultaneously producing the wear products that become the construction material for the coating layer," the researchers wrote in the paper.Siman-Tov now wants to date the layer and confirm the time when the glacier eroded the rock surface. He is also conducting laboratory experiments to try to recreate the same structures observed in the coating layer. The researchers will continue to work with Stock in Yosemite to study the chemical weathering of glacial polish surfaces compared to regular, exposed granodiorite. | Earthquakes | 2,017 |
November 13, 2017 | https://www.sciencedaily.com/releases/2017/11/171113111046.htm | Largest, longest multiphysics earthquake simulation created to date | Just before 8:00 a.m. local time on December 26, 2004, people in southeast Asia were starting their days when the third strongest recorded earthquake in history ripped a 1,500-kilometer tear in the ocean floor off the coast of the Indonesian island of Sumatra. | The earthquake lasted between 8 and 10 minutes (one of the longest ever recorded), and lifted the ocean floor several meters, creating a tsunami with 30-meter waves that devastated whole communities. The event caused nearly 200,000 deaths across 15 countries, and released as much energy above and below ground as multiple centuries of US energy usage.The Sumatra-Andaman Earthquake, as it is called, was as surprising as it was violent. Despite major advancements in earthquake monitoring and warning systems over the last 50 years, earth scientists were unable to predict it because relatively little data exists about such large-scale seismological events. Researchers have a wealth of information related to semi-regular, lower-to-medium-strength earthquakes, but disasters such as the Sumatra-Andaman -- events that only happen every couple hundred years -- are too rare to create reliable data sets.In order to more fully understand these events, and hopefully provide better prediction and mitigation methods, a team of researchers from the Ludwig-Maximilians-Universität Munich (LMU) and Technical University of Munich (TUM) is using supercomputing resources at the Leibniz Supercomputing Centre (LRZ) to better understand these rare, extremely dangerous seismic phenomena."Our general motivation is to better understand the entire process of why some earthquakes and resulting tsunamis are so much bigger than others," said TUM Professor Dr. Michael Bader. "Sometimes we see relatively small tsunamis when earthquakes are large, or surprisingly large tsunamis connected with relatively small earthquakes. Simulation is one of the tools to get insight into these events."The team strives for "coupled" simulations of both earthquakes and subsequent tsunamis. It recently completed its largest earthquake simulation yet. Using the SuperMUC supercomputer at LRZ, the team was able to simulate 1,500 kilometers of non-linear fracture mechanics -- the earthquake source -- coupled to seismic waves traveling up to India and Thailand over a little more than 8 minutes of the Sumatra-Andaman earthquake. Through several in-house computational innovations, the team achieved a 13-fold improvement in time to solution. In recognition of this achievement, the project was nominated for the best paper award at SC17, one of the world's premier supercomputing conferences, held this year on November 12-17 in Denver, Colorado.Earthquakes happen as rock below Earth's surface breaks suddenly, often as a result of the slow movement of tectonic plates.One rough predictor of an ocean-based earthquake's ability to unleash a large tsunami is whether plates are grinding against one another or colliding head-on. If two or more plates collide, one plate will often force the other below it. Regions where this process occurs are called subduction zones and can host very large, shallowly dipping faults -- so called "megathrusts." Energy release across such huge zones of weakness tends to create violent tsunamis, as the ocean floor rises a significant amount, temporarily displacing large amounts of water.Until recently, though, researchers doing computational geophysics had great difficulties simulating subduction earthquakes at the necessary level of detail and accuracy. Large-scale earthquake simulations are difficult generally, but subduction events are even more complex."Modeling earthquakes is a multiscale problem in both space and time," said Dr. Alice Gabriel, the lead researcher from the LMU side of the team. "Reality is complex, meaning that incorporating the observed complexity of earthquake sources invariably involves the use of numerical methods, highly efficient simulation software, and, of course, high-performance computing (HPC). Only by exploiting HPC can we create models that can both resolve the dynamic stress release and ruptures happening with an earthquake while also simulating seafloor displacement over thousands of kilometers."When researchers simulate an earthquake, they use a computational grid to divide the simulation into many small pieces. They then compute specific equations for various aspects of the simulation, such as generated seismic shaking or ocean floor displacement, among others, over "time steps," or simulation snapshots over time that help put it in motion, much like a flip book.The finer the grid, the more accurate the simulation, but the more computationally demanding it becomes. In addition, the more complex the geometry of the earthquake, the more complex the grid becomes, further complicating the computation. To simulate subduction earthquakes, computational scientists have to create a large grid that can also accurately represent the very shallow angles at which the two continental plates meet. This requires the grid cells around the subduction area to be extra small, and often slim in shape.Unlike continental earthquakes, which have been better documented through computation and observation, subduction events often happen deep in the ocean, meaning that it is much more difficult to constrain a simulation by ground shaking observations and detailed, reliable data from direct observation and laboratory experiments.Furthermore, computing a coupled, large-scale earthquake-tsunami simulation requires using data from a wide variety of sources. Researchers must take into account the seafloor shape, the shape and strength of the plate boundary ruptured by the earthquake and the material behaviour of Earth's crust at each level, among other aspects. The team has spent the last several years developing methods to more efficiently integrate these disparate data sources into a consistent model.To reduce the enormous computing time, the team exploited a method called "local time stepping." In areas where the simulations require much more spatial detail, researchers also must "slow down" the simulation by performing more time steps in these areas. Other sections that require less detail may execute much bigger -- and thus -- far fewer time steps.If the team had to run its entire simulation at a uniform small time step, it would have required roughly 3 million individual iterations. However, only few cells of the computational grid required this time step size. Major parts could be computed with much larger time steps, some requiring only 3000 time steps. This reduced the computational demand significantly and led to much of the team's 13-fold speedup. This advancement also led to the team's simulation being the largest, longest first-principles simulation of an earthquake of this type.Due to its close collaboration with LRZ staff, the team had opportunities to use the entire SuperMUC machine for its simulations. Bader indicated that these extremely large-scale runs are invaluable for the team to gain deeper insights in its research. "There is a big difference if you run on a quarter of a machine or a full machine, as that last factor of 4 often reveals the critical bottlenecks," he said.The team's ability to take full advantage of current-generation supercomputing resources has it excited about the future. It's not necessarily important that next-generation machines offer the opportunity for the LMU-TUM researchers to run "larger" simulations -- current simulations can effectively simulate a large enough geographic area. Rather, the team is excited about the opportunity to modify the input data and run many more iterations during a set amount of computing time."We have been doing one individual simulation, trying to accurately guess the starting configuration, such as the initial stresses and forces, but all of these are still uncertain," Bader said. "So we would like to run our simulation with many different settings to see how slight changes in the fault system or other factors would impact the study. These would be larger parameter studies, which is another layer of performance that a computer would need to provide."Gabriel also mentioned that next-generation machines will hopefully be able to simulate urgent, real-time scenarios that can help predict hazards as they relate to likely aftershock regions. The team is excited to see the next-generation architectures at LRZ and the other Gauss Centre for Supercomputing centres, the High-Performance Computing Center Stuttgart and the Jülich Supercomputing Centre.In Bader's view, the team's recent work not only represents its largest-scale simulation to date, but also the increasingly strong collaboration between the domain scientists and computational scientists in the group. "This paper has a strong seismology component and a strong HPC component," he said. "This is really a 50-50 paper for us. Our collaboration has been going nicely, and it is because it isn't about getting ours or theirs. Both groups profit, and this is really nice joint work."This work was carried out using Gauss Centre for Supercomputing resources based at the Leibniz Supercomputing Centre. | Earthquakes | 2,017 |
November 9, 2017 | https://www.sciencedaily.com/releases/2017/11/171109093825.htm | Why did the Earth's ancient oceans disappear? | We think of oceans as being stable and permanent. However, they move at about the same speed as your fingernails grow. Geoscientists at CEED, University of Oslo have found a novel way of mapping the Earth's ancient oceans. | The surface of the Earth is in constant motion. New crust is formed at mid-oceanic ridges, such as the Mid-Atlantic Ridge, and older crust is destroyed.If we go millions of years back in time, the oceans and the continents of planet Earth were very different. Oceans that once existed are now buried deep inside the interior of the Earth, in the mantle.Seismic tomography uses earthquakes to image Earth's interior down to approximately 2,800 km. Models based on this technique are used to show how the surface of our planet may have looked like up to 200 million years ago.Grace Shephard at the Centre for Earth Evolution and Dynamics (CEED), University of Oslo has found a simple, yet powerful way to combine images from alternative seismic tomography models. In a new study published in "There are many different ways of creating such models, and lots of different data input can be used," explains Grace Shephard, who has been a postdoctoral researcher at CEED since she took her Ph.D. at the University of Sydney four years ago."We wanted a quick and simple way to see which features are common across all of the models. By comparing up to 14 different models, for instance, we can visualize where they agree and thus identify what we call the most robust anomalies. This gives more accurate and more easily available information about the movements of ocean basins and contents back in time -- and the interaction between the Earth's crust and the mantle."The tomography models are used to reconstruct movements of continents and oceans. The novel and open way of displaying the models takes away some of the decision making for scientists studying the dynamics of the Earth."With this tool, geoscientists can choose which models to use, how deep into the mantle to go, and a few other parameters," explains Shephard. -- Thus, they can zoom into their area of interest. However, we must remember that the maps are only as good as the tomography models they are built upon.Grace Shephard and colleagues have also studied if there are more agreement between the various tomography models at certain depths of the mantle. They have made discoveries that suggest more paleoseafloor can be found at around 1,000 -- 1,400 km beneath the surface than at other depths."If these depths are translated to time -- and we presuppose that the seafloor sinks into the mantle at a rate of 1 centimeter per year -- it could mean that there was a period around 100-140 million years ago that experienced more ocean destruction. However, it could also identify a controversial region in the Earth that is more viscous, or 'sticky,' and causes sinking features to pile up, a bit like a traffic jam. These findings, and the reasons behind, bear critical information about the surface and interior evolution of our planet," explains Shephard.To understand the evolution of the Earth, it is essential to study the subduction zones. The tectonic plates of the oceans are being subducted under the continental plates, or under other oceanic plates. Examples include the Pacific Ocean moving under Japan, and subductions within the Mediterranean region. Plate reconstruction models generally agree that about 130 million years ago, there was a peak in the amount of subduction happening. So the maps by Shephard and colleagues could provide independent evidence for this event.Reversing the evolution Grace Shephard shows us computer animations reversing these evolutionary processes. She brings back to the surface oceans that have been buried deep inside the mantle for millions of years. It may look like a game, but it illustrates an important point:"Studying these processes in new ways opens up new questions. That is something we welcome, because we need to find out what questions to ask and what to focus on in order to understand the development of the Earth. We always have to keep in mind what is an observations and what is a model. The models need to be tested against observations, to make way for new and improved models. It is an iterative procedure.""It is a fundamental way of understanding more about our planet, the configuration of continents and oceans, climate change, mountain building, the location of precious resources, biology, etc. Lines of evidence in the past can be crucial for insight into what will happen in the future, and is critical for the interaction of society and the natural environment.""If you look at Earth from space, the distribution of continents and oceans will then look much the same, even though life, the climate and sea level may have dramatically changed. If we move even further ahead, say 10 or 100 million years, it is very hard to say how oceans may be opening and closing, but we have some clues. Some people think that the Atlantic will close, and others think the Arctic or Indian oceans will close. We can follow the rules of the past when we look to the future, but this task keeps geoscientists very busy. | Earthquakes | 2,017 |
October 26, 2017 | https://www.sciencedaily.com/releases/2017/10/171026135244.htm | Japanese earthquake zone strongly influenced by the effects of friction | The islands of the Japanese archipelago are affected both by frequent, low-magnitude earthquakes and tremors and by larger, highly destructive events. One of the largest quakes to strike Japan occurred in 1944, leading to the loss of more than 1,200 lives on the main and most populated island of Honshu. Its strength resulted from the abrupt release of plate tectonic forces, a process known as subduction, centered on an area beneath Honshu where it slides over the top of oceanic crust. | Highly destructive earthquakes caused by subduction occur because of excessive friction that develops during the sliding process, resulting in a build-up of stress. Sudden release of this stress, a condition called rupturing, leads to the violent shaking felt during an earthquake. A recent study led by the International Institute for Carbon-Neutral Energy Research (I2CNER) at Kyushu University in Japan, and published in "Our understanding of the dynamic behavior of plate boundary faults has advanced," the study's lead author Takeshi Tsuji says. "Yet the factors that control the build-up of friction and stress along plate interfaces and in co-seismic zones are less established."The researchers used advanced 2D and 3D seismic profiles to reveal the detailed structure of the Nankai Trough, particularly of an ancient accretionary prism -- a large mass of rock and sediment accumulated in the trough.The added mass of this rock and sediment has impeded subduction, ultimately causing stress to build up over time. This stress build-up, and rupturing, was the root cause of the massive 1944 Tonankai earthquake and the smaller Off-Mie earthquake that struck almost the same area on April 1, 2016."Along with evidence of frictional obstruction to subduction," Tsuji says, "the fault structure appears to have also impacted earthquake location and behavior. We found that aftershocks of the 2016 quake only occurred in front of the accretionary prism, where stress accumulation is greatest."The long-term implications of the study hinge on evidence that the pre-existing faults from the 1944 earthquake have strongly influenced the orientation and location of rupturing during the 2016 event, suggesting that large earthquakes in Japan are most likely to occur in this very same region of the Nankai Trough in the future. | Earthquakes | 2,017 |
October 24, 2017 | https://www.sciencedaily.com/releases/2017/10/171024141730.htm | Raton Basin earthquakes linked to oil and gas fluid injections | A rash of earthquakes in southern Colorado and northern New Mexico recorded between 2008 and 2010 was likely due to fluids pumped deep underground during oil and gas wastewater disposal, says a new University of Colorado Boulder study. | The study, which took place in the 2,200-square-mile Raton Basin along the central Colorado-northern New Mexico border, found more than 1,800 earthquakes up to magnitude 4.3 during that period, linking most to wastewater injection well activity. Such wells are used to pump water back in the ground after it has been extracted during the collection of methane gas from subterranean coal beds.One key piece of the new study was the use of hydrogeological modeling of pore pressure in what is called the "basement rock" of the Raton Basin -- rock several miles deep that underlies the oldest stratified layers. Pore pressure is the fluid pressure within rock fractures and rock pores.While two previous studies have linked earthquakes in the Raton Basin to wastewater injection wells, this is the first to show that elevated pore pressures deep underground are well above earthquake-triggering thresholds, said CU Boulder doctoral student Jenny Nakai, lead study author. The northern edges of the Raton Basin border Trinidad, Colorado, and Raton, New Mexico."We have shown for the first time a plausible causative mechanism for these earthquakes," said Nakai of the Department of Geological Sciences. "The spatial patterns of seismicity we observed are reflected in the distribution of wastewater injection and our modeled pore pressure change."A paper on the study was published in the The Raton Basin earthquakes between 2008 and 2010 were measured by the seismometers from the EarthScope USArray Transportable Array, a program funded by the National Science Foundation (NSF) to measure earthquakes and map Earth's interior across the country. The team also used seismic data from the Colorado Rockies Experiment and Seismic Transects (CREST), also funded by NSF.As part of the research, the team simulated in 3-D a 12-mile long fault gleaned from seismicity data in the Vermejo Park region in the Raton Basin. The seismicity patterns also suggest a second, smaller fault in the Raton Basin that was active from 2008-2010.Nakai said the research team did not look at the relationship between the Raton Basin earthquakes and hydraulic fracturing, or fracking.The new study also showed the number of earthquakes in the Raton Basin correlates with the cumulative volume of wastewater injected in wells up to about 9 miles away from the individual earthquakes. There are 28 "Class II" wastewater disposal wells -- wells that are used to dispose of waste fluids associated with oil and natural gas production -- in the Raton Basin, and at least 200 million barrels of wastewater have been injected underground there by the oil and gas industry since 1994."Basement rock is typically more brittle and fractured than the rock layers above it," said Sheehan, also a fellow at CU's Cooperative Institute for Research in Environmental Sciences. "When pore pressure increases in basement rock, it can cause earthquakes."There is still a lot to learn about the Raton Basin earthquakes, said the CU Boulder researchers. While the oil and gas industry has monitored seismic activity with seismometers in the Raton Basin for years and mapped some sub-surface faults, such data are not made available to researchers or the public.The earthquake patterns in the Raton Basin are similar to other U.S. regions that have shown "induced seismicity" likely caused by wastewater injection wells, said Nakai. Previous studies involving CU Boulder showed that injection wells likely caused earthquakes near Greeley, Colorado, in Oklahoma and in the mid-continent region of the United States in recent years. | Earthquakes | 2,017 |
October 24, 2017 | https://www.sciencedaily.com/releases/2017/10/171024133733.htm | Supercomputers help scientists improve seismic forecasts for California | Southern California has the highest earthquake risk of any region in the U.S., but exactly how risky and where the greatest risks lie remains an open question. | Earthquakes occur infrequently and depend on complex geological factors deep underground, making them hard to reliably predict in advance. For that reason, forecasting earthquakes means relying on massive computer models and multifaceted simulations, which recreate the rock physics and regional geology and require big supercomputers to execute.In June 2017, a team of researchers from the U.S. Geological Survey and the Southern California Earthquake Center (SCEC) released a major paper in The results relied on computations performed on the original Stampede supercomputer at the Texas Advanced Computing Center, resources at the University of Southern California Center for High-Performance Computing, as well as the newly deployed Stampede2 supercomputer, to which the research team had early access. (Stampede 1 and Stampede2 are supported by grants from the National Science Foundation.)"High-performance computing on TACC's Stampede system, and during the early user period of Stampede2, allowed us to create what is, by all measures, the most advanced earthquake forecast in the world," said Thomas H. Jordan, director of the Southern California Earthquake Center and one of the lead authors on the paper.The new forecast is the first fault-based model to provide self-consistent rupture probabilities from the very short-term -- over a period of less than an hour -- to the very long term -- up to more than a century. It is also the first model capable of evaluating the short-term hazards that result from multi-event sequences of complex faulting.To derive the model, the researchers ran 250,000 rupture scenarios of the state of California, vastly more than in the previous model, which simulated 8,000 ruptures.Among its novel findings, the researchers' simulations showed that in the week following a magnitude 7.0 earthquake, the likelihood of another magnitude 7.0 quake would be up to 300 times greater than the week beforehand. This scenario of 'cascading' ruptures was demonstrated in the 2002 magnitude 7.9 Denali, Alaska, and the 2016 magnitude 7.8 Kaikoura, New Zealand earthquakes, according to David Jacobson and Ross Stein of Temblor.The dramatic increase in the likelihood of powerful aftershocks is due to the inclusion of a new class of models that assess short-term changes in seismic hazard based on what is known about earthquake clustering and aftershock excitations. These factors have never been used in a comprehensive, statewide model like this one.The current model also takes into account the likelihood of ruptures jumping from one fault to a nearby one, which has been observed in California's highly interconnected fault system.Based on these and other new factors, the new model increases the likelihood of powerful aftershocks but downgrades the predicted frequency of earthquakes between magnitude 6.5 and 7.0, which did not match historical records.Importantly, UCERF3 can be updated with observed seismicity -- real-time data based on earthquakes in action -- to capture the static or dynamic triggering effects that play out during a particular sequence of events. The framework is adaptable to many other continental fault systems, and the short-term component might be applicable to the forecasting of minor earthquakes and tremors that are caused by human activity.The impact of such an improved model goes beyond the fundamental scientific improvement it represents. It has the potential to impact building codes, insurance rates, and the state's response to a powerful earthquake.Said Jordan, "The U.S. Geological Survey has included UCERF3 as the California component of the National Seismic Hazard Model, and the model is being evaluated for use in operational earthquake forecasting on timescales from hours to decades."In addition to forecasting the likelihood of an earthquake, models like UCERF3 help predict the associated costs of earthquakes in the region. In recent months, the researchers used UCERF3 and Stampede2 to create a prototype operational loss model, which they described in a paper posted online to Earthquake Spectra in August.The model estimates the statewide financial losses to the region (the costs to repair buildings and other damages) caused by an earthquake and its aftershocks. The risk metric is based on a vulnerability function and the total replacement cost of asset types in a given census tract.The model found that the expected loss per year when averaged over many years would be $4.0 billion statewide. More importantly, the model was able to quantify how expected losses change with time due to recent seismic activity. For example, the expected losses in a year following an magnitude 7.1 main shock spike to $24 billion due to potentially damaging aftershocks, a factor of six greater than during "normal" times.Being able to quantify such fluctuations will enable financial institutions, such as earthquake insurance providers, to adjust their business decisions accordingly."It's all about providing tools that will help make society more resilient to damaging earthquake sequences," says Ned Field of the USGS, another lead author of the two studies.Though there's a great deal of uncertainty in both the seismicity and the loss estimates, the model is an important step at quantifying earthquake risk and potentially devastation in the region, thereby helping decision-makers determine whether and how to respond. | Earthquakes | 2,017 |
October 23, 2017 | https://www.sciencedaily.com/releases/2017/10/171023101803.htm | Western US Quake? Fifty simulations of the 'Really Big One' show how a 9.0 Cascadia earthquake could play out | One of the worst nightmares for many Pacific Northwest residents is a huge earthquake along the offshore Cascadia Subduction Zone, which would unleash damaging and likely deadly shaking in coastal Washington, Oregon, British Columbia and northern California. | The last time this happened was in 1700, before seismic instruments were around to record the event. So what will happen when it ruptures next is largely unknown.A University of Washington research project, to be presented Oct. 24 at the Geological Society of America's annual meeting in Seattle, simulates 50 different ways that a magnitude-9.0 earthquake on the Cascadia subduction zone could unfold."There had been just a handful of detailed simulations of a magnitude-9 Cascadia earthquake, and it was hard to know if they were showing the full range," said Erin Wirth, who led the project as a UW postdoctoral researcher in Earth and space sciences. "With just a few simulations you didn't know if you were seeing a best-case, a worst-case or an average scenario. This project has really allowed us to be more confident in saying that we're seeing the full range of possibilities."Off the Oregon and Washington coast, the Juan de Fuca oceanic plate is slowly moving under the North American plate. Geological clues show that it last jolted and unleashed a major earthquake in 1700, and that it does so roughly once every 500 years. It could happen any day.Wirth's project ran simulations using different combinations for three key factors: the epicenter of the earthquake; how far inland the earthquake will rupture; and which sections of the fault will generate the strongest shaking.Results show that the intensity of shaking can be less for Seattle if the epicenter is fairly close to beneath the city. From that starting point, seismic waves will radiate away from Seattle, sending the biggest shakes in the direction of travel of the rupture."Surprisingly, Seattle experiences less severe shaking if the epicenter is located just beneath the tip of northwest Washington," Wirth said. "The reason is because the rupture is propagating away from Seattle, so it's most affecting sites offshore. But when the epicenter is located pretty far offshore, the rupture travels inland and all of that strong ground shaking piles up on its way to Seattle, to make the shaking in Seattle much stronger."The research effort began by establishing which factors most influence the pattern of ground shaking during a Cascadia earthquake. One, of course, is the epicenter, or more specifically the "hypocenter," which locates the earthquake's starting point in three-dimensional space.Another factor they found to be important is how far inland the fault slips. A magnitude-9.0 earthquake would likely give way along the whole north-south extent of the subduction zone, but it's not well known how far east the shake-producing area would extend, approaching the area beneath major cities such as Seattle and Portland.The third factor is a new idea relating to a subduction zone's stickiness. Earthquake researchers have become aware of the importance of "sticky points," or areas between the plates that can catch and generate more shaking. This is still an area of current research, but comparisons of different seismic stations during the 2010 Chile earthquake and the 2011 Tohoku earthquake show that some parts of the fault released more strong shaking than others.Wirth simulated a magnitude-9.0 earthquake, about the middle of the range of estimates for the magnitude of the 1700 earthquake. Her 50 simulations used variables spanning realistic values for the depth of the slip, and had randomly placed hypocenters and sticky points. The high-resolution simulations were run on supercomputers at the Pacific Northwest National Laboratory and the University of Texas, Austin.Overall, the results confirm that coastal areas would be hardest hit, and locations in sediment-filled basins like downtown Seattle would shake more than hard, rocky mountaintops. But within that general framework, the picture can vary a lot; depending on the scenario, the intensity of shaking can vary by a factor of 10. But none of the pictures is rosy."We are finding large amplification of ground shaking by the Seattle basin," said collaborator Art Frankel, a U.S. Geological Survey seismologist and affiliate faculty member at the UW. "The average duration of strong shaking in Seattle is about 100 seconds, about four times as long as from the 2001 Nisqually earthquake."The research was done as part of the M9 Project, a National Science Foundation-funded effort to figure out what a magnitude-9 earthquake might look like in the Pacific Northwest and how people can prepare. Two publications are being reviewed by the USGS, and engineers are already using the simulation results to assess how tall buildings in Seattle might respond to the predicted pattern of shaking.As a new employee of the USGS, Wirth will now use geological clues to narrow down the possible earthquake scenarios."We've identified what parameters we think are important," Wirth said. "I think there's a future in using geologic evidence to constrain these parameters, and maybe improve our estimate of seismic hazard in the Pacific Northwest." | Earthquakes | 2,017 |
October 23, 2017 | https://www.sciencedaily.com/releases/2017/10/171023094627.htm | Geophysicist finds teaching opportunities in movie mistakes | Few scientists regard the 1997 movie Volcano, in which flaming magma suddenly spews from the La Brea tar pits and incinerates much of Los Angeles, as a means to foster scientific literacy. After all, Southern California has no magma to spew. But geophysicist Seth Stein sees it differently. | "Scientists have a choice," said Stein, William Deering Professor in the Department of Earth & Planetary Sciences at Northwestern University. "We can complain about all the horrible mistakes that are in a lot of movies, or we can say, "hey, this is a really great opportunity to get the class interested.'"Stein argues that scientific errors in movies, from impossibly large tsunamis to caverns in Earth's mantle, can be used to teach scientific lessons and foster a healthy sense of skepticism. By training students to spot the errors and seek out true explanations, Stein often incorporates scientifically disastrous disaster movies into his classroom lessons on tectonics, Earth's interior, and geo-physical data analysis.Stein and his coauthors, geologists Reece Elling, Amir Salaree, and geophysicist Michael Wysession of Washington University, plan to share their approach at the Geological Society of America's Annual Meeting on Monday, 23 October, in Seattle, Washington.There's no shortage of scientific unlikelihood to choose from. In the 2003 movie The Core, for example, a team of "terranauts" venture into Earth's core inside a vessel made of "unobtanium," where they encounter gaping voids in our planet's interior and crash through fields of sparkling, jagged minerals. The 2004 television miniseries "10.5" portrays destruction brought on by an earthquake of an absurdly large magnitude.By identifying these errors and learning why they're inaccurate (caverns and the minerals shown simply couldn't exist at those depths and pressures, nor would an M10.5 earthquake strike our planet), Stein said he offers students an entertaining way to connect with their inner skeptic: a vital trait for young scientists in training."Scientists are supposed to be skeptical," said Stein. "We're not supposed to believe what authorities tell us. We're supposed to question and challenge everything."Stein said the errors extend beyond film. He's spotted inaccuracies in widely-used educational software, science museum animations, popular geology textbooks, and even drafting errors in his own books.Among the most pervasive movie errors, Stein said, is the ubiquitous "lack of geological constraints on the world," and that "you can just have volcanoes popping up everywhere." It looks as if Stein will receive more teaching opportunities just before GSA 2017, as the new movie Geostorm, in which a weather-controlling satellite system turns nefarious, instantly freezing an entire Afghani village and unleashing packs of tornadoes on unsuspecting civilians, comes out 20 October.Yet Stein's presentation comes at a time when movie and television crews often hire scientists as consultants, who do everything from writing equations on background classroom boards to reviewing scripts for gross scientific inaccuracies. Stein refers to The Martian, whose production was thoroughly informed by NASA specialists, as one of the more accurate films of late."There was only one thing they had to fudge," said Stein, "which was the initial scene where the mission was destroyed by the wind storm. It's true that the wind speeds on Mars are very high, but because the density of the atmosphere is so low it wouldn't actually matter." | Earthquakes | 2,017 |
October 17, 2017 | https://www.sciencedaily.com/releases/2017/10/171017114344.htm | Scientists determine source of world's largest mud eruption | On May 29, 2006, mud started erupting from several sites on the Indonesian island of Java. Boiling mud, water, rocks and gas poured from newly-created vents in the ground, burying entire towns and compelling many Indonesians to flee. By September 2006, the largest eruption site reached a peak, and enough mud gushed on the surface to fill 72 Olympic-sized swimming pools daily. | Indonesians frantically built levees to contain the mud and save the surrounding settlements and rice fields from being covered. The eruption, known as Lusi, is still ongoing and has become the most destructive ongoing mud eruption in history. The relentless sea of mud has buried some villages 40 meters (130 feet) deep and forced nearly 60,000 people from their homes. The volcano still periodically spurts jets of rocks and gas into the air like a geyser. It is now oozing around 80,000 cubic meters (3 million cubic feet) of mud each day -- enough to fill 32 Olympic-sized pools. Watch a video of the Lusi eruption here: Now, more than 11 years after it first erupted, researchers may have figured out why the mudflows haven't stopped: deep underground, Lusi is connected to a nearby volcanic system.In a new study, researchers applied a technique geophysicists use to map Earth's interior to image the area beneath Lusi. The images show the conduit supplying mud to Lusi is connected to the magma chambers of the nearby Arjuno-Welirang volcanic complex through a system of faults 6 kilometers (4 miles) below the surface.Volcanoes can be connected to each other deep underground and scientists suspected Lusi and the Arjuno-Welirang volcanic complex were somehow linked, because previous research showed some of the gas Lusi expels is typically found in magma. But no one had yet shown that Lusi is physically connected to Arjuno-Welirang.The researchers discovered that the scorching magma from the Arjuno-Welirang volcano has essentially been "baking" the organic-rich sediments underneath Lusi. This process builds pressure by generating gas that becomes trapped below the surface. In Lusi's case, the pressure grew until an earthquake triggered it to erupt.Studying the connection of these two systems could help scientists to better understand how volcanic systems evolve, whether they erupt magma, mud or hydrothermal fluids."We clearly show the evidence that the two systems are connected at depth," said Adriano Mazzini, a geoscientist at CEED -- University of Oslo and lead author of the new study in the Java is part of a volcanic island arc, formed when one tectonic plate subducts below another. As the island rose upward out of the sea, volcanoes formed along its spine, with basins of shallow water between them. Lusi's mud comes from sediments laid down in those basins while the island was still partially submerged underwater.Mazzini has been studying Lusi since soon after the eruption began. Two years ago, the study's authors installed a network of 31 seismometers around Lusi and the neighboring volcanic complex. Researchers typically use seismometers to measure ground shaking during earthquakes, but scientists can also use them to create three-dimensional images of the areas underneath volcanoes.Using 10 months of data recorded by the seismometers, Mazzini and his colleagues imaged the area below Lusi and the surrounding volcanoes. The images showed a tunnel protruding from the northernmost of Arjuno-Welirang's magma chambers into the sedimentary basin where Lusi is located. This allows magma and hydrothermal fluids originating in the mantle to intrude into Lusi's sediments, which triggers massive reactions and creates gas that generates high pressure below Earth's surface. Any perturbation -- like an earthquake -- can then trigger this system to erupt."It's just a matter of reactivating or opening these faults and whatever overpressure you have gathered in the subsurface will inevitably want to escape and come to the surface, and you have a manifestation on the surface, and that is Lusi," Mazzini said.Mazzini and other researchers suspect a magnitude 6.3 earthquake that struck Java two days before the mud started flowing was what triggered the Lusi eruption, by reactivating the fault system that connects it to Arjuno-Welirang.By allowing magma to flow into Lusi's sedimentary basin, the fault system could be an avenue for moving the entire volcanic system northward, said Stephen Miller, a professor of geodynamics at the University of Neuchâtel in Neuchâtel, Switzerland who was not connected to the study."It looks like this might be the initial stages of this march forward of this volcanic arc," Miller said. "Ultimately, it's bringing all this heat over toward Lusi, which is driving that continuous system."Mazzini and other scientists are unsure how much longer Lusi will continue to erupt. While mud volcanoes are fairly common on Java, Lusi is a hybrid between a mud volcano and a hydrothermal vent, and its connection to the nearby volcano will keep sediments cooking for years to come."So what it means to me is that Lusi's not going to stop anytime soon," Miller said. | Earthquakes | 2,017 |
October 16, 2017 | https://www.sciencedaily.com/releases/2017/10/171016081950.htm | Waves in lakes make waves in the Earth | Beneath the peaceful rolling waves of a lake is a rumble, imperceptible to all but seismometers, that ripples into the earth like the waves ripple along the shore. | In a study published today in the "It's kind of a new phenomenon," says Keith Koper, director of the University of Utah Seismograph Stations and co-author of the study. "We don't really know how it's created."Seismologists have long known that wind-driven ocean waves generate small seismic waves, called microseisms. These microseisms are generated as waves drag across the ocean floor or interact with each other. They are part of the background seismic noise in coastal areas."We've recently found that the waves on lakes actually generate these microseisms too," Koper says. Lake microseisms had been previously recorded near the Great Lakes, Canada's Great Slave Lake and Utah's own Great Salt Lake. In the paper, Koper and colleagues present additional observations from Yellowstone Lake and three lakes in China, exploring the characteristics of the respective lakes' microseisms.Koper says the tremors are very small. "You wouldn't be able to feel 'em, that's for sure," he says. But by averaging seismic signals over a long period -- six months, for example -- a consistent signal emerges.The signal can be used to produce what Koper calls a "CT scan of the Earth," or seismic tomography. Seismic waves travel through different geological materials at different speeds, so observing how waves change as they emanate from a source can reveal subsurface geology. Researchers can create these seismic sources with methods like a hammer on a metal plate, an explosion, or a specially outfitted truck with a vibrating plate. Lakes, Koper says, provide a natural, regular source. "It would take quite a bit of effort and work to generate this level of energy."The area that could be explored using lake microseisms is limited to the region close to a lake, but Koper writes that lake microseisms emanating from the Great Salt Lake might reach far enough to visualize how seismic waves would move beneath Salt Lake City, which sits on the Wasatch Fault, in a major earthquake. Likewise, Lake Tahoe microseisms could extend to Reno, Nevada, and Lake Michigan could provide microseisms to image the geology beneath the Chicago area.Microseisms can perform another function, says Aini Mokhdhari, a senior majoring in geology. Because the tremors are caused by wind-driven waves, microseisms cease when a lake freezes over in winter. They resume again when it thaws in the spring. Thus, rather than relying on satellite or eyewitness observations, lake freezing and thawing could be monitored by an autonomous seismometer.Mokhdhari looked at microseismic data from Yellowstone Lake, a well-observed lake for which the freezing and thawing dates are known. "We compare the data we got from the seismograph to see if it's the same," she says. "So far it is." Seismological observations may not be needed at Yellowstone Lake, but could be useful for monitoring more remote lakes for long-term changes to ice cover duration. Mokhdhari will present results of her work on lake microseisms at the Fall Meeting of the American Geophysical Union, to be held Dec. 11-15 in New Orleans.Next summer, Mokhdhari and Koper will join colleagues in a further seismic study of Yellowstone Lake. They'll place an array of small seismometers called geophones around the perimeter of the lake, and also place an array of special waterproof seismometers on the lake floor. Additionally, they will use a buoy on the lake to measure wind and wave conditions. Their colleagues are looking to understand the hydrothermal vents in Yellowstone Lake, but Mokhdhari and Koper are much more interested in capturing microseisms from all angles."If we can record at the same time on land and underwater," Koper says, "we can get a better idea of how these things are generated." | Earthquakes | 2,017 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.