Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
July 8, 2020 | https://www.sciencedaily.com/releases/2020/07/200708110010.htm | Study could rewrite Earth's history | Curtin University-led research has found new evidence to suggest that the Earth's first continents were not formed by subduction in a modern-like plate tectonics environment as previously thought, and instead may have been created by an entirely different process. | Published in the journal Lead author Dr Luc-Serge Doucet, from the Earth Dynamics Research Group in Curtin's School of Earth and Planetary Sciences, said the first continents were formed early in Earth's history more than three billion years ago, but how they were formed is still open to debate."Previous research has suggested that the first supercontinents formed through subduction and plate tectonics, which is when the Earth's plates move under one another shaping the mountains and oceans," Dr Doucet said."Our research found that that the chemical makeup of the rock fragments was not consistent with what we would usually see when subduction occurs. If the continents were formed through subduction and plate tectonics we would expect the ratio of iron and zinc isotopes to be either very high or very low, but our analyses instead found the ratio of isotopes was similar to that found in non-subduction rocks."Dr Doucet said the team used a relatively new technique known as the non-traditional stable isotope method, which has been used to pinpoint the processes that formed continental and mantle rocks."Our research provides a new, but unknown theory as to how the Earth's continents formed more than three billion years ago. Further research will be needed to determine what the unknown explanation is," Dr Doucet said.The research was co-authored by researchers from Curtin's Earth Dynamics Research Group, Université Libre de Bruxelles in Belgium, Institute for Geochemistry and Petrology in Switzerland, and Université de Montpellier in France. | Earthquakes | 2,020 |
July 6, 2020 | https://www.sciencedaily.com/releases/2020/07/200706101837.htm | The sixth sense of animals: An early warning system for earthquakes? | Even today, nobody can reliably predict when and where an earthquake will occur. However, eyewitnesses have repeatedly reported that animals behave unusually before an earthquake. In an international cooperation project, researchers from the Max Planck Institute of Animal Behavior in Konstanz/Radolfzell and the Cluster of Excellence Centre for the Advanced Study of Collective Behaviour at the University of Konstanz, have investigated whether cows, sheep, and dogs can actually detect early signs of earthquakes. To do so, they attached sensors to the animals in an earthquake-prone area in Northern Italy and recorded their movements over several months. The movement data show that the animals were unusually restless in the hours before the earthquakes. The closer the animals were to the epicentre of the impending quake, the earlier they started behaving unusually. The movement profiles of different animal species in different regions could therefore provide clues with respect to the place and time of an impending earthquake. | Experts disagree about whether earthquakes can be exactly predicted. Nevertheless, animals seem to sense the impending danger hours in advance. For example, there are reports that wild animals leave their sleeping and nesting places immediately before strong quakes and that pets become restless. However, these anecdotal accounts often do not stand up to scientific scrutiny because the definition of unusual behaviour is often too unclear and the observation period too short. Other factors could also explain the behaviour of the animals.In order to be able to use animal activity patterns as a kind of early warning system for earthquakes, the animals would have to show measurable behavioural changes. Moreover, if they do indeed react to weak physical changes immediately before an earthquake, they should react more strongly the closer they are to the epicentre of the quake.In an international cooperation project, researchers from the Max Planck Institute of Animal Behavior in Radolfzell/Konstanz and the Centre for the Advanced Study of Collective Behaviour, a Cluster of Excellence at the University of Konstanz, have investigated whether animals really do this. On an Italian farm in an earthquake-prone area, they attached accelerometers to the collars of six cows, five sheep, and two dogs that had already displayed unusual behaviour before earthquakes. The researchers then recorded their movements continuously over several months. During this period, official authorities reported about 18,000 earthquakes in the region. In addition to many small and hardly noticeable quakes, there were also 12 earthquakes with a strength of 4 or higher on the Richter scale.The researchers then selected the quakes that triggered statistically relevant earth movements on the farm. These included strong quakes up to 28 km away as well as weaker quakes, the epicentres of which were very close to the farm. However, instead of explicitly looking for abnormal behaviours in the period before these events, the researchers chose a more cautious approach. They first marked all behavioural changes of the animals that were unusual according to objective, statistical criteria. "In this way, we ensure that we not only establish correlations retrospectively but also that we really do have a model that can be used for predictions," says Martin Wikelski, director at the Max Planck Institute of Animal Behavior and Principal Investigator at the Centre for the Advanced Study of Collective Behaviour.The data -- measured as body acceleration of each farm animal (indicating activity level) -- were evaluated using statistical models drawn from financial econometrics. "Because every animal reacts differently in size, speed and according to species, the animal data resemble data on heterogenous financial investors," explains co-author Winfried Pohlmeier, Professor of Econometrics at the University of Konstanz and Principal Investigator at the Centre for the Advanced Study of Collective Behaviour. The scientists also considered other disturbance factors such as natural changes in animal activity patterns over the day.In this way, the researchers discovered unusual behavioural patterns up to 20 hours before an earthquake. "The closer the animals were to the epicentre of the impending shock, the earlier they changed their behaviour. This is exactly what you would expect when physical changes occur more frequently at the epicentre of the impending earthquake and become weaker with increasing distance," explains Wikelski. However, this effect was clear only when the researchers looked at all animals together. "Collectively, the animals seem to show abilities that are not so easily recognized on an individual level," says Wikelski.It is still unclear how animals can sense impending earthquakes. Animals may sense the ionization of the air caused by the large rock pressures in earthquake zones with their fur. It is also conceivable that animals can smell gases released from quartz crystals before an earthquake.Real-time data measured by the researchers and recorded since December 2019 show what an animal earthquake early warning system could look like: a chip on the collar sends the movement data to a central computer every three minutes. This triggers a warning signal if it registers a significantly increased activity of the animals for at least 45 minutes.The researchers have once received such a warning. "Three hours later, a small quake shook the region," says Wikelski. "The epicentre was directly below the stables of the animals."However, before the behaviour of animals can be used to predict earthquakes, researchers need to observe a larger number of animals over longer periods of time in different earthquake zones around the world. For this, they want to use the global animal observation system Icarus on the International Space Station ISS, which will start its scientific operation in a few weeks.Icarus, a scientific project directed by Martin Wikelski, is a joint project funded and carried out by the German Aerospace Center (DLR) and the Russian space agency Roskosmos and is supported by the European Space Agency (ESA). | Earthquakes | 2,020 |
July 2, 2020 | https://www.sciencedaily.com/releases/2020/07/200702113658.htm | Typhoon changed earthquake patterns | The Earth's crust is under constant stress. Every now and then this stress is discharged in heavy earthquakes, mostly caused by the slow movement of Earth's crustal plates. There is, however, another influencing factor that has received little attention so far: intensive erosion can temporarily change the earthquake activity (seismicity) of a region significantly. This has now been shown for Taiwan by researchers from the GFZ German Research Centre for Geosciences in cooperation with international colleagues. They report on this in the journal | The island in the western Pacific Ocean is anyway one of the most tectonically active regions in the world, as the Philippine Sea Plate collides with the edge of the Asian continent. 11 years ago, Typhoon Morakot reached the coast of Taiwan. This tropical cyclone is considered the one of the worst in Taiwan's recorded history.Within only three days in August 2009, three thousand litres of rain fell per square metre. As a comparison, Berlin and Brandenburg receive an average of around 550 liters per square meter in one year. The water masses caused catastrophic flooding and widespread landsliding. More than 600 people died and the immediate economic damage amounted to the equivalent of around 3 billion euros.The international team led by Philippe Steer of the University of Rennes, France, evaluated the earthquakes following this erosion event statistically. They showed that there were significantly more small-magnitude and shallow earthquakes during the 2.5 years after typhoon Morakot than before, and that this change occurred only in the area showing extensive erosion. GFZ researcher and senior author Niels Hovius says: "We explain this change in seismicity by an increase in crustal stresses at shallow depth, less than 15 kilometres, in conjunction with surface erosion." The numerous landslides have moved enormous loads, rivers transported the material from the devastated regions. "The progressive removal of these loads changes the state of the stress in the upper part of the Earth's crust to such an extent that there are more earthquakes on thrust faults," explains Hovius.So-called active mountain ranges, such as those found in Taiwan, are characterized by "thrust faults" in the underground, where one unit of rocks moves up and over another unit. The rock breaks when the stress becomes too great. Usually it is the continuous pressure of the moving and interlocking crustal plates that causes faults to move. The resulting earthquakes in turn often cause landslides and massively increased erosion. The work of the GFZ researchers and their colleagues now shows for the first time that the reverse is also possible: massive erosion influences seismicity -- and does so in a geological instant. Niels Hovius: "Surface processes and tectonics are connected in the blink of an eye." The researcher continues: "Earthquakes are among the most dangerous and destructive natural hazards. Better understanding earthquake triggering by tectonics and by external processes is crucial for a more realistic assessment of earthquake hazards, especially in densely populated regions." | Earthquakes | 2,020 |
July 2, 2020 | https://www.sciencedaily.com/releases/2020/07/200702113713.htm | Cause of abnormal groundwater rise after large earthquake | Increases in groundwater levels and volumes after large earthquakes have been observed around the world, but the details of this process have remained unclear due to a lack of groundwater data directly before and after an earthquake strikes. Fortunately, researchers from Kumamoto and Kwansei Gakuin Universities (Japan) and UC Berkley (US) realized that they had a unique research opportunity to analyze groundwater level changes around Kumamoto City after large earthquakes struck the area in 2016 . | Changes in the hydrological environment after an earthquake, like ponds or wells drying-up, the sudden appearance of running water, or a rise in water levels have been recorded since Roman times. Various theories have been proposed for the cause of such changes, such as fluctuations in pore water pressure (the pressure of groundwater held in the pores or gaps of rocks and soil), increased water permeability, and water movement through new cracks. To identify the actual cause, data must be collected from observation sites in wells, water sources, and rivers. However, especially in the case of inland earthquakes, it is generally rare for these sites to be spatiotemporally arranged in an area where a large earthquake has occurred. Additionally, it is even rarer to have enough data to compare before and after the disaster. These difficulties have been a roadblock to obtaining a clear picture of how hydrological environments change after earthquakes.Kumamoto City, on the southern Japanese island of Kyushu, is famous for its water. Nearly 100% of the city's drinking water is sourced from groundwater in the area so there are many observation wells in the area that continuously record water level and quality data. In the early morning (Japan time) of April 16, 2016, a magnitude 7.0 earthquake struck the city which resulted in a wealth of groundwater data both before and after the earthquake. Kumamoto University researchers recognized this unique opportunity to assess how earthquakes can change hydrological environments in more detail than ever before, so they established an international collaboration to study the event.An abnormal rise in groundwater level occurred after the main shock and was particularly noticeable in the recharge area of the groundwater flow system. The water levels peaked within a year after the main shock at around 10 meters and, although it has calmed down thereafter, water levels were still high more than three years later. This was thought to be due to an inflow of water from a place not part of the pre-earthquake hydrological cycle, so researchers attempted to determine the sources by using stable isotope ratios of water.The stable isotope ratios of water on Earth's surface change slightly with various processes (evaporation, condensation, etc.) so they become unique marker values depending on location. These markers make it possible to determine the processes that affected a water sample as well as its source.A comparison of the before-and-after sets of stable isotope ratios revealed that, prior to the earthquake, groundwater in the Kumamoto City area came mainly from low-elevation mountain aquifers, soil water in recharge areas, and seepage from the central Shirakawa river area. After the earthquake, the researchers believe that seismic fractures on the west side of Mt. Aso increased the permeability of the mountain aquafer which released groundwater toward the recharge area of the flow system and increased water levels. Furthermore, groundwater levels in the outflow area that had dropped immediately after the main shock were nearly restored within just one year."Our research is the first to capture the hydrological environment changes caused by a large earthquake in detail," said study leader Associate Professor Takahiro Hosono. "The phenomenon we discovered can occur anywhere on Earth in areas with climate and geological conditions similar to Kumamoto. We hope our research will be useful both for academics and the establishment of guidelines for regional water use in a disaster." | Earthquakes | 2,020 |
June 29, 2020 | https://www.sciencedaily.com/releases/2020/06/200629164145.htm | Researchers catch a wave to determine how forces control granular material properties | Stress wave propagation through grainy, or granular, materials is important for detecting the magnitude of earthquakes, locating oil and gas reservoirs, designing acoustic insulation and designing materials for compacting powders. | A team of researchers led by a Johns Hopkins mechanical engineering professor used X-ray measurements and analyses to show that velocity scaling and dispersion in wave transmission is based on particle arrangements and chains of force between them, while reduction of wave intensity is caused mainly from particle arrangements alone. The research appears in the June 29 edition of the journal the "Our study provides a better understanding of how the fine-scale structure of a granular material is related to the behavior of waves propagating through them," said Ryan Hurley, assistant professor of mechanical engineering at Johns Hopkins Whiting School of Engineering. "This knowledge is of fundamental importance in the study of seismic signals from landslides and earthquakes, in the nondestructive evaluation of soils in civil engineering, and in the fabrication of materials with desired wave properties in materials science."Hurley conceived of this research while a postdoc at Lawrence Livermore National Laboratory, collaborating with a team that included LLNL physicist Eric Herbold. The experiments and analysis were later performed by Hurley and Whiting School postdoc Chongpu Zhai after Hurley moved to JHU, with experimental assistance and continued discussions with Herbold.Structure-property relations of granular materials are governed by the arrangement of particles and the chains of forces between them. These relations enable design of wave damping materials and non-destructive testing technologies. Wave transmission in granular materials has been extensively studied and demonstrates unique features: power-law velocity scaling, dispersion and attenuation (the reduction of the amplitude of a signal, electric current, or other oscillation).Earlier research, dating back to the late 1950s described "what" may be happening to the material underlying wave propagation, but the new research provides evidence for "why.""The novel experimental aspect of this work is the use of in-situ X-ray measurements to obtain packing structure, particle stress and inter-particle forces throughout a granular material during the simultaneous measurement of ultrasound transmission," said Hurley. "These measurements are the highest fidelity dataset to-date investigating ultrasound, forces and structure in granular materials.""These experiments, along with the supporting simulations, allow us to reveal why wave speeds in granular materials change as a function of pressure and to quantify the effects of particular particle-scale phenomena on macroscopic wave behavior," said Zhai, who led the data analysis efforts and was that paper's first author.The research provides new insight into time- and frequency-domain features of wave propagation in randomly packed grainy materials, shedding light on the fundamental mechanisms controlling wave velocities, dispersion and attenuation in these systems.This research was funded by the Johns Hopkins Whiting School of Engineering and LLNL's Laboratory Directed Research and Development program, and was carried out at the Advanced Photon Source, an Office of Science User Facility, operated by Argonne National Laboratory. | Earthquakes | 2,020 |
June 24, 2020 | https://www.sciencedaily.com/releases/2020/06/200624120450.htm | How water in the deep Earth triggers earthquakes and tsunamis | In a new study, published in the journal | Water (HSubduction zones, where tectonic plates converge and one plate sinks beneath another, are the most important parts of the cycle -- with large volumes of water going in and coming out, mainly through volcanic eruptions. Yet, just how (and how much) water is transported via subduction, and its effect on natural hazards and the formation of natural resources, has historically been poorly understood.Lead author of the study, Dr George Cooper, Honorary Research Fellow at the University of Bristol's School of Earth Sciences, said: "As plates journey from where they are first made at mid-ocean ridges to subduction zones, seawater enters the rocks through cracks, faults and by binding to minerals. Upon reaching a subduction zone, the sinking plate heats up and gets squeezed, resulting in the gradual release of some or all of its water. As water is released it lowers the melting point of the surrounding rocks and generates magma. This magma is buoyant and moves upwards, ultimately leading to eruptions in the overlying volcanic arc. These eruptions are potentially explosive because of the volatiles contained in the melt. The same process can trigger earthquakes and may affect key properties such as their magnitude and whether they trigger tsunamis or not."Exactly where and how volatiles are released and how they modify the host rock remains an area of intense research.Most studies have focused on subduction along the Pacific Ring of Fire. However, this research focused on the Atlantic plate, and more specifically, the Lesser Antilles volcanic arc, located at the eastern edge of the Caribbean Sea."This is one of only two zones that currently subduct plates formed by slow spreading. We expect this to be hydrated more pervasively and heterogeneously than the fast spreading Pacific plate, and for expressions of water release to be more pronounced," said Prof. Saskia Goes, Imperial College London.The Volatile Recycling in the Lesser Antilles (VoiLA) project brings together a large multidisciplinary team of researchers including geophysicists, geochemists and geodynamicists from Durham University, Imperial College London, University of Southampton, University of Bristol, Liverpool University, Karlsruhe Institute of Technology, the University of Leeds, The Natural History Museum, The Institute de Physique du Globe in Paris, and the University of the West Indies."We collected data over two marine scientific cruises on the RRS James Cook, temporary deployments of seismic stations that recorded earthquakes beneath the islands, geological fieldwork, chemical and mineral analyses of rock samples, and numerical modelling," said Dr Cooper.To trace the influence of water along the length of the subduction zone, the scientists studied boron compositions and isotopes of melt inclusions (tiny pockets of trapped magma within volcanic crystals). Boron fingerprints revealed that the water-rich mineral serpentine, contained in the sinking plate, is a dominant supplier of water to the central region of the Lesser Antilles arc."By studying these micron-scale measurements it is possible to better understand large-scale processes. Our combined geochemical and geophysical data provide the clearest indication to date that the structure and amount of water of the sinking plate are directly connected to the volcanic evolution of the arc and its associated hazards," said Prof. Colin Macpherson, Durham University"The wettest parts of the downgoing plate are where there are major cracks (or fracture zones). By making a numerical model of the history of fracture zone subduction below the islands, we found a direct link to the locations of the highest rates of small earthquakes and low shear wave velocities (which indicate fluids) in the subsurface," said Prof. Saskia Goes.The history of subduction of water-rich fracture zones can also explain why the central islands of the arc are the largest and why, over geologic history, they have produced the most magma."Our study provides conclusive evidence that directly links the water-in and water-out parts of the cycle and its expressions in terms of magmatic productivity and earthquake activity. This may encourage studies at other subduction zones to find such water-bearing fault structures on the subducting plate to help understand patterns in volcanic and earthquake hazards," said Dr Cooper."In this research we found that variations in water correlate with the distribution of smaller earthquakes, but we would really like to know how this pattern of water release may affect the potential -- and act as a warning system -- for larger earthquakes and possible tsunami," said Prof. Colin Macpherson. | Earthquakes | 2,020 |
June 18, 2020 | https://www.sciencedaily.com/releases/2020/06/200618150241.htm | Natural fluid injections triggered Cahuilla earthquake swarm | A naturally occurring injection of underground fluids drove a four-year-long earthquake swarm near Cahuilla, California, according to a new seismological study that utilizes advances in earthquake monitoring with a machine-learning algorithm. In contrast to mainshock/aftershock sequences, where a large earthquake is followed by many smaller aftershocks, swarms typically do not have a single standout event. | The study, which will be published on June 19 in the journal The Cahuilla swarm, as it is known, is a series of small temblors that occurred between 2016 and 2019 near Mt. San Jacinto in Southern California. To better understand what was causing the shaking, Ross and colleagues from Caltech, the United States Geological Survey (USGS), and the University of Texas at Austin used earthquake-detection algorithms with deep neural networks to produce a highly detailed catalog of more than 22,000 seismic events in the area ranging in magnitude from 0.7 to 4.4.When compiled, the catalog revealed a complex but narrow fault zone, just 50 meters wide with steep curves when viewed in profile. Plotting those curves, Ross says, was crucial to understanding the reason for the years of regular seismic activity.Typically, faults are thought to either act as conduits for or barriers to the flow of underground fluids, depending on their orientation to the direction of the flow. While Ross's research supports that generally, he and his colleagues found that the architecture of the fault created complex conditions for underground fluids flowing within it.The researchers noted the fault zone contained undulating subterranean channels that connected with an underground reservoir of fluid that was initially sealed off from the fault. When that seal broke, fluids were injected into the fault zone and diffused through the channels, triggering earthquakes. This natural injection process was sustained over about four years, the team found."These observations bring us closer to providing concrete explanations for how and why earthquake swarms start, grow, and terminate," Ross says.Next, the team plans to build off these new insights and characterize the role of this type of process throughout the whole of Southern California. | Earthquakes | 2,020 |
June 17, 2020 | https://www.sciencedaily.com/releases/2020/06/200617145937.htm | Geoscientists create deeper look at processes below Earth's surface with 3D images | Geoscientists at The University of Texas at Dallas recently used massive amounts of earthquake data and supercomputers to generate high-resolution, 3D images of the dynamic geological processes taking place far below the Earth's surface. | In a study published April 29 in "This is the first comprehensive seismic study to directly image 3D mantle flow fields in actual subduction environments using advanced FWI technology," said Dr. Hejun Zhu, corresponding author of the study and assistant professor of geosciences in the School of Natural Sciences and Mathematics. Dr. Jidong Yang, who earned his PhD in geosciences from UT Dallas in May, and Dr. Robert Stern, professor of geosciences, are the study's co-authors.Between the relatively thin layer of the Earth's crust and its inner core lies the thickest part of the planet, the mantle. Over short time periods, the mantle can be considered solid rock, but on the geological time scale of millions of years, the mantle flows like a viscous fluid.Earth's crust is broken into pieces called tectonic plates. These plates move across and into the mantle very slowly -- about as fast as fingernails grow. At regions called subduction zones, one plate descends under another into the mantle."The sinking of oceanic plates into the Earth's mantle at subduction zones is what causes the Earth's tectonic plates to move and is one of the most important processes taking place in our planet," Zhu said. "Subduction zones are also the source of many natural hazards, such as earthquakes, volcanoes and tsunamis. But the pattern of mantle flow and deformation around descending plates is still poorly understood. The information our techniques yield is crucial for understanding our dynamic planet."Zhu and his colleagues tackled the problem using a geophysical measurement called seismic anisotropy, which measures the difference in how fast mechanical waves generated by earthquakes travel in different directions inside the Earth. Seismic anisotropy can reveal how the mantle moves around the subducting plate. Similar technology is also used by the energy industry to locate oil and gas resources."When a diver dives into water, the water separates, and that separation in turn affects the way the water moves around the swimmer," Zhu said. "It's similar with oceanic plates: When they dive into hot mantle, that action induces mantle separation and flow around the plates."The research team created the images using high-fidelity data recorded over a 10-year period from 180 earthquakes by some 4,500 seismic stations located in a grid across the U.S. The numerical calculations for the FWI algorithm were performed on the high-performance computing clusters at the National Science Foundation (NSF)-supported Texas Advanced Computing Center at UT Austin, as well as on supercomputers at UT Dallas."Previously we couldn't 'see' under the Earth's surface, but by using this technology and this very wonderful data set, we are able to delineate the 3D distribution of various seismic phenomena and tell at what depths they are occurring," Zhu said.The images confirmed that the plates in the study region are not large, solid pieces but rather are fragmented into smaller slabs."This looks different from the textbook depictions of tectonic plates coming together, with one solid piece of oceanic plate descending under another solid piece," Zhu said. "Some researchers have hypothesized that this fragmentation occurs, and our imaging and modeling provides evidence that supports that view."Zhu's 3D model shows complex mantle flow patterns around a number of descending fragments and in the gaps between slabs. Such chunky, fragmented pieces are seen in regions throughout the world, Zhu said.In the northwestern U.S., for example, the Juan de Fuca Plate is also fragmented into two pieces where it descends under the North American Plate in the Cascadia subduction zone, an area where strong earthquakes have occurred over the centuries."We know that most earthquakes happen at the interface between a slab and the mantle. If there is a gap between these fragments, what's called a window region, you wouldn't expect earthquakes there," Zhu said. "If you look at the earthquake distribution along the Cascadia subduction zone, there is a span where you do not have earthquakes. That is probably a region where there is a gap in the subducting oceanic plate."The Middle America Trench that we studied has its own unique, dynamic properties. In the future, we plan to shift our attention to other subduction zones, including the Kermadec-Tonga subduction zone in the region of the Australian and Pacific plates."The research was funded by a grant to Zhu from the NSF's Division of Earth Sciences. | Earthquakes | 2,020 |
June 12, 2020 | https://www.sciencedaily.com/releases/2020/06/200612111430.htm | Remixed mantle suggests early start of plate tectonics | New Curtin University research on the remixing of Earth's stratified deep interior suggests that global plate tectonic processes, which played a pivotal role in the existence of life on Earth, started to operate at least 3.2 billion years ago. | Published in Nature's Lead researcher PhD Candidate Mr Hamed Gamal El Dien, from Curtin's School of Earth and Planetary Sciences, said there was much scientific debate over the exact start date of plate tectonics on Earth."Some scientists believe plate tectonics only began to operate from around 800 million years ago, whereas others think it could go as far back as four billion years ago, soon after the formation of our planet," Mr Gamal El Dien said."So far nearly all the evidence used in this debate came from scarcely preserved surface geological proxies, and little attention has been paid to the record kept by Earth's deep mantle -- this is where our research comes in."For the first time, we were able to demonstrate that a significant shift in mantle composition (or a major mantle remixing) started around 3.2 billion years ago, indicating a global recycling of the planet's crustal materials back in to its mantle layer, which we believe shows the start of global plate tectonic activity."During the earliest stages of Earth's planetary differentiation, the planet was divided into three main layers: the core, the mantle and the crust. Scientists believe there would have been very little remixing between the lighter crust and the much denser mantle, until the onset of plate tectonics.However through the ongoing process of subduction, some lighter crustal materials are carried back into the denser deep Earth and remixed with the mantle. The question the researchers then asked was, when did this global and whole-mantle remixing process start?"Keeping the basic process of subduction in mind, we hypothesise that ancient rock samples found on the crust, that are ultimately sourced from the deep mantle, should show evidence of the first major 'stirring up' in the mantle layer, marking the start of plate subduction as a vital component of plate tectonic processes," Mr Gamal El Dien said.To complete this research, the team looked at the time variation of the isotopic and chemical composition of approximately 6,000 mantle-derived basaltic and komatiitic lava rocks, dated to be between two and four billion years old.Research co-author John Curtin Distinguished Professor and Australian Laureate Fellow Professor Zheng-Xiang Li, head of the Earth Dynamics Research Group, said the research is highly significant in understanding the dynamic evolution of our planet."Plate tectonic activity on the planet is responsible for the formation of mineral and energy resources. It also plays a vital role for the very existence of humankind. Plate tectonics are found uniquely operative on Earth, the only known habitable planet," Professor Li said."Through our retrospective analysis of mantle-derived samples, we discovered that after the initial chemical stratification and formation of a hard shell in the first billion years of Earth's 4.5 billion year history, there was indeed a major chemical 'stir up' some 3.2 billion years ago."We take this 'stir up' as the first direct evidence from deep Earth that plate tectonics started over 3 billion years ago, leading to a step change in mantle composition, followed by the oxygenation of our atmosphere and the evolution of life." | Earthquakes | 2,020 |
June 11, 2020 | https://www.sciencedaily.com/releases/2020/06/200611143101.htm | Scientists detect unexpected widespread structures near Earth's core | University of Maryland geophysicists analyzed thousands of recordings of seismic waves, sound waves traveling through the Earth, to identify echoes from the boundary between Earth's molten core and the solid mantle layer above it. The echoes revealed more widespread, heterogenous structures -- areas of unusually dense, hot rock -- at the core-mantle boundary than previously known. | Scientists are unsure of the composition of these structures, and previous studies have provided only a limited view of them. Better understanding their shape and extent can help reveal the geologic processes happening deep inside Earth. This knowledge may provide clues to the workings of plate tectonics and the evolution of our planet.The new research provides the first comprehensive view of the core-mantle boundary over a wide area with such detailed resolution. The study was published in the June 12, 2020, issue of the journal The researchers focused on echoes of seismic waves traveling beneath the Pacific Ocean basin. Their analysis revealed a previously unknown structure beneath the volcanic Marquesas Islands in the South Pacific and showed that the structure beneath the Hawaiian Islands is much larger than previously known."By looking at thousands of core-mantle boundary echoes at once, instead of focusing on a few at a time, as is usually done, we have gotten a totally new perspective," said Doyeon Kim, a postdoctoral fellow in the UMD Department of Geology and the lead author of the paper. "This is showing us that the core-mantle boundary region has lots of structures that can produce these echoes, and that was something we didn't realize before because we only had a narrow view."Earthquakes generate seismic waves below Earth's surface that travel thousands of miles. When the waves encounter changes in rock density, temperature or composition, they change speed, bend or scatter, producing echoes that can be detected. Echoes from nearby structures arrive more quickly, while those from larger structures are louder. By measuring the travel time and amplitude of these echoes as they arrive at seismometers in different locations, scientists can develop models of the physical properties of rock hidden below the surface. This process is similar to the way bats echolocate to map their environment.For this study, Kim and his colleagues looked for echoes generated by a specific type of wave, called a shear wave, as it travels along the core-mantle boundary. In a recording from a single earthquake, known as a seismogram, echoes from diffracted shear waves can be hard to distinguish from random noise. But looking at many seismograms from many earthquakes at once can reveal similarities and patterns that identify the echoes hidden in the data.Using a machine learning algorithm called Sequencer, the researchers analyzed 7,000 seismograms from hundreds of earthquakes of 6.5 magnitude and greater occurring around the Pacific Ocean basin from 1990 to 2018. Sequencer was developed by the new study's co-authors from Johns Hopkins University and Tel Aviv University to find patterns in radiation from distant stars and galaxies. When applied to seismograms from earthquakes, the algorithm discovered a large number of shear wave echoes."Machine learning in earth science is growing rapidly and a method like Sequencer allows us to be able to systematically detect seismic echoes and get new insights into the structures at the base of the mantle, which have remained largely enigmatic," Kim said.The study revealed a few surprises in the structure of the core-mantle boundary."We found echoes on about 40% of all seismic wave paths," said Vedran Lekić, an associate professor of geology at UMD and a co-author of the study. "That was surprising because we were expecting them to be more rare, and what that means is the anomalous structures at the core-mantle boundary are much more widespread than previously thought."The scientists found that the large patch of very dense, hot material at the core-mantle boundary beneath Hawaii produced uniquely loud echoes, indicating that it is even larger than previous estimates. Known as ultralow-velocity zones (ULVZs), such patches are found at the roots of volcanic plumes, where hot rock rises from the core-mantle boundary region to produce volcanic islands. The ULVZ beneath Hawaii is the largest known.This study also found a previously unknown ULVZ beneath the Marquesas Islands."We were surprised to find such a big feature beneath the Marquesas Islands that we didn't even know existed before," Lekić said. "This is really exciting, because it shows how the Sequencer algorithm can help us to contextualize seismogram data across the globe in a way we couldn't before." | Earthquakes | 2,020 |
June 11, 2020 | https://www.sciencedaily.com/releases/2020/06/200611143052.htm | Utah's arches continue to whisper their secrets | Two new studies from University of Utah researchers show what can be learned from a short seismic checkup of natural rock arches and how erosion sculpts some arches -- like the iconic Delicate Arch -- into shapes that lend added strength. | A study published in Geophysical Research Letters begins with thorough measurements of vibrations at an arch in Utah, and applies those measurements to glean insights from 17 other arches with minimal scientific equipment required.The second study, published in Geomorphology, compares the strength of arch shapes, specifically beam-like shapes versus inverted catenary shapes (like Delicate Arch or Rainbow Bridge).The Geohazards Research Group at the University of Utah measures small vibrations in rock structures, which come from earthquakes, wind and other sources both natural and human-made, to construct 3-D models of how the structures resonate.Part of the reason for these measurements is to assess the structural health of the rock feature. In studying 17 natural arches, doctoral candidates Paul Geimer, Riley Finnegan and their colleagues set seismometers on the arches for a few hours to a few days. The data from those measurements, coupled with the 3-D models, gave important information about the modes, or major movement directions, of the arches as well as the frequencies for those modes of vibration."This is all possible using noninvasive methods," Geimer says, "that form the first step in improving our ability to detecting and identifying damage within arches and similar features." The noninvasive nature of the tests -- with the seismometers sitting on the arch's surface without damaging the rock -- is important, as many of Utah's rock arches are culturally significant.The studies of the 17 arches used just one or two seismometers each, so with permission from the National Park Service, the researchers went to Musselman Arch in Canyonlands National Park to verify their earlier measurements. The arch is flat across the top and easily accessible, so they dotted it with 30 seismometers and listened."This added wealth of information helped us to confirm our assumptions that arch resonant modes closely follow simple predictive models, and surrounding bedrock acts as rigid support," Geimer says. "To my knowledge, it was the first measurement of its kind for a natural span, after decades of similar efforts at human-made bridges."All of the arches studied exhibited the property of low damping, Geimer says, which means that they continued to vibrate long after a gust of wind, for example, or a seismic wave from a far-off earthquake. The results also help researchers infer the mechanical properties of rocks without having to drill into the rock to take a sample. For example, the stiffness of the Navajo Sandstone, widespread in Southern Utah, seems to be related to the amount of iron in the rock.Natural arches come in a range of shapes, including beam-like spans that stretch between two rock masses and classic freestanding or partly freestanding inverted catenary arches. A catenary is the arc formed by a hanging chain or rope -- so flip it upside down and you've got an inverted catenary."In its ideal form, the inverted catenary eliminates all tensile stresses," Geimer says, creating a stable curved span supported solely by compression, which the host sandstone can resist most strongly. The idea that inverted catenary arches are sculpted by erosion into strong shapes is not new. But the U team's approach to analyzing them is. Returning back to their 3-D models of arches and analysis of their vibration modes, the researchers simulated the gravitational stresses in detail on each arch and calculated a number, called the mean principle stress ratio, or MSR, that classifies whether the arch is more like a beam or more like an inverted catenary.The structure of the rock in which the arch is carved can also influence its shape. Inverted catenary arches are more likely to form in thick massive rock formations. "This allows gravitational stresses to be the dominant sculpting agent," Geimer says, "leaving behind a smooth arc of rock held in compression." Beam-like arches typically form in rock formations with multiple layers with varying strengths. "Weaker layers are removed by erosion more quickly," he adds, "leaving behind a layer of stronger material too thin to form a catenary curve."While the inverted catenary shape can lend an arch stability in its current form, Geimer and associate professor Jeff Moore are quick to point out that the arch is still vulnerable to other means of eventual collapse. "At Delicate Arch," Moore says, "the arch rests on a very thin easily eroded clayey layer, which provides weak connection to the ground, while Rainbow Bridge is restrained from falling over by being slightly connected to an adjoining rock knoll."Still, the MSR metric can help researchers and public lands managers evaluate an arch's stability due to its shape. The Geohazards Research Group is continuing to study other factors that can influence rock features' stability, including how cracks grow in rock and how arches have collapsed in the past.Find the group's 3-D models Watch (and listen - turn up your speakers!) how Moonshine Arch near Vernal, Utah, moves | Earthquakes | 2,020 |
June 11, 2020 | https://www.sciencedaily.com/releases/2020/06/200611094140.htm | What control the height of mountains? Surprisingly, it is not erosion | Which forces and mechanisms determine the height of mountains? A group of researchers from Münster and Potsdam has now found a surprising answer: It is not erosion and weathering of rocks that determine the upper limit of mountain massifs, but rather an equilibrium of forces in the Earth's crust. This is a fundamentally new and important finding for the earth sciences. The researchers report on it in the scientific journal | The highest mountain ranges on Earth -- such as the Himalayas or the Andes -- arise along convergent plate boundaries. At such plate boundaries two tectonic plates move toward each other, and one of the plates is forced beneath the other into the Earth's mantle. During this process of subduction, strong earthquakes repeatedly occur on the plate interface, and over millions of years mountain ranges are built at the edges of the continents.Whether the height of mountain ranges is mainly determined by tectonic processes in the Earth's interior or by erosional processes sculpturing the Earth's surface has long been debated in geosciences.A new study led by Armin Dielforder of GFZ German Research Centre for Geoscience now shows that erosion by rivers and glaciers has no significant influence on the height of mountain ranges. Together with scientists from the GFZ and the University of Münster (Germany), he resolved the longstanding debate by analysing the strength of various plate boundaries and calculating the forces acting along the plate interfaces.The researchers arrived at this surprising result by calculating the forces along different plate boundaries on the Earth. They used data that provide information about the strength of plate boundaries. These data are derived, for example, from heat flow measurements in the subsurface. The heat flow at convergent plate boundaries is in turn influenced by the frictional energy at the interfaces of the continental plates.One can imagine the formation of mountains using a tablecloth. If you place both hands under the cloth on the table top and push it, the cloth folds and at the same time it slides a little over the back of your hands. The emerging folds would correspond, for instance, to the Andes, the sliding over the back of the hands to the friction in the underground. Depending on the characteristics of the rock, tensions also build up in the deep underground which are discharged in severe earthquakes, especially in subduction zones.The researchers collected worldwide data from the literature on friction in the subsurface of mountain ranges of different heights (Himalayas, Andes, Sumatra, Japan) and calculated the resulting stress and thus the forces that lead to the uplift of the respective mountains. In this way they showed that in active mountains the force on the plate boundary and the forces resulting from the weight and height of the mountains are in balance.Such a balance of forces exists in all the mountain ranges studied, although they are located in different climatic zones with widely varying erosion rates. This result shows that mountain ranges are able to react to processes on the Earth's surface and to grow with rapid erosion in such a way that the balance of forces and the height of the mountain range are maintained. This fundamentally new finding opens up numerous opportunities to study the long-term development and growth of mountains in greater detail. | Earthquakes | 2,020 |
June 10, 2020 | https://www.sciencedaily.com/releases/2020/06/200610135059.htm | Proposed seismic surveys in Arctic Refuge likely to cause lasting damage | Winter vehicle travel can cause long-lasting damage to the tundra, according to a new paper by University of Alaska Fairbanks researchers published in the journal | Scars from seismic surveys for oil and gas exploration in the Arctic National Wildlife Refuge remained for decades, according to the study. The findings counter assertions made by the Bureau of Land Management in 2018 that seismic exploration causes no "significant impacts" on the landscape. That BLM determination would allow a less-stringent environmental review process of seismic exploration in the Arctic Refuge 1002 Area.UAF's Martha Raynolds, the lead author of the study, said she and other scientists have documented lasting impacts of winter trails throughout years of field research. Their paper, authored by an interdisciplinary team with expertise in Arctic vegetation, snow, hydrology and permafrost, summarizes what is currently known about the effects of Arctic seismic exploration and what additional information is needed to effectively regulate winter travel to minimize impacts.A grid pattern of seismic survey lines is used to study underground geology. These trails, as well as trails caused by camps that support workers, damage the underlying tundra, even when limited to frozen, snow-covered conditions. Some of the existing scars on the tundra date back more than three decades, when winter 2D seismic surveys were initiated. Modern 3D surveying requires a tighter network of survey lines, with larger crews and more vehicles. The proposed 1002 Area survey would result in over 39,000 miles of tracks."Winter tundra travel is not a technology that has changed much since the '80s," said Raynolds, who studies Arctic vegetation at UAF's Institute of Arctic Biology. "The impacts are going to be as bad or worse, and there are proposing many, many more miles of trails."Conditions for winter tundra travel have become more difficult, due to a mean annual temperature increase of 7-9 degrees F on Alaska's Arctic coastal plain since 1986. Those warmer conditions have contributed to changing snow cover and thawing permafrost. The impact of tracks on the vegetation, soils and permafrost eventually changes the hydrology and habitat of the tundra, which affects people and wildlife who rely on the ecosystem.The paper argues that more data are needed before proceeding with Arctic Refuge exploration efforts. That includes better information about the impacts of 3D seismic exploration; better weather records in the region, particularly wind and snow data; and high-resolution maps of the area's ground ice and hydrology. The study also emphasizes that the varied terrain and topography in the 1002 Area are different from other parts of the North Slope, making it more vulnerable to damage from seismic exploration.Other contributors to the paper included UAF's Mikhail Kanevskiy, Matthew Sturm and Donald "Skip" Walker; Anna Liljedahl, UAF and Woods Hole Research Center; Janet Jorgenson, U.S. Fish and Wildlife Service; Torre Jorgenson, Alaska Ecoscience; and Matthew Nolan, Fairbanks Fodar. | Earthquakes | 2,020 |
June 9, 2020 | https://www.sciencedaily.com/releases/2020/06/200609095108.htm | New hints of volcanism under the heart of northern Europe | Scientists have discovered new evidence for active volcanism next door to some of the most densely populated areas of Europe. The study 'crowd-sourced' GPS monitoring data from antennae across western Europe to track subtle movements in the Earth's surface, thought to be caused by a rising subsurface mantle plume. The work is published in | The Eifel region lies roughly between the cities of Aachen, Trier, and Koblenz, in west-central Germany. It is home to many ancient volcanic features, including the circular lakes known as 'maars'.These are the remnants of violent volcanic eruptions, such as the one which created Laacher See, the largest lake in the area. The explosion that created this is thought to have occurred around 13,000 years ago, with a similar explosive power to the cataclysmic Mount Pinatubo eruption in 1991.The mantle plume that likely fed this ancient activity is thought to still be present, extending up to 400km down into the Earth. However, whether or not it is still active is unknown: "Most scientists had assumed that volcanic activity in the Eifel was a thing of the past," said Prof. Corné Kreemer, lead author of the new study. "But connecting the dots, it seems clear that something is brewing underneath the heart of northwest Europe."In the new study, the team -- based at the University of Nevada, Reno and the University of California, Los Angeles in the United States -- used data from thousands of commercial and state-owned GPS antennae all over western Europe, to map out how the ground is moving vertically and horizontally as the Earth's crust is pushed, stretched and sheared.The research revealed that the region's land surface is moving upward and outward over a large area centred on the Eifel, and including Luxembourg, eastern Belgium and the southernmost province of the Netherlands, Limburg."The Eifel area is the only region in the study where the ground motion appeared significantly greater than expected," adds Prof. Kreemer. "The results indicate that a rising plume could explain the observed patterns and rate of ground movement."The new results complement those of a previous study in The implication of this study is that there may not only be an increased volcanic risk, but also a long-term seismic risk in this part of Europe. The researchers urge caution however: "This does not mean that an explosion or earthquake is imminent, or even possible again in this area. We and other scientists plan to continue monitoring the area using a variety of geophysical and geochemical techniques, in order to better understand and quantify any potential risks." | Earthquakes | 2,020 |
June 8, 2020 | https://www.sciencedaily.com/releases/2020/06/200608092937.htm | Why the Victoria Plate in Africa rotates | The East African Rift System (EARS) is a newly forming plate tectonic boundary at which the African continent is being separated into several plates. This is not a clean break. The system includes several rift arms and one or more smaller so-called microplates. According to GPS data, the Victoria microplate is moving in a counterclockwise rotation relative to Africa in contrast to the other plates involved. | Previous hypotheses suggested that this rotation is driven by the interaction of a mantle plume -- an upward flow of hot rock within the Earth's mantle -- with the microplate's thick craton and the rift system. But now, researchers from the German Research Centre for Geosciences GFZ in Potsdam around Anne Glerum have found evidence that suggests that the configuration of weaker and stronger lithospheric regions predominantly controls the rotation of continental microplates and Victoria in particular. Their findings were published in the journal In the paper, the researchers argue that a particular configuration of mechanically weaker mobile belts and stronger lithospheric regions in the EARS leads to curved, overlapping rift branches that under extensional motion of the major tectonic plates induces a rotation. They used 3D numerical models on the scale of the whole EARS to compute the lithosphere and upper mantle dynamics of the last 10 million years."Such large models run on high performance computing clusters," says Anne Glerum, main author of the study. "We tested the predictive strength of our models by comparing their predictions of velocity with GPS-derived data, and our stress predictions with the World Stress Map, a global compilation of information on the present-day crustal stress field maintained since 2009. This showed that the best fit was obtained with a model that incorporates the first order strength distributions of the EARS' lithosphere like the one we prepared."There are many more continental microplates and fragments on Earth that are thought to rotate or have rotated. The lithosphere-driven mechanism of microplate rotation suggested in the new paper helps interpret these observed rotations and reconstruct plate tectonic motions throughout the history of the Earth. | Earthquakes | 2,020 |
June 5, 2020 | https://www.sciencedaily.com/releases/2020/06/200605105414.htm | Study shows diamonds aren't forever | Diamonds, those precious, sparkling jewels, are known as the hardest materials on Earth. They are a high-pressure form of carbon and found deep in the ground. | While diamonds are commonly thought of as hard and stable, carbon from about 100 miles beneath the African plate is being brought to shallower levels where diamond will become unstable. Molten rock (magma) brings the excess carbon towards the surface, and earthquakes open cracks that allow the carbon to be released into the air as carbon dioxide.PhD student Sarah Jaye Oliva and Professor of Earth and Environmental Sciences and Marshall-Heape Chair in Geology Cynthia Ebinger are among a group of international researchers who co-authored a paper "Displaced cratonic mantle concentrates deep carbon during continental rifting," which was published in the journal "Somewhat amusedly," Ebinger said, "the paper is evidence that Diamonds Aren't Forever."The pair report on their findings about the African continent splitting in two and the massive amounts of COEbinger said of her student, "Sarah Jaye contributed to the gas measurements, and she analyzed the deep structure and state-of-stress data that enabled us to deduce the process leading to the excess COOliva participated in a month-long campaign in 2018 to sample gases released diffusely through the soil and at springs that dot the East African Rift System in Tanzania.Through the sampling, Oliva and other researchers found that COOliva said this made sense because the steep edge of the bottom of the plate is "where we expect magmas (molten rock material within the Earth that will cool to form igneous rock) to form and where faulting and fracture networks should be most intense.""The resulting faults and fissures, we think, act as conduits through the crust that concentrate fluxes of COModeling by the researchers also suggests that the mantle underneath the study region may be enriched in carbon due to the local erosion of the cratonic lithosphere that may even contain diamonds. (A craton is an old and stable part of the continental lithosphere, which consists of the Earth's two topmost layers, the crust and the uppermost mantle.)"The eroded material could melt as it moves towards thinner lithosphere, and this would be another factor in increasing the COShe added, "Participating in this project was extremely rewarding for me. We, as seismologists, geodynamicists, structural geologists and geochemists all came together to understand how rifts help mobilize CO | Earthquakes | 2,020 |
June 4, 2020 | https://www.sciencedaily.com/releases/2020/06/200604111637.htm | Australia's ancient geology controls the pathways of modern earthquakes | Seismological and geological studies led by University of Melbourne researchers show the 2016 magnitude 6.0 Petermann earthquake produced a landscape-shifting 21 km surface rupture. The dimensions and slip of the fault plane were guided by zones of weak rocks that formed more than 500 million years ago. | The unusually long and smooth rupture produced by this earthquake initially puzzled scientists as Australia's typically strong ancient cratons tend to host shorter and rougher earthquakes with greater displacements at this magnitude."We found that in regions where weaker rocks are present, earthquakes may rupture faults under low friction," said University of Melbourne Research Fellow, Dr Januka Attanayake."This means that structural properties of rocks obtained from geologic mapping can help us to forecast the possible geometry and slip distributions of future earthquakes, which ultimately allow us to better understand the seismic hazard posed by our many potentially active faults."Australia regularly incurs earthquakes of this magnitude that could, if located close to our urban centers, create catastrophic damage similar to that incurred in the fatal 2011 magnitude 6.2 Christchurch earthquake in New Zealand. Luckily, most of these earthquakes in Australia have occurred in remote areas."The Petermann Ranges, extending 320km from east Central Western Australia to the southwest corner of the Northern Territory, started forming about 600 million years ago when an Australian intracontinental mountain building event termed the Petermann Orogeny occurred.Dr Attanayake said seismic and geologic data collected from the near-field investigation of the Petermann earthquake four years ago by a research team comprising Dr Tamarah King, Associate Professor Mark Quigley, Gary Gibson, and Abe Jones in the School of Earth Sciences helped determined that weak rock layers embedded in the strong crust may have played a role in setting off the rare earthquake.Despite a major desert storm severely hampering field work, the geologists scoured the land for evidence of a surface rupture, both on foot and using a drone, which they eventually located two weeks into their field work. As a result researchers were able to map in detail the deformation associated with a 21 kilometre long trace of a surface rupture, along which the ground had uplifted with a maximum vertical displacement of one metre.Seismologists rapidly deployed broadband seismometers to detect and locate aftershocks that provide independent information to estimate the geometry of the fault plane that ruptured.Dr Attanayake said "The Petermann earthquake is a rare example where we've been able to link earthquakes with pre-existing geologic structure by combining seismological modelling and geological field mapping."With this insight about what caused Central Australia's old, strong, and cold cratonic crust to break and produce this significant earthquake, seismic and geologic data might help us infer possible geometries of fault planes present beneath our urban centres and forecast seismic hazard." | Earthquakes | 2,020 |
June 2, 2020 | https://www.sciencedaily.com/releases/2020/06/200602183415.htm | Areas where earthquakes are less likely to occur | Scientists from Cardiff University have discovered specific conditions that occur along the ocean floor where two tectonic plates are more likely to slowly creep past one another as opposed to drastically slipping and creating catastrophic earthquakes. | The team have shown that where fractures lie on the ocean floor, at the junction of two tectonic plates, sufficient water is able to enter those fractures and trigger the formation of weak minerals which in turn helps the two tectonic plates to slowly slide past one another.The new findings, which have been published in the journal This, in turn, could potentially contribute to solving one of the greatest challenges that faces seismologists, which is to be able to forecast earthquakes with enough precision to save lives and reduce the economic damage that is caused.Earth's outer layer, the lithosphere, is made up tectonic plates that shift over the underlying asthenosphere like floats on a swimming pool at rates of centimetres per year.Stresses begin to build up where these plates meet and are relieved at certain times either by earthquakes, where one plate catastrophically slips beneath the other at a rate of meters per second, or by creeping whereby the plates slip slowly past one another at a rate of centimetres per year.Scientists have for a long time been trying to work out what causes a particular plate boundary to either creep or to produce an earthquake.It is commonly believed that the slip of tectonic plates at the juncture of an oceanic and continental plate is caused by a weak layer of sedimentary rock on the top of the ocean floor; however, new evidence has suggested that the rocks deeper beneath the surface in the oceanic crust could also play a part and that they may be responsible for creep as opposed to earthquakes.In their study, the team from Cardiff University and Tsukuba University in Japan looked for geological evidence of creep in rocks along the Japan coast, specifically in rocks from oceanic crust that had been deeply buried in a subduction zone, but through uplift and erosion were now visible on the Earth's surface.Using state-of-the-art imaging techniques the team were able to observe the microscopic structure of the rocks within the oceanic crust and use them to estimate the amount of stress that was present at the tectonic plate boundary.Their results showed that the oceanic crust was in fact far weaker than previously assumed by scientists."This means that, at least in the ancient Japanese subduction zone, slow creep within weak, wet oceanic crust could allow the ocean lithosphere to slip underneath the overlying continent without earthquakes being generated," said lead-author of the study Christopher Tulley, from Cardiff University's School of Earth and Ocean Sciences."Our study therefore confirms that oceanic crust, typically thought to be strong and prone to deforming by earthquakes, may instead commonly deform by creep, providing it is sufficiently hydrated." | Earthquakes | 2,020 |
May 27, 2020 | https://www.sciencedaily.com/releases/2020/05/200527150157.htm | New clues to deep earthquake mystery | A new understanding of our planet's deepest earthquakes could help unravel one of the most mysterious geophysical processes on Earth. | Deep earthquakes -- those at least 300 kilometers below the surface -- don't typically cause damage, but they are often widely felt. These earthquakes can provide vital clues to understanding plate tectonics and the structure of the Earth's interior. Due to the extremely high temperature and pressures where deep earthquakes occur, they likely stem from different physical and chemical processes than earthquakes near the surface. But it's hard to gather information about deep earthquakes, so scientists don't have a solid explanation for what causes them."We can't directly see what's happening where deep earthquakes occur," said Magali Billen, professor of geophysics in the UC Davis Department of Earth and Planetary Sciences.Billen builds numerical simulations of subduction zones, where one plate sinks below another, to better understand the forces controlling plate tectonics. Her recent work helps explain the distribution of deep earthquakes, showing that they most often strike in regions of "high strain" where a sinking tectonic plate bends and folds."These models provide compelling evidence that strain rate is an important factor in controlling where deep earthquakes occur," she said.The new understanding that deformation is a major factor in deep earthquakes should help scientists resolve which mechanisms trigger deep earthquakes and can provide new constraints on subduction zone structure and dynamics, Billen said."Once we understand deep earthquake physics better, we will be able to extract even more information about the dynamics of subduction, the key driver of plate tectonics," she said.Her findings were published May 27 in the journal Deep earthquakes occur in subduction zones -- where one of the tectonic plates floating on the surface of the Earth dives under another and is "subducted" into the mantle. Within the sinking slabs of crust, earthquakes cluster at some depths and are sparse in others. For example, many slabs exhibit large gaps in seismic activity below 410 kilometers in depth.The gaps in seismicity line up with regions of the slab that are deforming more slowly in the numerical models, Billen said."Deformation is not the same everywhere in the plate," Billen said. "That's really what's new here."Billen's research was not originally intended to investigate deep earthquakes. Rather, she was trying to understand the slow back-and forth motion of deep ocean trenches, where plates bend downward in subduction zones."I decided out of curiosity to plot the deformation in the plate, and when I looked at the plot, the first thing that popped in my mind was 'wow, this looks like the distribution of deep earthquakes,'" she said. "It was a total surprise."Billen's model incorporates the latest data about phenomena such as the density of minerals, different layers in the sinking plate, and experimental observations of how rocks behave at high temperatures and pressures."This is the first model that really brings together the physical equations that describe the sinking of the plates and key physical properties of the rocks," Billen said.The results cannot distinguish between possible causes for deep earthquakes. However, they do provide new ways to explore what causes them, Billen said."Taking into account the added constraint of strain-rate should help to resolve which mechanisms are active in the subducting lithosphere, with the possibility that multiple mechanisms may be required," she said.The project was supported by a fellowship from the Alexander von Humboldt Foundation and an award from the National Science Foundation. The Computational Infrastructure for Geodynamics supports the CitcomS software used for the numerical simulations. | Earthquakes | 2,020 |
May 27, 2020 | https://www.sciencedaily.com/releases/2020/05/200527123334.htm | Study finds a (much) earlier birth date for tectonic plates | Yale geophysicists reported that Earth's ever-shifting, underground network of tectonic plates was firmly in place more than 4 billion years ago -- at least a billion years earlier than scientists generally thought. | Tectonic plates are large slabs of rock embedded in the Earth's crust and upper mantle, the next layer down. The interactions of these plates shape all modern land masses and influence the major features of planetary geology -- from earthquakes and volcanoes to the emergence of continents."Understanding when plate tectonics started on Earth has long been a fundamentally difficult problem," said Jun Korenaga, a professor of earth and planetary sciences in Yale's Faculty of Arts and Sciences and senior author of the new study, published in One promising proxy for determining if tectonic plates were operational is the growth of continents, Korenaga said. This is because the only way to build up a continent-sized chunk of land is for surrounding surface rock to keep sinking deeply over a long period -- a process called subduction that is possible only through plate tectonics.In the new study, Korenaga and Yale graduate student Meng Guo found evidence of continental growth starting as early as 4.4 billion years ago. They devised a geochemical simulation of the early Earth based on the element argon -- an inert gas that land masses emit into the atmosphere. Argon is too heavy to escape Earth's gravity, so it remains in the atmosphere like a geochemical ledger."Because of the peculiar characteristics of argon, we can deduce what has happened to the solid Earth by studying this atmospheric argon," Korenaga said. "This makes it an excellent bookkeeper of ancient events."Most of the argon in Earth's atmosphere is 40Ar -- a product of the radioactive decay of 40K (potassium), which is found in the crust and mantle of continents. The researchers said their model looked at the atmospheric argon that has gradually accumulated over the history of the planet to determine the age of continental growth.Part of the challenge in creating their simulation, the researchers said, was incorporating the effects of a geological process called "crustal recycling." This refers to the cycle by which continental crust builds up, then is eroded into sediments, and eventually carried back underground by tectonic plate movements -- until the cycle renews itself.The simulation thus had to account for argon gas emissions that were not part of continental growth."The making of continental crust is not a one-way process," Korenaga said. | Earthquakes | 2,020 |
May 26, 2020 | https://www.sciencedaily.com/releases/2020/05/200526173821.htm | Designing a flexible material to protect buildings, military personnel | Shake, rattle and roll. | Even though they are miles from the epicenter of an earthquake, buildings can collapse due to how an earthquake energy makes the ground shake and rattle. Now, a team of engineers led by Guoliang Huang, a James C. Dowell Professor in the Mechanical and Aerospace Engineering Department at the University of Missouri College of Engineering, has designed a flexible material that can help buildings withstand multiple waves of energy traveling through a solid material, including the simultaneous forward and backward and side-to-side motions found in earthquakes."Our elastic material can stretch and form to a particular surface, similarly to a wrap on a vehicle," Huang said. "It can be applied to the surface of an existing building to allow it to flex in an earthquake. What is unique about the structured lattice-type material is that it protects against both types of energy waves -- longitudinal and sheer -- that can travel through the ground."Huang said the material also can be used by the defense industry to protect against vibration in mechanical parts, such as aircraft or submarine engines."For over 20 years, no one had a natural solution for this issue in a solid material," Huang said. "Now, we've designed, modeled and fabricated a new material with properties that do not exist naturally for what we believe is a nearly perfect protective device."The Army Research Office, which provided funding for the basic research effort at the University of Missouri associated with this project, is encouraged by the results from Huang's team."The results that the University of Missouri team has recently published are encouraging," said Dan Cole, the program manager at the Army Research Office, a part of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "This research could lead to new strategies for steering mechanical waves away from critical regions in solid objects, which could enable novel capabilities in soldier protection and maneuvering." | Earthquakes | 2,020 |
May 20, 2020 | https://www.sciencedaily.com/releases/2020/05/200520120735.htm | Caves tell us that Australia's mountains are still growing | Australia has often been unfairly portrayed as an old and idle continent with little geological activity, but new research suggests that we remain geologically active and that some of our mountains are still growing. | The University of Melbourne study reveals that parts of the Eastern Highlands of Victoria, including popular skiing destinations such as Mt Baw Baw and Mt Buller, may be as young as five million years, not 90 million years as originally thought.John Engel is one of four scientists from the Isotope Geochemistry Group in the School of Earth Sciences who studied the stalagmites, stalactites, and flowstones -- technically called 'speleothems' -- in the nearby Buchan Caves to produce the findings."At least 250 meters of additional height in the East Victorian Highlands appears to have been gained in the last few million years," Mr Engel said.With the help of Parks Victoria Rangers, the team visited 10 caves, climbing down through the passages and crawling through tight squeezes to collect small fragments of speleothem 'rubble' to take back to the lab to determine their age, using radiometric U-Pb dating."Our research shows a clear trend between oldest speleothem (cave age) and height in the landscape," Mr Engel said. "The data suggests that the Buchan region has been steadily uplifting at a rate of 76 meters every million years, beginning at least 3.5 million years ago and continuing today. This means that some speleothems have been sitting in dark caves undisturbed for 3.5 million years."Evidence suggests the Highlands originally rose up about 90 million years ago when the Tasman Sea between Australia and New Zealand opened up.Researchers say the cause of the more recent uplift is debated but a leading theory points to the friendly rivalry with New Zealand."The Australian and Pacific plates share a common boundary and many of the forces involved at this boundary may be propagated into the Australian plate as tectonic stress. Some of this tectonically-induced stress is then released as uplift of the mountains in South East Australia," Mr Engel said."This is why East Gippsland may still feel effects related to these tectonic forces. This subtle modification of classical plate tectonic theory can help explain the frequent, small earthquakes observed along South East Australia."Mr Engel said while mountains like the Himalaya and the Swiss Alps are admired for their aesthetic appeal, uncovering the secret stories surrounding when and how mountains form provides a layer of appreciation as well as an active field of research for geologists."Our research showcases a new -- and rather unique -- method for measuring the uplift of mountains. This technique of using speleothem is likely to also work in other caves across the world for regions with 'recent' tectonic activity, offering geologists great opportunities to share more stories about these impressive and unchanging features of our landscape." | Earthquakes | 2,020 |
May 18, 2020 | https://www.sciencedaily.com/releases/2020/05/200518162655.htm | Unknown submarine landslides discovered in Gulf of Mexico | A Florida State University researcher has used new detection methods to identify 85 previously unknown submarine landslides that occurred in the Gulf of Mexico between 2008 and 2015, leading to questions about the stability of oil rigs and other structures, such as pipelines built in the region. | Assistant Professor Wenyuan Fan in the Department of Earth, Ocean and Atmospheric Science has published a new paper in the journal "The observed landslides suggest a possible tsunami hazard for coastal communities along the Gulf of Mexico and that seabed infrastructure in the Gulf of Mexico, including oil platforms and pipelines, is also at risk from the landslides," Fan said.Fan and his colleagues measured data from seismic stations across the United States. They found that out of the 85 landslides they identified, 10 occurred spontaneously without preceding earthquakes. The other 75 occurred almost instantly after the passage of surface waves caused by distant earthquakes. Some of these were considered rather small earthquakes, Fan added.The finding was a surprise for Fan and his colleagues, he noted. In trying to better understand lesser-known earthquake processes, he had designed a method to capture earthquake data that would help him get a better look at continuous waveforms. That led him to seismic sources in the Gulf of Mexico."There are few active faults in the Gulf, and the seismicity is scarce in the region," he said. "This puzzled me and concerned me because we live close to the Gulf. With the question and the concern, I looked into the details of these seismic sources and eventually concluded that they are likely to be submarine landslides."Fan said currently he and his colleagues do not have any real-time data related to damage from these events and that most of the landslides were in the deep-water region of the Gulf. The ability to detect and locate these submarine landslides suggests that scientists may be able to adapt researchers' methods for hazard monitoring in the future though. | Earthquakes | 2,020 |
May 14, 2020 | https://www.sciencedaily.com/releases/2020/05/200514143502.htm | New model to accurately date historic earthquakes | Three earthquakes in the Monterey Bay Area, occurring in 1838, 1890 and 1906, happened without a doubt on the San Andreas Fault, according to a new paper by a Portland State University researcher. | The paper, "New Insights into Paleoseismic Age Models on the Northern San Andreas Fault: Charcoal In-built ages and Updated Earthquake Correlations," was recently published in the Assistant Professor of Geology at PSU Ashley Streig said the new research confirms what her team first discovered in 2014: three earthquakes occurred within a 68-year period in the Bay Area on the San Andreas Fault."This is the first time there's been geologic evidence of a surface rupture from the historic 1838 and 1890 earthquakes that we knew about from newspapers and other historical documents," Streig said. "It basically meant that the 1800s were a century of doom."Building on the 2014 study, Streig said they were able to excavate a redwood slab from a tree felled by early Europeans, from one meter below the surface in the Bay Area. The tree was toppled before the three earthquakes in question occurred. That slab was used to determine the precise date logging first occurred in the area, and pinpointed the historic dates of the earthquakes. Further, they were able use the slab to develop a new model for determining recurrence intervals and more exact dating.Streig used the dating technique wiggle matching for several measured carbon 14 samples from the tree slab and compared them with fluctuations in atmospheric carbon 14 concentrations over time to fingerprint the exact death of the tree and confirm the timing of the earthquakes. Because the researchers had an exact age from the slab, they were able to test how well the most commonly used material, charcoal, works in earthquake age models.Charcoal is commonly used for dating and to constrain the ages of prehistoric earthquakes and develop an earthquake recurrence interval, but Streig said the charcoal can be hundreds of years older than the stratigraphic layer containing it, yielding an offset between what has been dated and the actual age of the earthquake. The new technique accounts for inbuilt charcoal ages -- which account for the difference in time between the wood's formation and the fire that generated said charcoal -- and can better estimate the age of the event being studied."We were able to evaluate the inbuilt age of the charcoal incorporated in the deposits and find that charcoal ages are approximately 322 years older than the actual age of the deposit -- so previous earthquake age models in this area using detrital charcoal would be offset roughly by this amount," she said.New earthquake age modeling using a method to correct for this charcoal inbuilt age, and age results from the tree stump are what give Streig absolute certainly that the 1838 and 1890 earthquakes in question occurred on the San Andreas Fault and during those years."We put the nail in the coffin," she added. | Earthquakes | 2,020 |
May 14, 2020 | https://www.sciencedaily.com/releases/2020/05/200514092638.htm | 'Lettere patenti' help assess intensity of historic central Italian earthquakes | Three hundred-year-old administrative documents from the Roman government, granting residents permission to repair damage to their buildings, can help modern-day seismologists calculate intensities for a notable sequence of earthquakes that struck central Italy in 1703. | Details gleaned from these "Lettere Patenti" offer a unique glimpse at the geographical spread and types of building damage caused by the 1703 earthquakes, according to the report in Seismic intensity -- a way to describe an earthquake's severity through its effects on humans and structures -- is assessed with reference to statistically representative buildings. Using information from the historical documents, the three main earthquakes in the 1703 sequence were likely between intensity V and VI, as measured by the European Macroseismic Scale-98.Intensity V or "strong" means that the earthquake was felt indoors by most people, trembling buildings and tipping over top-heavy objects. Intensity VI or "slightly damaging" earthquakes can cause minor to moderate damage to buildings, like cracking and falling pieces of plaster.The administrative documents provide a more realistic view into the earthquakes' impact than historical reports of damage to monumental buildings such as churches and Rome's Colosseum, the researchers conclude.The new study's "fundamental contribution -- information about residential housing -- is crucial for assessing intensity, especially when using the EMS-98," said Tertulliani. "This macroseismic scale suggests, if possible, to not consider monumental buildings in the intensity evaluation because they are not statistically significant. The Lettere Patenti are a source of information on the residential building stock of Rome."The 1703 sequence, including earthquakes on 14 January, 16 January and 2 February, were widely felt and feared by people in central Italy. The January earthquakes were located in the Umbria region, which at the time was part of the Papal State, and the February earthquake struck the Abrutium region in the Kingdom of Naples.They are among the few earthquakes to have produced significant damage to monumental structures in the city of Rome, Tertulliani and colleagues noted. The damage included toppled chimneys, falling rubble and cracks in the domes and vaults of several churches, including St. Peter's and St. Paul's Basilicas, and the church of St. Andrea della Valle. Two or three arches at The Colosseum collapsed, and parts of the Aurelian Walls were damaged.The researchers point out, however, that churches and towers are often the structures that are damaged the most during earthquakes, due to their structural complexity. To expand their view of the earthquakes' impact, Tertulliani and colleagues built on a previous study looking at administrative documents from the time period, called Lettere Patenti or literally, "letter of permission or license." The documents issued by the Roman government authorize individuals to complete maintenance on the external parts of their buildings."The work of the historical seismologist has something of both the archaeologist and the detective, so it is necessary to look for a wide typology of documents," Tertulliani explained. "Before the end of 18th century, most of the reports or chronicles that were written after a natural phenomenon were not conceived with the primary aim of describing its effects, so the seismologist has to search for administrative documents."With the help of the Lettere Patenti and a 1748 property map, the researchers were able to pinpoint 93 new damage points on civil buildings spread throughout the historic center of Rome. Most of the repairs authorized by the documents involved demolition and reconstructions of walls or parts of buildings after moderate structure damage. Tertulliani hopes to study other historical Italian earthquakes using these document-based methods, especially as new archival material becomes available. But he cautioned that "this kind of research is extremely time-consuming, and it does not always lead to appreciable results." | Earthquakes | 2,020 |
May 12, 2020 | https://www.sciencedaily.com/releases/2020/05/200512134530.htm | Growing mountains or shifting ground: What is going on in Earth's inner core? | Exhaustive seismic data from repeating earthquakes and new data-processing methods have yielded the best evidence yet that the Earth's inner core is rotating -- revealing a better understanding of the hotly debated processes that control the planet's magnetic field. | The new study by researchers from the University of Illinois at Urbana-Champaign is published in the journal Geologists do not fully understand how the Earth's magnetic field generator works, but suspect it is closely linked to dynamic processes near the inner core-outer core boundary area, the researchers said. Shifts in the location of the magnetic poles, changes in field strength and anomalous seismic data have prompted researchers to take a closer look."In 1996, a small but systematic change of seismic waves passing through the inner core was first detected by our group, which we interpreted as evidence for differential rotation of the inner core relative to the Earth's surface," said geology professor and study co-author Xiaodong Song, who is now at Peking University. "However, some studies believe that what we interpret as movement is instead the result of seismic waves reflecting off an alternately enlarging and shrinking inner core boundary, like growing mountains and cutting canyons."The researchers present seismic data from a range of geographic locations and repeating earthquakes, called doublets, that occur in the same spot over time. "Having data from the same location but different times allows us to differentiate between seismic signals that change due to localized variation in relief from those that change due to movement and rotation," said Yi Yang, a graduate student and lead author of the study.The team found that some of the earthquake-generated seismic waves penetrate through the iron body below the inner core boundary and change over time, which would not happen if the inner core were stationary, the researchers said. "Importantly, we are seeing that these refracted waves change before the reflected waves bounce off the inner core boundary, implying that the changes are coming from inside the inner core," Song said.The basis of the debate lies in the fact the prior studies looked at a relatively small pool of somewhat ambiguous data generated from a method that is highly dependent on accurate clock time, the researchers said."What makes our analysis different is our precise method for determining exactly when the changes in seismic signals occur and arrive at the various seismic stations across the globe," Yang said. "We use a seismic wave that did not reach inner core as a reference wave in our calculations, which eliminates a lot of the ambiguity."This precise arrival time analysis, an extensive collection of the best quality data and careful statistical analysis performed by Yang, are what give this study its power, Song said. "This work confirms that the temporal changes come mostly, if not all, from the body of the inner core, and the idea that inner core surface changes are the sole source of the signal changes can now be ruled out," he said. | Earthquakes | 2,020 |
May 6, 2020 | https://www.sciencedaily.com/releases/2020/05/200506123731.htm | Fiber optics capture seismic signatures of the rose parade | Yes, there's a prize for the most beautiful flower-filled float in the Rose Parade each year, but how about a prize for the most ground-shaking marching band? According to a new study, the 2020 honors go to the Southern University and A&M College, followed closely by the hometown Pasadena City College Honor band. | These bragging rights and other interesting signatures of the Rose Parade were captured by fiber optic telecommunications cable lying below the parade route. In The technique, called distributed acoustic sensing (DAS), uses the tiny internal flaws in a long optical fiber as thousands of seismic sensors. An instrument at one end of the fiber sends laser pulses down the cable that are reflected off the fiber flaws and bounced back to the instrument. When an earthquake disturbs the fiber, researchers can examine changes in the size, frequency and phase of the reflected pulses to learn more about the resulting seismic waves. (Read more about DAS here.)For the Rose Parade project, Zhan and colleagues examined data from a 2.5-kilometer (1.6 mile) stretch of cable under the parade route that contained about 400 seismic sensors. In this case, the disturbance to the cables was the compression and flexure of the roads by parade participants."The main goal of the Pasadena Array is to detect small earthquakes and image the geological structure underneath the city. It has been operating only since November 2019, so we actually do not have any good-sized earthquake in the city yet," explained Zhan. "The Rose Parade, as a well-controlled event -- no other traffic except the parade, traveling all in one direction at almost constant speed -- provides a rare opportunity for network calibration."Their seismic readout "turned out to be quite broadband," Zhan said. The array captured the distinct signals of zig-zagging police motorcycles clearing the route, the bend of the road as heavy floats weighing 16,000 to 18,000 kilograms (17.6 to 19.9 tons) passed overhead, and a series of harmonic frequencies that corresponds to the even stepping of the marching bands. The "heaviest" float measured in this way was the Amazon studios float, which contained a bus and rocket mounted on a truck.The researchers were even able to see a gap in the DAS record when the "Mrs. Meyer's Clean Day" float got stuck at a tight turn in the route and backed up the parade traffic for six minutes."This project inspires us that in the future we will probably use heavy vehicles for calibrations of DAS arrays in other cities," Zhan said.The annual Rose Parade has been held on News Year's Day since 1890, and more than 700, 000 spectators crowd the curbsides each year. The event takes place before the Rose Bowl, an American college football game. | Earthquakes | 2,020 |
April 29, 2020 | https://www.sciencedaily.com/releases/2020/04/200429133957.htm | 'Wobble' may precede some great earthquakes | The land masses of Japan shifted from east to west to east again in the months before the strongest earthquake in the country's recorded history, a 2011 magnitude-9 earthquake that killed more than 15,500 people, new research shows. | Those movements, what researchers are calling a "wobble," may have the potential to alert seismologists to greater risk of future large subduction-zone earthquakes. These destructive events occur where one of Earth's tectonic plates slides under another one. That underthrusting jams up or binds the earth, until the jam is finally torn or broken and an earthquake results.The findings were published today (April 30) in the journal "What happened in Japan was an enormous but very slow wobble -- something never observed before," said Michael Bevis, a co-author of the paper and professor of earth sciences at The Ohio State University."But are all giant earthquakes preceded by wobbles of this kind? We don't know because we don't have enough data. This is one more thing to watch for when assessing seismic risk in subduction zones like those in Japan, Sumatra, the Andes and Alaska."The wobble would have been imperceptible to people standing on the island, Bevis said, moving the equivalent of just a few millimeters per month over a period of five to seven months. But the movement was obvious in data recorded by more than 1,000 GPS stations distributed throughout Japan, in the months leading up to the March 11 Tohoku-oki earthquake.The research team, which included scientists from Germany, Chile and the United States, analyzed that data and saw a reversing shift in the land -- about 4 to 8 millimeters east, then to the west, then back to the east. Those movements were markedly different from the steady and cyclical shifts the Earth's land masses continuously make."The world is broken up into plates that are always moving in one way or another," Bevis said. "Movement is not unusual. It's this style of movement that's unusual."Bevis said the wobble could indicate that in the months before the earthquake, the plate under the Philippine Sea began something called a "slow slip event," a relatively gentle and "silent" underthrusting of two adjacent oceanic plates beneath Japan, that eventually triggered a massive westward and downward lurch that drove the Pacific plate and slab under Japan, generating powerful seismic waves that shook the whole country.That 2011 earthquake caused widespread damage throughout Japan. It permanently shifted large parts of Japan's main island, Honshu, several meters to the east. It launched tsunami waves more than 40 meters high. More than 450,000 people lost their homes. Several nuclear reactors melted down at the Fukushima Daiichi Nuclear Power Plant, sending a steady stream of toxic, radioactive materials into the atmosphere and forcing thousands nearby to flee their homes. It was the worst nuclear disaster since Chernobyl.Researchers who study earthquakes and plate tectonics try to pinpoint the approximate magnitude of the next large earthquakes and predict where and when they might occur. The "when" is much harder than the "where."But it won't be possible to use the findings of this study to predict earthquakes in some subduction zones around the world because they don't have the GPS systems needed, said Jonathan Bedford, lead author of this study and a researcher at the GFZ German Research Centre for Geosciences.In 2011, Japan had one of the largest and most robust GPS monitoring systems in the world. That system provided ample data, and allowed the research team to identify the swing the land mass made in the months leading up to the earthquake.Other countries, including Chile and Sumatra, which were hit by devastating earthquakes and tsunamis in 2010 and 2004, respectively, had much less-comprehensive systems at the time of those disasters.The researchers analyzed similar data from the 2010 Chile earthquake, and found evidence of a similar wobble; Bedford said the data was "only just good enough to capture the signal.""We really need to be monitoring all major subduction zones with high-density GPS networks as soon as possible," he said. | Earthquakes | 2,020 |
April 27, 2020 | https://www.sciencedaily.com/releases/2020/04/200427125126.htm | Fracking and earthquake risk | Hydraulic fracturing for oil and gas production can trigger earthquakes, large and small. A new approach to managing the risk of these quakes could help operators and regulators hit the brakes early enough to prevent nuisance and reduce the chance of property damage and injury. | The approach, developed by four Stanford University researchers and published April 28 in the Hydraulic fracturing, or fracking, involves pumping fluids at high pressure into wells drilled down into and across rock formations thousands of feet underground. The pressure creates small earthquakes that break the rock, forcing open existing fractures or creating new ones. Petroleum then flows more easily out of the cracked rocks and into the well. "The goal is to make many tiny earthquakes, but sometimes they are larger than planned," said study co-author William Ellsworth, a geophysics professor at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).By taking the local risk of nuisance-level shaking as its starting point, the new strategy contrasts with the current common practice for managing fracking-related quakes based on size. Under a system known as a traffic-light protocol, operators have a green light to proceed as long as earthquakes remain relatively small. Larger earthquakes may require an operator to adjust or stop work. The system is widely used to manage hazards of fracking for oil and gas in the United States, Canada, China and Europe, and also for geothermal energy development in South Korea, Europe and the United States."Implicitly, I think regulators have had risk in the back of their mind," said study co-author Greg Beroza, a geophysics professor at Stanford. "But risk-based frameworks have not been used previously -- perhaps because it requires a bit of extra analysis."Earthquake size offers a rough proxy for how much damage can be expected, and it's a measure that regulators and operators can monitor in real time. The problem is quakes of the same size can present very different risks from one location to another due to differences in population density. "A project located in a virtually uninhabited area of west Texas would pose a much lower risk than a similar project located near a city," Ellsworth explained.In addition, geological factors including earthquake depth, fault geometry and local soil conditions can influence how an earthquake's energy -- and potential to do damage -- becomes amplified or peters out as it travels underground. All of this context is key to honing in on a tolerable amount of shaking and establishing traffic-light thresholds accordingly."Areas such as Oklahoma, with buildings that were not designed to resist strong shaking, or areas that anticipate amplified shaking due to soft soils, can account for their community needs with this approach," said study co-author Jack Baker, a professor of civil and environmental engineering who leads the Stanford Center for Induced and Triggered Seismicity with Beroza, Ellsworth and Stanford geophysicist Mark Zoback.The Stanford researchers developed mathematical techniques to account for the web of risk factors that shape the probability of an earthquake generating noticeable or damaging shaking in a specific location. They built upon these techniques to make a translation to earthquake magnitude. This allowed them to create guidelines for devising new traffic-light protocols that still use earthquake size to clearly delineate between the green, yellow and red zones, but with much more tailoring to local concerns and geology."If you tell me what exposure you have in a certain area -- population density, site amplification, distance to towns or critical infrastructure -- our analysis can spit out numbers for green-, yellow- and red-light thresholds that are fairly well informed by real-world risks," said lead study author Ryan Schultz, a PhD student in geophysics.The analysis also makes it possible, he added, to start out with some level of risk deemed tolerable -- say, a 50 percent chance of nuisance-level shaking at the nearest household -- and calculate the maximum earthquake magnitude that would keep risk at or below that level. "This is about making it clearer what choices are being made," Schultz said, "and facilitating a conversation between operators, regulators and the public."In general, the authors recommend setting yellow-light thresholds approximately two magnitude units below the red light. According to their analysis, this would result in 1 percent of cases jumping from the green zone straight to red. "If you stop the operation right at or before the threshold for damage, you're assuming you have perfect control, and often that's not the reality," Schultz said. "Often, the biggest earthquakes happen after you've turned off the pumps." | Earthquakes | 2,020 |
April 24, 2020 | https://www.sciencedaily.com/releases/2020/04/200424150728.htm | Fault roughness and magnitude of earthquakes | A new study led by McGill University has found that tectonic plates beneath the Earth's surface can show varying degrees of roughness and could help explain why certain earthquakes are stronger than others. | Earthquakes happen when the rocks beneath the Earth's surface break along geological fault lines and slide past each other. The properties of these faults -- such as the roughness of their surface -- can have an influence on the size of seismic events, however their study has been challenging because they are buried deep beneath the Earth's surface.In order to have a better understanding of the characteristics of these faults, researchers from McGill University, the University of California Santa Cruz and Ruhr University Bochum in Germany used high-resolution seismic reflection data to map and measure the roughness of 350 km2 of a plate boundary fault located off the Pacific coast of Costa Rica."We already knew that the roughness of a fault was an important factor, but we did not know how rough faults in the subsurface truly are, nor how variable the roughness is for a single fault," says James Kirkpatrick, a professor in McGill's Department of Earth and Planetary Sciences.In a recently published study in Historically, the earthquakes that have occurred in this part of the world have been moderately large (M7) and Kirkpatrick, who is also the study's first author, believes the rough patches they found might be the reason why."These rough patches are stronger and more resistant to earthquake slip," he says. "The historical record of earthquakes is relatively short, so we can't say with certainty that larger ones have not occurred. Future seismic events in the area, which will be recorded with modern equipment, should help us determine if they show the same limited magnitude."Kirkpatrick and his colleagues also hope to apply their methods to other subduction zones where similar geophysical data is available to start to evaluate whether their conclusions are generally applicable."This connection between the fault roughness and earthquake magnitude might one day help us understand the size and style of earthquakes most likely to occur a given fault." | Earthquakes | 2,020 |
April 23, 2020 | https://www.sciencedaily.com/releases/2020/04/200423082229.htm | Seismic map of North America reveals geologic clues, earthquake hazards | How do mountains form? What forces are needed to carve out a basin? Why does the Earth tremble and quake? | Earth scientists pursue these fundamental questions to gain a better understanding of our planet's deep past and present workings. Their discoveries also help us plan for the future by preparing us for earthquakes, determining where to drill for oil and gas, and more. Now, in a new, expanded map of the tectonic stresses acting on North America, Stanford researchers present the most comprehensive view yet of the forces at play beneath the Earth's surface.The findings, published in "Understanding the forces in the Earth's crust is fundamental science," said study co-author Mark Zoback, the Benjamin M. Page Professor of Geophysics in Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "In some cases, it has immediate application, in others, it may be applied decades later to practical questions that do not exist today."The new research provides the first quantitative synthesis of faulting across the entire continent, as well as hundreds of measurements of compressive stress directions -- the direction from which the greatest pressure occurs in the Earth's crust. The map was produced by compiling new and previously published measurements from boreholes as well as inferences about kinds or "styles" of faults based on earthquakes that have occurred in the past.The three possible styles of faulting include extensional, or normal faulting, in which the crust extends horizontally; strike-slip faulting, in which the Earth slides past itself, like in the San Andreas fault; and reverse, or thrust, faulting in which the Earth moves over itself. Each one causes very different shaking from a hazard point of view."In our hazards maps right now, in most places, we don't have direct evidence of what kind of earthquake mechanisms could occur," said Jack Baker, a professor of civil and environmental engineering who was not involved with the study. "It's exciting that we have switched from this blind assumption of anything is possible to having some location-specific inferences about what types of earthquakes we might expect."In addition to presenting a continent-level view of the processes governing the North American plate, the data -- which incorporates nearly 2,000 stress orientations, 300 of which are new to this study -- offer regional clues about the behavior of the subsurface."If you know an orientation of any fault and the state of stress nearby, you know how likely it is to fail and whether you should be concerned about it in both naturally-triggered and industry-triggered earthquake scenarios," said lead author Jens-Erik Lund Snee, PhD '20, now a postdoctoral fellow with the United States Geological Survey (USGS) in Lakewood, Colorado. "We've detailed a few places where previously published geodynamic models agree very well with the new data, and others where the models don't agree well at all."In the Eastern U.S., for example, the style of faulting revealed by the study is exactly the opposite of what would be expected as the surface slowly "rebounds" following the melting of the ice sheets that covered most of Canada and the northern U.S. some 20,000 years ago, according to Lund Snee. The discovery that the rebound stresses are much less than those already stored in the crust from plate tectonics will advance scientists' understanding of the earthquake potential in that area.In the Western U.S., the researchers were surprised to see changes in stress types and orientations over short distances, with major rotations occurring over only tens of miles -- a feature that current models of Earth dynamics do not reveal."It's just much clearer now how stress can systematically vary on the scale of a sedimentary basin in some areas," Zoback said. "We see things we've never seen before that require geologic explanation. This will teach us new things about how the Earth works."Zoback is also a senior fellow at the Stanford Precourt Institute for Energy, co-director of the Stanford Center for Induced and Triggered Seismicity (SCITS) and director of the Stanford Natural Gas Initiative. Baker is also an affiliate at the Stanford Precourt Institute for Energy.The study was supported by SCITS, an industrial affiliates program that studies scientific and operational issues associated with triggered and induced earthquakes. | Earthquakes | 2,020 |
April 22, 2020 | https://www.sciencedaily.com/releases/2020/04/200422151310.htm | Tectonic plates started shifting earlier than previously thought | An enduring question in geology is when Earth's tectonic plates began pushing and pulling in a process that helped the planet evolve and shaped its continents into the ones that exist today. Some researchers theorize it happened around four billion years ago, while others think it was closer to one billion. | A research team led by Harvard researchers looked for clues in ancient rocks (older than 3 billion years) from Australia and South Africa, and found that these plates were moving at least 3.2 billion years ago on the early Earth. In a portion of the Pilbra Craton in Western Australia, one of the oldest pieces of the Earth's crust, scientists found a latitudinal drift of about 2.5 centimeters a year, and dated the motion to 3.2 billion years ago.The researchers believe this shift is the earliest proof that modern-like plate motion happened between two to four billion years ago. It adds to growing research that tectonic movement occurred on the early Earth. The findings are published in "Basically, this is one piece of geological evidence to extend the record of plate tectonics on Earth farther back in Earth history," said Alec Brenner, one of the paper's lead authors and a member Harvard's Paleomagnetics Lab. "Based on the evidence we found, it looks like plate tectonics is a much more likely process to have occurred on the early Earth and that argues for an Earth that looks a lot more similar to today's than a lot of people think."Plate tectonics is key to the evolution of life and the development of the planet. Today, the Earth's outer shell consists of about 15 rigid blocks of crust. On them sit the planet's continents and oceans. The movement of these plates shaped the location of the continents. It helped form new ones and it created unique landforms like mountain ranges. It also exposed new rocks to the atmosphere, which led to chemical reactions that stabilized Earth's surface temperature over billions of years. A stable climate is crucial to the evolution of life.When the first shifts occurred has long been an issue of considerable debate in geology. Any information that sheds light on it is valuable. The study, published on Earth Day, helps fill in some of the gaps. It also loosely suggests the earliest forms of life developed in a more moderate environment."We're trying to understand the geophysical principles that drive the Earth," said Roger Fu, one of the paper's lead authors and an assistant professor of earth and planetary sciences in the Faculty of Arts and Science. "Plate tectonics cycles elements that are necessary for life into the Earth and out of it."Plate tectonics helps planetary scientists understand worlds beyond this one, too."Currently, Earth is the only known planetary body that has robustly established plate tectonics of any kind," said Brenner, a third-year graduate student in the Graduate School of Arts and Sciences. "It really behooves us as we search for planets in other solar systems to understand the whole set of processes that led to plate tectonics on Earth and what driving forces transpired to initiate it. That hopefully would give us a sense of how easy it is for plate tectonics to happen on other worlds, especially given all the linkages between plate tectonics, the evolution of life and the stabilization of climate."For the study, members of the project traveled to Pilbara Craton in Western Australia. A craton is a primordial, thick, and very stable piece of crust. They are usually found in the middle of tectonic plates and are the ancient hearts of the Earth's continents.This makes them the natural place to go to study the early Earth. The Pilbara Craton stretches about 300 miles across, covering approximately the same area as the state of Pennsylvania. Rocks there formed as early as 3.5 billion years ago.In 2017, Fu and Brenner took samples from a portion called the Honeyeater Basalt. They drilled into the rocks there and collected core samples about an inch wide.They brought the samples back to Fu's lab in Cambridge, where they placed the samples into magnetometers and demagnetizing equipment. These instruments told them the rock's magnetic history. The oldest, most stable bit of that history is hopefully when the rock formed. In this case, it was 3.2 billion years ago.The team then used their data and data from other researchers, who've demagnetized rocks in nearby areas, to date when the rocks shifted from one point to another. They found a drift of 2.5 centimeters a year.Fu and Brenner's work differs from most studies because the scientists focused on measuring the position of the rocks over time while other work tends to focus on chemical structures in the rocks that suggest tectonic movement.Researchers used the novel Quantum Diamond Microscope to confirm their findings from 3.2 billion years ago. The microscope images the magnetic fields and particles of a sample. It was developed in collaboration between researchers at Harvard and MIT.In the paper, the researchers point out they weren't able to rule out a phenomenon called "true polar wander." It can also cause the Earth's surface to shift. Their results lean more towards plate tectonic motion because of the time interval of this geological movement.Fu and Brenner plan to keep analyzing data from the Pilbara Craton and other samples from around the world in future experiments. A love of the outdoors drives both of them, and so does an academic need to understand the Earth's planetary history."This is part of our heritage," Brenner said. | Earthquakes | 2,020 |
April 21, 2020 | https://www.sciencedaily.com/releases/2020/04/200421094301.htm | Holistic way to measure the economic fallout from earthquakes | When an earthquake or other natural disaster strikes, government relief agencies, insurers and other responders converge to take stock of fatalities and injuries, and to assess the extent and cost of damage to public infrastructure and personal property. | But until now, such post-disaster assessment procedures have focused on the dollar value of damages to property while failing to account for something that is equally important but harder to quantify; namely, that the poorer someone or their family is, the harder it is for them to recover and regain their former standard of living.Now, civil engineers at Stanford, working with economists from the World Bank, have devised the first disaster assessment model that combines the well-understood property damage estimates with a way to calculate two previously nebulous variables -- the community-wide economic impacts caused by disruptions to industry and jobs, and the social costs to individuals and families.Although such an analysis might seem obvious, no one had ever put brick-and-mortar losses together with pain and suffering consequences until the Stanford civil engineers teamed up with World Bank economists Stephane Hallegatte and Brian Walsh to create this holistic damage assessment model.In a study published March 30 in Although the researchers tailored their approach specifically for earthquakes, they hope experts in hurricanes, tornados, floods and other disasters will also adopt the new economic and sociological measures in order to give policymakers new tools to plan for disasters.Study first author Maryia Markhvida, a former graduate student of Baker's, said the researchers started with traditional models of property damage assessment, and added on top of these a second layer of analysis to quantify a concept called "well-being," which they borrowed from economists and sociologists. The model calculates the incomes and consumption levels of people in different socioeconomic strata to assign a numerical value to well-being, which can be thought of as how people feel about daily life as they recover from a disaster.The researchers combined the physical damage tools with economic and well-being assessments to create a more holistic model of disaster effects. For example, should the property damage part of their system show that a particular building is likely to collapse, the second layer of analysis would kick in to extrapolate how such structural damage would affect where people work and how it would affect a variety of industries in ways that would trickle down to impact people's incomes and spending power, thus diminishing their sense of well-being.Much of the income, expense and spending data for the analysis was derived from Census data, which enabled the researchers to tie their well-being calculations to the relative poverty or prosperity of people living in different neighborhoods.When they estimated the relative loss of well-being for people in each of four income brackets, they found that those at the bottom felt a roughly 60 percent loss of well-being as a fraction of the Bay Area's average annual income, relative to something closer to 25 percent for those at the top (see Chart A available via image download).The researchers also compared three types of losses for 10 cities in the San Francisco Bay Area (see Chart B available via image download). These calculations factored in the vulnerability of each city's building stock (the number and types of homes, offices and other structures), its proximity to the hypothesized earthquake and other factors such what sort of savings and insurance people had as safety cushions. The chart shows that, even when property damages are roughly equal, well-being losses are larger in cities that have lower-income population and lower household savings."It makes sense that the people who have less in the first place feel that life becomes that much harder when they lose some of what little they had," Markhvida said.The researchers envision that policymakers will use the model to consider in advance how to mitigate the impacts of a quake and speed the region's recovery afterward. They might, for instance, run "what if" exercises to weigh the relative benefits of measures such as tightening building codes, providing incentives to do retrofits or get earthquake insurance, or make contingency plans to extend or expand unemployment benefits."This model could help government officials decide which policies provide the best bang for the buck, and also see how they might affect not just potential property damages, but losses to people's sense of well-being," Baker said. | Earthquakes | 2,020 |
April 14, 2020 | https://www.sciencedaily.com/releases/2020/04/200414125746.htm | Timing of large earthquakes follows a 'devil's staircase' pattern | At the regional level and worldwide, the occurrence of large shallow earthquakes appears to follow a mathematical pattern called the Devil's Staircase, where clusters of earthquake events are separated by long but irregular intervals of seismic quiet. | The finding published in the The researchers note that their results could have implications for seismic hazard assessment. For instance, they found that these large earthquake sequences (those with events magnitude 6.0 or greater) are "burstier" than expected, meaning that the clustering of earthquakes in time results in a higher probability of repeating seismic events soon after a large earthquake. The irregular gap between event bursts also makes it more difficult to predict an average recurrence time between big earthquakes.Seismologists' catalogs for large earthquakes in a region might include too few earthquakes over too short a time to capture the whole staircase pattern, making it "difficult to know whether the few events in a catalog occurred within an earthquake cluster or spanned both clusters and quiescent intervals," Chen and his colleagues noted."For this same reason, we need to be cautious when assessing an event is 'overdue' just because the time measured from the previous event has passed some 'mean recurrence time' based an incomplete catalog," they added.The Devil's Staircase, sometimes called a Cantor function, is a fractal demonstrated by nonlinear dynamic systems, in which a change in any part could affect the behavior of the whole system. In nature, the pattern can be found in sedimentation sequences, changes in uplift and erosion rates and reversals in Earth's magnetic field, among other examples.Chen's Ph.D. advisor Mian Liu had an unusual introduction to the Devil's Staircase. "I stumbled into this topic a few years ago when I read about two UCLA researchers' study of the temporal pattern of a notorious serial killer, Andrei Chikatilo, who killed at least 52 people from 1979 to 1990 in the former Soviet Union," he explained. "The time pattern of his killings is a Devil's staircase. The researchers were trying to understand how the criminal's mind worked, how neurons stimulate each other in the brain. I was intrigued because I realized that earthquakes work in a similar way, that a fault rupture could stimulate activity on other faults by stress transfer.""Conceptually, we also know that many large earthquakes, which involve rupture of multiple and variable fault segments in each rupture, violate the basic assumption of the periodic earthquakes model, which is based on repeated accumulation and release of energy on a given fault plane," Liu added.The factors controlling the clustered events are complex, and could involve the stress that stimulates an earthquake, changes in frictional properties and stress transfer between faults or fault segments during a rupture, among other factors, said Gang Luo of Wuhan University. He noted that the intervals appear to be inversely related to the background tectonic strain rate for a region. | Earthquakes | 2,020 |
April 7, 2020 | https://www.sciencedaily.com/releases/2020/04/200407150819.htm | Making a connection: Two ways that fault segments may overcome their separation | In complex fault zones, multiple seemingly disconnected faults can potentially rupture at once, increasing the chance of a large damaging earthquake. Recent earthquakes including the 1992 Landers, 1999 Hector Mine and 2019 Ridgecrest earthquakes in California, among others, ruptured in this way. But how can seismologists predict whether individual fault segments might be connected and rupture together during a seismic event? | One way might be to look for clues that the segments are connected below the surface, according to David Oglesby, a researcher at the University of California, Riverside. His study published in the And in a second paper published in BSSA, Hui Wang of the Chinese Earthquake Administration and colleagues conclude that a rupture along a stepover fault, where parallel fault segments overlap in the direction of a rupture, might be able to "jump" over a wider gap between the fault segments than previously thought.In both cases, making the connection between fault segments could have a significant impact on assessing seismic hazards for a region. "The potential maximum rupture length, hence the maximum magnitude [of an earthquake], is an important parameter for assessing seismic hazards," said Mian Liu of the University of Missouri-Columbia, a co-author on the Wang study."The details of connectivity can have a controlling influence on whether you get a big earthquake that jumps across what appear to be multiple fault segments or a small earthquake that remains on a small segment," Oglesby said.Oglesby began thinking about this problem of discerning connections at depth after a conference where one of the speakers suggested that completely disconnected faults would have different slip patterns than faults connected at depth. Modeling that looked at slip distribution -- broadly, where slip occurs along a fault -- might be useful, he thought.In his 3D dynamic rupture modeling of fault segments disconnected by gaps, Oglesby looked in particular at how rapidly the slip decays to zero at the edge of a fault segment on the surface. Does the amount of slip gradually decrease toward zero at the edge, or does it quickly decrease to zero?The models suggest that "all things being equal, if a fault appears to be disconnected at the surface but is connected at relatively shallow depth, then typically the slip will decay very rapidly to zero at the edge of the fault segment," Oglesby said.Shallow depth in this case means that the segments are connected at about 1 to 2 kilometers (0.6 to 1.2 miles) below the surface, he noted. If the fault remains completely disconnected or is connected deeper than 1 to 2 kilometers, "then the slip will not decay to zero as rapidly at the edge of the surface fault segment," Oglesby explained, since the deeper connection is too far away to have a strong effect on surface slip distribution.Oglesby stressed that his models are simplified, and don't account for other factors such as the high stress and strain and potential rock failure around the edges of fault segments. "And just because you get this rapid decay, it doesn't necessarily mean that [a fault] is connected at depth," he noted. "There are lots of factors that affect fault slip. It's a clue, but not a smoking gun."In their modeling study, Wang and colleagues took a closer look at what factors might influence a rupture's jump between parallel fault segments in a stepover system. They were prompted by events such as the 2016 magnitude 7.8 Kaikoura, New Zealand, earthquake, where rupture jumped between nearly parallel fault segments as much as 15 to 20 kilometers apart.The researchers found that by including the background effects of changes in stress in a stepover, ruptures could jump over a wider space than the 5 kilometers (about 3.1 miles) predicted by some earlier studies.Wang and colleagues' models suggest instead that a rupture may jump more than 15 kilometers (9.3 miles) in a releasing or extensional stepover, or 7 kilometers (4.3 miles) in a restraining or compressive stepover fault.Their models combine data on long-term tectonic stress changes with changes in stress predicted by fault dynamic rupture models, providing a fuller picture of stress changes along a fault over a timescale of both millions of years and a few seconds. "We realized that we needed to bridge these different fault models to better understand fault mechanics," said Liu.Liu also cautioned that their models only measure one aspect of complex fault geometry. "Although many factors could contribute to rupture propagation across stepovers, the step width is perhaps one of the easiest to measure, so hopefully our results would lead to more studies and a better understanding of complex fault systems." | Earthquakes | 2,020 |
March 31, 2020 | https://www.sciencedaily.com/releases/2020/03/200331162247.htm | Sediments may control location, magnitude of megaquakes | The world's most powerful earthquakes strike at subduction zones, areas where enormous amounts of stress build up as one tectonic plate dives beneath another. When suddenly released, this stress can cause devastating "megaquakes" like the 2011 Mw 9.0 Tohoku event, which killed nearly 16,000 people and crippled Japan's Fukushima Dai-ichi Nuclear Power Plant. Now a study published in Geology suggests that sediments atop the downgoing slab can play a key role in determining the magnitude and location of these catastrophic events. | In this newly published study, a team led by Gou Fujie, a senior scientist at the Japan Agency for Marine-Earth Science and Technology, used a trio of geophysical methods to image the subducting sediments in the northeastern Japan arc, where the Tohoku event occurred. The findings suggest that variations caused by volcanic rocks intruded into these sediments can substantially influence the nature of subduction zone earthquakes."Our imaging shows that the enormous amount of slip that occurred during the 2011 Tohoku earthquake stopped in an area of thin sediments that are just starting to subduct," says Fujie. "These results indicate that by disturbing local sediment layers, volcanic activity that occurred prior to subduction can affect the size and the distribution of interplate earthquakes after the layers have been subducted."Researchers first began to suspect that variations in subducting sediments could influence megaquakes after the 2011 Tohoku event, when international drilling in the northeastern Japan arc showed that giant amounts of slip during the earthquake occurred in a slippery, clay-rich layer located within the subducting sediments. To better understand the nature of the downgoing slab in this region, Fujie's team combined several imaging techniques to paint a clearer picture of the subseafloor structure.The researchers discovered there are what Fujie calls "remarkable regional variations" in the sediments atop the downgoing plate, even where the seafloor topography seems to be flat. There are places, he says, where the sediment layer appears to be extremely thin due to the presence of an ancient lava flow or other volcanic rocks. These volcanic intrusions have heavily disturbed, and in places thermally metamorphosed, the clay layer in which much of the seismic slip occurred.Because the type of volcanism that caused sediment thinning in the northeastern Japan arc has also been found in many areas, says Fujie, the research suggests such thinning is ubiquitous -- and that this type of volcanic activity has also affected other seismic events. "Regional variations in sediments atop descending oceanic plates appear to strongly influence devastating subduction zone earthquakes," he concludes. | Earthquakes | 2,020 |
March 31, 2020 | https://www.sciencedaily.com/releases/2020/03/200331130008.htm | Potential for using fiber-optic networks to assess ground motions during earthquakes | A new study from a University of Michigan researcher and colleagues at three institutions demonstrates the potential for using existing networks of buried optical fibers as an inexpensive observatory for monitoring and studying earthquakes. | The study provides new evidence that the same optical fibers that deliver high-speed internet and HD video to our homes could one day double as seismic sensors."Fiber-optic cables are the backbone of modern telecommunications, and we have demonstrated that we can turn existing networks into extensive seismic arrays to assess ground motions during earthquakes," said U-M seismologist Zack Spica, first author of a paper published online Feb. 12 in the journal The study was conducted using a prototype array at Stanford University, where Spica was a postdoctoral fellow for several years before recently joining the U-M faculty as an assistant professor in the Department of Earth and Environmental Sciences. Co-authors include researchers at Stanford and from Mexico and Virginia."This is the first time that fiber-optic seismology has been used to derive a standard measure of subsurface properties that is used by earthquake engineers to anticipate the severity of shaking," said geophysicist Greg Beroza, a co-author on the paper and the Wayne Loel Professor in Stanford's School of Earth, Energy & Environmental Sciences.To transform a fiber-optic cable into a seismic sensor, the researchers connect an instrument called a laser interrogator to one end of the cable. It shoots pulses of laser light down the fiber. The light bounces back when it encounters impurities along the fiber, creating a "backscatter signal" that is analyzed by a device called an interferometer.Changes in the backscatter signal can reveal how the fiber stretches or compresses in response to passing disturbances, including seismic waves from earthquakes. The technique is called distributed acoustic sensing, or DAS, and has been used for years to monitor the health of pipelines and wells in the oil and gas industry.The new study in In addition, the study demonstrates that optical fibers can be used to sense seismic waves and obtain velocity models and resonance frequencies of the ground -- two parameters that are essential for ground-motion prediction and seismic-hazard assessment. Spica and his colleagues say their results are in good agreement with an independent survey that used traditional techniques, thereby validating the methodology of fiber-optic seismology.This approach appears to have great potential for use in large, earthquake-threatened cities such as San Francisco, Los Angeles, Tokyo and Mexico City, where thousands of miles of optical cables are buried beneath the surface."What's great about using fiber for this is that cities already have it as part of their infrastructure, so all we have to do is tap into it," Beroza said.Many of these urban centers are built atop soft sediments that amplify and extend earthquake shaking. The near-surface geology can vary considerably from neighborhood to neighborhood, highlighting the need for detailed, site-specific information.Yet getting that kind of information can be a challenge with traditional techniques, which involve the deployment of large seismometer arrays -- thousands of such instruments in the Los Angeles area, for example."In urban areas, it is very difficult to find a place to install seismic stations because asphalt is everywhere," Spica said. "In addition, many of these lands are private and not accessible, and you cannot always leave a seismic station standing alone because of the risk of theft."Fiber optics could someday mark the end of such large scale and expensive experiments. The cables are buried under the asphalt and crisscross the entire city, with none of the disadvantages of surface seismic stations."The technique would likely be fairly inexpensive, as well, Spica said. Typically, commercial fiber-optic cables contain unused fibers that can be leased for other purposes, including seismology.For the moment, traditional seismometers provide better performance than prototype systems that use fiber-optic sensing. Also, seismometers sense ground movements in three directions, while optical fibers only sense along the direction of the fiber.The 3-mile Stanford fiber-optic array and data acquisition were made possible by a collective effort from Stanford IT services, Stanford Geophysics, and OptaSense Ltd. Financial support was provided by the Stanford Exploration Project, the U.S. Department of Energy and the Schlumberger Fellowship.The next phase of the project involves a much larger test array. A 27-mile loop was formed recently by linking optical fibers on Stanford's historic campus with fibers at several other nearby locations.The other authors of the | Earthquakes | 2,020 |
March 25, 2020 | https://www.sciencedaily.com/releases/2020/03/200325154052.htm | Eclectic rocks influence earthquake types | New Zealand's largest fault is a jumble of mixed-up rocks of all shapes, sizes, compositions and origins. According to research from a global team of scientists, this motley mixture could help explain why the fault generates slow-motion earthquakes known as "slow slip events" as well as destructive, tsunami-generating tremors. | "One thing that really surprised us was the sheer diversity of rock types," said Laura Wallace, a research scientist at the University of Texas Institute for Geophysics (UTIG) and co-chief scientist on the expedition that retrieved rock samples from the fault. "These rocks that are being mashed up together all behave very differently in terms of their earthquake generating potential."The finding was described in a paper published March 25, 2020, in Subduction zones -- places where one tectonic plate dives beneath another -- are where the world's largest and most damaging earthquakes occur. Scientists have long debated why quakes are more powerful or more frequent at some subduction zones than at others, and whether there may be a connection with the slow slip events, which can take weeks or months to unfold. Although they are not felt by people on the surface, the energy they release into the Earth is comparable to powerful earthquakes."It has become apparent only in the last few years that slow slip events happen at many different types of faults, and some at depths in the Earth much shallower than previously thought," said the paper's lead author, Philip Barnes of the New Zealand Institute for Water and Atmospheric Research (NIWA). "It's raised a lot of big questions about why they happen, and how they affect other kinds of earthquakes."To answer these questions, Barnes, Wallace, and UTIG Director Demian Saffer led two scientific ocean drilling expeditions to a region off the coast of New Zealand, where they drilled into and recovered rocks from the vicinity of the tremors' source. UTIG is a research unit of the UT Jackson School of Geosciences."The earthquake and geological science community has speculated about what goes into a subduction zone where slow earthquakes occur," said Saffer, who was co-chief scientist on the second expedition. "But this was the first time we've literally held those rocks -- and physical evidence for any of those ideas -- in our hands."The team drilled into the remains of a buried, ancient sea mountain where they found pieces of volcanic rock, hard, chalky, carbonate rocks, clay-like mudrocks, and layers of sediments eroded from the mountain's surface.Kelin Wang, an expert in earthquake physics and slow slip events at the Geological Survey of Canada, said that the paper was effectively a breakthrough in understanding how the same fault can generate different types of earthquakes."In addition to helping us understand the geology of slow slip events this paper also helps explain how the same fault can exhibit complex slip behavior, including tsunami-generating earthquakes," said Wang, who was not part of the study.Efforts to understand the connection between slow slip events and more destructive earthquakes are already underway. These studies, which are being led by other UTIG researchers, include detailed seismic imaging -- which is similar to a geological CAT scan -- of the slow slip zone in New Zealand, and an ongoing effort to track the behavior of subduction zones around the world by installing sensors on and beneath the seafloor. The goal of the work is to develop a better understanding of the events that lead up to a slow slip event versus a tsunami-generating earthquake."The next needed steps are to continue installing offshore instruments at subduction zones in New Zealand and elsewhere so we can closely monitor these large offshore faults, ultimately helping communities to be better prepared for future earthquakes and tsunami," said Wallace, who also works at GNS Science, New Zealand's government-funded geosciences research institute. | Earthquakes | 2,020 |
March 16, 2020 | https://www.sciencedaily.com/releases/2020/03/200316141510.htm | Shifts in deep geologic structure may have magnified great 2011 Japan tsunami | On March 11, 2011, a magnitude 9 earthquake struck under the seabed off Japan -- the most powerful quake to hit the country in modern times, and the fourth most powerful in the world since modern record keeping began. It generated a series of tsunami waves that reached an extraordinary 125 to 130 feet high in places. The waves devastated much of Japan's populous coastline, caused three nuclear reactors to melt down, and killed close to 20,000 people. | The tsunami's obvious cause: the quake occurred in a subduction zone, where the tectonic plate underlying the Pacific Ocean was trying to slide under the adjoining continental plate holding up Japan and other landmasses. The plates had been largely stuck against each other for centuries, and pressure built up. Finally, something gave. Hundreds of square miles of seafloor suddenly lurched horizontally some 160 feet, and thrust upward by up to 33 feet. Scientists call this a megathrust. Like a hand waved vigorously underwater in a bathtub, the lurch propagated to the sea surface and translated into waves. As they approached shallow coastal waters, their energy concentrated, and they grew in height. The rest is history.But scientists soon realized that something did not add up. Tsunami sizes tend to mirror earthquake magnitudes on a predictable scale; This one produced waves three or four times bigger than expected. Just months later, Japanese scientists identified another, highly unusual fault some 30 miles closer to shore that seemed to have moved in tandem with the megathrust. This fault, they reasoned, could have magnified the tsunami. But exactly how it came to develop there, they could not say. Now, a new study in the journal The study's authors, based at Columbia University's Lamont-Doherty Earth Observatory, examined a wide variety of data collected by other researchers before the quake and after. This included seafloor topographic maps, sediments from underwater boreholes, and records of seismic shocks apart from the megathrust.The unusual fault in question is a so-called extensional fault -- one in which the Earth's crust is pulled apart rather than being pushed together. Following the megathrust, the area around the extensional fault moved some 200 feet seaward, and a series of scarps 10 to 15 feet high could be seen there, indicating a sudden, powerful break. The area around the extensional fault was also warmer than the surrounding seabed, indicating friction from a very recent movement; that suggested the extensional fault had been jolted loose when the megathrust struck. This in turn would have added to the tsunami's power.Extensional faults are in fact common around subduction zones -- but only in oceanic plates, not the overriding continental ones, where this one was found. How did it get there? And, might such dangerous features lurk in other parts of the world?The authors of the new paper believe the answer is the angle at which the ocean plate dives under the continental; they say it has been gradually shallowing out over millions of years. "Most people would say it was the megathrust that caused the tsunami, but we and some others are saying there may have been something else at work on top of that," said Lamont PhD. student Bar Oryan, the paper's lead author. "What's new here is we explain the mechanism of how the fault developed."The researchers say that long ago, the oceanic plate was moving down at a steeper angle, and could drop fairly easily, without disturbing the seafloor on the overriding continental plate. Any extensional faulting was probably confined to the oceanic plate behind the trench -- the zone where the two plates meet. Then, starting maybe 4 million or 5 million years ago, it appears that angle of subduction began declining. As a result, the oceanic plate began exerting pressure on sediments atop the continental plate. This pushed the sediments into a huge, subtle hump between the trench and Japan's shoreline. Once the hump got big and compressed enough, it was bound to break, and that was probably what happened when the megathrust quake shook things loose. The researchers used computer models to show how long-term changes in the dip of the plate could produce major changes in the short-term deformation during an earthquake.There are multiple lines of evidence. For one, material taken from boreholes before the quake show that sediments had been squeezed upward about midway between the land and the trench, while those closer to both the land and the trench had been subsiding -- similar to what might happen if one laid a piece of paper flat on a table and then slowly pushed in on it from opposite sides. Also, recordings of aftershocks in the six months after the big quake showed scores of extensional-fault-type earthquakes carpeting the seabed over the continental plate. This suggests that the big extensional fault is only the most obvious one; strain was being released everywhere in smaller, similar quakes in surrounding areas, as the hump relaxed.Furthermore, on land, Japan hosts numerous volcanoes arranged in a neat north-south arc. These are fueled by magma generated 50 or 60 miles down, at the interface between the subducting slab and the continental plate. Over the same 4 million to 5 million years, this arc has been migrating westward, away from the trench. Since magma generation tends to take place at a fairly constant depth, this adds to the evidence that the angle of subduction has gradually been growing shallower, pushing the magma-generating zone further inland.Lamont geophysicist and coauthor Roger Buck said that the study and the earlier ones it builds on have global implications. "If we can go and find out if the subduction angle is moving up or down, and see if sediments are undergoing this same kind of deformation, we might be better able to say where this kind of risk exists," he said. Candidates for such investigation would include areas off Nicaragua, Alaska, Java and others in the earthquake zones of the Pacific Ring of Fire. "These are areas that matter to millions of people," he said. | Earthquakes | 2,020 |
March 12, 2020 | https://www.sciencedaily.com/releases/2020/03/200312142323.htm | 'Fossil earthquakes' offer new insight into seismic activity deep below Earth's surface | A major international study has shed new light on the mechanisms through which earthquakes are triggered up to 40km beneath the earth's surface. | While such earthquakes are unusual, because rocks at those depth are expected to creep slowly and aseismically, they account for around 30 per cent of intracontinental seismic activity. Recent examples include a significant proportion of seismicity in the Himalaya as well as aftershocks associated with the 2001 Bhuj earthquake in India.However, very little is presently known about what causes them, in large part due to the fact that any effects are normally hidden deep underground.The current study, published in They showed that earthquake ruptures may be encouraged by the interaction of different shear zones that are creeping slowly and aseismically. This interaction loads the adjacent blocks of stiff rocks in the deep crust, until they cannot sustain the rising stress anymore, and snap -- generating earthquakes.Emphasising observations of quite complex networks created by earthquake-generated faults, they suggest that this context is characterised by repeating cycles of deformation, with long-term slow creep on the shear zones punctuated by episodic earthquakes.Although only a transient component of such deformation cycles, the earthquakes release a significant proportion of the accumulated stress across the region.The research was led by the University of Plymouth (UK) and University of Oslo (Norway), with scientists conducting geological observations of seismic structures in exhumed lower crustal rocks on the Lofoten Islands.The region is home to one of the few well-exposed large sections of exhumed continental lower crust in the world, exposed during the opening of the North Atlantic Ocean.Scientists spent several months in the region, conducting a detailed analysis of the exposed rock and in particular pristine pseudotachylytes (solidified melt produced during seismic slip regarded as 'fossil earthquakes') which decorate fault sets linking adjacent or intersecting shear zones.They also collected samples from the region which were then analysed using cutting edge technology in the University's Plymouth Electron Microscopy Centre.Lead author Dr Lucy Campbell, Post-Doctoral Research Fellow at the University of Plymouth, said: "The Lofoten Islands provide an almost unique location in which to examine the impact of earthquakes in the lower crust. But by looking at sections of exposed rock less than 15 metres wide, we were able to see examples of slow-forming rock deformation working to trigger earthquakes generated up to 30km beneath the surface. The model we have now developed provides a novel explanation of the causes and effects of such earthquakes that could be applied at many locations where they occur."Project lead Dr Luca Menegon, Associate Professor at the University of Plymouth and the University of Oslo, added: "Deep earthquakes can be as destructive as those nucleating closer to the Earth's surface. They often occur in highly populated areas in the interior of the continents, like in Central Asia for example. But while a lot is known about what causes seismic activity in the upper crust, we know far less about those which occur lower. This study gives us a fascinating insight into what is happening deep below the Earth's surface, and our challenge is now to take this research forward and see if we can use it to make at-risk communities more aware of the dangers posed by such activity."As part of the study, scientists also worked with University of Plymouth filmmaker Heidi Morstang to produce a 60-minute documentary film about their work. Pseudotachylyte premiered at the 2019 Bergen International Film Festival, and will be distributed internationally once it has screened at various other festivals globally. | Earthquakes | 2,020 |
March 12, 2020 | https://www.sciencedaily.com/releases/2020/03/200312123637.htm | Separations between earthquakes reveal clear patterns | When large earthquakes occur, seismologists are well aware that subsequent, smaller tremors are likely to take place afterwards in the surrounding geographical region. So far, however, few studies have explored how the similarity between these inter-earthquake times and distances is related to their separation from initial events. In a new study published in EPJ B, researchers led by Min Lin at the Ocean University of China in Qingdao show for the first time that the two values become increasingly correlated the closer they are in time and space to previous, larger earthquakes. | As one of Earth's most familiar natural disasters, this new mathematical insight into the occurrence of earthquakes could better inform policymakers about how they should prepare for the disasters. The team's work leads on from previous models, which were developed to understand the mechanisms and dynamics underlying earthquake occurrence following large, initial seismic events. Over a wide range of time and distance scales, Lin and colleagues revealed a strong 'cross-correlation' between inter-earthquake distances and times -- a quantity describing the similarity between the two values as a function of their relative separation in time and space from an original event.The researchers achieved their results through 'detrended cross-correlation analysis', performed on data gathered in the earthquake-prone regions of California and Sumatra between 1990 and 2013. Lin's team also accounted for the evolution in cross-correlation over time, revealing that the relationship remains strong in the time following large earthquakes but weakens both before and after this period. Their insights could help seismologists to better understand the patterns which unfold after large initial earthquakes. In turn, this could enable governments and local communities to better safeguard their populations against the worst effects of large seismic events. | Earthquakes | 2,020 |
March 10, 2020 | https://www.sciencedaily.com/releases/2020/03/200310114715.htm | Injection strategies are crucial for geothermal projects | Geothermal energy with its significant baseload capacity has long been investigated as a potential complement and long-term replacement for traditional fossil fuels in electricity and heat production. In order to develop deep geothermal reservoirs where there are not enough natural fluid pathways, the formation needs to be hydraulically stimulated. Creation of so called Enhanced Geothermal Systems (EGS) opens fluid flow paths by injecting large quantities of water at elevated pressures. This is typically accompanied by induced seismicity. | Some especially large induced earthquakes have led to the termination or suspension of several EGS projects in Europe, such as the deep heat mining projects in Basel and in St. Gallen, both Switzerland. Recently, the occurrence of a MW 5.5 earthquake in 2017 near Pohang, South Korea, has been linked to a nearby located EGS project. As such, there now exists substantial public concern about EGS projects in densely populated areas. Developing new coupled monitoring and injection strategies to minimize the seismic risk is therefore key to safe development of urban geothermal resources and restore public faith in this clean and renewable energy.In a new study published in This means that induced seismicity and magnitudes could be managed by changes in the injection strategy. Stimulations that reveal unbound increase in seismic moment suggest, that in these cases evolution of seismicity is mainly controlled by regional tectonics. During injection a pressure-controlled rupture may become unstable, with the maximum expected magnitude then being only limited by the size of tectonic faults and fault connectivity. Close near-real-time monitoring of the seismic moment evolution with injected fluid could help to identify stress-controlled stimulations at the early stages of injection or potentially diagnose critical changes in the stimulated system during injection for an immediate reaction in stimulation strategy. | Earthquakes | 2,020 |
March 3, 2020 | https://www.sciencedaily.com/releases/2020/03/200303140138.htm | Researchers develop new explanation for destructive earthquake vibrations | Earthquakes produce seismic waves with a range of frequencies, from the long, rolling motions that make skyscrapers sway, to the jerky, high-frequency vibrations that cause tremendous damage to houses and other smaller structures. A pair of Brown University geophysicists has a new explanation for how those high-frequency vibrations may be produced. | In a paper published in "The way we normally think of earthquakes is that stress builds up on a fault until it eventually fails, the two sides slip against each other, and that slip alone is what causes all the ground motions we observe," said Tsai, an associate professor in Brown's Department of Earth, Environmental and Planetary Sciences. "The idea of this paper is to evaluate whether there's something other than just slip. The basic question is: If you have objects colliding inside the fault zone as it slips, what physics could result from that?"Drawing from mathematical models that describe the collisions of rocks during landslides and other debris flows, Tsai and Hirth developed a model that predicts the potential effects of rock collisions in fault zones. The model suggested the collisions could indeed be the principal driver of high-frequency vibrations. And combining the collision model with more traditional frictional slip models offers reasonable explanations for earthquake observations that don't quite fit the traditional model alone, the researchers say.For example, the combined model helps explain repeating earthquakes -- quakes that happen at the same place in a fault and have nearly identical seismic wave forms. The odd thing about these quakes is that they often have very different magnitudes, yet still produce ground motions that are nearly identical. That's difficult to explain by slip alone, but makes more sense with the collision model added, the researchers say."If you have two earthquakes in the same fault zone, it's the same rocks that are banging together -- or at least rocks of basically the same size," Tsai said. "So if collisions are producing these high-frequency vibrations, it's not surprising that you'd get the same ground motions at those frequencies regardless of the amount of slip that occurs."The collision model also may help explain why quakes at more mature fault zones -- ones that have had lots of quakes over a long period of time -- tend to produce less damage compared to quakes of the same magnitude at more immature faults. Over time, repeated quakes tend to grind down the rocks in a fault, making the faults smoother. The collision model predicts that smoother faults with less jagged rocks colliding would produce weaker high-frequency vibrations.Tsai says that more work needs to be done to fully validate the model, but this initial work suggests the idea is promising. If the model does indeed prove valid, it could be helpful in classifying which faults are likely to produce more or less damaging quakes."People have made some observations that particular types of faults seem to generate more or less high-frequency motion than others, but it has not been clear why faults fall into one category or the other," he said. "What we're providing is a potential framework for understanding that, and we could potentially generalize this to all faults around the world. Smoother faults with rounded internal structures may generally produce less high-frequency motions, while rougher faults would tend to produce more."The research also suggests that some long-held ideas about how earthquakes work might need revising."In some sense it might mean that we know less about certain aspects of earthquakes than we thought," Tsai said. "If fault slip isn't the whole story, then we need a better understanding of fault zone structure." | Earthquakes | 2,020 |
March 2, 2020 | https://www.sciencedaily.com/releases/2020/03/200302113353.htm | Sinking sea mountains make and muffle earthquakes | Subduction zones -- places where one tectonic plate dives beneath another -- are where the world's largest and most damaging earthquakes occur. A new study has found that when underwater mountains -- also known as seamounts -- are pulled into subduction zones, not only do they set the stage for these powerful quakes, but also create conditions that end up dampening them. | The findings mean that scientists should more carefully monitor particular areas around a subducting seamount, researchers said. The practice could help scientists better understand and predict where future earthquakes are most likely to occur."The Earth ahead of the subducting seamount becomes brittle, favoring powerful earthquakes while the material behind it remains soft and weak, allowing stress to be released more gently," said co-author Demian Saffer, director of the University of Texas Institute for Geophysics (UTIG), a research unit of The University of Texas at Austin Jackson School of Geosciences.The study was published on March 2 in The researchers used a computer model to simulate what happens when seamounts enter ocean trenches created by subduction zones. According to the model, when a seamount sinks into a trench, the ground ahead of it becomes brittle, as its slow advance squeezes out water and compacts the Earth. But in its wake, the seamount leaves a trail of softer wet sediment. The hard, brittle rock can be a source for powerful earthquakes, as forces generated by the subducting plate build up in it -- but the weakened, wet material behind the seamount creates an opposite, dampening effect on these quakes and tremors.Although seamounts are found all over the ocean floor, the extraordinary depths at which subduction occurs means that studying or imaging a subducting seamount is extremely difficult. This is why until now, scientists were not sure whether seamounts could affect the style and magnitude of subduction zone earthquakes.The current research tackled the problem by creating a realistic computer simulation of a subducting seamount and measuring the effects on the surrounding rock and sediment, including the complex interactions between stresses in the Earth and fluid pressure in the surrounding material. Getting realistic data for the model involved conducting experiments on rock samples collected from subduction zones by scientific ocean drilling offshore Japan.The scientists said the model's results took them completely by surprise. They had expected water pressure and stress to break up material at the head of the seamount and thus weaken the rocks, not strengthen them."The seamount creates a feedback loop in the way fluids get squeezed out and the mechanical response of the rock to changes fluid pressure," said Ellis, who co-developed the numerical code at the heart of the study.The scientists are satisfied their model is robust because the earthquake behavior it predicts consistently matches the behavior of real earthquakes.While the weakened rock left in the wake of seamounts may dampen large earthquakes, the researchers believe that it could be an important factor in a type of earthquake known as a slow slip event. These slow-motion quakes are unique because they can take days, weeks and even months to unfold.Laura Wallace, a research scientist at UTIG and GNS Science, who was the first to document New Zealand slow slip events, said that the research was a demonstration of how geological structures in the Earth's crust, such as seamounts, could influence a whole spectrum of seismic activity."The predictions from the model agree very nicely with what we are seeing in New Zealand in terms of where small earthquakes and tremors are happening relative to the seamount," said Wallace, who was not part of the current study.Sun believes that their investigations have helped address a knowledge gap about seamounts, but that research will benefit from more measurements."We still need high resolution geophysical imaging and offshore earthquake monitoring to better understand patterns of seismic activity," said Sun.The research was funded by the Seismogenesis at Hikurangi Integrated Research Experiment (SHIRE), an international project co-led by UT Austin to investigate the origin of earthquakes in subduction zones.The study was also supported by the National Science Foundation, the New Zealand Ministry of Business, Innovation and Employment, and GNS Science. | Earthquakes | 2,020 |
February 21, 2020 | https://www.sciencedaily.com/releases/2020/02/200221102118.htm | How earthquakes deform gravity | Lightning -- one, two, three -- and thunder. For centuries, people have estimated the distance of a thunderstorm from the time between lightning and thunder. The greater the time gap between the two signals, the further away the observer is from the location of the lightning. This is because lightning propagates at the speed of light with almost no time delay, while thunder propagates at the much slower speed of sound of around 340 metres per second. | Earthquakes also send out signals that propagate at the speed of light (300,000 kilometers per second) and can be recorded long before the relatively slow seismic waves (about 8 kilometers per second). However, the signals that travel at the speed of light are not lightning bolts, but sudden changes in gravity caused by a shift in the earth's internal mass. Only recently, these so-called PEGS signals (PEGS = Prompt elasto-gravity signals) were detected by seismic measurements. With the help of these signals, it might be possible to detect an earthquake very early before the arrival of the destructive earthquake or tsunami waves.However, the gravitational effect of this phenomenon is very small. It amounts to less than one billionth of the earth's gravity. Therefore, PEGS signals could only be recorded for the strongest earthquakes. In addition, the process of their generation is complex: they are not only generated directly at the source of the earthquake, but also continuously as the earthquake waves propagate through the earth's interior.Until now, there has been no direct and exact method to reliably simulate the generation of PEGS signals in the computer. The algorithm now proposed by the GFZ researchers around Rongjiang Wang can calculate PEGS signals with high accuracy and without much effort for the first time. The researchers were also able to show that the signals allow conclusions to be drawn about the strength, duration and mechanism of very large earthquakes. The study was published in the journal An earthquake shifts the rock slabs in the earth's interior abruptly, and thus changes the mass distribution in the earth. In strong earthquakes, this displacement can amount to several meters. "Since the gravity that can be measured locally depends on the mass distribution in the vicinity of the measuring point, every earthquake generates a small but immediate change in gravity," says Rongjiang Wang, scientific coordinator of the new study.However, every earthquake also generates waves in the earth itself, which in turn change the density of the rocks and thus the gravitation a little bit for a short time -- the earth's gravity oscillates to some extent in sync with the earthquake. Furthermore, this oscillating gravity produces a short-term force effect on the rock, which in turn triggers secondary seismic waves. Some of these gravitationally triggered secondary seismic waves can be observed even before the arrival of the primary seismic waves."We faced the problem of integrating these multiple interactions to make more accurate estimates and predictions about the strength of the signals," says Torsten Dahm, head of the section Physics of Earthquakes and Volcanoes at GFZ. "Rongjiang Wang had the ingenious idea of adapting an algorithm we had developed earlier to the PEGS problem -- and succeeded.""We first applied our new algorithm to the Tohoku quake off Japan in 2011, which was also the cause of the Fukushima tsunami," says Sebastian Heimann, program developer and data analyst at GFZ. "There, measurements on the strength of the PEGS signal were already available. The consistency was perfect. This gave us certainty for the prediction of other earthquakes and the potential of the signals for new applications."In the future, by evaluating the changes in gravity many hundreds of kilometres away from the epicentre of an earthquake off the coast, this method could be used to determine, even during the earthquake itself, whether a strong earthquake is involved that could trigger a tsunami, according to the researchers. "However, there is still a long way to go," says Rongjiang Wang. "Today's measuring instruments are not yet sensitive enough, and the environmentally induced interference signals are too great for the PEGS signals to be directly integrated into a functioning tsunami early warning system." | Earthquakes | 2,020 |
February 20, 2020 | https://www.sciencedaily.com/releases/2020/02/200220101121.htm | Earthquakes disrupt sperm whales' ability to find food | Otago scientists studying sperm whales off the coast of Kaikōura discovered earthquakes affect their ability to find food for at least a year. | The University of Otago-led research is the first to examine the impact of a large earthquake on a population of marine mammals, and offers new insight into how top predators such as sperm whales react and adapt to a large-scale natural disturbance.Changes in habitat use by a deep-diving predator in response to a coastal earthquake, has recently been published in Earthquakes and aftershocks can affect sperm whales in several ways, the study explains.The whales depend on sound for communication, detection of prey and navigation and are also highly sensitive to noise.Earthquakes produce among the loudest underwater sounds which can induce injuries, hearing damage, displacement and behavioural modifications.While earthquakes and other extreme natural events are rare occurrences, they can really shift the state of ecosystems by wiping out animals and plants, lead author and Marine Sciences Teaching Fellow Dr Marta Guerra says."Understanding how wild populations respond to earthquakes helps us figure out their level of resilience, and whether we need to adjust management of these populations while they are more vulnerable."The fatal 7.8 magnitude Kaikōura earthquake on November 14, 2016 produced strong ground shaking which triggered widespread underwater mudslides in the underwater canyon off the coastline.This caused what's known as 'canyon flushing', which in the case of the Kaikōura earthquake, involved high-energy currents flushing 850 tonnes of sediment from the underwater canyon into the ocean.The Kaikōura canyon is an important year-round foraging ground for sperm whales, which have an important ecological role as top predators and are a key attraction for the local tourism industry -- the main driver of the town's economy.Just why the canyon is important to sperm whales is "a piece of the puzzle we are still trying to nut out," says Dr Guerra."But it's likely related to the immense productivity of the canyon's seabed, and a combination of how the currents interact with the steep topography of the submarine canyon."Scientists examined data collected on the behaviour of 54 sperm whales between January 2014 and January 2018 -- a timeframe which allowed an opportunity to determine any significant changes in pre and post-earthquake whale foraging behaviour."We really didn't know what to expect, as there is so little known about how marine animals react to earthquakes," Dr Guerra says.The researchers found clear changes in the whales' behaviour in the year following the earthquake: most noticeably whales spent about 25 per cent more time at the surface -- which potentially meant they needed to spend more effort searching for prey, either by diving deeper or for longer timesThere are two main reasons the whales may have expanded their search effort, the study explains.Firstly, benthic invertebrate communities which lived in the upper canyon may have been removed by the canyon flushing event, resulting in sparser prey and reduced foraging abilities.Secondly, sediment deposition and erosion may have required sperm whales to 're-familiarise' with a modified habitat, increasing the effort to navigate and locate prey whose location may have changed."The flushing of almost 40,000 tonnes of biomass from the canyon's seabed probably meant that the animals that normally fed on the seabed had a short supply of food, possibly moving away," Dr Guerra says."This would have indirectly affected the prey of sperm whales (deep-water fish and squid), becoming scarce and making it harder for the whales to find food."Scientists were particularly surprised by how clear the changes were, especially in terms of where the sperm whales were feeding."The head of the Kaikōura canyon, where we used to frequently find sperm whales foraging, was quiet as a desert," Dr Guerra says.Although earthquakes happen relatively frequently in areas where marine mammals live, this study was the first to document the impact on a population, thanks to a long-term monitoring programme which has been in place since 1990.Globally, there have been punctual observations, such as a fin whale displaying an 'escape response' after an earthquake on the Gulf of California, or particularly low sightings of humpback whales coinciding with the months following an earthquake off Alaska, Dr Guerra says."Deep-sea systems are so out of sight that we rarely consider the consequences of them being disturbed, whether by natural of human impacts."I think our results emphasise how far-reaching the impacts to the sea bed can be, affecting even animals at the top of the food chain such as sperm whales."The study found the whales' behavioural changes lasted about a year after the 2016 earthquake and returned to normal levels in the summer of 2017-18.Dr Guerra believes this study also highlights the importance of long-term monitoring of marine wildlife and ecosystems, without which scientists wouldn't be able to detect changes that occur after marine mammals are exposed to disturbance. | Earthquakes | 2,020 |
February 10, 2020 | https://www.sciencedaily.com/releases/2020/02/200210095308.htm | Geothermal energy: Drilling a 3,000-meter deep well | Although stopping climate change is challenging, it is imperative to slow it down as soon as possible by reducing greenhouse gas emissions. But how can we meet the growing energy demand while reducing our use of polluting fossil fuels? Geothermal energy is an efficient, non-polluting solution but in certain cases geothermal operations must be handled with care. Reaching the most powerful sources of available energy means drilling deep into the layers of the earth's crust to find geothermal fluids with high energy content (hot water and gas released by magma). Yet, the deeper we drill the greater are the subsurface unknowns controlling the stability of the Earth's crust. Destabilising the precarious equilibrium at depth with geothermal wells may reactivate the geological layers causing earthquakes. | Researchers at the University of Geneva (UNIGE), Switzerland, working in collaboration with the University of Florence and the National Research Council (CNR) in Italy, have studied the seismic activity linked to a geothermal drilling in search of supercritical fluids. They discovered that the drilling did not cause uncontrolled seismic activity. This drilling under such critical conditions suggests that the technology is on the verge of mastering geothermal energy, paving the way for new sources of non-polluting heat and electricity.The scientific community agrees that COThe Larderello geothermal field in Tuscany -- the world's oldest -- currently produces 10% of the world's total geothermal electricity supply. We know that at about 3,000 metres depth, we reach a geological layer marked by a seismic reflector, where it is thought that supercritical fluids may be found. Supercritical fluids yield an enormous amount of renewable energy. The term supercritical implies an undefined phase state -- neither fluids nor gaseous -- and boast a very powerful energy content."Engineers have been trying since the 1970s to drill down to this famous level at 3,000 metres in Larderello but they still haven't succeeded," explains Riccardo Minetto, a researcher in UNIGE's Department of Earth Sciences. "What's more, we still don't know exactly what this bed is made up of: is it a transition between molten and solid rocks? Or does it consist of cooled granites releasing fluids trapped at this level?" The technology is becoming ever more sophisticated. Because of this geothermal drilling in search of supercritical conditions has been attempted once more at Larderello-Tavale. The aim? Deepening a wellbore few centimetres wide to a depth of 3,000 metres to tap these supercritical fluids. "This drilling, which formed part of the European DESCRAMBLE project, was unique because it targeted the suggested transition between rocks in a solid and molten state," continues professor Lupi.The Geneva team set up eight seismic stations around the well within a radius of eight kilometres to measure the impact of the drilling on seismic activity. As the drilling progressed, the geophysicists collected the data and analysed each difficulty that was encountered. "The good news is that for the very first time, drilling in search of supercritical fluids caused only minimal seismic disturbance, which was a feat in such conditions and a strong sign of the technological progress that has been made," explains professor Lupi. His team used the eight seismic stations to distinguish between the natural seismic activity and the very weak events caused by the drilling. The threshold of 3,000 metres, however, was not reached. "The engineers had to stop about 250 metres from this level as a result of the extremely high temperature increase -- over 500 degrees. There's still room for technical progress on this point," says Minetto.This study indicates that the supercritical drilling went well and that the technology is close to being mastered. "Until now, anyone who had tried to sink a well in supercritical conditions did not succeed because of the high temperatures but the results here are extremely encouraging," says professor Lupi. Switzerland is itself very active in promoting geothermal energy. This renewable source of energy if developed further would share some of the burden of the country's hydropower, solar and windpower. "Geothermal energy could be one of the main sources of energy of our future, so it's only right to promote future investments to develop it further and safely," concludes the Geneva-based researcher. | Earthquakes | 2,020 |
February 4, 2020 | https://www.sciencedaily.com/releases/2020/02/200204121502.htm | Peeking at the plumbing of one of the Aleutian's most-active volcanoes | A new approach to analyzing seismic data reveals deep vertical zones of low seismic velocity in the plumbing system underlying Alaska's Cleveland volcano, one of the most-active of the more than 70 Aleutian volcanoes. The findings are published in | Arc volcanoes like Cleveland form over plate boundaries where one tectonic plate slides beneath another. They are linked to the Earth's mantle by complex subsurface structures that cross the full thickness of the planet's crust. These structures are more complex than the large chambers of molten rock that resemble a textbook illustration of a volcano. Rather, they comprise an interlaced array of solid rock and a "mushy" mix of partially molten rock and solid crystals.Resolving this subterranean architecture is crucial for emergency planning and saving human lives. But these regions have been difficult to image.Since it's impossible for humans to directly observe the depths of our planet's interior, scientists need instruments to help them visualize what's happening down there. Traditionally, a variety of geophysical and geochemical approaches are deployed to determine the structures that exist beneath a volcano.For example, the seismic waves caused by earthquakes can be used like an ultrasound to map the Earth's interior. But for this to work, the waves must reach the subterranean structures that the scientists want to study. Although Cleveland has frequent gas emissions, explosions, and ash deposits at its surface, there is very little evidence of seismic activity deep beneath the volcano. This makes imaging the architecture of the lower and middle crust below Cleveland very challenging.Until now, the number of instruments needed to use seismic waves traveling from more-distant earthquakes for imaging was prohibitive.In this work, Janiszewski demonstrated a novel technique that uses seismic waves coming from distant earthquakes but isolates just the part of them that is affected by moving through the boundary between the Earth's mantle and crust. This allowed Janiszewski to build models that better distinguish the partially molten regions from the surrounding solid rock in these difficult-to-reach depths beneath Cleveland volcano without requiring a much-more-extensive number of seismic stations at the surface."We revealed the volcano's deep subterranean structure in never-before-seen detail, using fewer instruments by an order of magnitude than is typical for detailed seismic imaging at volcanoes," Janiszewski said.Unlike typical seismic imaging experiments that deploy dozens of seismometers, this study used only eight. Six of these stations were deployed as part of the NSF-funded Islands of the Four Mountains experiment between August 2015 and July 2016. Two were permanent Alaska Volcano Observatory stations."The technique will allow imaging of structures underneath volcanoes where there are only a few stations, or where a lack of deep earthquakes in the vicinity makes other methods difficult," Janiszewski added.This work was supported by the NSF GeoPRISMS program [grant EAR-1456939 to DCR] and the Alaska Volcano Observatory. | Earthquakes | 2,020 |
January 30, 2020 | https://www.sciencedaily.com/releases/2020/01/200130173603.htm | Pre-eruption seismograms recovered for 1980 Mount St. Helens event | Nearly 40 years ago, analog data tapes faithfully recorded intense seismic activity in the two months before the historic eruption of Mount St. Helens in Washington State in May 1980. It took some lengthy and careful restoration efforts -- including a turn in a kitchen oven for some of the tapes -- to recover their data. | The data provide a near-continuous sequence of seismic activity leading up to the 18 May eruption, but they do not appear to contain any significant change in the seismic signals that would have hinted to researchers "that something big was coming," said Stephen Malone, an emeritus professor at the University of Washington and the former director of the Pacific Northwest Seismic Network.In The Mount St. Helens tapes "are a unique data set for the time, in that there aren't very many cases where you have a volcanic earthquake sequence, certainly not one that was as wild and crazy and active as St. Helens, for which you have data other than on paper records," Malone said.Digitally transformed, the tape data have been archived at the Incorporated Research Institutions for Seismology (IRIS) Data Management Center. Malone said few researchers have accessed them so far.The quality of the data is less impressive than today's digital recordings, he noted, but combined with modern software and techniques, they could give researchers new insights into volcanic systems, and potentially into the May 1980 eruption.Even though the initial analysis of the tapes did not reveal anything that would have suggested an imminent major eruption, Malone said, "if someone did a systematic look at these data using much more modern analysis tools, they might see a gradual change or something that did progress in a subtle way."The tapes come from seismic stations installed by the Pacific Northwest Seismic Network after a March 1980 magnitude 4.2 earthquake near the volcano. Every five days, the tapes at the stations had to be serviced by hand. Some of the stations used radio telemetry to transfer their data to a digital recording system at the PNSN labs, but the system was designed so that only the largest earthquakes triggered recordings for later analysis.Malone recalls the frantic activity at the lab at the time, where he and his colleagues were attempting to keep up with the incoming seismic data "but also interpreting it in real time for the benefit of the volcano hazards people," he said. "We kept on top of things the best we could, but retaining data for longevity was a lower priority then."The five-day tapes, along with larger tapes that held some of the telemetered data and were changed irregularly at the lab, were forgotten in storage until Malone's retirement in 2007. He worked with them off and on for years to see what data might still be recovered.Techniques from the audio recording industry helped Malone figure out how to proceed. For instance, he learned that baking the tapes could help stabilize them so that they could be spun and read. "If you bake the tapes, actually cook them at low temperatures for a day, it sets the binder oxide on the tape such that it won't be scraped off as easily." Malone and his wife experimented with this technique using a home oven, "and we were able to recover data with pretty good fidelity," he said.For the larger tapes, Malone turned to a Canadian audio recording professional who had some experience with analog seismic data tapes used by oil and gas companies, and who had the equipment to spin the tape reels. The recovery was paid with the help of a U.S. Geological Survey grant that provides funds to recover old seismic data.Malone said the recovery "was a little like gambling. I didn't know how good the data would be until we had processed the whole tape. But I've been modestly pleased by the results."All of the analog tapes were discarded after Malone determined that no more data could be recovered from them. | Earthquakes | 2,020 |
January 28, 2020 | https://www.sciencedaily.com/releases/2020/01/200128114613.htm | Upper-plate earthquakes caused uplift along New Zealand's Northern Hikurangi Margin | Earthquakes along a complex series of faults in the upper plate of New Zealand's northern Hikurangi Subduction Margin were responsible for coastal uplift in the region, according to a new evaluation of local marine terraces. | The findings, reported in the Using radiocarbon and other methods to date the marine terraces at two North Island sites, Puatai Beach and Pakarae River mouth, Nicola Litchfield of GNS Science and her colleagues conclude that the uplift events that created the terraces occurred at different times between the two sites. This suggests that the uplift was not the result of subduction earthquakes or single-fault upper plate earthquakes.The pattern of uplift seen in the marine terraces led the researchers to map new offshore faults in the region, which they think may be one source of these upper-plate earthquakes, said Litchfield.The Hikurangi Subduction Margin lies along the eastern edge of North Island, where the Pacific and Australian tectonic plates collide and the Pacific plate slips under the island. Recent New Zealand earthquakes involving multiple fault ruptures and coastal deformation, such as the magnitude 7.8 Kaikoura earthquake in 2016, have prompted seismologists to evaluate the mechanisms behind these complicated sequences, Litchfield said -- especially along the remote areas of the northern Margin where there have been fewer studies overall.Marine terraces are created when shorelines are raised above sea level by uplift at the coast, and they record the time and amount of uplift. Based on the geological evidence from other sites in New Zealand, "we are confident that each terrace represents an individual earthquake," Litchfield said.Previous radiocarbon dating studies suggested that the youngest marine terraces at Puatai Beach and Pakarae River mouth were created at the same times. But Litchfield and colleagues decided to revisit these dates with a more comprehensive examination of the terraces. At each site, the researchers did extensive trenching "to see what the stratigraphy was, and to carefully sample, and then use more than radiocarbon dating techniques to get high resolution ages," said Litchfield.The researchers were able to use a layer of volcanic ash, along with radiocarbon dating of beach shells, to determine ages for each terrace at each site. At Puatai Beach, the terraces correspond to three earthquakes that occurred between 1710 and 1770 years ago, 910 and 1100 years ago, and 250 and 420 years ago. At Pakarae River mouth, the terraces correspond to earthquakes that took place between 530 and 660 years ago and between 1290 and 1490 years ago.The different terrace ages at each site combined with modeling of uplift from earthquakes on newly mapped offshore faults allowed the researchers to rule out a subduction earthquake or single-fault upper-plate earthquake as the cause of uplift.Researchers will need to learn more about the extent and orientation of the newly mapped offshore faults, and model how they might rupture together, to fully evaluate how they impact overall seismic hazard, Litchfield said. "Simply having more faults offshore, some of them quite close, means that there is more earthquake and tsunami hazard," she noted. "We don't know yet how that might be balanced by the fact that there is less subduction earthquake hazard in the model, though." | Earthquakes | 2,020 |
January 27, 2020 | https://www.sciencedaily.com/releases/2020/01/200127075257.htm | Seismic biomarkers in Japan Trench fault zone reveal history of large earthquakes | In the aftermath of the devastating Tohoku-Oki earthquake that struck off the coast of Japan in March 2011, seismologists were stunned by the unprecedented 50 meters of shallow displacement along the fault, which ruptured all the way to the surface of the seafloor. This extreme slip at shallow depths exacerbated the massive tsunami that, together with the magnitude 9.1 earthquake, caused extensive damage and loss of life in Japan. | In a new study, published January 27 in "We found evidence of many large earthquakes that have ruptured to the seafloor and could have generated tsunamis like the one that struck in 2011," said coauthor Pratigya Polissar, associate professor of ocean sciences at UC Santa Cruz.Japanese researchers looking at onshore sediment deposits have found evidence of at least three similar tsunamis having occurred in this region at roughly 1,000-year intervals. The new study suggests there have been even more large earthquakes on this fault zone than those that left behind onshore evidence of big tsunamis, said coauthor Heather Savage, associate professor of Earth and planetary sciences at UC Santa Cruz.Savage and Polissar have developed a technique for assessing the history of earthquake slip on a fault by analyzing organic molecules trapped in sedimentary rocks. Originally synthesized by marine algae and other organisms, these "biomarkers" are altered or destroyed by heat, including the frictional heating that occurs when a fault slips during an earthquake. Through extensive laboratory testing over the past decade, Savage and Polissar have developed methods for quantifying the thermal evolution of these biomarkers and using them to reconstruct the temperature history of a fault.The Japan Trench Fast Drilling Project (JFAST) drilled into the fault zone in 2012, extracting cores and installing a temperature observatory. UCSC seismologist Emily Brodsky helped organize JFAST, which yielded the first direct measurement of the frictional heat produced by the fault slip during an earthquake (see earlier story). This heat dissipates after the earthquake, however, so the signal is small and transient."The biomarkers give us a way to detect permanent changes in the rock that preserve a record of heating on the fault," Savage said.For the new study, the researchers examined the JFAST cores, which extended through the fault zone into the subducting plate below. "It's a complex fault zone, and there were a lot of faults throughout the core. We were able to say which faults had evidence of large earthquakes in the past," Savage said.One of their goals was to understand whether some rock types in the fault zone were more prone to large slip in an earthquake than other rocks. The cores passed through layers of mudstones and clays with different frictional strengths. But the biomarker analysis showed evidence of large seismic slip on faults in all the different rock types. The researchers concluded that differences in frictional properties do not necessarily determine the likelihood of large shallow slip or seismic hazard.Savage and Polissar began working on the biomarker technique as postdoctoral researchers at UC Santa Cruz, publishing their first paper on it with Brodsky in 2011. They continued developing it as researchers at the Lamont-Doherty Earth Observatory of Columbia University, before returning to UC Santa Cruz as faculty members in 2019. Hannah Rabinowitz, the first author of the new paper, worked with them as a graduate student at Columbia and is now at the U.S. Department of Energy."We've tested this technique in different rocks with different ages and heating histories, and we can now say yes, there was an earthquake on this fault, and we can tell if there was a large one or many small ones," Savage said. "We can now take this technique to other faults to learn more about their histories."In addition to Rabinowitz, Savage, and Polissar, the coauthors of the paper include Christie Rowe and James Kirkpatrick at McGill University. This work was funded by the National Science Foundation. The JFAST project was sponsored by the International Ocean Drilling Program (IODP). | Earthquakes | 2,020 |
January 23, 2020 | https://www.sciencedaily.com/releases/2020/01/200123115911.htm | Evidence to explain behavior of slow earthquakes | A team of researchers at the University of Ottawa has made an important breakthrough that will help better understand the origin and behavior of slow earthquakes, a new type of earthquake discovered by scientists nearly 20 years ago. | These earthquakes produce movement so slow -- a single event can last for days, even months -- that they are virtually imperceptible. Less fearsome and devastating than regular earthquakes, they do not trigger seismic waves or tsunamis. They occur in regions where a tectonic plate slides underneath another one, called ''subduction zone faults'', adjacent but deeper to where regular earthquakes occur. They also behave very differently than their regular counterparts. But how? And more importantly: why?Pascal Audet, Associate Professor in the Department of Earth and Environmental Sciences at uOttawa, along with his seismology research group (Jeremy Gosselin, Clément Estève, Morgan McLellan, Stephen G. Mosher and former uOttawa postdoctoral student Andrew J. Schaeffer), were able to find answers to these questions."Our work presents unprecedented evidence that these slow earthquakes are related to dynamic fluid processes at the boundary between tectonic plates," said first author and uOttawa PhD student, Jeremy Gosselin. "These slow earthquakes are quite complex, and many theoretical models of slow earthquakes require the pressure of these fluids to fluctuate during an earthquake cycle."Using a technique similar to ultrasound imagery and recordings of earthquakes, Audet and his team were able to map the structure of the Earth where these slow earthquakes occur. By analyzing the properties of the rocks where these earthquakes happened, they were able to reach their conclusions.In fact, in 2009, Professor Audet had himself presented evidence that slow earthquakes occurred in regions with unusually high fluid pressures within the Earth."The rocks at those depths are saturated with fluids, although the quantities are minuscule," explained Professor Pascal Audet. "At a depth of 40 km, the pressure exerted on the rocks is very high, which normally tends to drive the fluids out, like a sponge that someone squeezes. However, these fluids are imprisoned in the rocks and are virtually incompressible; the fluid pressure therefore rises to very high values, which essentially weakens the rocks and generates slow earthquakes."Several studies over the past years had suggested these events are related to dynamic changes in fluid pressure, but until now, no conclusive empirical evidence had been established. "We were keen to repeat Professor Audet's previous work to look for time-varying changes in fluid pressures during slow earthquakes," explained Jeremy Gosselin. "What we discovered confirmed our suspicions and we were able to establish the first direct evidence that fluid pressures do, in fact, fluctuate during slow earthquakes." | Earthquakes | 2,020 |
January 22, 2020 | https://www.sciencedaily.com/releases/2020/01/200122134917.htm | Complex rupturing during 2019 Ridgecrest, California, sequence | The 2019 Ridgecrest earthquake sequence, which startled nearby California residents over the 4 July holiday with magnitude 6.4 and magnitude 7.1 earthquakes, included 34,091 earthquakes overall, detailed in a high-resolution catalog created for the sequence. | The catalog, developed by David Shelly at the U.S. Geological Survey in Golden, Colorado, was published in the Data Mine column in "Because of the complexity in this sequence, I think there are still a lot of unanswered questions about what the important aspects of the triggering and evolution of this sequence were, so having this catalog can help people make more progress on answering those questions," said Shelly.Shelly used a technique called template matching, which scanned through seismic signals to find those matching the "fingerprint" of 13,525 known and cataloged earthquakes, as well as precise relative relocation techniques to detect 34,091 earthquakes associated with the event. Most of the earthquakes were magnitude 2.0 or smaller.The catalog covers the time period spanning the the foreshock sequence leading up to the 4 July 2019 magnitude 6.4 earthquake through the first 10 days of aftershocks following the magnitude 7.1 earthquake on 5 July.By precisely locating the earthquakes, Shelly was able to discern several crosscutting fault structures in the region, with mostly perpendicular southwest- and northwest strikes. The foreshocks of the magnitude 6.4 event aligned on a northwest-striking fault that appears to have ruptured further in the aftershocks of that earthquake, along with a southwest-striking fault where a surface rupture was observed by teams who went out to the site.Shelly said the magnitude 7.1 earthquake appears to have started at the northwestern edge of the magnitude 6.4 rupture, extending to the northwest and southeast and possibly extending that rupture to the northwest and southeast. The magnitude 7.1 event was highly complex, with several southwest-striking alignments and multi-fault branching and high rates of aftershocks, especially at the northwestern end of the rupture.The Ridgecrest earthquakes took place along "a series of immature faults, in the process of developing," Shelly said, noting that this could explain in part why the earthquake sequence was so complex. Compared to the mature San Andreas Fault Zone to the west, which accommodates about half of the relative plate motion as the Pacific and North American tectonic plates collide, the Ridgecrest faults are broadly part of the Eastern California Shear Zone, where multiple faults accommodate up to 25 percent of this tectonic strain.Shelly noted that the catalog benefitted from the long-established, densely instrumented, real-time seismic network that covers the region. "When there's a big earthquake in an area that's not well-covered, people rush out to try to at least cover the aftershocks with great fidelity," he explained. "Here, having this permanent network makes it so you can evaluate the entire earthquake sequence, starting with the foreshock data, to learn more about the earthquake physics and processes." | Earthquakes | 2,020 |
January 22, 2020 | https://www.sciencedaily.com/releases/2020/01/200122122119.htm | Signals from inside Earth: Borexino experiment releases new data on geoneutrinos | Scientists involved in the Borexino collaboration have presented new results for the measurement of neutrinos originating from the interior of the Earth. The elusive "ghost particles" rarely interact with matter, making their detection difficult. With this update, the researchers have now been able to access 53 events -- almost twice as many as in the previous analysis of the data from the Borexino detector, which is located 1,400 metres below the Earth's surface in the Gran Sasso massif near Rome. The results provide an exclusive insight into processes and conditions in the earth's interior that remain puzzling to this day. | The earth is shining, even if it is not at all visible to the naked eye. The reason for this is geoneutrinos, which are produced in radioactive decay processes in the interior of the Earth. Every second, about one million of these elusive particles penetrate every square centimetre of our planet's surface.The Borexino detector, located in the world's largest underground laboratory, the Laboratori Nazionali del Gran Sasso in Italy, is one of the few detectors in the world capable of observing these ghostly particles. Researchers have been using it to collect data on neutrinos since 2007, i.e. for over ten years. By 2019, they were able to register twice as many events as at the time of the last analysis in 2015 -- and reduce the uncertainty of the measurements from 27 to 18 percent, which is also due to new analysis methods."Geoneutrinos are the only direct traces of the radioactive decays that occur inside the Earth, and which produce an as yet unknown portion of the energy driving all the dynamics of our planet," explains Livia Ludhova, one of the two current scientific coordinators of Borexino and head of the neutrino group at the Nuclear Physics Institute (IKP) at Forschungszentrum Jülich.The researchers in the Borexino collaboration have extracted with an improved statistical significance the signal of geoneutrinos coming from the Earth's mantle which lies below the Earth crust by exploiting the well-known contribution from the Earth's uppermost mantle and crust -- the so called lithosphere.The intense magnetic field, the unceasing volcanic activity, the movement of the tectonic plates, and mantle convection: The conditions inside the Earth are in many ways unique in the entire solar system. Scientists have been discussing the question of where the Earth's internal heat comes from for over 200 years."The hypothesis that there is no longer any radioactivity at depth in the mantle can now be excluded at 99% confidence level for the first time. This makes it possible to establish lower limits for uranium and thorium abundances in the Earth's mantle," says Livia Ludhova.These values are of interest for many different Earth model calculations. For example, it is highly probable (85%) that radioactive decay processes inside the Earth generate more than half of the Earth's internal heat, while the other half is still largely derived from the original formation of the Earth. Radioactive processes in the Earth therefore provide a non-negligible portion of the energy that feeds volcanoes, earthquakes, and the Earth's magnetic field.The latest publication in Phys. Rev. D not only presents the new results, but also explains the analysis in a comprehensive way from both the physics and geology perspectives, which will be helpful for next generation liquid scintillator detectors that will measure geoneutrinos. The next challenge for research with geoneutrinos is now to be able to measure geoneutrinos from the Earth's mantle with greater precision, perhaps with detectors distributed at different positions on our planet. One such detector will be the JUNO detector in China where the IKP neutrino group is involved. The detector will be 70 times bigger than Borexino which helps in achieving higher statistical significance in a short time span. | Earthquakes | 2,020 |
January 17, 2020 | https://www.sciencedaily.com/releases/2020/01/200117080829.htm | 'Melting rock' models predict mechanical origins of earthquakes | Engineers at Duke University have devised a model that can predict the early mechanical behaviors and origins of an earthquake in multiple types of rock. The model provides new insights into unobservable phenomena that take place miles beneath the Earth's surface under incredible pressures and temperatures, and could help researchers better predict earthquakes -- or even, at least theoretically, attempt to stop them. | The results appear online on January 17 in the journal "Earthquakes originate along fault lines deep underground where extreme conditions can cause chemical reactions and phase transitions that affect the friction between rocks as they move against one another," said Hadrien Rattez, a research scientist in civil and environmental engineering at Duke. "Our model is the first that can accurately reproduce how the amount of friction decreases as the speed of the rock slippage increases and all of these mechanical phenomena are unleashed."For three decades, researchers have built machines to simulate the conditions of a fault by pushing and twisting two discs of rock against one another. These experiments can reach pressures of up to 1450 pounds per square inch and speeds of one meter per second, which is the fastest underground rocks can travel. For a geological reference point, the Pacific tectonic plate moves at about 0.00000000073 meters per second."In terms of ground movement, these speeds of one meter per second are incredibly fast," said Manolis Veveakis, assistant professor of civil and environmental engineering at Duke. "And remember that friction is synonymous with resistance. So if the resistance drops to zero, the object will move abruptly. This is an earthquake."In these experiments, the surface of the rocks either begins to turn into a sort of gel or to melt, lowering the coefficient of friction between them and making their movement easier. It's been well established that as the speed of these rocks relative to one another increases to one meter per second, the friction between them drops like a rock, you might say, no matter the type. But until now, nobody had created a model that could accurately reproduce these behaviors.In the paper, Rattez and Veveakis describe a computational model that takes into account the energy balance of all the complicated mechanical processes taking place during fault movement. They incorporate weakening mechanisms caused by heat that are common to all types of rock, such as mineral decomposition, nanoparticle lubrication and melting as the rock undergoes a phase change.After running all of their simulations, the researchers found that their new model accurately predicts the drop in friction associated with the entire range of fault speeds from experiments on all available rock types including halite, silicate and quartz.Because the model works well for so many different types of rock, it appears to be a general model that can be applied to most situations, which can reveal new information about the origins of earthquakes. While researchers can't fully recreate the conditions of a fault, models such as this can help them extrapolate to higher pressures and temperatures to get a better understanding of what is happening as a fault builds toward an earthquake."The model can give physical meaning to observations that we usually cannot understand," Rattez said. "It provides a lot of information about the physical mechanisms involved, like the energy required for different phase transitions.""We still cannot predict earthquakes, but such studies are necessary steps we need to take in order to get there," said Veveakis. "And in theory, if we could interfere with a fault, we could track its composition and intervene before it becomes unstable. That's what we do with landslides. But, of course, fault lines are 20 miles underground, and we currently don't have the drilling capacity to go there."This work was supported by the Southern California Earthquake Center (118062196) under the National Science Foundation (EAR-1033462) and the United States Geological Survey (G12AC20038). | Earthquakes | 2,020 |
January 15, 2020 | https://www.sciencedaily.com/releases/2020/01/200115164015.htm | Slow-motion interplate slip detected in the Nankai Trough near Japan | Earthquakes are generally thought of as abrupt, violent events that last for only moments. However, movement of the Earth's tectonic plates is often less sudden and more sustained -- slow earthquakes can last for hours, months, or even longer. | Researchers led by The University of Tokyo have discovered signals due to slow slip events with large amounts of slip along the Nankai Trough subduction zone just southeast of Japan, reported in a new study published in Slow earthquakes such as these slow slip events are exceedingly difficult to detect using conventional seismological techniques, especially in offshore areas. However, because understanding slow earthquake events is essential for evaluating the risks posed by more violent seismic events like megathrust earthquakes, new methods have been developed to address this problem.For this research, a Global Navigation Satellite System-Acoustic ranging (GNSS-A) combination technique was used to monitor changes in the absolute position of the seafloor. According to study first author Yusuke Yokota, "GNSS-A was first proposed in the 1980s, but has really been developed and applied over the last two decades for detection of slow earthquakes. This method combines satellite detection of movements at the sea surface with undersea data from an acoustic ranging system, and can reliably detect deformation of 5 cm or more in offshore areas."Slip sites generally were detected below the shallow undersea interplate boundary of the trough, adjacent to regions with strong interplate coupling. Recurrent offshore slow slip event signals were detected in the Kii and Bungo Channel areas, and were correlated with strong activity of very low frequency seismic events.Slow slip event signals were not clearly identified in the Tosa Bay or Enshu-nada regions, although this may have been a matter of insufficient resolution. However, their absence may also support the possibility that these are the main slip regions of the megathrust zone of the Nankai Trough."Differences in the features between these regions may be related to earthquake history and reflect different friction conditions," explains Dr. Yokota. "Detailed understanding of these friction conditions and how they relate spatiotemporally to megathrust earthquake events is essential for accurate earthquake simulation. Therefore, studying these newly discovered slow slip events in the Nankai Trough will contribute to earthquake disaster prevention and preparedness."In addition to informing earthquake disaster research, these findings also shed light on the marine sedimentary environment and plate tectonics of the Nankai Trough. | Earthquakes | 2,020 |
January 7, 2020 | https://www.sciencedaily.com/releases/2020/01/200107104942.htm | Magnitude of Great Lisbon Earthquake may have been lower than previous estimates | The magnitude of the Great Lisbon Earthquake event, a historic and devastating earthquake and tsunami that struck Portugal on All Saints' Day in 1755, may not be as high as previously estimated. | In his study published in the Fonseca's analysis also locates the epicenter of the 1755 earthquake offshore of the southwestern Iberian Peninsula, and suggests the rupture was a complicated one that may have involved faulting onshore as well. This re-evaluation could have implications for the seismic hazard map of the region, he said.The current maps are based on the assumption that most of the region's crustal deformation is contained in large offshore earthquakes, without a significant onshore component. "While the current official map assigns the highest level of hazard to the south of Portugal, gradually diminishing toward the north, the interpretation now put forward concentrates the hazard in the Greater Lisbon area," said Fonseca.The 1755 Lisbon earthquake and tsunami event, along with the fires it caused that burned for hours in the city, is considered one of the deadliest earthquake events in history, leading to the deaths of about 12,000 people. The devastation had a significant impact on Portugal's economy and its political power within Europe, and its philosophical and theological implications were widely discussed by Enlightenment scholars from Voltaire to Immanuel Kant.The widespread devastation led earlier seismologists to estimate a high magnitude for the earthquake. With modern modeling techniques and a better understanding of the region's tectonics, Fonseca thought it important to revisit the estimate. The 1755 earthquake is unusual in that it produced extreme damage hundreds of kilometers from its epicenter without any of the accompanying geological conditions -- like amplification of seismic waves in a loose sedimentary basin, for instance -- that normally cause such severe site effects."Explanations put forward for the extreme damage in Lisbon tend to invoke abnormally low attenuation of seismic energy as the waves move away from the epicenter, something that is not to be observed anywhere else in the globe," Fonseca explained. "Current attempts to harmonize seismic hazard assessment across Europe are faced with large discrepancies in this region, which need to be investigated and resolved for a better mitigation and management of the risk through building codes and land use planning."Fonseca used 1206 points of macroseismic data to reassess the 1755 earthquake's magnitude and epicenter. The analysis and modeling also indicate that some of the very high earthquake intensities reported in the region's nearby Lower Tagus Valley and the Algarve may have been due to two separate onshore earthquakes in these locations. These earthquakes, which took place a few minutes after the offshore rupture, may have been triggered by the first earthquake, Fonseca suggests.The new magnitude estimate for the 1755 earthquake is similar to that of another large regional earthquake, the 1969 magnitude 7.8 Gorringe Bank quake. However, the damage from the Gorringe Bank earthquake was much less severe, possibly in part because the onshore faults had not accumulated enough stress to make them "ripe to rupture," Fonseca says. "The Lower Tagus Fault, near Lisbon, ruptured in 1909, in 1531 and likely in 1344. It is plausible that it was good to go in 1755, but still halfway through the process of accumulating stress in 1969."Fonseca also suggests that the destructive size of the 1755 accompanying tsunami might be due more to the presence of a large sedimentary body produced by past subduction, called an accretionary wedge, on the ocean bottom in the Gulf of Cadiz. When a fault rupture moves through this wedge, it can generate a tsunami even without an extreme magnitude rupture, he said. | Earthquakes | 2,020 |
January 6, 2020 | https://www.sciencedaily.com/releases/2020/01/200106125127.htm | Formation of a huge underwater volcano offshore the Comoros | A new submarine volcano was formed off the island of Mayotte in the Indian Ocean in 2018. This was shown by an oceanographic campaign in May 2019. Now an international team led by the scientist Simone Cesca from the German Research Centre for Geosciences GFZ has illuminated the processes deep inside the Earth before and during the formation of the new volcano. | It is as if the researchers had deciphered a new type of signal from the Earth's interior that indicates a dramatic movement of molten rocks before the eruption. With their specially developed seismological methods, the researchers are reconstructing the partial emptying of one of the deepest and largest active magma reservoirs ever discovered in the upper mantle. The study was published in the journal Since May 2018, an unusual sequence of earthquakes has been recorded off-coast the island of Mayotte in the Comoros archipelago between Africa and Madagascar. Seismic activity began with a swarm of thousands of 'seemingly tectonic' earthquakes, culminating in an earthquake of magnitude 5.9 in May 2018. Mostly since June 2018, however, a completely new form of earthquake signal has emerged that was so strong that it could be recorded up to a thousand kilometres away. These 20 to 30 minute long signals are characterized by particularly harmonic, low frequencies, almost monochromatic, similar to a large bell or a double bass, and are called Very Long Period (VLP) signals. Although the centre of the seismic activity was located almost 35 kilometres offshore the east of the island, a continuous lowering and eastward motion of the earth's surface at Mayotte had begun at the same time as the massive swarms of VLP events started, accumulating to almost 20 centimetres to date.Although there was no evidence of earlier volcanic activity in the epicentre of the seismic activity, GFZ scientists had suspected magmatic processes from the beginning, as quake swarms in the upper earth crust often arise as a reaction to the rise of magma and VLPs in earlier years were associated with the collapse of large caldera volcanoes. The special frequency content of the VLP signals is caused by the resonance oscillation of the buried magma chamber. The deeper the vibrations, the larger the magma reservoir. However, the earthquake swarms under the ocean floor were much deeper than with other volcanoes and the resonance tones of the VLPs were unusually low and strong.An international team led by GFZ scientist Simone Cesca analysed seismological and geodetic data from the region to study these observations and their evolution over time. However, the investigations were complicated by the fact that there was no seismic network on the ocean floor and therefore only measurements were available at great distances on Mayotte, Madagascar and in Africa. "We tried to improve the unfavourable initial situation by developing special new analytical methods such as cluster and directional beam methods," says Cesca.The team identified different activity phases within the sequence of events from May 2018 to today. The initial swarm phase indicated a rapid upward movement of magma from a deep mantle reservoir more than 30 kilometres below the Earth's surface. Once an open channel had formed from the Earth's mantle to the seabed, the magma began to flow unhindered and form a new underwater volcano. A French oceanographic campaign recently confirmed the formation of the submarine volcano, whose location coincides with the reconstructed magma rise.In this phase, the apparent tectonic earthquake activity decreased again, while the lowering of the ground on the island of Mayotte began. Likewise, long-lasting monofrequency VLP signals started. "We interpret this as a sign of the collapse of the deep magma chamber off the coast of Mayotte," explains Eleonora Rivalta, co-author of the scientific team. "It is the deepest (~30 km) and largest magma reservoir in the upper mantle (more than 3.4 cubic kilometres) to date, which is beginning to empty abruptly.""Since the seabed lies 3 kilometres below the water surface, almost nobody noticed the enormous eruption. However, there are still possible hazards for the island of Mayotte today, as the Earth's crust above the deep reservoir could continue to collapse, triggering stronger earthquakes," says Torsten Dahm, Head of the section Physics of Earthquakes and Volcanoes at the GFZ. | Earthquakes | 2,020 |
December 18, 2019 | https://www.sciencedaily.com/releases/2019/12/191218090220.htm | Submarine cables to offshore wind farms transformed into a seismic network | An international team of geoscientists led by Caltech has used fiber optic communications cables stationed at the bottom of the North Sea as a giant seismic network, tracking both earthquakes and ocean waves. | The project was, in part, a proof of concept. Oceans cover two-thirds of the earth's surface, but placing permanent seismometers under the sea is prohibitively expensive. The fact that the fiber network was able to detect and record a magnitude-8.2 earthquake near Fiji in August 2018 proves the ability of the technology to fill in some of the massive blind spots in the global seismic network, says Caltech graduate student Ethan Williams (MS '19). Williams is the lead author of a study on the project that was published by "Fiber optic communications cables are growing more and more common on the sea floor. Rather than place a whole new device, we can tap into some of this fiber and start observing seismicity immediately," Williams says.The project relies on a technology called distributing acoustic sensing, or DAS. DAS was developed for energy exploration but has been repurposed for seismology. DAS sensors shoot a beam of light down a fiber optic cable. Tiny imperfections in the cable reflect back miniscule amounts of the light, allowing the imperfections to act as "waypoints." As a seismic wave jostles the fiber cable, the waypoints shift minutely in location, changing the travel time of the reflected light waves and thus allowing scientists to track the progression of the wave. The DAS instrument used in this study was built and operated by a team from Spain's University of Alcalá, led by study co-author Miguel Gonzalez-Herraez.Recently, Caltech's Zhongwen Zhan (MS '08, PhD '13) began deploying DAS for seismology. For example, he and his colleagues tracked aftershocks from California's Ridgecrest earthquake sequence using fiber that stretches along the state's 395 freeway and also have tapped into the City of Pasadena's fiber network to create a citywide earthquake-detecting network."Seafloor DAS is a new frontier of geophysics that may bring orders-of-magnitude more submarine seismic data and a new understanding of the deep Earth's interior and major faults," says Zhan, assistant professor of geophysics and coauthor of study.For the North Sea project, Williams, Zhan, and their colleagues employed a 40,000-meter section of fiber optic cable that connects a North Sea wind farm to the shore. There are millions of tiny imperfections in the cable, so they averaged out the imperfections in each 10-meter segment, creating an array of more than 4,000 virtual sensors."With the flip of a switch, we have an array of 4,000 sensors that would've cost millions to place," Williams says.Because of the network's fine degree of sensitivity, the North Sea array was able to track tiny, non-earthquake-related seismic noise (or "microseisms") and found evidence that supports a longstanding theory that the microseisms result from ocean waves.In 1950, mathematician and oceanographer Michael Selwyn Longuet-Higgins theorized that the complex interaction of ocean waves could exert enough of a rolling pressure on the sea floor to generate so-called Scholte waves -- a type of seismic wave that occurs at the interface of a liquid and a solid. By tracking both ocean waves and corresponding microseisms, the North Sea array revealed that the microseisms could be the result of ocean-wave interactions. | Earthquakes | 2,019 |
December 11, 2019 | https://www.sciencedaily.com/releases/2019/12/191211115558.htm | Thunderquakes make underground fiber optic telecommunications cables hum | Telecommunications lines designed for carrying internet and phone service can pick up the rumble of thunder underground, potentially providing scientists with a new way of detecting environmental hazards and imaging deep inside the Earth. | The new research being presented today at AGU's Fall Meeting and published in AGU's The new study used The Pennsylvania State University's existing fiber network for internet and phone service as a distributed sensor array to observe the progress of thunderstorms as they crossed the campus.Traditional seismometers have recorded ground motions evoked by thunder, called thunderquakes, vibrating in the infrasound frequency range, below 20 Hertz, which is inaudible to the human ear. The fiber array, which is buried 1 meter (3 feet) underground, picked up a wider range of the frequencies heard in a peal of thunder. The bandwidth detected, from 20 to 130 Hertz, is consistent with microphone recordings of thunder and provides more information about the event, the study found.Penn State geophysicist Tieyuan Zhu and meteorologist David Stensrud gained access to the university's telecommunication fiber optic cable in April 2019. They were listening for subtle vibrations from a variety of environmental effects, including sinkhole formation and flooding."Once we set up, we found a lot of very strong events in our fiber optic data, so I was very curious, what's the cause of these signals?" said Zhu. The researchers found a match when they synchronized their results with data from the U.S. National Lightning Detection Network. "We thought, yeah, this is exactly the thunderstorm data, actually recorded by our fiber array."The passage of lightning heats the air so fast it creates a shockwave we hear as thunder. Vibrations from loud events like lightning, meteor explosions and aircraft sonic booms pass from the air to Earth's surface, shaking the ground.Fiber optic cables carry telecommunications information in bursts of laser light conducted by strands of transparent glass about as thick as a human hair. Vibrations in the Earth such as those created by thunderstorms, earthquakes or hurricanes stretch or compress the glass fibers, causing a slight change in light intensity and the time the laser pulse takes to travel to its destination. The researchers tracked these aberrations to monitor ground motion, converting the laser pulses back to acoustic signals."The laser is very sensitive. If there is a subtle underground perturbation, the laser can detect that change," said Zhu.Several kilometers of continuous fiber underlay Penn State's campus, which means the array can act like a network of more than 2,000 seismometers emplaced every two meters along the cable path. With this high density of sensors, the researchers can calculate the location where the thunder originated, potentially distinguishing between cloud-to-ground and cloud-to-cloud lightning."Compared to the seismometers, the fiber optic array can provide fabulous spatial, and also temporal, resolution," said Zhu. "We can track the thunderstorm source movement."The researchers said the new study demonstrates fiber optic networks under urban areas are an untapped resource for monitoring environmental hazards. They also hold potential for studying the crust and deep structures of the Earth, which cannot be measured directly.Scientists learn about the inside of the planet by observing the way seismic waves from earthquakes are altered as they pass through it. Ground motions induced by thunderstorms, which are much more frequent than earthquakes on the east coast of North America, could help reveal the hidden shapes of Earth's interior, Zhu said. | Earthquakes | 2,019 |
December 9, 2019 | https://www.sciencedaily.com/releases/2019/12/191209131952.htm | Formula 1 technology for the construction of skyscrapers | City, University London draws on Formula 1 technology for the construction of "needle-like" skyscrapers. | Researchers City, University of London are developing new vibration-control devices based on Formula 1 technology so "needle-like" high-rise skyscrapers which still withstand high winds can be builtCurrent devices called tuned mass dampers (TMDs) are fitted in the top floors of tall buildings to act like heavyweight pendulums counteracting building movement caused by winds and earthquakes. But they weigh up to 1,000 tons and span five storeys in 100-storey buildings -- adding millions to building costs and using up premium space in tight city centres.Recent research work published by Dr Agathoklis Giaralis (an expert in structural dynamics at City, University of London), and his colleagues, published in the November 2019 edition of the Dr Giaralis said: "If we can achieve smaller, lighter TMDs, then we can build taller and thinner buildings without causing seasickness for occupants when it is windy. Such slender structures will require fewer materials and resources, and so will cost less and be more sustainable, while taking up less space and also being aesthetically more pleasing to the eye. In a city like London, where space is at a premium and land is expensive, the only real option is to go up, so this technology can be a game-changer."Tests have shown that up to 30% less steel is needed in beams and columns of typical 20-storey steel building thanks to the new devices. Computer model analyses for an existing London building, the 48-storey Newington Butts in Elephant and Castle, Southwark, had shown that "floor acceleration" -- the measure of occupants' comfort against seasickness -- can be reduced by 30% with the newly proposed technology."This reduction in floor acceleration is significant," added Dr Giaralis. "It means the devices are also more effective in ensuring that buildings can withstand high winds and earthquakes. Even moderate winds can cause seasickness or dizziness to occupants and climate change suggests that stronger winds will become more frequent. The inerter-based vibration control technology we are testing is demonstrating that it can significantly reduce this risk with low up-front cost in new, even very slender, buildings and with small structural modifications in existing buildings." Dr Giaralis said there was a further advantage:"As well as achieving reduced carbon emissions through requiring fewer materials, we can also harvest energy from wind-induced oscillations -- I don't believe that we are able at the moment to have a building that is completely self-sustaining using this technology, but we can definitely harvest enough for powering wireless sensors used for inner building climate control." | Earthquakes | 2,019 |
December 4, 2019 | https://www.sciencedaily.com/releases/2019/12/191204124543.htm | New, young volcano discovered in the Pacific | Researchers from Tohoku University have discovered a new petit-spot volcano at the oldest section of the Pacific Plate. The research team, led by Associate Professor Naoto Hirano of the Center for Northeast Asian Studies, published their discovery in the in the journal | Petit-spot volcanoes are a relatively new phenomenon on Earth. They are young, small volcanoes that come about along fissures from the base of tectonic plates. As the tectonic plates sink deeper into the Earth's upper mantle, fissures occur where the plate begins to bend causing small volcanoes to erupt. The first discovery of petit-spot volcanoes was made in 2006 near the Japan Trench, located to the northeast of Japan.Rock samples collected from previous studies of petit-spot volcanoes signify that the magma emitted stems directly from the asthenosphere -- the uppermost part of Earth's mantle which drives the movement of tectonic plates. Studying petit-spot volcanoes provides a window into the largely unknown asthenosphere giving scientists a greater understanding of plate tectonics, the kind of rocks existing there, and the melting process undergone below the tectonic plates.The volcano was discovered in the western part of the Pacific Ocean, near Minamitorishima Island, Japan's easternmost point, also known as Marcus Island. The volcano is thought to have erupted less than 3 million years ago due to the subduction of the Pacific Plate deeper into the mantle of the Marina Trench. Previously, this area is thought to have contained only seamounts and islands formed 70-140 million years ago.The research team initially suspected the presence of a small volcano after observing bathymetric data collected by the Japan Coast Guard. They then analyzed rock samples collected by the Shnkai6500, a manned submersible that can dive to depths of 6,500 meters, which observed the presence of volcano."The discovery of this new Volcano provides and exciting opportunity for us to explore this area further, and hopefully reveal further petit-spot volcano," says Professor Hirano. He adds, "This will tell us more about the true nature of the asthenosphere." Professor Hirano and his team will continue to explore the site for similar volcanoes since mapping data demonstrates that the discovered volcano is part of a cluster. | Earthquakes | 2,019 |
December 4, 2019 | https://www.sciencedaily.com/releases/2019/12/191204113223.htm | Seismologists see future in fiber optic cables as earthquake sensors | Each hair-thin glass fiber in a buried fiber optic cable contains tiny internal flaws -- and that's a good thing for scientists looking for new ways to collect seismic data in places from a busy urban downtown to a remote glacier. | In DAS works by using the tiny internal flaws of a long optical fiber as thousands of seismic sensors along tens of kilometers of fiber optic cable. An instrument at one end sends laser pulses down a cable and collects and measures the "echo" of each pulse as it is reflected off the internal fiber flaws.When the fiber is disturbed by changes in temperature, strain or vibrations -- caused by seismic waves, for instance -- there are changes in the size, frequency and phase of laser light scattered back to the DAS instrument. Seismologists can use these changes to determine the kinds of seismic waves that might have nudged the fiber, even if just by a few tens of nanometers.The sensitivity of DAS instruments has improved markedly over the past five years, opening new possibilities for their deployment, Zhan said. "The sensitivity is getting better and better, to the point that a few years ago that if you compare the waveforms from a fiber section with a geophone, they look very similar to each other."Their performance makes them suitable for use across a variety of environments, especially in places where it would be too expensive to set up a more sensitive or dense seismic network. Researchers can also tap into the large amounts of unused or "dark" fiber that has been laid down previously by telecommunication companies and others. A few strands off a larger cable, said Zhan, would serve a seismologist's purposes.Zhan said the oil and gas industry has been one of the biggest drivers of the new method, as they used cable down boreholes to monitor fluid changes in deep-water oil fields and during hydraulic fracturing and wastewater injection.DAS researchers think the method is especially promising for seismic monitoring in harsh environments, like Antarctica -- or the moon. With a regular network of seismometers, scientists "need to protect and power each node" of instruments in the network, Zhan explained. "Where for DAS, you lay down one long strand of fiber, which is fairly sturdy, and all your sensitive instruments are only at one end of the fiber.""You can imagine that on the moon or some other planet, with a high radiation or high temperature scenario, the electronics might not survive that long in that environment," he added. "But fiber can."Scientists are already using DAS to probe thawing and freezing cycles in permafrost and on glaciers, to better characterize their dynamic motion of ice flows and sliding on bedrock, which could help researchers learn more about how glacial melt driven by climate change contributes to sea level rise.At the moment, the range of most DAS systems is 10 to 20 kilometers. Researchers hope to extend this in the near future to 100 kilometers, Zhan said, which could be useful for seismic coverage in ocean bottom environments, including offshore subduction zones.DAS is also well-suited for rapid response after earthquakes, especially in areas where dark fiber is numerous and seismologists have made arrangements to use the fiber beforehand. After the 2019 Ridgecrest earthquakes in southern California, for instance, Zhan and his colleagues moved quickly to monitor the aftershock sequence in the area using DAS. "We turned about 50 kilometers of cable into more than 6,000 sensors in three days," he said.If seismologists have done their legwork in identifying and requesting access to fibers ahead of time, Zhan said, a DAS system can be deployed within a few hours after an earthquake.One challenge in using fiber is knowing exactly how it lies in the ground. With the DAS method, researchers know how far along a fiber a particular sensor lays, but if the fiber optic cable is coiled or bent or sagging, the calculations could be off. To remedy this, seismologists sometimes do a "tap test" -- which maps sledgehammer blows along the ground above the cable with GPS, as the blows reverberate off the fiber to create a sort of sonar image of its twists and turns.DAS sensors also contain more "self-noise" -- background seismic signals that could interfere with an earthquake identification -- than traditional seismic sensors, "but frankly we don't exactly know why," said Zhan. Some of the noise might come from the interrogating laser pulses, which might not be stable, or from the cable itself. Some cables lie loose in their tunnels, and other have multiple fiber connectors, which might produce reflection and loss of the light signal."While still in its infancy, DAS already has shown itself as the working heart -- or perhaps ear drums -- of a valuable new seismic listening tool," Zhan concluded. | Earthquakes | 2,019 |
December 3, 2019 | https://www.sciencedaily.com/releases/2019/12/191203133845.htm | Southern Arizona once looked like Tibet | A University of Wyoming researcher and his colleagues have shown that much of the southwestern United States was once a vast high-elevation plateau, similar to Tibet today. | This work has implications for the distribution of natural resources, such as copper, and provides insight into the formation of mountains during the subduction of tectonic plates."We normally think of southern Arizona and the surrounding areas as hot, cactus-laden deserts with relatively low base elevations, below 3,000 feet," says Jay Chapman, an assistant professor in UW's Department of Geology and Geophysics. "However, our recent research suggests that, during the Late Cretaceous to Early Paleogene period (80-50 million years ago), the region may have had elevations in excess of 10,000 feet and looked more like the Tibetan plateau north of the Himalayan Mountains or the Altiplano in the Andes Mountains in South America."Chapman is lead author of a paper, titled "Geochemical evidence for an orogenic plateau in the southern U.S. and northern Mexican Cordillera during the Laramide Orogeny," which was published online Nov. 22 in the journal Roy Greig, a Ph.D. student in the Department of Geosciences at the University of Arizona, and Gordon Haxel, a U.S. Geological Survey scientist based in Flagstaff, Ariz., are co-authors of the paper. Chapman and his colleagues analyzed the chemistry of igneous rocks to determine how thick Earth's crust was in the past and then related the thickness to elevation."Earth's crust floats in the mantle just like an iceberg floats in the water, with a little bit sticking out above the surface," Chapman says. "When the crust is thicker, the height of mountains and the elevation of the land surface are higher, just like the height of an iceberg sticking out of the water is taller if the overall iceberg is larger."The study determined that the crust in southern Arizona was once almost 60 kilometers thick, which is twice as thick as it is today -- and comparable to how thick the crust is in parts of the Himalayas."While the ancient mountains were forming, magma intruded into the crust and formed rocks like granite," Chapman says. "When the crust was really thick, the magmas experienced extreme pressure from the weight of all the rocks above them, which caused distinctive changes in the types and the chemistry of the minerals that formed those rocks."One of the interesting questions the study raises is how the crust in southern Arizona became so thick in the past."The most common way to make really thick crust is for tectonic plates to converge or collide, which produces large earthquakes and faults that stack rock masses overtop one another," he says. "The prevailing view of southern Arizona is that there was never enough faulting in the area to make the thickness of crust we observe. It is a bit of a conundrum as to how such thick crust was generated."Adam Trzinski, a first-year Ph.D. student at UW, is now tackling this problem and searching for ancient faults in southern Arizona that could help explain how the thick crust became so thick. In addition to helping understand plate tectonic processes, the study may help explain why copper is so abundant in southern Arizona."Several previous studies have noted a correlation between large copper ore deposits and regions of thick crust," Chapman says. "For example, there are many copper mines in the Andes Mountains in Chile. The results of this study strengthen that correlation and may aid in exploration efforts."UW's College of Arts and Sciences and the National Science Foundation (NSF) funded the research. Chapman was awarded a $270,000 grant from the NSF in August to continue his work. | Earthquakes | 2,019 |
November 28, 2019 | https://www.sciencedaily.com/releases/2019/11/191128172354.htm | Underwater telecom cables make superb seismic network | Fiber-optic cables that constitute a global undersea telecommunications network could one day help scientists study offshore earthquakes and the geologic structures hidden deep beneath the ocean surface. | In a paper appearing this week in the journal Their technique, which they had previously tested with fiber-optic cables on land, could provide much-needed data on quakes that occur under the sea, where few seismic stations exist, leaving 70% of Earth's surface without earthquake detectors."There is a huge need for seafloor seismology. Any instrumentation you get out into the ocean, even if it is only for the first 50 kilometers from shore, will be very useful," said Nate Lindsey, a UC Berkeley graduate student and lead author of the paper.Lindsey and Jonathan Ajo-Franklin, a geophysics professor at Rice University in Houston and a visiting faculty scientist at Berkeley Lab, led the experiment with the assistance of Craig Dawe of MBARI, which owns the fiber-optic cable. The cable stretches 52 kilometers offshore to the first seismic station ever placed on the floor of the Pacific Ocean, put there 17 years ago by MBARI and Barbara Romanowicz, a UC Berkeley Professor of the Graduate School in the Department of Earth and Planetary Science. A permanent cable to the Monterey Accelerated Research System (MARS) node was laid in 2009, 20 kilometers of which were used in this test while off-line for yearly maintenance in March 2018."This is really a study on the frontier of seismology, the first time anyone has used offshore fiber-optic cables for looking at these types of oceanographic signals or for imaging fault structures," said Ajo-Franklin. "One of the blank spots in the seismographic network worldwide is in the oceans."The ultimate goal of the researchers' efforts, he said, is to use the dense fiber-optic networks around the world -- probably more than 10 million kilometers in all, on both land and under the sea -- as sensitive measures of Earth's movement, allowing earthquake monitoring in regions that don't have expensive ground stations like those that dot much of earthquake-prone California and the Pacific Coast."The existing seismic network tends to have high-precision instruments, but is relatively sparse, whereas this gives you access to a much denser array," said Ajo-Franklin.The technique the researchers use is Distributed Acoustic Sensing, which employs a photonic device that sends short pulses of laser light down the cable and detects the backscattering created by strain in the cable that is caused by stretching. With interferometry, they can measure the backscatter every 2 meters (6 feet), effectively turning a 20-kilometer cable into 10,000 individual motion sensors."These systems are sensitive to changes of nanometers to hundreds of picometers for every meter of length," Ajo-Franklin said. "That is a one-part-in-a-billion change."Earlier this year, they reported the results of a six-month trial on land using 22 kilometers of cable near Sacramento emplaced by the Department of Energy as part of its 13,000-mile ESnet Dark Fiber Testbed. Dark fiber refers to optical cables laid underground, but unused or leased out for short-term use, in contrast to the actively used "lit" internet. The researchers were able to monitor seismic activity and environmental noise and obtain subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network."The beauty of fiber-optic seismology is that you can use existing telecommunications cables without having to put out 10,000 seismometers," Lindsey said. "You just walk out to the site and connect the instrument to the end of the fiber."During the underwater test, they were able to measure a broad range of frequencies of seismic waves from a magnitude 3.4 earthquake that occurred 45 kilometers inland near Gilroy, California, and map multiple known and previously unmapped submarine fault zones, part of the San Gregorio Fault system. They also were able to detect steady-state ocean waves -- so-called ocean microseisms -- as well as storm waves, all of which matched buoy and land seismic measurements."We have huge knowledge gaps about processes on the ocean floor and the structure of the oceanic crust because it is challenging to put instruments like seismometers at the bottom of the sea," said Michael Manga, a UC Berkeley professor of earth and planetary science. "This research shows the promise of using existing fiber-optic cables as arrays of sensors to image in new ways. Here, they've identified previously hypothesized waves that had not been detected before."According to Lindsey, there's rising interest among seismologists to record Earth's ambient noise field caused by interactions between the ocean and the continental land: essentially, waves sloshing around near coastlines."By using these coastal fiber optic cables, we can basically watch the waves we are used to seeing from shore mapped onto the seafloor, and the way these ocean waves couple into the Earth to create seismic waves," he said.To make use of the world's lit fiber-optic cables, Lindsey and Ajo-Franklin need to show that they can ping laser pulses through one channel without interfering with other channels in the fiber that carry independent data packets. They're conducting experiments now with lit fibers, while also planning fiber-optic monitoring of seismic events in a geothermal area south of Southern California's Salton Sea, in the Brawley seismic zone. | Earthquakes | 2,019 |
November 26, 2019 | https://www.sciencedaily.com/releases/2019/11/191126091313.htm | Extra-terrestrial impacts may have triggered 'bursts' of plate tectonics | When -- and how -- Earth's surface evolved from a hot, primordial mush into a rocky planet continually resurfaced by plate tectonics remain some of the biggest unanswered questions in earth science research. Now a new study, published in | "We tend to think of the Earth as an isolated system, where only internal processes matter," says Craig O'Neill, director of Macquarie University's Planetary Research Centre. "Increasingly, though, we're seeing the effect of solar system dynamics on how the Earth behaves."Modelling simulations and comparisons with lunar impact studies have revealed that following Earth's accretion about 4.6 billion years ago, Earth-shattering impacts continued to shape the planet for hundreds of millions of years. Although these events appear to have tapered off over time, spherule beds -- distinctive layers of round particles condensed from rock vaporized during an extra-terrestrial impact -- found in South Africa and Australia suggest the Earth experienced a period of intense bombardment about 3.2 billion years ago, roughly the same time the first indications of plate tectonics appear in the rock record.This coincidence caused O'Neill and co-authors Simone Marchi, William Bottke, and Roger Fu to wonder whether these circumstances could be related. "Modelling studies of the earliest Earth suggest that very large impacts -- more than 300 km in diameter -- could generate a significant thermal anomaly in the mantle," says O'Neill. This appears to have altered the mantle's buoyancy enough to create upwellings that, according to O'Neill, "could directly drive tectonics."But the sparse evidence found to date from the Archaean -- the period of time spanning 4.0 to 2.5 billion years ago -- suggests that mostly smaller impacts less than 100 km in diameter occurred during this interval. To determine whether these more modest collisions were still large and frequent enough to initiate global tectonics, the researchers used existing techniques to expand the Middle Archaean impact record and then developed numerical simulations to model the thermal effects of these impacts on Earth's mantle.The results indicate that during the Middle Archaean, 100-kilometer-wide impacts (about 30 km wider than the much younger Chixculub crater) were capable of weakening Earth's rigid, outermost layer. This, says O'Neill, could have acted as a trigger for tectonic processes, especially if Earth's exterior was already "primed" for subduction."If the lithosphere were the same thickness everywhere, such impacts would have little effect," states O'Neill. But during the Middle Archean, he says, the planet had cooled enough for the mantle to thicken in some spots and thin in others. The modelling showed that if an impact were to happen in an area where these differences existed, it would create a point of weakness in a system that already had a large contrast in buoyancy -- and ultimately trigger modern tectonic processes."Our work shows there is a physical link between impact history and tectonic response at around the time when plate tectonics was suggested to have started," says O'Neill. "Processes that are fairly marginal today -- such as impacting, or, to a lesser extent, volcanism -- actively drove tectonic systems on the early Earth," he says. "By examining the implications of these processes, we can start exploring how the modern habitable Earth came to be." | Earthquakes | 2,019 |
November 22, 2019 | https://www.sciencedaily.com/releases/2019/11/191122113307.htm | New technology developed to improve forecasting of Earthquakes, Tsunamis | University of South Florida geoscientists have successfully developed and tested a new high-tech shallow water buoy that can detect the small movements and changes in the Earth's seafloor that are often a precursor to deadly natural hazards, like earthquakes, volcanoes and tsunamis. | The buoy, created with the assistance of an $822,000 grant from the National Science Foundation's Ocean Technology and Interdisciplinary Coordination program, was installed off Egmont Key in the Gulf of Mexico last year and has been producing data on the three-dimensional motion of the sea floor. Ultimately the system will be able to detect small changes in the stress and strain the Earth's crust, said USF School of Geosciences Distinguished Professor Tim Dixon.The patent-pending seafloor geodesy system is an anchored spar buoy topped by high precision Global Positioning System (GPS). The buoy' orientation is measured using a digital compass that provides heading, pitch, and roll information -- helping to capture the crucial side-to-side motion of the Earth that can be diagnostic of major tsunami-producing earthquakes, Dixon said. He was joined in leading the project by USF Geoscience Phd student Surui Xie, Associate Professor Rocco Malservisi USF College of Marine Science's Center for Ocean Technology research faculty member Chad Lembke, and a number of USF ocean technology personnel.Their findings were recently published in the While there are several techniques for seafloor monitoring currently available, that technology typically works best in the deeper ocean where there is less noise interference. Shallow coastal waters (less than a few hundred meters depth) are a more challenging environment but also an important one for many applications, including certain types of devastating earthquakes, the researchers said. Offshore strain accumulation and release processes are critical for understanding megathrust earthquakes and tsunamis, they noted.The experimental buoy rests on the sea bottom using a heavy concrete ballast and has been able to withstand several storms, including Hurricane Michael's march up the Gulf of Mexico. The system is capable of detecting movements as small as one to two centimeters, said Dixon, an expert on natural hazards and author of the book Curbing Catastrophe."The technology has several potential applications in the offshore oil and gas industry and volcano monitoring in some places, but the big one is for improved forecasting of earthquakes and tsunamis in subduction zones," Dixon said. "The giant earthquakes and tsunamis in Sumatra in 2004 and in Japan in 2011 are examples of the kind of events we'd like to better understand and forecast in the future."Dixon said the system is designed for subduction zone applications in the Pacific Ocean's "Ring of Fire" where offshore strain accumulation and release processes are currently poorly monitored. One example where the group hopes to deploy the new system is the shallow coastal waters of earthquake prone Central America.The Egmont Key test location sits in just 23 meters depth. While Florida is not prone to earthquakes, the waters off Egmont Key proved an excellent test location for the system. It experiences strong tidal currents that tested the buoy's stability and orientation correction system. The next step in the testing is to deploy a similar system in deeper water of the Gulf of Mexico off Florida's west coast. | Earthquakes | 2,019 |
November 21, 2019 | https://www.sciencedaily.com/releases/2019/11/191121121745.htm | Life under extreme conditions at hot springs in the ocean | The volcanic island of Kueishantao in northeastern Taiwan is an extreme habitat for marine organisms. With an active volcano, the coastal area has a unique hydrothermal field with a multitude of hot springs and volcanic gases. The acidity of the study area was among the highest in the world. The easily accessible shallow water around the volcanic island therefore represents an ideal research environment for investigating the adaptability of marine organisms, some of which are highly specialised, such as crabs, to highly acidified and toxic seawater. | For about ten years, marine researchers from the Institute of Geosciences at Kiel University (CAU), together with their Chinese and Taiwanese partners from Zhejiang University in Hangzhou and the National Taiwan Ocean University in Keelung, regularly collected data on geological, chemical and biological processes when two events disrupted the results of the time series in 2016. First, the island was shaken by an earthquake and hit by the severe tropical typhoon Nepartak only a few weeks later. On the basis of data collected over many years, the researchers from Kiel, China and Taiwan were now able to demonstrate for the first time that biogeochemical processes had changed due to the consequences of the enormous earthquake and typhoon and how different organisms were able to adapt to the changed seawater biogeochemistry in the course of only one year. The first results of the interdisciplinary study, based on extensive data dating back to the 1960s, were recently published in the international journal Nature "Our study clearly shows how closely atmospheric, geological, biological and chemical processes interact and how an ecosystem with extreme living conditions such as volcanic sources on the ocean floor reacts to disturbances caused by natural events," says Dr. Mario Lebrato of the Institute of Geosciences at Kiel University. For years, scientists led by Dr. Dieter Garbe-Schönberg and Dr. Mario Lebrato from the Institute of Geosciences at the CAU have been researching the shallow hydrothermal system "Kueishantao." The selected site has a large number of carbon dioxide emissions in the shallow water. In addition, the sources release toxic metals. Sulphur discolours the water over large areas. The volcanic gases -- with a high sulphur compounds -- lead to a strong acidification of the sea water. Through methods of airborne drone surveying, modelling, regular sampling and laboratory experiments research into the hydrothermal field therefore makes an important contribution to the effects of ocean acidification on marine communities. Only a few specialized animal species such as crabs, snails and bacteria live in the immediate vicinity of the sources. A few metres away, on the other hand, is the diverse life of a tropical ocean."Due to the high acidity, the high content of toxic substances and elevated temperatures of the water, the living conditions prevailing there can serve as a natural laboratory for the investigation of significant environmental pollution by humans. The sources at Kueishantao are therefore ideal for investigating future scenarios," says co-author Dr. Yiming Wang, who recently moved from Kiel University to the Max Planck Institute for the Science of Human History in Jena.After the severe events in 2016, the study area changed completely. The seabed was buried under a layer of sediment and rubble. In addition, the acidic warm water sources dried up, and the composition of the sea water had significantly and continuously changed over a long period of time. Aerial photos taken with drones, samples taken by research divers from Kiel and Taiwan as well as biogeochemical investigations clearly showed the spatial and chemical extent of the disturbances. These were recorded by the biologist and research diver Mario Lebrato and his Taiwanese colleague Li Chun Tseng and compared with the results of earlier samplings. "What initially looked like a catastrophe for our current time series study turned out to be a stroke of luck afterwards. This gave us the rare opportunity to observe how organisms adapt to the severe disturbances. We were able to draw on a comprehensive database to do this" explains project manager Dr. Dieter Garbe-Schönberg from the Institute of Geosciences at Kiel University. | Earthquakes | 2,019 |
November 19, 2019 | https://www.sciencedaily.com/releases/2019/11/191119132523.htm | Evidence of two earthquakes extends rupture history in Grand Tetons National Park | Hand-dug trenches around Leigh Lake in Grand Teton National Park in Wyoming reveal evidence for a previously unknown surface-faulting earthquake in along the Teton Fault -- one occurring about 10,000 years ago. | Together with evidence from the site of a second earthquake that ruptured around 5,900 years ago, the findings published in the The Teton Fault is one of the fastest-moving normal faults in the western United States, separating the eastern edge of the Teton Range from the Jackson Hole basin. The fault is divided into southern, central and northern segments, with the Leigh Lake site falling within the central segment. A previous study identified two Teton Fault earthquakes that occurred 8,000 years ago and 4,700 to 7,900 years ago on the southern segment at Granite Canyon, one of the most famous hiking spots in the Grand Teton National Park.The younger earthquake at Leigh Lake may be the same rupture as the youngest Granite Canyon earthquake, confirming that there were at least three earthquakes in Holocene times, and that the most recent activity along the fault occurred about 6,000 years ago, said Mark Zellman of BGC Engineering, Inc., the lead author of the BSSA study.Although the Leigh Lake study doesn't provide a definite answer to the question of whether multiple segments of the Teton fault have ruptured at once, Zellman said the findings "do give us a clue that multi-section ruptures are possible. The overlap in age between the youngest Leigh Lake earthquake and the youngest Granite Canyon earthquake "leaves open the possibility that at least the southern and central section of the Teton fault ruptured together during the most recent event."Given the Teton fault's high rate of movement in the past, it has been a surprisingly long time since its last earthquake, said Zellman. "The seemingly regular and relatively short intervals of time between these three events makes the long period of quiescence on the Teton fault even more surprising," he said. "I was expecting that we would have found evidence for at least one rupture that post-dates the youngest event known from Granite Canyon."Zellman and colleagues chose Leigh Lake as a study site because no other paleoseismic studies had been conducted previously on this central segment of the fault, and because the site offered several small and easy to reach scarps for shovel excavations. The researchers excavated at two of three scarps that represent the fault's movement in postglacial times.The remoteness of the site and its location within a national park prevented the researchers from using heavy equipment to dig and backfill their shallow trenches. In the future, Zellman said, "it would be nice to identify a location or two where we could excavate a deeper trench to expose a longer record."Asked about the older event at Granite Canyon that was not found at Leigh Lake, Zellman said "evidence for that earthquake might be preserved in the third scarp. But we won't know for sure until we excavate that scarp."The researchers examined the coarse exposed sediments in the trenches for signs of past faulting, in some places analyzing the orientation of large rocks clasts within the trench walls to reveal the fault's presence. The faults were dated using radiocarbon and optically stimulated luminescence methods.Based on the length of the fault ruptures, Zellman and colleagues estimate the 10,000-year old earthquake may have been a magnitude 6.6 to 7.2 quake, while the 5,900-year old earthquake may have been magnitude 7.0 to 7.2.Zellman said other studies of sites along the fault's northern segment, combined with data from studies that look at landslides and other signs of paleoseismic activity contained in deep lake sediments from the region, will help further fill in the history of the Teton Fault. | Earthquakes | 2,019 |
November 13, 2019 | https://www.sciencedaily.com/releases/2019/11/191113170312.htm | Fault-slip: AI to simulate tectonic plate deformation | Each year, anywhere from a few hundred to tens of thousands of deaths are attributed to the catastrophic effects of major earthquakes. Apart from ground shaking, earthquake hazards include landslides, dam ruptures, flooding, and worse -- if the sea floor is suddenly displaced during an earthquake, it can trigger a deadly tsunami. | Although earthquakes can't be prevented, processes involving the Earth's tectonic plates that make up its crust and upper mantle can provide scientists with clues about the possible effects of these impending disasters before they arrive.A team led by professor Tsuyoshi Ichimura at the Earthquake Research Institute (ERI) at the University of Tokyo (UTokyo) is studying the deformation of tectonic plates to aid physics-based forecasting of natural disasters such as earthquakes. Specifically, the team is simulating a tectonic plate boundary spanning from Vancouver, British Columbia, down to Northern California. At this boundary -- called the Cascadia Subduction Zone -- the coastal Explorer, Juan de Fuca, and Gorda plates move east and shift underneath the North American Plate, a process known as subduction that can trigger large-magnitude earthquakes and volcanic activity.The team recently extended and optimized one of its scientific codes for the world's most powerful and smartest supercomputer for open science, the IBM AC922 Summit at the Oak Ridge Leadership Computing Facility (OLCF), a US Department of Energy (DOE) Office of Science User Facility located at DOE's Oak Ridge National Laboratory (ORNL).By transforming the Unstructured fiNite element ImpliCit sOlver with stRuctured grid coarseNing (UNICORN) code into an artificial intelligence (AI)-like algorithm, the team ran UNICORN at 416 petaflops and gained a 75-fold speedup from a previous state-of-the-art solver by fully leveraging the power of the Tensor Cores on Summit's Volta GPUs. Tensor Cores are specialized processing units that rapidly carry out matrix multiplications and additions using mixed precision calculations."The Tensor Cores aren't available for just any type of calculation," said Kohei Fujita, assistant professor at ERI. "For this reason, we had to align all of our data access patterns and multiplication patterns to suit them." Data access patterns determine how data is accessed in memory by a software program and can be organized more efficiently to exploit a particular computer architecture.Using UNICORN, the UTokyo team simulated a 1,944 km × 2,646 km × 480 km area at the Cascadia Subduction Zone to look at how the tectonic plate is deformed due to a phenomenon called a "fault slip," a sudden shift that occurs at the plate boundary.The team said the new solver can be used as a tool to aid scientists in the arduous task of long-term earthquake forecasting -- a goal that, when realized, could lead to earthquake prediction and disaster mitigation.Previously, the team demonstrated a general approach to introduce AI to scientific applications in the iMplicit sOlver wiTH artificial intelligence and tRAnsprecision computing, or MOTHRA, code -- an achievement that earned them an Association for Computing Machinery Gordon Bell finalist nomination last year."For UNICORN, we optimized the code specifically for Summit," said ERI doctoral student Takuma Yamaguchi. "New hardware with some specific features sometimes requires sophisticated implementations to achieve better performance."UNICORN performs denser computations, allowing it to take full advantage of the unique architecture of Summit, which features 9,216 IBM POWER9 CPUs and 27,648 NVIDIA Volta GPUs. The most computationally expensive piece of the code ran at 1.1 exaflops using mixed precision -- a major undertaking for a code that is based on equations rather than deep learning computations. (Codes based on the latter are inherently optimal for systems such as Summit.)For future earthquake problems, the team will need to apply UNICORN to analyze the Earth's crust and mantle responses to a fault slip over time. This will require thousands of simulations then hundreds or thousands of additional iterations to compare the results with real-world earthquake events."To reach our earthquake forecasting goals, we will have to do many simulations of crust deformation and then compare our results with observed records from past earthquakes," Ichimura said. | Earthquakes | 2,019 |
November 12, 2019 | https://www.sciencedaily.com/releases/2019/11/191112114007.htm | New exploration method for geothermal energy | Where to drill? This is the basic question in the exploration of underground energy resources, such as geothermal energy. Water in rocks flows along permeable pathways, which are the main target for geothermal drilling. Borehole, core and micro-earthquake data show that the pathways are spatially connected, permeable structures, such as fractures or faults in the rock. However, the geothermal potential of these structures cannot be fully exploited with the techniques available to date. | A research team led by Maren Brehme, research scientist at the GFZ German Research Centre for Geosciences until August 2019 and now Assistant Professor at the TU Delft, presents a new method for locating potential drilling sites that are covered by water. "In the future, our method will make it possible to map geological structures under water and draw conclusions about the inflow from surrounding layers," says Maren Brehme.Since geothermal fields are often located in volcanic areas, they usually occur near or below crater lakes. "However, these lakes hide structures that are important for geothermal energy," explains Maren Brehme. "In the study, we showed that volcanic lakes such as the Lake Linau in Indonesia, which we investigated, have so-called 'sweet spots', deep holes with fluid inflow from the surrounding rock." The method is not limited to volcanic lakes though. It can also be applied to other underwater areas.The new approach combines bathymetry measurements with geochemical profiles. In this study, bathymetry (from Greek bathýs 'deep' and métron 'measure') is used to map fault zones and geyser-like holes in the lake floor. Its most important feature is the echo sounder. The geochemical profiles from data on temperature, salinity, density and pH at different depths show areas in the lake with inflows from the surrounding geothermal reservoir. The combination allows the distinction between permeable and non-permeable structures, which was previously not possible. With this method, promising locations for drilling can be located more precisely.The related field work took place in 2018 during an expedition to Lake Linau led by Maren Brehme. It was part of the long-standing GFZ cooperation with Indonesian partners funded by the German Federal Ministry of Education and Research. The Lake Linau is only a few kilometres from the Lahendong site, where the first geothermal low-temperature demonstration power plant in Indonesia, jointly developed by GFZ and Indonesian partners, was successfully commissioned in 2017. | Earthquakes | 2,019 |
November 12, 2019 | https://www.sciencedaily.com/releases/2019/11/191112073557.htm | Shedding new light on earthquake that killed 9,000 people | A new understanding of a fault that caused a deadly 7.8 magnitude earthquake can help scientists better understand where and when the next big one will hit. | For decades, scientists have debated the structure of the Main Himalayan Thrust -- the fault responsible for a 2015 earthquake that killed nearly 9,000 people, injured 22,000, and destroyed 600,000 homes in Gorkha, Nepal. This fault is a direct result of ongoing collision between two tectonic plates -- the Indian and Eurasian -- that gives rise to the Himalayas.Led by UC Riverside, a team of researchers has determined a new geometric model for the fault that will allow officials to better prepare for future shakers. The team's work is detailed in a paper published today in "This is the most high-resolution model of this fault structure to date," said Abhijit Ghosh, a UCR associate professor of geophysics. "With this knowledge we can better explain why the quake happened the way it happened, and better estimate the stress points along the fault that may act as birthplaces for future large damaging earthquakes."Following the quake, Ghosh and his collaborators rushed to Nepal to operate a network of 45 seismometers in the ground. Their journey was complicated by the difficulty of traveling in that high-altitude, rocky region as well as the timing of the quake during monsoon season.Despite the difficulties, the team pushed through because the existing network of aftershock-measuring devices, known as seismic stations, was very limited. Without data on the aftershocks, such as their locations and magnitude, it would not have been possible to develop a more detailed understanding of the fault."The geometry of the fault also matters," Ghosh said. "It's critical to look at smaller earthquakes and aftershocks to determine where the stress points are in a fault. Fault geometry plays a major role in earthquake generation."It is also critical to learn the shape of a fault, as well as earthquake "style," meaning the ways in which a block of rock moves relative to other rocks during an earthquake.The team found that the Himalayan Thrust, which runs more than 1,000 kilometers from Pakistan to Myanmar, is built in a shape known as a duplex in the area where the magnitude 7.8 earthquake occurred in 2015."It consists of two horizontal planes connected by a complex structure bounded by many not-quite-horizontal faults," Ghosh explained.This study was funded by the National Science Foundation. The first author of the paper is Matt Mendoza, a PhD student in the Ghosh Earthquake Seismology lab at UCR. Partners included the Government of Nepal's Department of Mining and Geology, as well as researchers from Stanford University, the University of Texas, El Paso, and the University of Oregon.The team concluded the fault is still accumulating stress, and that the 2015 event may have increased the likelihood of another big earthquake nearby. This last point may be of interest to Californians.Accumulated stress from the Nepalese earthquake may be adding stress to parts of the underlying fault that haven't ruptured yet, and the same may be true of faults in Southern California.In particular, Ghosh is interested in the aftermath of a magnitude 7.1 earthquake that hit Ridgecrest, Calif., on July 5. The 150-mile-long Garlock fault lies perpendicular to the one that caused the Ridgecrest quake and could produce an even larger and more damaging earthquake.Those residing anywhere near major fault lines should always have an emergency plan and supplies on hand because earthquakes are inevitable."The moral of this story is if you live anywhere near a fault, get your earthquake kit ready," Ghosh said. "That is always the moral of the story." | Earthquakes | 2,019 |
November 7, 2019 | https://www.sciencedaily.com/releases/2019/11/191107111740.htm | Geoscientists hope to make induced earthquakes predictable | University of Oklahoma Mewbourne College of Earth and Energy assistant professor Xiaowei Chen and a group of geoscientists from Arizona State University and the University of California, Berkeley, have created a model to forecast induced earthquake activity from the disposal of wastewater after oil and gas production. | "In this region of the country, for every barrel of oil produced from the ground, usually between eight and nine barrels of water are also extracted from many wells," said Chen.The large amount of water leads to a problem for oil producers -- what to do with it?Also called brine, this wastewater contains salt, minerals and trace amounts of oil, making it unusable for consumption or agricultural purposes and cost-prohibitive to treat. It is disposed of by injecting it back into the earth, deep into porous rock formations.Wastewater injection can cause earthquakes, explained Chen, and while most of the recent earthquakes in Oklahoma have been small, several have been in excess of 3.0 on the Richter scale.Chen and a team of researchers, led by Guang Zhai and Manoochehr Shirzaei from ASU, and Michael Manga from UC Berkeley, set out to find a way to make induced earthquakes in Oklahoma predictable and small.Their method, explained Chen, was to "create a model that correlates injected wastewater volume with stress changes on nearby faults and the number of earthquakes in that area."Forecasting the amount of seismic activity from wastewater injection is difficult because it involves accounting for numerous variables:The team tackled each issue.Chen and her fellow researchers studied subsurface hydrology parameters -- how fast fluid moves within porous rocks and how quickly introduced fluid changes the stress in the subsurface basement. This is important because the subsurface basement is the location of Oklahoma's induced earthquakes.While the ASU team used satellite data to determine subsurface hydrology parameters, Chen focused on space and time distributions of earthquakes, and determined hydrology parameters by looking at how fast earthquakes move away from injection zones. By comparing both sets of data, researchers further increased the accuracy of their model.As it turns out, the Arbuckle Group, a sedimentary layer that sits on top of the subsurface basement deep within the earth, is especially permeable, allowing brine, and therefor earthquakes, to easily spread."When we inject brine into the Arbuckle Group at a depth of 1-3 kilometers, it can transport through the porous rocks, modifying stresses and causing earthquakes on basement faults," said Chen.Next, researchers can plug in the amount of fluid into the model. By adding the volume of fluid injected in a particular area into their model, they can determine the stress it will place on that region as it spreads.With the brine variables accounted for, researchers then added information about pre-existing faults into regional calculations. The more researchers know about a particular area, the more accurate the data will be."If we are going to operate in an area where we don't have any prior seismicity, it will be a little challenging to forecast accurately," said Chen. "But by operating in a new area and taking real-time parameters, operators and researchers should be able to forecast future behavior."Chen hopes that by following the results of the models she helped create, oil operators in the state can create new protocols for how much wastewater to inject and where.This could help prevent large induced earthquakes in Oklahoma. Chen does not believe forthcoming protocols will end induced seismicity altogether, but rather will help cap earthquake size and rate with restricted injection control. This method can forecast future induced seismicity.Chen foresees a protocol similar to tornado watches -- a window of time where Oklahomans are warned they may feel minor tremors in a region of the state.According to Chen, this is an area where the close working ties between geoscientists and petroleum engineers will need to be even stronger. So far, her research has garnered interest from both geoscientists and petroleum engineers in industry and academia. | Earthquakes | 2,019 |
November 6, 2019 | https://www.sciencedaily.com/releases/2019/11/191106120432.htm | Earthquake impact can be affected by seasonal factors, historical study shows | The season that an earthquake occurs could affect the extent of ground failure and destruction that the event brings, according to a new look at two historical earthquakes that occurred about 100 years ago near Almaty, Kazakhstan. | In a paper published in The frozen layer may have inhibited the drainage of pore-pressure excess through the surface during the earthquake, causing liquefaction at depth. In effect, the frozen layer extending about one meter below the surface "was a sealant layer that was not allowing the pore pressure to diffuse," explained Stefano Parolai, a co-author of the paper at the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale in Italy.The findings suggest seismologists should incorporate potential seasonal differences in soil characteristics "as they are making probabilistic liquefaction or ground failure assessments," added co-author Denis Sandron, also at the Istituto Nazionale.The effect of frozen ground on ground deformation is already calculated for some types of infrastructure such as oil pipelines in Alaska, but Parolai said the Kazakhstan study shows the importance of considering these effects in urban areas as well, especially when it can become a seasonal effect.The Verny (Verny is the former name of Almaty) earthquake destroyed nearly all of the town's adobe buildings and killed 300 people in 1887. The Kemin earthquake, which took place about 40 kilometers from Verny, caused a surprising amount of widespread ground failure and destruction and 390 deaths.Parolai and his colleagues reviewed historical records of the two earthquakes as part of a larger project studying site effects and seismic risk in central Asia led by the GFZ German Research Centre for Geosciences. The two Almaty earthquakes, including their secondary effects such as landslides, "were very well documented by expeditions of the Mining Department of Russia and the Russian Mining Society at the time," Parolai said.The differences in ground failure between the two earthquakes were perplexing to the researchers. The fact that the two earthquakes had taken place at different times of year prompted Sandron and his colleagues to consider whether frozen ground might have been a factor in ground failure, as previous researchers had noted for the 1964 magnitude 9.2 Great Alaska earthquake.To explore this idea, the research team created computer simulations of the earthquakes using different models of the soil profile that would affect the velocity of seismic waves passing through them, along with temperature data (more than 100 years of temperature recordings exist for Almaty) to determine whether it would be likely to have a frozen layer of ground at shallow depth during January.One of the other challenges, said co-author Rami Alshembari ,formerly of the International Centre for Theoretical Physics in Italy and currently at the University of Exeter in the United Kingdom, was finding a way to include appropriate strong motion data recordings in the simulations, since "of course there were no digital recordings of these two earthquakes. We had to choose the most reasonable and robust studies for input of strong motion," finally settling on data taken from the 1999 magnitude 7.6 Chi-Chi earthquake in Taiwan, appropriately modified, as being most similar to the Almaty earthquakes.Models that included a one-meter deep frozen layer as a seal against pore pressure draining were the best fit for the ground failure seen in the Kemin earthquake, they concluded. Although the researchers suspected that a frozen layer could be the culprit, Parolai said they were surprised by the strength of the effect.The findings suggest that other seismologists should consider seasonality in their site effect studies, he noted. "Even in materials where we would not expect this effect, due to local conditions and temperatures it could happen.""Without this good documentation, probably we would not have noticed this effect," said Parolai. "This is telling us that good data taken in the past can be very precious in 100 years." | Earthquakes | 2,019 |
November 4, 2019 | https://www.sciencedaily.com/releases/2019/11/191104121558.htm | Historical data confirms recent increase in West Texas earthquakes | A new analysis of historical seismic data led by The University of Texas at Austin has found that earthquake activity in West Texas near the city of Pecos has increased dramatically since 2009. | The study, published Nov. 4 in the The researchers were able to extend the seismic record of the area by turning to the older TXAR system near Lajitas, Texas, about 150 miles to the south. TXAR is an array of 10 seismographs installed in the 1990s by scientists at Southern Methodist University (SMU) to help track nuclear testing across the world, said lead author Cliff Frohlich, a senior research scientist emeritus at the University of Texas Institute for Geophysics (UTIG)."Especially for these West Texas earthquakes, we would like to get some information about when they started," Frohlich said. "I really saw this as a way to bridge the gap before TexNet."The TXAR system is some distance from Pecos, but Frohlich said the equipment is highly sensitive and that the area is remote and seismically very quiet, making the system perfect for picking up vibrations from explosions across the world or from earthquakes 150 miles away. Frohlich and SMU scientist Chris Hayward developed a method to derive the earthquake data from the international data TXAR collects and build an earthquake catalog for the Delaware Basin near Pecos from 2000 to 2017.By analyzing the data, scientists were able to document more than 7,000 seismic events near Pecos that were determined by the team to be earthquakes. Multiple events first started occurring in 2009, when 19 earthquakes of at least magnitude 1 were documented. The rate increased over time, with more than 1,600 earthquakes of magnitude 1 or greater in 2017. Most were so small that no one felt them.The study shows a correlation between earthquake activity in the area and an increase in oil and gas activity, but it does not make an effort to directly tie the two together as other studies have done."West Texas now has the highest seismicity rates in the state," said co-author and SMU Associate Professor Heather DeShon. "What remained uncertain is when the earthquakes actually started. This study addresses that."This study is the latest in a comprehensive effort to determine what is causing an increase in seismic activity in Texas and how oil and gas operations can be managed to minimize the human-induced element. The state approved the TexNet system in 2015. It is operated in tandem with research efforts by the Center for Integrated Seismicity Research (CISR)."The obvious next step is exactly what the University of Texas is doing -- conducting these careful studies on the relationship between earthquakes and their human and natural causes to build an integrated understanding," Hennings said.The bureau and UTIG are units of the UT Jackson School of Geosciences. Scientists from Southern Methodist University, Portland State University, the University of Oklahoma and the French institute IFREMER also worked on the study. | Earthquakes | 2,019 |
October 30, 2019 | https://www.sciencedaily.com/releases/2019/10/191030151432.htm | What makes Earth's surface move? Could the surface drive mantle movement? | Do tectonic plates move because of motion in the Earth's mantle, or is the mantle driven by the movement of the plates? Or could it be that this question is ill-posed? This is the point of view adopted by scientists at the École Normale Supérieure -- PSL, the CNRS and the University of Rome 3, who regard the plates and the mantle as belonging to a single system. According to their simulations, published in | Which forces drive tectonic plates? This has remained an open question ever since the advent of plate tectonic theory 50 years ago. Do the cold edges of plates slowly sinking into the Earth's mantle at subduction zones cause the motion observed at the Earth's surface? Or alternatively, does the mantle, with its convection currents, drive the plates? For geologists, this is rather like the problem of the chicken and the egg: the mantle apparently causes the plates to move, while they in turn drive the mantle...To shed light on the forces at work, scientists from the Geology Laboratory of the École Normale Supérieure (CNRS/ENS -- PSL), the Institute of Earth Sciences (CNRS/Universities Grenoble Alpes and Savoie Mont Blanc/IRD/Ifsttar) and the University of Rome 3 treated the solid Earth as a single indivisible system and carried out the most comprehensive modelling to date of the evolution of a fictional planet very similar to the Earth. The scientists first had to find the appropriate parameters, and then spend some nine months solving a set of equations with a supercomputer, reconstructing the evolution of the planet over a period of 1.5 billion years.Using this model, the team showed that two thirds of the Earth's surface moves faster than the underlying mantle, in other words it is the surface that drags the interior, while the roles are reversed for the remaining third. This balance of forces changes over geological time, especially for the continents. The latter are mainly dragged by deep motion within the mantle during the construction phases of a supercontinent, as in the ongoing collision between India and Asia: in such cases, the motion observed at the surface can provide information about the dynamics of the deep mantle. Conversely, when a supercontinent breaks up, the motion is mainly driven by that of the plates as they sink down into the mantle.The computation contains a wealth of data that remains largely unexploited. The data obtained could help us to understand how mid-ocean ridges form and disappear, how subduction is triggered, or what determines the location of the plumes that cause vast volcanic outpourings. | Earthquakes | 2,019 |
October 30, 2019 | https://www.sciencedaily.com/releases/2019/10/191030120334.htm | Southern California earthquakes increased stress on major fault line | A University of Iowa-led study has found that a series of Southern California earthquakes last summer increased stress on the Garlock Fault, a major earthquake fault line that has been dormant for at least a century. | The researchers used satellite imagery and seismic instruments to map the effects of the Ridgecrest earthquakes, a sequence that began with a magnitude 6.4 foreshock in the Mojave Desert on July 4 before a magnitude 7.1 earthquake that struck the next day. In all, there were more than 100,000 aftershocks stemming from the twin earthquakes.The analysis by Bill Barnhart, a geodesist at Iowa, and researchers at the U.S. Geological Survey showed the Ridgecrest earthquakes and aftershocks caused "aseismic creep" along a 12- to 16-mile section of the Garlock Fault, which runs east to west for 185 miles from the San Andreas Fault to Death Valley, and perpendicular to the Ridgecrest earthquake region."The aseismic creep tells us the Garlock Fault is sensitive to stress changes, and that stresses increased across only a limited area of the fault," says Barnhart, assistant professor in the UI Department of Earth and Environmental Sciences and corresponding author on the study, published in the journal "So, if -- and that's a big if -- this area were to slip in a future earthquake, we are showing where that might happen," he adds.The Ridgecrest earthquakes and aftershocks led to ruptures on the surface and below ground right up to the Garlock Fault. Other than the one stressed section on the Garlock Fault identified by the research team, the remaining 165 or so miles of the fault actually shows decreased stress from the Ridgecrest seismic activity."This is good news," Barnhart says.Aseismic creep is slip on a fault that does not produce the shaking or seismic waves associated with earthquakes. Creep on the Garlock Fault following the Ridgecrest earthquakes was shallow, occurring from the surface to around 300 feet below ground.The Garlock Fault has not produced large earthquakes since instrument-keeping began -- at least a century -- but is considered a potential seismic risk to Southern California."The Garlock Fault has been quiet for a long time," Barnhart says. "But there's geologic evidence that there have been large earthquakes on it. It's a major fault line."Barnhart's team says the stressed section could be capable of producing an earthquake between a magnitude 6.7 to a magnitude 7.0 if it ruptured."It would be an earthquake of the magnitude of the Ridgecrest sequence," Barnhart says. "That means it would be big. You'd feel some swaying in Los Angeles, but it wouldn't be a magnitude 7.8 that could be more damaging."Aseismic creep triggered on faults by nearby earthquakes in Southern California are relatively common, and do not necessarily lead to earthquakes, Barnhart says.Another analysis of the Ridgecrest earthquakes, published this month in the journal The USGS and the Southern California Earthquake Center funded the research. | Earthquakes | 2,019 |
October 29, 2019 | https://www.sciencedaily.com/releases/2019/10/191029140722.htm | 3-D models of cascadia megathrust events match coastal changes from 1700 earthquake | By combining models of magnitude 9 to 9.2 earthquakes on the Cascadia Subduction Zone with geological evidence of past coastal changes, researchers have a better idea of what kind of megathrust seismic activity was behind the 1700 Cascadia earthquake. | The analysis by Erin Wirth and Arthur Frankel of the U.S. Geological Survey indicates that a rupture extending to just offshore for most of the Pacific Northwest could cause the pattern of coastal subsidence seen in geologic evidence from the 1700 earthquake, with an estimated magnitude between 8.7 and 9.2.An earthquake rupture that also contains smaller patches of high stress drop, strong motion-generating "subevents" matches the along-fault variations in coastal subsidence seen from southern Oregon to British Columbia from the 1700 earthquake, the researchers conclude in their study published in the The seismic hazard associated with Cascadia megathrust earthquakes depends on how far landward the rupture extends, along with differences in slip along the fault. For this reason, the new study could help improve seismic hazard estimates for the region, including estimates of ground shaking intensity in Portland, Oregon, Seattle, Washington and Vancouver, British Columbia.For instance, the 2014 National Seismic Hazard Maps assigned different "weights" to earthquake scenarios that rupture to different extents of the down-dipping plate in the region's subduction zone, as a way to express their potential contribution to overall megathrust earthquake hazard. An earthquake where the rupture extends deep and partially inland is weighted at 30%, a shallow rupture that is entirely offshore is weighted at 20%, and a mid-depth rupture that extends approximately to the coastline is weighted at 50%."We looked at various magnitude 9 rupture scenarios for Cascadia, to see how the coastal land level changes under those scenarios," said Wirth, " and you can't match the paleoseismic estimates for how the land level changed along the Pacific Northwest coast during the 1700 Cascadia earthquake" with rupture scenarios at the shallowest and deepest points."This may mean that these scenarios deserve less weight in assessing the overall seismic hazard for Cascadia," Wirth noted.The researchers used data from other megathrust earthquakes around the world, such as the 2010 magnitude 8.8 Maule, Chile and the 2011 magnitude 9.0 in Tohoku, Japan earthquakes to inform their models. One of the features found in these and other megathrust events around the world are distinct patches of strong motion-generating "subevents" that take place in the deeper portions of the megathrust fault.Wirth and Frankel show that variations in coastal subsidence caused by the 1700 earthquake may be due to the locations of these subevents. But improving the accuracy of paleoseismic estimates for how the land level changed during previous Cascadia earthquakes is critical to ascertain this, said Wirth.It's unclear what causes these subevents, other than that these areas of the fault must generate high stress that can be released in the form of strong ground shaking. This might indicate that the subevents have a physical cause like the structure or composition of the rocks along the fault that makes them mechanically strong, or changes in friction or fluid pore pressure related to their depth.In the Tohoku and Maule earthquakes, Wirth noted, "the frequency of ground shaking that is most damaging to buildings and infrastructure seemed to be radiated from these discrete patches on the fault."More research to understand what and where these subevents are, and whether they change over time, could improve seismic hazard estimates in Cascadia, she said. "If we could constrain the location of these subevents ahead of time, then you could anticipate where your strongest ground shaking might be."In 2002, the USGS estimated that there was a 10% to 14% chance of another magnitude 9.0 Cascadia earthquake occurring in the next 50 years. | Earthquakes | 2,019 |
October 24, 2019 | https://www.sciencedaily.com/releases/2019/10/191024131330.htm | GIS-based analysis of fault zone geometry and hazard in an urban environment | Typical geologic investigations of active earthquake fault zones require that the fault can be observed at or near the Earth's surface. However, in urban areas, where faults present a direct hazard to dense populations, the surface expression of a fault is often hidden by development of buildings and infrastructure. This is the case in San Diego, California, where the Rose Canyon fault zone trends through the highly developed downtown. | Due to regulations on development in areas of active faulting, hundreds of individual, city block-sized fault investigations have been conducted by geotechnical consulting firms in downtown San Diego since the late 1970s. The reports produced from these investigations include information on geology and faulting beneath the urban landscape that is valuable to government agencies, the geotechnical community, and earthquake scientists.Luke Weidman, Jillian M. Maloney, and Thomas K. Rockwell compiled data from 268 of these individual reports to create the first centralized geodatabase for study of the Rose Canyon fault zone through downtown San Diego. The geodatabase includes 2020 georeferenced datapoints with links to the original data logs. The team then used the interactive geodatabase to examine the geometry of the Rose Canyon fault zone beneath the city.Fault mapping revealed a complex geometry, likely related to a step in the fault zone towards the west and offshore. More work is needed, however, to assess changes in fault activity through time and how those changes may relate to fault zone evolution. The team also identified several places where fault segments mapped in geotechnical reports do not match with other publicly available fault databases.These contradictions should be resolved for more accurate hazard assessment for the region. Overall, the geodatabase proved to be an effective way to map complex fault zone geometry that is otherwise obscured by development at Earth's surface.The data held within the geodatabase could also be used for future research on patterns of earthquake occurrence and for models of ground shaking caused by potential future earthquakes along the fault zone. The geodatabase was made publicly available to facilitate these types of projects. A similar approach may be useful in other major cities world-wide where fault zones are located beneath developed regions, such as Los Angeles and San Francisco (USA), Izmit (Turkey), Wellington (New Zealand), and Kumamoto (Japan). | Earthquakes | 2,019 |
October 23, 2019 | https://www.sciencedaily.com/releases/2019/10/191023150335.htm | Earthquakes in slow motion | A new study from Caltech finds that so-called "slow slip" or "silent" earthquakes behave more like regular earthquakes than previously thought. The discovery opens the door for geoscientists to use these frequent and nondestructive events as an easy-to-study analog that will help them find out what makes earthquakes tick. | Slow-slip events were first noted about two decades ago by geoscientists tracking otherwise imperceptible shifts in the earth using GPS technology. They occur when faults grind incredibly slowly against each other, like an earthquake in slow motion. For example, a slow-slip event that occurs over the course of weeks might release the same amount of energy as a minute-long magnitude-7.0 earthquake. Because they occur deep in the earth and release energy so slowly, there is very little deformation at the surface, although the slow events might affect an area of thousands of square kilometers. As such, they were only noted when GPS technology was refined to the point that it could track those very minute shifts. Slow-slip events also do not occur along every fault; so far, they have been spotted in just a handful of locations including the Pacific Northwest, Japan, Mexico, and New Zealand.As they have only just begun to be detected and cataloged, a lot remains unknown about them, says Jean-Philippe Avouac, Caltech's Earle C. Anthony Professor of Geology and Mechanical and Civil Engineering. "There's a lot of uncertainty. You can't study them using traditional seismological techniques because the signal they create is too faint and gets lost in the noise from human activities as well as from natural geological processes like ocean waves, rivers, and winds." Before Avouac's group began this study, there were not enough documented slow-slip events to determine their scaling properties reliably, he says.Avouac's group designed and applied an innovative signal processing technique to detect and image the slow-slip events along Washington state's Cascadia Subduction Zone, where the North American tectonic plate is sliding southwest over the Pacific Ocean plate, using a network of 352 GPS stations. The researchers analyzed data spanning the years 2007 to 2018 and were able to build a catalog of more than 40 slow-slip events of varied sizes. Their findings appear in Compiling data from these events, the researchers were able to characterize the features of slow-slip events more precisely than previously possible. One key finding from the study is that slow-slip events obey the same scaling laws as regular earthquakes.In this context, the scaling law describes the "moment" of a slip event on a fault -- which quantifies the elastic energy released by slip on a fault -- as a function of the duration of slip. In practical terms, that means that a big slip across a broad area yields a long-lasting earthquake. It has long been known that the moment of an earthquake is proportional to the cube of the amount of time the earthquake lasts. In 2007, a team from the University of Tokyo and Stanford suggested that slow-slip events appear to be different, with the moment seemingly directly proportional to time.Armed with their new fleshed-out catalog, Avouac's team argues that the magnitudes of slow-slip events also are proportional to the cube of their duration, just like regular earthquakes.Since these events behave similarly to regular earthquakes, studying them could shed light on their more destructive cousins, Avouac says, particularly because slow-slip events occur more frequently. While a traditional magnitude-7.0 earthquake might only occur along a fault every couple of hundred years, a slow-slip event of that magnitude can reoccur along the same fault every year or two."If we study a fault for a dozen years, we might see 10 of these events," Avouac says. "That lets us test models of the seismic cycle, learning how different segments of a fault interact with one another. It gives us a clearer picture of how energy builds up and is released with time along a major fault." Such information could offer more insight into earthquake mechanics and the physics governing their timing and magnitude, he says. | Earthquakes | 2,019 |
October 23, 2019 | https://www.sciencedaily.com/releases/2019/10/191023150333.htm | Ground failure study shows deep landslides not reactivated by 2018 Anchorage Quake | Major landslides triggered by the 1964 magnitude 9.2 Great Alaska earthquake responded to, but were not reactivated by, the magnitude 7.1 Anchorage earthquake that took place 30 November 2018, researchers concluded in a new study published in | The shaking that accompanied the 2018 earthquake was of a higher frequency and a shorter duration than shaking during the 1964 quake, both of which probably kept the 1964 landslides from moving downslope again, said Randall Jibson of the U.S. Geological Survey and his colleagues.In places like Government Hill and Turnagain Heights, where devastating landslides had taken place in 1964, there were "cracks in the places where they had moved in 1964, but they just kind of oscillated in place," Jibson said. "I think [the shaking] was well below what it would take for them to really take off and move again."After a major survey of ground failures caused by the 2018 earthquake, Jibson and his colleagues also noted many fewer landslides -- several thousands fewer -- than would be predicted for the area from landslide modeling based on earthquake magnitude.Other differences between the 1964 and 2018 earthquakes might help to explain these findings, said Jibson. The 1964 Alaska quake was a megathrust event, where the rupture took place along the subducting boundary between two tectonic plates. The 2018 Anchorage intraslab quake, on the other hand, originated within a tectonic plate and at a greater depth than the 1964 quake.Megathrust earthquakes like the 1964 event tend to produce longer durations and periods of shaking, said Jibson, which could trigger more and larger landslides. "We feel pretty strongly that one of the reasons the Anchorage earthquake didn't trigger so many landslides is because it was an intraslab event," he said.Data from the Anchorage quake are helpful for scientists like Jibson who study how earthquakes differ in producing landslides across the globe. "For the past 30 years we've made comparisons based on [earthquake] magnitude alone," he said. "But now we have seen enough earthquakes to know that it's not just magnitude that affects landslides, it's also focal mechanisms and tectonic settings and the frequency of earthquake waves."Jibson and his colleagues surveyed the Anchorage region on foot and by helicopter in the days after the 2018 earthquake to catalog a variety of ground failures caused by the earthquake, from landslides to liquefaction to ground cracking. Each of these types of ground failures caused significant damage in the area, the researchers concluded.Landsliding in the River Heights neighborhood of Eagle River, for instance, moved houses off their foundations and severed utility lines, while earth slumps blocked an offramp to International Airport Road in Anchorage. Some Anchorage homeowners and businesses reported ejected sand in crawl spaces and cracked foundations due to liquefaction settlement.One of the most damaging types of ground failure was extensional cracks, particularly at the boundary between natural and artificial slopes and areas graded flat for buildings. Two phenomena are at work in these settings that can lead to damaging cracking, said Jibson. Seismic shaking "tends to get amplified near sharp breaks in slope," while differences in seismic properties between intact bedrock and loose fill cause "seismic waves to be trapped in the softer, loose material and they reflect back and forth," he said.Jibson and colleagues shared some of the ground failure analyses with a diverse group of researchers and local policymakers at a September conference in Anchorage, "One Year Later: Symposium on the 2018 M7.1 Anchorage Earthquake." The symposium was organized by the Earthquake Engineering Research Institute and the Alaska Earthquake Center with support from the National Earthquake Hazards Reduction Program through the National Science Foundation and the U.S. Geological Survey. | Earthquakes | 2,019 |
October 17, 2019 | https://www.sciencedaily.com/releases/2019/10/191017141103.htm | Analysis of recent Ridgecrest, California earthquake sequence reveals complex, damaging fault systems | The largest earthquake sequence in Southern California in two decades has taught scientists that large earthquakes can occur in a more complex fashion than commonly assumed. The sequence also loaded up strain on a nearby major fault, according to a new study. | The study, a comprehensive analysis of the Ridgecrest Earthquake Sequence by geophysicists from Caltech and JPL, will be published in "This was a real test of our modern seismic monitoring system," says Zachary Ross, assistant professor of geophysics at Caltech and lead author of the The team drew on data gathered by orbiting radar satellites and ground-based seismometers to piece together a picture of an earthquake rupture that is far more complex than found in models of many previous large seismic events.Major earthquakes are commonly thought to be caused by the rupture of a single long fault, such as the more than 800-mile-long San Andreas fault, with a maximum possible magnitude that is dictated primarily by the length of the fault. After the magnitude-7.3 earthquake that struck Landers, California, in 1992 -- which involved the rupture of several different faults -- seismologists began rethinking that model.As described in the The complexity of the rupture is only clear because of the multiple types of scientific instruments that studied the event, Ross says. Satellites observed the ruptures that reached the surface and the associated ground deformation extending out over 100 kilometers in every direction from the rupture, while a dense network of seismometers observed the seismic waves that radiated out from the earthquake. Together, these data allowed scientists to develop a model of subsurface fault slipping and the relationship between the major slipping faults and the significant number of small earthquakes occurring before, between, and after the two largest shocks."We actually see that the magnitude-6.4 quake simultaneously broke faults at right angles to each other, which is surprising because standard models of rock friction view this as unlikely," Ross says. "It is remarkable that we now can resolve this level of detail."Also noteworthy is that the rupture ended just a few kilometers shy of the nearby Garlock Fault, which stretches more than 300 kilometers across Southern California on the northern boundary of the Mojave Desert. The fault has been relatively quiet for the past 500 years, but the strain placed on the Garlock Fault by July's earthquake activity triggered it to start creeping. Indeed, the fault has slipped two centimeters at the surface since July, the scientists say.The event, Ross says, illustrates just how little we still understand about earthquakes. "It's going to force people to think hard about how we quantify seismic hazard and whether our approach to defining faults needs to change," he says. "We can't just assume that the largest faults dominate the seismic hazard if many smaller faults can link up to create these major quakes. Over the last century, the largest earthquakes in California have probably looked more like Ridgecrest than the 1906 San Francisco earthquake, which was along a single fault. It becomes an almost intractable problem to construct every possible scenario of these faults failing together -- especially when you consider that the faults that ruptured during the Ridgecrest Sequence were unmapped in the first place." | Earthquakes | 2,019 |
October 17, 2019 | https://www.sciencedaily.com/releases/2019/10/191017111720.htm | What happens under the Yellowstone Volcano? | The Yellowstone National Park in the USA with its geysers and hot springs is a great attraction for tourists. However, especially in times of little news, the media often focusses on the Yellowstone Supervolcano, which last erupted about 630,000 years ago. Inevitably, then the question of the underlying geological structures will be posed. A recent study by Bernhard Steinberger of the German GeoForschungsZentrum and colleagues in the USA helps to better understand the processes in the Earth's interior. The paper will soon appear in the journal " | According to the model, beneath the Yellowstone volcano lies a so-called mantle plume: a chimney-like structure that reaches thousands of kilometres deep to the border of the Earth's core and mantle. The origin of the plume lies under the "Baja California," more than a thousand kilometers southwest of the national park. Evaluations of earthquake waves had already suggested something like this, but the idea of such a "mantle plume" did not fit in with the movement of the Earth's lithospheric plates.It is clear that Yellowstone is a so-called intraplate volcano. Most volcanoes in the world are located at the borders of continental plates, either where material from the Earth's interior rises, as in Iceland, or where one continental plate submerges under the other and melts, as is the case along the American westcoast. In contrast to plate boundary volcanism, intraplate volcanism goes back to "hotspots" under the Earth's crust. This can be imagined as a welding torch that melts the lithosphere from below -- where a hole is virtually burned through, a volcano grows. This is how Hawaii, for example, came into being.The seismic data for Yellowstone, however, did not provide a clear picture for a long time. This has changed due to new data and refined measurement methods, which have allowed the deeper part of the plume to be imaged in a tomographic image. However, gaps remain in the upper mantle. The data were not so clear here. The study from the GFZ now fills these gaps with a modelling result that maps the mantle plume consistently with the observation data. Accordingly, there are slow movements of the rock in the lower mantle of the Earth, which are directed southwest relative to the surface. Like the plume of smoke of a steam ship, the mantle plume moves from Baja California to the north-northeast to the Yellowstone volcano. Bernhard Steinberger: "Our study contributes to a better understanding of intraplate volcanism and supports the hypothesis of a deep mantle plume. However, this has no impact on the risk assessment of the Yellowstone volcano." | Earthquakes | 2,019 |
October 8, 2019 | https://www.sciencedaily.com/releases/2019/10/191008104652.htm | Rice irrigation worsened landslides in deadliest earthquake of 2018 | Irrigation significantly exacerbated the earthquake-triggered landslides in Palu, on the Indonesian island of Sulawesi, in 2018, according to an international study led by Nanyang Technological University, Singapore (NTU Singapore) scientists. | The 7.5 magnitude earthquake struck the Indonesian city on 28 September 2018, taking the lives of over 4,300 people, making it the deadliest earthquake in the world that year.Writing in Nature Geoscience, researchers from NTU Singapore's Earth Observatory of Singapore (EOS) and the Asian School of the Environment (ASE), together with collaborators from institutions in Indonesia, the United States, the United Kingdom, China and Australia, reveal that the landslides in Indonesia's Palu Valley resulted from widespread liquefaction in areas that were heavily irrigated for rice cultivation.A century-old aqueduct, constructed to bring enough water into the Palu Valley to irrigate rice, artificially raised the water table to almost ground level. This elevation increased the potential for liquefaction -- a situation where buried sediment becomes fluid-like due to strong seismic ground-shaking.The combination of this fluid-like sediment and the slope of the valley floor exacerbated the catastrophe, creating wide lateral spreading of water, landslides, and debris, which swept through the villages.This deadly cocktail marked Indonesia´s deadliest earthquake since Yogyakarta in 2006."This event is a wake-up call for any area where active faults and irrigation coincide," said Dr Kyle Bradley, a principal investigator at NTU's EOS who led the research."We need to improve the awareness and understanding of liquefaction-related landslides and pay closer attention to places where irrigation has artificially raised the water table, said Dr Bradley, who is also a lecturer at NTU's ASE.The research highlights the urgency for Southeast Asian nation-states to review locations with intensive rice farming activities which lie among active faults.Dr Bradley said, "This is of particular concern in Southeast Asia as the pace of development is often faster than the return time of large earthquakes -- the average time period between one earthquake and the next. Most other similarly irrigated areas have not yet been tested by extreme ground shaking, and some of those areas could also pose a major hazard."By analysing satellite images taken before and after the earthquake to identify areas affected by landslides, NTU researchers discovered that irrigated paddies and fields were strongly affected, while areas planted with trees were more stable.This suggested that heavy irrigation and a raised water table were responsible for creating a new liquefaction hazard."Hazards that are created by humans can often be more readily moderated than other natural hazards. Based on the relative resiliency of areas planted with mixed tree crops and irrigated fields, we propose that more intermixed planting could decrease the hazard of large landslides in the future," said Dr Bradley.The satellite image mapping was complemented by field observations of the landslides and of the local irrigation system and practices, produced by an international team of scientists led by Dr Ella Meilianda of the Tsunami and Disaster Mitigation Research Center at Syiah Kuala University in Banda Aceh.Professor Thomas Dunne of the Bren School of Environmental Science and Management at the University of California, Santa Barbara, who was not affiliated with the study, said "The study has demonstrated how Earth scientists with strong field-based understanding of land surface mechanics can use the rapidly growing toolbox of remote sensing to analyse dangerous processes. The landscape-scale survey approach could be applied elsewhere for systematic assessment and avoidance of dangers that are often overlooked when large infrastructure is first proposed in rapidly developing, but potentially unstable terrains."The research team plans to continue their study by assessing the effects of local land use on outcomes during the Palu earthquake. | Earthquakes | 2,019 |
October 2, 2019 | https://www.sciencedaily.com/releases/2019/10/191002112618.htm | North American seismic networks can contribute to nuclear security | The International Monitoring System is the top global seismic network for monitoring nuclear weapon tests around the world. To expand the system's detection capabilities, however, international monitors should seek out the data, methods and expertise of smaller regional seismic networks. | In a paper published as part of an upcoming focus section on regional seismic networks in The need to detect low-yield explosions -- those with a yield less than 0.5 kilotons and causing seismic events between magnitude 1.0 and 3.0 -- has become more pressing, given conflicting reports on Russian low-yield tests at its Arctic island base of Novaya Zemlya and in North Korea, where researchers have debated whether a 2010 event was an explosion or a natural earthquake."Whether with an experienced tester like Russia or an inexperienced tester like North Korea, the U.S. government has a very strong interest in these small events," said Koper.These small seismic events would likely only be well-recorded at local distances of 150 to 200 kilometers, and are difficult to distinguish from earthquakes and other kinds of industrial explosions. Regional seismic networks, which operate over local distances and routinely create catalogs of small earthquakes that must be uncontaminated by non-earthquake seismic noise, are well-equipped to handle these challenges."That's sort of our bread and butter -- the detection, the location and estimation of the size of these small events," Koper noted. "It's a way that regional networks in North America can contribute to this important global security issue."In the SRL paper, Koper offers several examples of how regional seismic networks have provided data that can be useful to international nuclear monitoring. Most regional seismic networks in North America center around a particular "seismo-tectonic feature that warrants extra monitoring versus what you might get from a global or national-scale network," said Koper, such as the New Madrid Fault Zone in the central United States or the Cascadia subduction zone in the Pacific Northwest.Data from several North American regional networks are helping seismologists refine some of the usual techniques that they use to distinguish explosions from natural earthquakes, such as comparing surface and body seismic waves and calculating the ratio between P-waves and S-waves. (P-waves compress rock in the same direction as the wave's movement, while S-waves move rock perpendicular to the direction of the wave.) While these techniques can identify earthquakes at a regional and global distance, data collected by regional networks suggests that these techniques may not perform in a similar way over local distances.Specific regions may provide other important information for monitoring. For instance, the geology in the northeastern United States is more similar to Asian test sites than the geology of western North America. "If you're really interested in detecting things in, let's say, North Korea, the Northeastern U.S is a better geological analog," Koper said.Beyond the IMS, low-yield nuclear test discrimination is also a topic of interest for agencies like the U.S. Department of Defense and national laboratories. In Utah, for instance, the Air Force Research Laboratory has provided funds for the regional network operators to create a catalog of times, locations and sizes of Utah mining blasts. "For the people who are testing new methods of discrimination, maybe based on machine learning or something like that, we can provide this nice, ground-truth catalog," Koper said. | Earthquakes | 2,019 |
October 1, 2019 | https://www.sciencedaily.com/releases/2019/10/191001132703.htm | Early warning signals heralded fatal collapse of Krakatau volcano | On 22 December 2018, a flank of the Anak Krakatau volcano plunged into the Sunda strait between the Indonesian islands of Sumatra and Java, triggering a tsunami that killed 430 people. An international research team led by Thomas Walter of the German Research Centre for Geosciences GFZ in Potsdam has now shown that the volcano produced clear warning signals before its collapse. This was the result of the analysis of a large amount of data from very different sources collected during ground-based measurements as well as by drones and satellites. Satellite data, for example, showed increased temperatures and ground movement on the southwestern flank months before the catastrophe. Seismic data and low-frequency sound waves from a smaller earthquake two minutes before the sudden collapse of a large part of the volcano had heralded the fatal event. This collapse finally triggered the deadly tsunami. The researchers want to use the analysis of this complex event cascade to improve monitoring and early detection of other volcanoes. Their study was published in the journal | Volcanic islands like Anak Krakatau often consist of unstable material. Therefore, every now and then a collapse of volcanic flanks occurs on these islands. Yet, this had not been precisely measured until now. "At Krakatau, we were able to observe for the first time how the erosion of such a volcanic flank took place and which signals announced it," Thomas Walter, a volcanologist at the GFZ explains. In their study at Anak Krakatau the researchers were able to show that over months the movement of the southeast flank towards the sea formed a kind of slide. The sudden accelerated slide of the flank into the sea, the so-called flank collapse, lasted only two minutes and was measured by seismographs and infrasound networks before the first impacts of the tsunami had reached the coasts."We used an exceptionally broad range of methods: From satellite observation to ground-based seismic data, from infrasound to drone data, from temperature measurements to chemical analysis of eruption products," says Thomas Walter. "Today's almost unrestricted access to worldwide data was critical in this. In the days following the tsunami, it allowed us to analyse this event at different locations in different countries at the same time."Similar to Anak Krakatau such events could also herald themselves on other volcanic islands in the Atlantic, Pacific or even in the Mediterranean, to which the results of the study could then presumably be transferred, according to Walter. "We assume that tsunami early warning systems must also take into account events caused by landslides. Those volcanoes that are at risk of slipping should be integrated into the monitoring systems."Seismologist Frederik Tilmann from GFZ and Freie Universität Berlin was also involved in the study. He says that the unusual seismic pattern of the flank collapse was a particular challenge when analysing the data. In contrast to tectonic earthquakes, only a small part of this pattern consisted of high frequencies around 1 Hertz (1 oscillation per second). Instead, the earthquake waves contained stronger components in the range of low frequencies up to about 0.03 Hertz (1 oscillation per 35 seconds). "This property was the reason why the event was not detected in any routine evaluation," says Tilmann.The effort of monitoring systems will pay back, since a large part of the victims of volcanoes in the past two centuries have not been killed by the eruptions themselves, but by landslides and tsunamis, according to Walter. The new results show that the danger of collapsing volcanoes has so far been underestimated. The first step now is to identify the volcanoes at particular risk and to supplement existing measurement methods with additional sensors and new algorithms for evaluation. "We are confident that our findings will lead to the development of improved monitoring systems," said Walter. | Earthquakes | 2,019 |
October 1, 2019 | https://www.sciencedaily.com/releases/2019/10/191001083957.htm | Did long ago tsunamis lead to mysterious, tropical fungal outbreak in Pacific northwest? | The Great Alaskan Earthquake of 1964 and the tsunamis it spawned may have washed a tropical fungus ashore, leading to a subsequent outbreak of often-fatal infections among people in coastal regions of the Pacific Northwest, according to a paper co-authored by researchers at the Johns Hopkins Bloomberg School of Public Health and the nonprofit Translational Genomics Research Institute (TGen), an affiliate of City of Hope. | In the paper, to publish Oct. 1 in the journal mBio, the co-authors confront the mystery of the The co-authors, microbiologist Arturo Casadevall, MD, PhD, Alfred and Jill Sommer Professor and Chair of Molecular Microbiology and Immunology at the Bloomberg School, and epidemiologist David Engelthaler, PhD, associate professor at the Translational Genomics Research Institute, posit that increased shipping after the 1914 opening of the Panama Canal brought "The big new idea here is that tsunamis may be a significant mechanism by which pathogens spread from oceans and estuarial rivers onto land and then eventually to wildlife and humans," says Casadevall. "If this hypothesis is correct, then we may eventually see similar outbreaks of According to the Centers for Disease Control and Prevention (CDC), more than 300 Epidemiologists have found evidence of About ten years ago, the CDC asked Engelthaler to investigate. He and his colleagues applied a "molecular clock" analysis to the DNA sequences of isolated That left the question of how After further study, Engelthaler deduced that the Great Alaska Earthquake of March 1964 might have been the key factor. The quake had its epicenter in southeastern Alaska but spawned tsunamis throughout the North Pacific. These tsunamis inundated coastal areas of British Columbia, Washington, Oregon, and California. And, the affected regions correspond broadly to the locations where There were multiple pieces to this puzzle. It appeared that a singular event, like a natural disaster, could have been the missing piece that brought the whole picture together, notes Engelthaler. The tsunami idea seemed to fit the "when, where, and why" of this disease emergence.Then, more than 30 years passed before the fungus began infecting humans in the region which was another unresolved issue. Relevant to this aspect of the puzzle is Casadevall's previous research which suggested, for example, that another human-infecting "We propose that The researchers now hope to continue testing their hypothesis with detailed analyses of | Earthquakes | 2,019 |
September 24, 2019 | https://www.sciencedaily.com/releases/2019/09/190924125027.htm | 'Treasure trove' of earthquake clues could be unearthed by wavy new technique | Their technique combines traditional 'acoustic mapping' with a newer method called 'full waveform inversion'. They found their new method enhanced their view of rocks along a fault line -- a break in the Earth's crust -- off the east coast of New Zealand's North Island. | The researchers hope that their clearer view of the rocks around these fault lines -- whose movements can trigger earthquakes and subsequent tsunamis -- will help them better understand why such events happen.Lead author Melissa Gray, from Imperial College London's Department of Earth Science and Engineering, said: "We can now scan underwater rocks to see their properties in greater detail. Hopefully this will help us to better work out how earthquakes and tsunamis happen."Just off the North Island coast of New Zealand, the edge of the Pacific tectonic plate ducks underneath the edge of the Australian plate -- an area known as the Hikurangi subduction zone.Subduction refers to when two plates move against each other, building pressure that eventually triggers one plate suddenly 'slipping' beneath the other. This sudden slipping can cause earthquakes, which in turn trigger tsunamis if they happen underwater.However, subduction can also cause silent quakes known as 'slow slip' events, which release the same amount of energy as a typical earthquake, but over a much longer amount of time.Slow slip events often go unnoticed and cause no damage, but the authors of this new report say studying them could constitute a "treasure trove" of information. Melissa said: "Our new way of studying slow slip events could unveil a treasure trove of clues about how larger, more devastating quakes happen."Ultrasound images of the subduction zone, before (L) and after (middle & R) 2D waveform inversion was used. The 'after' photos show the zone in much finer, higher resolution detail.Current rock mapping techniques use sound waves to build pictures of what rocks look like many kilometres below ground, as well as revealing how porous and hard they are and how much fluid and gas they are likely to contain. This information helps scientists assess how rocks might behave when stress builds up, and how much shaking there would be in an earthquake.Now Melissa, together with Imperial's Dr Rebecca Bell and Professor Joanna Morgan, have plugged current sound wave information into an imaging technique called full waveform inversion.This method helped them paint a picture of the Hikurangi fault zone in unprecedented detail. They also captured the shallow faults which were responsible for the large Gisborne tsunami in 1947 -- an example of a large tsunami caused by a relatively small slow slip earthquake.The method builds on the concept of 'acoustic mapping', where sound waves are sent from a boat on the ocean surface down to the seabed and kilometres into the Earth's crust. The amount of time taken for the waves to bounce off different rock layers and back up to the boat -- as recorded by underwater microphones being towed behind the boat -- tells scientists the distance to the seabed and rock layers, as well as the likely composition of the rocks.The researchers combined data from acoustic mapping with the full waveform inversion technique. This converted the sound waves into higher resolution, more intricately detailed maps of the seabed and rock beneath.To check their data were accurate, the authors compared their models of the rock properties mapped by inversion with samples collected from drilling by the International Ocean Discovery Program. They found that the models and real data matched, indicating the technique is accurate and reliable, and can provide more information than current drilling methods.Study co-author Dr Bell said: "We can use this to study earthquake and tsunami-prone areas around New Zealand and the rest of the world."Next, they will work to map the very point at which two edges of tectonic plates touch down to depths of 10-15 kilometres.Dr Bell added: "Although nobody's seen fault lines like this at such scale before, we still don't know the properties of the Hikurangi plate boundary at the depth where slow slips occur."Ultimately, we want to understand why some slips cause devastating earthquakes, while others do not." | Earthquakes | 2,019 |
September 23, 2019 | https://www.sciencedaily.com/releases/2019/09/190923090552.htm | Faults' hot streaks and slumps could change earthquake hazard assessments | For more than a century, a guiding principle in seismology has been that earthquakes recur at semi-regular intervals according to a "seismic cycle." In this model, strain that gradually accumulates along a locked fault is completely released in a large earthquake. Recently, however, seismologists have realized that earthquakes often occur in clusters separated by gaps, and one research group now argues that the probability of a tremor's recurrence depends upon whether a cluster is ongoing -- or over. | On Monday, 23 Sept. 2019, at the GSA Annual Meeting in Phoenix, Seth Stein, the Deering Professor of Geological Sciences at Northwestern University, will present a new model that he and his co-authors believe better explains the complexity of the "supercycles" that have been observed in long-term earthquake records. "One way to think about this is that faults have hot streaks -- earthquake clusters -- as well as slumps -- earthquake gaps -- just like sports teams," says Stein.In the traditional concept of the seismic cycle, Stein explains, the likelihood of a large earthquake depends solely upon the amount of time that has elapsed since the most recent large tremor reset the system. In this simple case, he says, the fault has only a "short-term memory.""The only thing that matters," says Stein, "is when the last big earthquake was. The clock is reset every time there's a big event."But this model is not realistic, he argues. "We would never predict the performance of a sports team based on how they performed during their previous game," says Stein. "The rest of the season is likely to be much more useful."Geologists sometimes see long-term patterns in paleoseismic records that the seismic-cycle model can't explain. In these cases, says Stein, "Not all the accumulated strain has been released after one big earthquake, so these systems have what we call "long-term memories.'"To get a sense of how a system with Long-Term Fault Memory would function, the researchers sampled windows of 1,300-years -- a period of time for which geologists might reasonably have a record available -- from simulated 50,000-year paleoseismic records. The results indicate that earthquake recurrence intervals looked very different depending upon which 1,300-year window the scientists examined.Because there are random elements involved, says Stein, there are windows when the recurrence intervals appear to be periodic, and other times when they look clustered. "But the fault hasn't changed its properties," he says. Eventually, the model predicts that the earthquakes will release much of the accumulated strain, at which point the system will reset and the fault's "streak" will end.According to this Long-Term Fault Memory model, the probability of an earthquake's occurrence is controlled by the strain stored on the fault. This depends on two parameters: the rate at which strain accumulates along the fault, and how much strain is released after each big earthquake. "The usual earthquake-cycle model assumes that only the last quake matters," says Stein, "whereas in the new model, earlier quakes have an effect, and this history influences the probability of an earthquake in the future." After a big quake, he says, there can still be lots of strain left, so the fault will be on a hot streak. Eventually, however, most of the strain is released and the fault goes into a slump.Ultimately, says Stein, the earthquake hazard depends upon whether or not a fault is in a slump or a streak. "Depending on which of those assumptions you make," he says, "you can get the earthquake probability much higher or much lower."Seismologists have not yet come up with a compelling way to determine whether a fault is -- or is not -- in a cluster. As a result, says Stein, "There's a much larger uncertainty in estimates of the probability of an earthquake than people have been wanting to admit." | Earthquakes | 2,019 |
September 10, 2019 | https://www.sciencedaily.com/releases/2019/09/190910095411.htm | New volcanic eruption forecasting technique | Volcanic eruptions and their ash clouds pose a significant hazard to population centers and air travel, especially those that show few to no signs of unrest beforehand. Geologists are now using a technique traditionally used in weather and climate forecasting to develop new eruption forecasting models. By testing if the models are able to capture the likelihood of past eruptions, the researchers are making strides in the science of volcanic forecasting. | The study, published in the journal "The 2008 eruption of Okmok came as a bit of surprise," said University of Illinois graduate student and lead author Jack Albright. "After an eruption that occurred in 1997, there were periods of slight unrest, but very little seismicity or other eruption precursors. In order to develop better forecasting, it is crucial to understand volcanic eruptions that deviate from the norm."Geologists typically forecast eruptions by looking for established patterns of preeruption unrest such as earthquake activity, groundswell and gas release, the researchers said. Volcanoes like Okmok, however, don't seem to follow these established patterns.To build and test new models, the team utilized a statistical data analysis technique developed after World War II called Kalman filtering."The version of Kalman filtering that we used for our study was updated in 1996 and has continued to be used in weather and climate forecasting, as well as physical oceanography," said geology professor Patricia Gregg, a co-author of the study that included collaborators from Southern Methodist University and Michigan State University. "We are the first group to use the updated method in volcanology, however, and it turns out that this technique works well for the unique unrest that led up to Okmok's 2008 eruption."One of those unique attributes is the lack of increased seismicity before the eruption, the researchers said. In a typical preeruption sequence, it is hypothesized that the reservoir under the volcano stays the same size as it fills with magma and hot gases. That filling causes pressure in the chamber to increase and the surrounding rocks fracture and move, causing earthquakes."In the 2008 eruption, it appears that the magma chamber grew larger to accommodate the increasing pressure, so we did not see the precursor seismic activity we would expect," Albright said. "By looking back in time with our models, or hindcasting, we can now observe is that stress had been building up in the rocks around the chamber for weeks, and the growth of the magma system ultimately led to its failure and eruption."This type of backward and forward modeling allows researchers to watch a volcanic system evolve over time. "While we stopped our analysis after the 2008 eruption, we are now able to propagate this new model forward in time, bring it to present day, and forecast where Okmok volcano is heading next," Gregg said.The researchers posit that these models will continue to find other less-recognized eruption precursors, but acknowledge that every volcano is different and that the models must be tailored to fit each unique system. | Earthquakes | 2,019 |
September 5, 2019 | https://www.sciencedaily.com/releases/2019/09/190905090931.htm | Role of earthquake motions in triggering a 'surprise' tsunami | In newly published research, an international team of geologists, geophysicists, and mathematicians show how coupled computer models can accurately recreate the conditions leading to the world's deadliest natural disasters of 2018, the Palu earthquake and tsunami, which struck western Sulawesi, Indonesia in September last year. The team's work was published in | The tsunami was as surprising to scientists as it was devastating to communities in Sulawesi. It occurred near an active plate boundary, where earthquakes are common. Surprisingly, the earthquake caused a major tsunami, although it primarily offset the ground horizontally -- normally, large-scale tsunamis are typically caused by vertical motions.Researchers were at a loss -- what happened? How was the water displaced to create this tsunami: by landslides, faulting, or both? Satellite data of the surface rupture suggests relatively straight, smooth faults, but do not cover areas offshore, such as the critical Palu Bay. Researchers wondered -- what is the shape of the faults beneath Palu Bay and is this important for generating the tsunami? This earthquake was extremely fast. Could rupture speed have amplified the tsunami?Using a supercomputer operated by the Leibniz Supercomputing Centre, a member of the Gauss Centre for Supercomputing, the team showed that the earthquake-induced movement of the seafloor beneath Palu Bay itself could have generated the tsunami, meaning the contribution of landslides is not required to explain the tsunami's main features. The team suggests an extremely fast rupture on a straight, tilted fault within the bay. In their model, slip is mostly lateral, but also downward along the fault, resulting in anywhere from 0.8 metres to 2.8 metres vertical seafloor change that averaged 1.5 metres across the area studied. Critical to generating this tsunami source are the tilted fault geometry and the combination of lateral and extensional strains exerted on the region by complex tectonics.The scientists come to this conclusion using a cutting-edge, physics-based earthquake-tsunami model. The earthquake model, based on earthquake physics, differs from conventional data-driven earthquake models, which fit observations with high accuracy at the cost of potential incompatibility with real-world physics. It instead incorporates models of the complex physical processes occurring at and off of the fault, allowing researchers to produce a realistic scenario compatible both with earthquake physics and regional tectonics.The researchers evaluated the earthquake-tsunami scenario against multiple available datasets. Sustained supershear rupture velocity, or when the earthquake front moves faster than the seismic waves near the slipping faults, is required to match simulation to observations. The modeled tsunami wave amplitudes match the available wave measurements and the modeled inundation elevation (defined as the sum of the ground elevation and the maximum water height) qualitatively match field observations. This approach offers a rapid, physics-based evaluation of the earthquake-tsunami interactions during this puzzling sequence of events."Finding that earthquake displacements probably played a critical role generating the Palu tsunami is as surprising as the very fast movements during the earthquake itself," said Thomas Ulrich, PhD student at Ludwig Maximilian University of Munich and lead author of the paper. "We hope that our study will launch a much closer look on the tectonic settings and earthquake physics potentially favouring localized tsunamis in similar fault systems worldwide." | Earthquakes | 2,019 |
September 4, 2019 | https://www.sciencedaily.com/releases/2019/09/190904130614.htm | Earthquake study casts doubt on early warnings but hints at improved forecasting | A recent study investigated around 100,000 localized seismic events to search for patterns in the data. University of Tokyo Professor Satoshi Ide discovered that earthquakes of differing magnitudes have more in common than was previously thought. This suggests development of early warning systems may be more difficult than hoped. But conversely, similarities between some events indicate that predictable characteristics may aid researchers attempting to forecast seismic events. | Since the 1980s seismologists -- earthquake researchers -- have wondered how feasible it might be to predict how an earthquake will behave given some information about its initial conditions. In particular whether you can tell the eventual magnitude based on seismic measurements near the point of origin, or epicenter. Most researchers consider this idea too improbable given the randomness of earthquake behavior, but Ide thinks there's more to it than that."Taking inspiration from a study comparing different-sized earthquakes, I decided to analyze a seismic data set from a region known as the Tohoku-Hokkaido subduction zone in eastern Japan," said Ide. "A systematic comparison of around 100,000 seismic events over 15 years leads me to believe earthquakes are not different in random ways but share many similarities."To draw comparisons between different earthquakes, Ide first selected the larger examples from the data set with magnitudes greater than 4.5. He also selected smaller earthquakes in the same regions as these larger ones. Ide then ascertained mathematically how similar seismic signals were between pairs of large and small earthquakes. He used a statistical function for the comparison of signals called a cross-correlation on data from 10 seismic stations close to the pairs of earthquakes in each case."Some pairs of large and small earthquakes start with exactly the same shaking characteristics, so we cannot tell the magnitude of an earthquake from initial seismic observations," explained Ide. "This is bad news for earthquake early warning. However, for future forecasting attempts, given this symmetry between earthquakes of different magnitudes, it is good to know they are not completely random." | Earthquakes | 2,019 |
August 27, 2019 | https://www.sciencedaily.com/releases/2019/08/190827123527.htm | 'Surrey swarm' earthquakes not caused by nearby oil extraction, study suggests | The series of 34 small earthquakes between April 2018 and May 2019 occurred within 10 km of two active oil extraction sites at Brockham and Horse Hill in Surrey. | Many residents of Newdigate, Dorking, Horley and Charlwood in Surrey, and Crawley and Horsham in West Sussex, felt the largest quake, which reached a magnitude of 3.2.[4]As the British Isles don't lie along boundaries separating two tectonic plates, earthquakes that are felt by people are relatively rare -- so there was concern that the swarm was triggered by nearby drilling and extraction.Now, the first in-depth study of the quakes by Imperial, the University of Bristol, and the British Geological Survey (BGS), has shown no direct link between oil extraction and earthquakes in the region.The authors therefore believe natural causes were behind the earthquakes, which occurred close to Gatwick Airport in West Sussex.Lead author Dr Stephen Hicks, of Imperial's Department of Earth Science and Engineering, said: "The quakes seem to have occurred naturally, and our findings suggest their closeness to oil extraction sites is probably a coincidence."The paper is published today in During the early stages of the swarm, the researchers installed seismometers -- instruments that measure ground vibrations -- around the affected areas[1]. The highly sensitive devices tracked the timings, strengths, and distribution of earthquakes.The researchers also used earthquake data from existing sensors in citizens' homes, known as 'RaspberryShakes', that had been 'listening' since late 2017 for seismic activity in the area.Based on data from the seismometers, the study team examined a variety of properties of the Surrey quakes and compared them to previous ones that were caused by both human activities and by natural causes in the UK and elsewhere.Most natural earthquakes in the UK cause rocks on either side of weaknesses in the ground, known as faults, to move horizontally. In contrast, earthquakes caused by oil extraction cause rocks either side of faults to move vertically.The researchers found that the Surrey swarm quakes moved ancient faults horizontally, indicating that the quakes would probably have happened regardless of nearby oil extraction.From the BGS seismometers, the researchers detected 168 small magnitude earthquakes between 2018 and 2019. The first cluster of earthquakes happened in April 2018, long after oil extraction tests in 2016, but well before further extended tests starting in July 2018, adding to the evidence that they were naturally caused.Dr Hicks said: "The ground vibrations recorded from earthquakes provide clues that hint at their cause. There are increasing examples worldwide of human activity causing earthquakes, but it can be difficult to work out which newer cases are natural, and which are human-caused."The researchers also looked at the distance between the earthquakes and extraction sites. Rather than cluster round the extraction sites, the quakes were distributed in a tight cluster more than 3 km away from the extraction sites.This area, said Dr Hicks, is too far away to link the quakes to oil extraction. He said: "It would be unprecedented for this type and scale of oil extraction to affect sites more than a kilometre away."The team also examined the depth at which the quakes occurred. To do this, they compared the locations of the earthquakes with images of rock layers beneath the area. The images were created by measuring the reflection of sound waves off each layer (see Fig 3).They found that although the Surrey earthquakes were shallow (around 2.5 km deep), they occurred deeper than rock formations from which oil is extracted (less than 1 km deep).The paper is the first piece of research that uses high-precision data and modelling to look at the cause of the Surrey swarm. The researchers are unsure why the swarm came about suddenly in one of the UK's least seismically active areas -- and it's not currently possible to predict natural earthquakes.The authors say the swarm, like most natural earthquakes in the UK, could have been caused by ongoing collision of the African and Eurasian tectonic plates in the Mediterranean Sea -- the UK's nearest plate boundary -- which stresses the crust and causes earthquakes across Europe.Dr Hicks said: "This is not the first time earthquakes have come seemingly from nowhere and without human input. Decades of instrumental recordings and hundreds years of historical accounts of earthquakes show that similar seismic swarms have happened in the UK before due to long-term tectonic stresses and without any clear link to human activities."Industrial activities have been known to cause earthquakes in the past, known as 'induced seismicity'. In most of these cases, quakes are caused by injecting fluids for hydraulic fracturing (fracking) or disposal of waste fluids. Since fracking does not currently take place in the Surrey or Sussex area, this study focused on conventional oil extraction, in which there is no such large-scale injection of fluid.Dr Hicks added: "If oil extraction caused the earthquakes, then it did so by a mechanism that hasn't yet been reported anywhere else in the world."The researchers are continuing to monitor quakes in the area for the foreseeable future. Dr Hicks said: "The more data we have, the more we'll know about the causes and effects of these earthquakes. Who knows which clues from the ground we'll pick up in the future." | Earthquakes | 2,019 |
August 26, 2019 | https://www.sciencedaily.com/releases/2019/08/190826121956.htm | Crack in Pacific seafloor caused volcanic chain to go dormant | From his geology lab at the University of Houston, Jonny Wu has discovered that a chain of volcanoes stretching between Northeast Asia and Russia began a period of silence 50 million years ago, which lasted for 10 million years. In the journal Geology, Wu, assistant professor of structural geology, tectonics and mantle structure, is reporting that one of the most significant plate tectonic shifts in the Pacific Ocean forced the volcanoes into dormancy. | At the end of the Cretaceous Period, shortly after dinosaurs disappeared, the Pacific Plate, the largest tectonic plate on Earth, mysteriously changed direction. One possible result was the formation of a prominent bend in the Hawaiian Islands chain, and another, just discovered by Wu, was the volcanic dormancy along a 900-mile stretch between Japan and the remote Sikhote-Alin mountain range in Russia in what is known as the Pacific Ring of Fire, where many volcanoes form."Around the time of the volcano dormancy, a crack in the Pacific Ocean Plate subducted, or went below, the volcanic margin. The thin, jagged crack in the seafloor was formed by plates moving in opposing directions and when they subduct, they tend to affect volcanic chains," reports Wu.When the volcanoes revived 10 million years later, the radiogenic isotopes within the magma were noticeably different."The productivity of magma within the once-violent chain of volcanoes was only one-third its previous level," said Wu, who has linked this phenomenon to the subduction of the Pacific-Izanagi mid-ocean ridge, an underwater mountain.Scientists have long understood that volcanic activity above subduction zones, where one tectonic plate converges towards and dives beneath another, is driven by water brought deep within the Earth by the diving subducting plate. When the water reaches depths of around 65 miles, it causes the solid mantle to partially melt and produces magma that may rise and feed volcanoes."However, in the case of the East Asian volcanoes, subduction of the immense seafloor crack interrupted its water-laden conveyor belt into the deep Earth. As a result, the volcanoes turned off," said Wu.Wu and UH doctoral student Jeremy Tsung-Jui Wu, who is not related to Jonny Wu, discovered the dormancy -- and the reason for it -- after examining a magmatic catalog of 900 igneous rock radio-isotopic values from the Cretaceous to Miocene eras. They also found evidence that the crack in the Pacific Plate was about 50% shorter than originally believed. | Earthquakes | 2,019 |
August 25, 2019 | https://www.sciencedaily.com/releases/2019/08/190825075934.htm | Deducing the scale of tsunamis from the 'roundness' of deposited gravel | Scientists from Tokyo Metropolitan University and Ritsumeikan University have found a link between the "roundness" distribution of tsunami deposits and how far tsunamis reach inland. They sampled the "roundness" of gravel from different tsunamis in Koyadori, Japan, and found a common, abrupt change in composition approximately 40% of the "inundation distance" from the shoreline, regardless of tsunami magnitude. Estimates of ancient tsunami size from geological deposits may help inform effective disaster mitigation. | Tsunamis are one of nature's most devastating hazards; understanding their scale and mechanism is of paramount scientific and socio-economic importance. Nevertheless, despite our best efforts to study and understand them, their infrequent occurrence can make quantitative studies difficult; tsunami-causing seismic events around subduction zones (where one tectonic plate dips underneath another plate) recur once every 100 to 1,000 years, significantly reducing the number of accurately documented events. It is highly desirable that we gain some understanding by looking at geological deposits instead. However, despite some success in finding the number and age of past events, it is not yet possible to estimate the magnitude of ancient tsunamis, particularly in narrow coastal lowlands like the Sanriku Coast in Japan, struck by the 2011 Tohoku earthquake and tsunami.Therefore, Assistant Professor Daisuke Ishimura from Tokyo Metropolitan University and Postdoctoral Fellow Keitaro Yamada from Ritsumeikan University carried out studies of gravel samples collected from bore holes and the trench in Koyadori, situated in the middle of the Sanriku coastline. Geological samples were taken corresponding to three tsunami events (AD 1611, 1896 and 2011) whose magnitudes are known, specifically their "inundation distance," or how far they reach inland. They used automated image analysis to study how "round" each gravel particle was in their samples, giving 10 to 100 times more data than existing, manual methods. Comparing distributions with measurements of modern beach and fluvial (river) gravels, they found that they could map the number ratio between beach and fluvial gravel. They discovered that this ratio suddenly changed at a certain distance away from the sea. This point was named the "Tsunami Gravel Inflection Point" (TGIP); it is thought to arise from "run-up" (incoming) waves bringing beach material inland and "return" waves drawing inland material towards the sea. Although the TGIP occurred at different locations for each event, they found that it was always approximately 40% of the inundation distance. They applied this finding to samples corresponding to even older tsunamis, providing estimates for the size of events along the Sanriku Coast going back approximately 4,000 years for the first time.Although the researchers believe this ratio is specific to the local topography, the same analysis may be applied to characterize other tsunami-prone locations. An accurate estimate of the extent of ancient tsunamis will expand the number of events available for future research to study the mechanisms behind tsunamis, helping to inform effective disaster mitigation and the planning of coastal communities. | Earthquakes | 2,019 |
August 22, 2019 | https://www.sciencedaily.com/releases/2019/08/190822081159.htm | Underground links between quakes and eruptions of Japan's biggest active volcano | The threat of explosive volcanic eruptions looms over many cities around the world. Earthquakes, another major geological hazard, are known to have some relationships with the occurrence of volcanic eruptions. Although they often precede volcanic events, the mechanisms of these relationships are not yet well understood. | Mount Aso in Kyushu, Japan, is one of the largest active volcanoes in the world and has experienced major earthquakes and eruptions as recently as 2016. Researchers at Kyushu University's International Institute for Carbon-Neutral Energy Research (I2CNER) have been investigating the relationships among these events to better understand what are happening under the surface and to help predict future disasters. In particular, for a new study published in "We analyzed continuous VLP seismicity data recorded from January 2015 to December 2016, a period that includes both large earthquakes and eruptions of Mount Aso," explains lead author of the study Andri Hendriyana. "Using on this dataset, we developed a differential-time back-projection method to accurately locate VLP events, and detected over 18,000 reliable VLP events."Using this method, two distinct clusters of these seismic events were identified in the subsurface below the caldera of Mount Aso. For most of the observation period, VLP activity was almost entirely confined to the eastern cluster. However, after the Kumamoto earthquakes on April 2016, VLP activity abruptly shifted to the western cluster for about five months. Then, in September 2016, one month before the largest eruption of Mount Aso during the study period, VLP events migrated back to the eastern cluster. After the large eruption on October 8, 2016, VLP seismicity stopped temporarily. Together, these observations show that VLP events are affected by the occurrence of earthquakes and are related to volcanic eruptions. VLP seismicity is considered to be directly related to pressure variations associated with magmatic activity."We interpret the migration of VLP activity after the earthquakes as a response to permeability enhancement or to fractures opening because of extension associated with the Kumamoto earthquakes," says senior author Takeshi Tsuji. He expects this method to be applied in further studies of Mount Aso as well as other volcanoes worldwide."The information obtained from this new monitoring approach could reveal new details about the dynamic behavior within Aso and other volcanoes after earthquakes, and could provide important information for prevention and mitigation of future disasters." | Earthquakes | 2,019 |
August 22, 2019 | https://www.sciencedaily.com/releases/2019/08/190822094022.htm | Explaining earthquakes we can't feel | The Earth's subsurface is an extremely active place, where the movements and friction of plates deep underground shape our landscape and govern the intensity of hazards above. While the Earth's movements during earthquakes and volcanic eruptions have been recorded by delicate instruments, analyzed by researchers and constrained by mathematical equations, they don't tell the whole story of the shifting plates beneath our feet. | Over the past two decades, the advent of the global positioning system -- including receivers with extremely sensitive sensors that capture millimeters of movement -- has made scientists aware of earthquake-like phenomena that have been challenging to untangle. Among them are so-called slow slip events, or slow-moving earthquakes -- sliding that occurs over weeks at a time unbeknownst to humans on the surface.These slow slip events occur all over the world and possibly help trigger larger earthquakes. The largest slow slip events occur in subduction zones, where one tectonic plate dives beneath another, eventually forming mountains and volcanoes over millions of years. New computer simulations produced by researchers at Stanford University and published online June 15 in the "Slow slip is such an intriguing phenomenon. Slow slip events are both so widespread and really so unexplained that they're a puzzle that dangles before us as scientists that we all want to solve," said study co-author Eric Dunham, an associate professor of geophysics in Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "We've known about slow slip for almost 20 years and there's still not a great understanding of why it happens."These events are especially challenging to explain because of their unstable but sluggish nature. The fault does not slide steadily but instead, sliding periodically, accelerates, yet never reaches the point where it sends out seismic waves large enough for humans to detect.Despite their stealthy nature, slow slip events can add up. In an ice stream in Antarctica, the slow slip events occur twice daily, last 30 minutes and are equivalent to magnitude 7.0 earthquakes, Dunham said.Researchers think changes in friction explain how quickly rock on either side of the fault slips. With that in mind, they assumed slow slip events started as earthquakes, with a type of friction known as rate-weakening that makes sliding fundamentally unstable. But many laboratory friction experiments contradicted that idea. Instead, they had found that rocks from slow slip regions display a more stable kind of friction known as rate-strengthening, widely thought to produce stable sliding. The new computer simulations resolved this inconsistency by showing how slow slip can arise with contrary-seeming rate-strengthening friction."A handful of studies had shown that there are ways to destabilize rate-strengthening friction. However, until our paper, no one had realized that if you simulated these instabilities, they actually turn into slow slip, they don't turn into earthquakes," according to lead author Elias Heimisson, a doctoral candidate at Stanford Earth. "We also identified a new mechanism for generating slow slip instabilities."Dunham's research group approaches unanswered questions about the Earth by considering all the possible physical processes that might be at play. In this case, faults occur in rocks that are saturated in fluid, giving them what's known as a poroelastic nature in which the pores allow the rock to expand and contract, which changes the fluid pressure. The group was curious about how those changes in pressure can change the frictional resistance on faults."In this case, we did not start on this project to explain slow slip events -- we started on it because we knew that rocks have this poroelastic nature and we wanted to see what consequences it had," Dunham said. "We never thought it would give rise to slow slip events and we never thought it would destabilize faults with this type of friction."With these new simulations that account for the rock's porous nature, the group found that as rocks get squeezed and fluids cannot escape, the pressure increases. That pressure increase reduces friction, leading to a slow slip event."The theory is high-level," Heimisson said. "We see these interesting things when you account for poroelasticity and people might want to use it more broadly in models of seismic cycles or specific earthquakes."Heimisson will be creating a 3D simulation based on this theory as a postdoctoral researcher at the California Institute of Technology.Martin Almquist, a postdoctoral research fellow in the Department of Geophysics, is a co-author on the study.The research was supported by the Stanford Consortium for Induced and Triggered Seismicity, the Southern California Earthquake Center, NASA Headquarters under the NASA Earth and Space Science Fellowship Program and the Knut and Alice Wallenberg Foundation. | Earthquakes | 2,019 |
August 7, 2019 | https://www.sciencedaily.com/releases/2019/08/190807131926.htm | A rocky relationship: A history of Earth's continents breaking up and getting back together | A new study of rocks that formed billions of years ago lends fresh insight into how Earth's plate tectonics, or the movement of large pieces of Earth's outer shell, evolved over the planet's 4.56-billion-year history. | A report of the findings, published August 7 in "One of the key ways to understand how Earth has evolved to become the planet that we know is plate tectonics," says Robert Holder, a Postdoctoral Fellow in Earth and Planetary Sciences at Johns Hopkins University and the paper's first author.Plate tectonics dictates how continents drift apart and come back together, helps explain where volcanoes and earthquakes occur, predicts cycles of erosion and ocean circulation, and how life on Earth has evolved.In a bid to resolve the mystery of how and when plate tectonics emerged on Earth, Holder and the research team examined a global compilation of metamorphic rocks that formed over the past 3 billion years at 564 sites. Metamorphic rocks are rocks that, through the process of being buried and heated deep in the Earth's crust, have transformed into a new type of rock. Scientists can measure the depth and temperatures at which metamorphic rocks form, and thereby constrain heat flow at different places in Earth's crust. Because plate tectonics strongly influences heat flow, ancient metamorphic rocks can be used to study plate tectonics in Earth's past.The research team compiled data on the temperatures and depths at which the metamorphic rocks formed and then evaluated how these conditions have changed systematically through geological time. From this, the team found that plate tectonics, as we see it today, developed gradually over the last 2.5 billion years."The framework for much of our understanding of the world and its geological processes relies on plate tectonics," says Holder. "Knowing when plate tectonics began and how it changed impacts that framework."Clarity on when plate tectonics began and whether it was different in Earth's past can help scientists better understand why we find certain rocks and minerals where we do and how they formed, says Holder.Other authors on this paper include Daniel Viete of the Johns Hopkins University; Michael Brown of the University of Maryland, College Park; and Tim Johnson of Curtin University | Earthquakes | 2,019 |
August 1, 2019 | https://www.sciencedaily.com/releases/2019/08/190801104108.htm | Drop of ancient seawater rewrites Earth's history | The remains of a microscopic drop of ancient seawater has assisted in rewriting the history of Earth's evolution when it was used to re-establish the time that plate tectonics started on the planet. | Plate tectonics is Earth's vital -- and unique -- continuous recycling process that directly or indirectly controls almost every function of the planet, including atmospheric conditions, mountain building (forming of continents), natural hazards such as volcanoes and earthquakes, formation of mineral deposits and the maintenance of our oceans. It is the process where the large continental plates of the planet continuously move, and the top layers of the Earth (crust) are recycled into the mantle and replaced by new layers through processes such as volcanic activity.Where it was previously thought that plate tectonics started about 2.7 billion years ago, a team of international scientists used the microscopic leftovers of a drop of water that was transported into the Earth's deep mantle -- through plate tectonics -- to show that this process started 600 million years before that. An article on their research that proves plate tectonics started on Earth 3.3 billion years ago was published in the high impact academic journal, "Plate tectonics constantly recycles the planet's matter, and without it the planet would look like Mars," says Professor Allan Wilson from the Wits School of Geosciences, who was part of the research team."Our research showing that plate tectonics started 3.3 billion years ago now coincides with the period that life started on Earth. It tells us where the planet came from and how it evolved."Earth is the only planet in our solar system that is shaped by plate tectonics and without it the planet would be uninhabitable.For their research, the team analysed a piece of rock melt, called komatiite -- named after the type occurrence in the Komati river near Barberton in Mpumalanga -- that are the leftovers from the hottest magma ever produced in the first quarter of Earth's existence (the Archaean). While most of the komatiites were obscured by later alteration and exposure to the atmosphere, small droplets of the molten rock were preserved in a mineral called olivine. This allowed the team to study a perfectly preserved piece of ancient lava."We examined a piece of melt that was 10 microns (0.01mm) in diameter, and analysed its chemical indicators such as HThe research allows insight into the first stages of plate tectonics and the start of stable continental crust."What is exciting is that this discovery comes at the 50th anniversary of the discovery of komatiites in the Barberton Mountain Land by Wits Professors, the brothers Morris and Richard Viljoen," says Wilson. | Earthquakes | 2,019 |
July 31, 2019 | https://www.sciencedaily.com/releases/2019/07/190731125432.htm | Faint foreshocks foretell California quakes | New research mining data from a catalog of more than 1.8 million southern California earthquakes found that nearly three-fourths of the time, foreshocks signalled a quake's readiness to strike from days to weeks before the the mainshock hit, a revelation that could advance earthquake forecasting. | "We are progressing toward statistical forecasts, though not actual yes or no predictions, of earthquakes," said Daniel Trugman, a seismologist at Los Alamos National Laboratory and coauthor of a paper out today in the journal The paper, titled "Pervasive foreshock activity across southern Californa," notes foreshocks preceded nearly 72 percent of the "mainshocks" studied (the largest quakes in a particular sequence), a percentage that is significantly higher than was previously understood.Many of these foreshocks are so small, with magnitudes less than 1, that they are difficult to spot through visual analysis of seismic waveforms. To detect such small events requires advanced signal processing techniques and is a huge, data-intensive problem. Significant computing capabilities were key to extracting these new insights from the southern California Quake Template Matching Catalog, recently produced by Trugman and coauthor Zachary Ross, an assistant professor in seismology at Caltech. The template matching took approximately 300,000 GPU-hours on an array of 200 NVIDIA-P100 GPUs, involving 3-4 weeks of computing time for the final run. GPUs are special types of computers, optimal for massively parallel problems, as each GPU has thousands of cores, and each core is capable of handling its own computational thread. For perspective, a standard laptop has either 2 or 4 cores.The earthquake catalog is archived by the Southern California Earthquake Data Center.The small foreshocks may be too difficult to discern in real time to be of use in earthquake forecasting. Another important issue is that quakes run in packs: they cluster in both space and time, so sorting the foreshocks of a particular quake out from the family of preliminary, main and aftershock rumbles of its fellow earth adjustments is no simple task.An earthquake prediction tool is still far off, Trugman explains, and for humans who like a yes or no answer, a statistical analysis that suggests a quake's probability is frustrating. But the potential insights and early warnings are improving, quake by quake. | Earthquakes | 2,019 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.