Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
February 8, 2016
https://www.sciencedaily.com/releases/2016/02/160208135433.htm
Long jumping earthquakes: Double dose of bad earthquake news
A team of researchers, including one from the University of California, Riverside, has discovered that earthquake ruptures can jump much further than previously thought, a finding that could have severe implications on the Los Angeles area and other regions in the world.
The scientists found that an earthquake that initiates on one thrust fault can spread 10 times farther than previously thought to a second nearby thrust fault, vastly expanding the possible range of "earthquake doublets," or double earthquakes.That could mean in areas such as Los Angeles, where there are multiple thrust faults close to each other, an earthquake from one thrust fault could spread to another fault, creating twice as much devastation.One potential bad scenario involves a single earthquake spreading between the Puente Hills thrust fault, which runs under downtown Los Angeles, and the Sierra Madre thrust fault, located close to Pasadena, said Gareth Funning, an associate professor of earth sciences at UC Riverside, and a co-author of a paper published online today (Feb. 8, 2016) about the research in the journal Other susceptible areas where there are multiple thrust faults are in close proximity include the Ventura, Calif. area, the Middle East, particularly Tehran, Iran, and the front of the Himalayas, in countries such as Afghanistan, Pakistan, India and Nepal.The researchers studied a 1997 earthquake in Pakistan, originally reported as a magnitude 7.1 event, showing that it was in fact composed of two 'subevents' -- a magnitude 7.0 earthquake, that was followed 19 seconds later by a magnitude 6.8 event, located 50 kilometers (30 miles) to the southeast.Funning considers the two earthquakes as subevents of one 'mainshock,' as opposed to the second earthquake being an aftershock, because they happened so close together in time and were so similar in size. There were many aftershocks in the following minutes and hours, but most of them were much smaller.The scientists used satellite radar images, precise earthquake locations, modeling and back projection of seismic radiation to prove the seismic waves from the first subevent caused the second to initiate, effectively 'jumping' the 50 kilometer distance between the two. Scientists previously thought an earthquake could only leap up to five kilometers.The finding has implications for seismic hazard forecasts developed by the United States Geological Survey. The current forecast model does not include the possibility of a similar double earthquake on the thrust faults in the Los Angeles area."This is another thing to worry about," Funning said. "The probability of this happening in Los Angeles is probably pretty low, but it doesn't mean it can't happen."Funning started work on the paper about 12 years ago as a graduate student at the University of Oxford. He was the first to find the satellite data for the earthquakes in Pakistan, which occurred in a largely unpopulated area, and notice they occurred close together in space and time.After dropping the work for several years, he, along with lead author Ed Nissen of the Colorado School of Mines, picked it up about three to four years ago, in part because of the possible implications for the Los Angeles area, which has a similar plate boundary, with similar faults, similar distances apart as the region in Pakistan where the 1997 earthquake doublet occurred.Thrust faults happen when one layer of rock is pushed up over another, often older, layer of rock by compressional forces. Thrust faults came to the attention of Californians after the 1994 Northridge earthquake, about 20 miles northwest of Los Angeles, which occurred on a thrust fault.Thrust faults are not as well understood by scientists as strike-slip faults, such as the San Andreas, in part because they are not as visible in the landscape, and do not preserve evidence for past earthquakes as well.
Earthquakes
2,016
February 8, 2016
https://www.sciencedaily.com/releases/2016/02/160208124248.htm
New cause of strong earthquakes discovered
A geologic event known as diking can cause strong earthquakes -- with a magnitude between 6 and 7, according to an international research team.
Diking can occur all over the world but most often occurs in areas where the Earth's tectonic plates are moving apart, such as Iceland, Hawaii and parts of Africa in the East African Rift System. As plates spread apart, magma from beneath the Earth's surface rises into the space, forming vertical magma intrusions, known as dikes. The dike pushes on the surrounding rocks, creating strain."Diking is a known phenomenon, but it has not been observed by geophysical techniques often," said Christelle Wauthier, assistant professor of geosciences, Penn State who led the study. "We know it's linked with rift opening and it has implications on plate tectonics. Here, we see that it also could pose hazards to nearby communities."The team investigated ties between two natural disasters from 2002 in the Democratic Republic of the Congo, East African Rift System. On Jan. 17, the Nyiragongo volcano erupted, killing more than 100 people and leaving more than 100,000 people homeless. Eight months later a magnitude 6.2 earthquake struck the town of Kalehe, which is 12 miles from the Nyiragongo volcano. Several people died during the Oct. 24 earthquake, and Kalehe was inundated with water from nearby Lake Kivu."The Kalehe earthquake was the largest recorded in the Lake Kivu area, and we wanted to find out whether it was coincidence that, eight months before the earthquake, Nyiragongo erupted," said Wauthier.The researchers used a remote sensing technique, Interferometric Synthetic Aperture Radar, to measure changes to the Earth's surface before and after both natural disasters."This technique produces ground surface deformation maps. Then, you can invert those deformation maps to find a source that could explain the observed deformation. For the deformation observed in January 2002, we found that the most likely explanation, or best-fitting model, was a 12-mile diking intrusion in between Nyiragongo and Kalehe," said Wauthier.The researchers used the same technique for the October 2002 magnitude 6.2 earthquake, analyzing seismicity in addition to ground-deformation changes. They found that there was a fault on the border of the East African Rift System that slipped, triggering the earthquake."We were able to identify the type of fault that slipped, and we also had the best-fitting model for the dike intrusion," said Wauthier. "Knowing both of those, we performed a Coulomb stress-change analysis and found that the January 2002 dike could have induced the October 2002 earthquake."Coulomb stress-change analysis is a modeling technique that calculates the stress changes induced by a deformation source at potential receiver faults throughout a region. If the Coulomb stress changes are positive, it means that the source is bringing the receiver fault closer to failure -- closer to slipping and generating an earthquake. This type of analysis is regularly applied to assess whether an earthquake in one region could trigger a secondary earthquake nearby.The researchers hypothesized that the dike opening pushed outward against the adjacent rocks. These rocks became strained and passed stress to rocks adjacent to them, accumulating stress on rocks on a fault in the Kalehe area. The dike brought this fault closer to failure and, eight months later, a small stress perturbation could have triggered the start of the magnitude 6.2 earthquake."We've known that every time magma flows through the Earth's crust, you create stress and generate seismicity," said Wauthier. "But these are normally very low magnitude earthquakes. This study suggests that a diking event has the potential to lead to a large earthquake," said Wauthier.The researchers report their findings in the current issue of Collaborators include Benoit Smets, European Center for Geodynamics and Seismology, Vrije Universiteit Brussel and Royal Museum for Central Africa; and Derek Keir, University of Southampton.The National Research Fund of Luxembourg, the Belgian Science Policy Office and the U.K. Natural Environment Research Council supported this research.
Earthquakes
2,016
February 4, 2016
https://www.sciencedaily.com/releases/2016/02/160204151626.htm
Can slow creep along thrust faults help forecast megaquakes?
In Japan and areas like the Pacific Northwest where megathrust earthquakes are common, scientists may be able to better forecast large quakes based on periodic increases and decreases in the rate of slow, quiet slipping along the fault.
This hope comes from a new study by Japanese and UC Berkeley seismologists, looking at the more than 1,000-kilimeter-long fault off northeast Japan where the devastating 2011 Tohoku-oki earthquake originated, generating a tsunami that killed thousands. There, the Pacific Plate is trundling under the Japan plate, not only causing megaquakes like the magnitude 9 in 2011, but giving rise to a chain of Japanese volcanoes.The scientists studied 28 years of earthquake measurements, looking at quakes of magnitude 2.5 or greater between 1984 and 2011. They discovered 1,515 locations off the coast of Japan where small repeating earthquakes happen -- 6,126 quakes in all.According to co-author Robert Nadeau, a UC Berkeley seismologist and a fellow with the Berkeley Institute for Data Science (BIDS), an analysis of these quakes found that larger, more destructive earthquakes -- those of magnitude 5 or greater -- occurred much more frequently when the periodic slow-slip was fastest. This included the great Tohoku-oki earthquake, which also devastated a nuclear power plant and led to widespread radioactive contamination."The persistence of the periodic pattern over time may help us refine earthquake probabilities in the future by taking into account the times of expected slow-slip pulses," he said. "Right now, seismologists gives forecasts on a 30-year time frame and assume nothing is changing on a shorter time scale. Our study points out that things are changing, and in a periodic way. So it may be possible for scientists to give shorter time ranges of greater and lower probability for larger events to happen."The research was led by Naoki Uchida, a seismologist at Tohoku University, and included UC Berkeley seismologist Roland Burgmann, professor of earth and planetary science. They published their findings in the Jan. 29 issue of Science.Slip, Nadeau said, is the relative motion between two sides of a fault, sometimes but not always resulting in ground shaking. So-called slow-slip or creep is what scientists call "fault slip," which happens quietly, without generating shaking, not even microquakes or faint tremors.Regions of a fault that slip quietly are considered to be weak or un-coupled. But within these un-coupled regions of rock underground there are variously-sized patches of fault that are much stronger, or coupled. These patches resist the quiet slip happening around them, only slipping when the pushing and pulling from the surrounding quiet slip stresses them to their breaking point and they "snap" in an earthquake."There is a relationship, which we showed here in California, between the time between 'snaps' on the small, strong patches where earthquakes happen and how much slip took place on the quiet fault surrounding them," Nadeau said. "Using this relationship for thousands of repeating earthquakes in Japan, we were able to map out the evolution of slow-slip on the megathrust. Then, by studying the pattern of this evolution, we discovered the periodic nature of the megathrust slow-slip and its relationship to larger earthquakes."Nadeau and the late UC Berkeley seismologist Thomas McEvilly showed 12 years ago that periodic slow slip occurred all along the San Andreas Fault, from Parkfield to Loma Prieta, Calif. In 2009 the group also observed deeper, transient and periodic slow-slip on the San Andreas -- this time associated with faint shaking called tremor -- and that it was linked with two larger quakes occurring in 2003 and 2004 at San Simeon and Parkfield, Calif. respectively."The phenomenon we found in Japan may not be limited to megathrust zones," he said. "Our 2004 study, more limited in scope than this one, showed a similar periodic slip process and an association between larger quakes -- those of magnitude 3.5 or greater -- and repeating earthquakes along the 170 km stretch of San Andreas Fault that we studied."
Earthquakes
2,016
February 3, 2016
https://www.sciencedaily.com/releases/2016/02/160203145727.htm
Research may explain mysterious deep earthquakes in subduction zones
Geologists from Brown University may have finally explained what triggers certain earthquakes that occur deep beneath the Earth's surface in subduction zones, regions where one tectonic plate slides beneath another.
Subduction zones are some of the most seismically active areas on earth. Earthquakes in these spots that occur close to the surface can be devastating, like the one that struck Japan in 2011 triggering the Fukushima nuclear disaster. But quakes also occur commonly in the subducting crust as it pushes deep below the surface -- at depths between 70 and 300 kilometers. These quakes, known as intermediate depth earthquakes, tend to be less damaging, but can still rattle buildings.Intermediate depth quakes have long been something of a mystery to geologists."They're enigmatic because the pressures are so high at that depth that the normal process of frictional sliding associated with earthquakes is inhibited," said Greg Hirth, professor of earth, environmental, and planetary sciences at Brown. "The forces required to get things to slip just aren't there."But through a series of lab experiments, Hirth and postdoctoral researcher Keishi Okazaki have shown that as water escapes from a mineral called lawsonite at high temperatures and pressures, the mineral becomes prone to the kind of brittle failure required to trigger an earthquake."Keishi's experiments were basically the first tests at conditions appropriate for where these earthquakes actually happen in the earth," Hirth said. "They're really the first to show strong evidence for this dehydration embrittlement."The work will be published on February 4, 2016 in the journal The experiments were done in what's known as a Grigg's apparatus. Okazaki placed samples of lawsonite in a cylinder and heated it up through the range of temperatures where water becomes unstable in lawsonite at high pressures. A piston then increased the pressure until the mineral began to deform. A tiny seismometer fixed to the apparatus detected sudden cracking in the lawsonite, a signal consistent with brittle failure.Okazaki performed similar experiments using a different mineral, antigorite, which had been previously implicated as contributing to intermediate depth seismicity. In contrast to lawsonite, the antigorite failed more gradually -- squishing rather than cracking -- suggesting that antigorite does not play a role in these quakes."That's one of the cool things about this," Hirth said. "For 50 years everyone has assumed this is a process related to antigorite, despite the fact that there wasn't much evidence for it. Now we have good experimental evidence of this dehydration process involving lawsonite."If lawsonite is indeed responsible for intermediate depth earthquakes, it would explain why such quakes are common in some subduction zones and not others. The formation of lawsonite requires high pressures and low temperatures. It is found in so-called "cold" subduction zones in which the suducting crust is older and therefore cooler in temperature. One such cold zone is found in northwest Japan. But conditions in "hot" subduction zones, like the Cascadia subduction zone off the coast of Washington state, aren't conducive to the formation of lawsonite."In hot subduction zones, we have very few earthquakes in the subducting crust because we have no lawsonite," Okazaki said. "But in cold subduction zones, we have lawsonite and we get these earthquakes."Ultimately, Hirth says research like this might help scientists to better understand why earthquakes happen at different places under different conditions."Trying to put into the context of all earthquakes how these processes are working might be important not just for understanding these strange types of earthquakes, but all earthquakes," he said. "We don't really understand a lot of the earthquake cycle. Predictability is the ultimate goal, but we're still at the stage of thinking about what's the recipe for different kinds of earthquakes. This appears to be one of those recipes."
Earthquakes
2,016
February 1, 2016
https://www.sciencedaily.com/releases/2016/02/160201123049.htm
Rapid formation of bubbles in magma may trigger sudden volcanic eruptions
It has long been observed that some volcanoes erupt with little prior warning. Now, scientists have come up with an explanation behind these sudden eruptions that could change the way observers monitor active or dormant volcanoes.
Previously, it was thought eruptions were triggered by a build-up of pressure caused by the slow accumulation of bubbly, gas-saturated magma beneath volcanoes over tens to hundreds of years. But new research has shown that some eruptions may be triggered within days to months by the rapid formation of gas bubbles in magma chambers very late in their lifetime.Using the Campi Flegrei volcano near Naples, southern Italy, as a case study, the team of scientists, from the universities of Oxford and Durham in the UK, and the Vesuvius Volcano Observatory in Italy, demonstrate this phenomenon for the first time and provide a mechanism to explain the increasing number of reported eruptions that occur with little or no warning.The study is published in the journal Lead author Mike Stock, from the Department of Earth Sciences at the University of Oxford, said: 'We have shown for the first time that processes that occur very late in magma chamber development can trigger explosive eruptions, perhaps in only a few days to months. This has significant implications for the way we monitor active and dormant volcanoes, suggesting that the signals we previously thought indicative of pre-eruptive activity -- such as seismic activity or ground deformation -- may in fact show the extension of a dormant period between eruptions.'Our findings suggest that, rather than seismic activity and ground deformation, a better sign of an impending eruption might be a change in the composition of gases emitted at the Earth's surface. When the magma forms bubbles, the composition of gas at the surface should change, potentially providing an early warning sign.'The researchers analysed tiny crystals of a mineral called apatite thrown out during an ancient explosive eruption of Campi Flegrei. This volcano last erupted in 1538 but has recently shown signs of unrest.By looking at the composition of crystals trapped at different times during the evolution of the magma body -- and with the apatite crystals in effect acting as 'time capsules' -- the team was able to show that the magma that eventually erupted had spent most of its lifetime in a bubble-free state, becoming gas-saturated only very shortly before eruption. Under these conditions of slow magma chamber growth, earthquakes and ground deformation observed at the surface may not be signs of impending eruption, instead simply tracking the arrival of new batches of magma at depth.Professor David Pyle from the Department of Earth Sciences at the University of Oxford, a co-author of the paper, said: 'Now that we have demonstrated that this approach can work on a particular volcano, and given apatite is a mineral found in many volcanic systems, it is likely to stimulate interest in other volcanoes to see whether there is a similar pattern.'This research will also help us refine our ideas of what we want to measure in our volcanoes and how we interpret the long-term monitoring signals traditionally used by observers.'The Campi Flegrei volcano system has had a colourful history. The Romans thought an area called Solfatara (where gas is emitted from the ground) was the home of Vulcan, the god of fire. Meanwhile, one of the craters in the system, Lake Avernus, was referred to as the entrance to Hades in ancient mythology.Additionally, Campi Flegrei has long been a site of geological interest. In Charles Lyell's 1830 Principles of Geology, he identified the burrows of marine fossils at the top of the Macellum of Pozzuoli (an ancient Roman market building), concluding that the ground around Naples rises and falls over geological time.
Earthquakes
2,016
January 29, 2016
https://www.sciencedaily.com/releases/2016/01/160129170502.htm
Ancient rocks of Tetons formed by continental collisions
University of Wyoming scientists have found evidence of continental collisions in Wyoming's Teton Range, similar to those in the Himalayas, dating to as early as 2.68 billion years ago.
The research, published Jan. 22 in the journal In fact, the remnants of tectonic activity in old rocks exposed in the Tetons point to the world's earliest known continent-continent collision, says Professor Carol Frost of UW's Department of Geology and Geophysics, lead author of the paper."While the Himalayas are the prime example of continent-continent collisions that take place due to plate tectonic motion today, our work suggests plate tectonics operated far, far back into the geologic past," Frost says.The paper's co-authors include fellow UW Department of Geology and Geophysics faculty members Susan Swapp and Ron Frost.The researchers reached their conclusions by analyzing ancient, exposed granite in the northern Teton Range and comparing it to similar rock in the Himalayas. The rocks were formed from magma produced by what is known as decompression melting, a process that commonly occurs when two continental tectonic plates collide. The dramatically thickened crust extends under gravitational forces, and melting results when deeper crust rises closer to the surface.While the Tetons are a relatively young mountain range, formed by an uplift along the Teton Fault less than 9 million years ago, the rocks exposed there are some of the oldest found in North America.The UW scientists found that the mechanisms that formed the granites of the Tetons and the Himalayas are comparable, but that there are significant differences between the rocks of the two regions. That is due to differences in the composition of the continental crust in Wyoming 2.68 billion years ago compared to crustal plates observed today. Specifically, the ancient crust that melted in the Tetons contained less potassium than the more recently melted crust found in the Himalayas.
Earthquakes
2,016
January 25, 2016
https://www.sciencedaily.com/releases/2016/01/160125125503.htm
Shallow earthquakes, deeper tremors along southern San Andreas fault compared by researchers
Seismologists working along California's San Andreas Fault near Cholame and Parkfield now have a better idea of how and where friction changes along the fault to produce both shallow earthquakes and the deeper earth tremors called low-frequency earthquakes (LFEs).
In their report published online 26 January in the There is also a somewhat puzzling, five-kilometer wide gap of seismic quiet between the deepest earthquakes and the shallowest LFEs analyzed in the study, the researchers noted. If this gap holds up under further scrutiny, seismologists will study its possible role in transferring seismic stress during a large earthquake, said Harrington.Slip along the San Andreas occurs through either aseismic creep or earthquakes in the shallow parts of the fault (less than 15 kilometers deep) while slip takes the form of stable sliding in the deeper part of the fault lying 35 kilometers below the surface. The zone between these two regions--where LFEs occur--is called the brittle-ductile transition zone, where increasing heat and pressure cause rocks to become less prone to fracture and more prone to bending and flowing deformation."Our study illuminates a possible gap in activity at the top of the transition zone, between the deeper LFEs and shallower earthquakes, which may be important to the transfer of stress into the seismogenic part of the fault," Harrington noted. "We need to better understand how slip evolves across this boundary throughout the seismic cycle."Cholame may be best known as the site where iconic actor James Dean died in a 1955 car crash, but the Cholame section of the San Andreas Fault is also where the last major Southern California earthquake on the fault may have originated. The 1857 Fort Tejon earthquake measured magnitude 7.8, causing the fault to rupture Earth continuously for over 350 kilometers (225 miles).Harrington and her colleagues deployed a temporary network of 13 seismic stations near Cholame to look more closely at the relationship between LFEs and earthquakes, since the area has been the site of vigorous LFE and tremor activity. Analyzing data collected by the temporary network and other permanent seismic stations, the scientists were able to precisely locate 34 earthquakes and 34 LFEs that occurred in the area between May 2010 and July 2011.The depth at which fault slip changes from earthquake to LFE varies along the strike of the fault, they concluded. These variations could be caused by differences in the type of rock, the presence of fluids or other factors that affect the frictional properties of the fault along its strike.In their analysis, the seismologists also identified two clusters of small earthquakes, one near the 2004 magnitude 6.0 Parkfield earthquake and one near the northern boundary of the Fort Tejon rupture, just south of Cholame. These clusters may represent areas of mixed frictional properties along the fault, adding support to the idea that earthquakes may originate in these types of frictional environments.As for the mysterious gap separating earthquakes and LFEs, Harrington said it's puzzling why there wouldn't be more of a gradual transition between the two types of events as the fault deepens. One possibility, she says, is that the slip is aseismic in this zone. It might also be that the gap is accumulating strain, and won't show signs of seismic slip until the strain rates are higher.Harrington and her colleagues are working on further studies that could help detect more different types of LFEs, and to determine whether the gap is a real one, she said.
Earthquakes
2,016
January 21, 2016
https://www.sciencedaily.com/releases/2016/01/160121150103.htm
New study zeros in on plate tectonics' start date
Earth has some special features that set it apart from its close cousins in the solar system, including large oceans of liquid water and a rich atmosphere with just the right ingredients to support life as we know it. Earth is also the only planet that has an active outer layer made of large tectonic plates that grind together and dip beneath each other, giving rise to mountains, volcanoes, earthquakes and large continents of land.
Geologists have long debated when these processes, collectively known as plate tectonics, first got underway. Some scientists propose that the process began as early as 4.5 billion years ago, shortly after Earth's formation. Others suggest a much more recent start within the last 800 million years. A study from the University of Maryland provides new geochemical evidence for a middle ground between these two extremes: An analysis of trace element ratios that correlate to magnesium content suggests that plate tectonics began about 3 billion years ago. The results appear in the January 22, 2016 issue of the journal "By linking crustal composition and plate tectonics, we have provided first-order geochemical evidence for the onset of plate tectonics, which is a fundamental Earth science question," said Ming Tang, a graduate student in geology at UMD and lead author of the study. "Because plate tectonics is necessary for the building of continents, this work also represents a further step in understanding when and how Earth's continents formed."The study zeros in on one key characteristic of Earth's crust that sets it apart geochemically from other terrestrial planets in the solar system. Compared with Mars, Mercury, Venus and even our own moon, Earth's continental crust contains less magnesium. Early in its history, however, Earth's crust more closely resembled its cousins, with a higher proportion of magnesium.At some point, Earth's crust evolved to contain more granite, a magnesium-poor rock that forms the basis of Earth's continents. Many geoscientists agree that the start of plate tectonics drove this transition by dragging water underneath the crust, which is a necessary step to make granite."You can't have continents without granite, and you can't have granite without taking water deep into the Earth," said Roberta Rudnick, former chair of the Department of Geology at UMD and senior author on the study. Rudnick, who is now a professor of earth sciences at the University of California, Santa Barbara, conducted this research while at UMD. "So at some point plate tectonics began and started bringing lots of water down into the mantle. The big question is when did that happen?"A logical approach would be to look at the magnesium content in ancient rocks formed across a wide span of time, to determine when this transition toward low-magnesium crustal rocks began. However, this has proven difficult because the direct evidence--magnesium--has a pesky habit of washing away into the ocean once rocks are exposed to the surface.Tang, Rudnick and Kang Chen, a graduate student at China University of Geosciences on a one and a half-year research visit to UMD, sidestepped this problem by looking at trace elements that are not soluble in water. These elements--nickel, cobalt, chromium and zinc--stay behind long after most of the magnesium has washed away. The researchers found that the ratios of these elements hold the key: higher ratios of nickel to cobalt and chromium to zinc both correlate to higher magnesium content in the original rock."To our knowledge, we are the first to discover this correlation and use this approach," Tang said. "Because the ratios of these trace elements correlate to magnesium, they serve as a very reliable 'fingerprint' of past magnesium content."Tang and his coauthors compiled trace element data taken from a variety of ancient rocks that formed in the Archean eon, a time period between 4 and 2.5 billion years ago, and used it to determine the magnesium content in the rocks when they were first formed. They used these data to construct a computer model of the early Earth's geochemical composition. This model accounted for how magnesium (specifically, magnesium oxide) content in the crust changed over time.The results suggest that 3 billion years ago, the Earth's crust had roughly 11 percent magnesium oxide by weight. Within a half billion years, that number had dropped to about 4 percent, which is very close to the 2 or 3 percent magnesium oxide seen in today's crust. This suggested that plate tectonics began about 3 billion years ago, giving rise to the continents we see today."It's really kind of a radical idea, to suggest that continental crust in Archean had that much magnesium," said Rudnick, pointing out that Tang was the first to work out the correlation between trace element ratios and magnesium. "Ming's discovery is powerful because he found that trace insoluble elements correlate with a major element, allowing us to address a long-standing question in Earth history.""Because the evolution of continental crust is linked to many major geological processes on Earth, this work may provide a basis for a variety of future studies of Earth history," Tang said. "For example, weathering of this magnesium-rich crust may have affected the chemistry of the ancient ocean, where life on Earth evolved. As for the onset of plate tectonics, I don't think this study will close the argument, but it certainly adds a compelling new dimension to the discussion."
Earthquakes
2,016
January 19, 2016
https://www.sciencedaily.com/releases/2016/01/160119103336.htm
Diamonds used to 'probe' ancient Earth
Diamonds dug up from ancient rock formations in the Johannesburg area, between 1890 and 1930 -- before the industrialisation of gold mining -- have revealed secrets of how Earth worked more than 3.5 billion years ago.
The three diamonds, which were extracted from the 3 billion-year-old Witwatersrand Supergroup -- the rock formation that is host to the famous Johannesburg gold mines -- were investigated by Dr. Katie Smart, Prof. Susan Webb and Prof. Lewis Ashwal from Wits University, Prof Sebastian Tappe from the University of Johannesburg, and Dr. Richard Stern from the University of Alberta (Edmonton, Canada), to study when modern-style plate tectonics began to operate on planet Earth. The diamonds were generously provided by Museum Africa, located in Johannesburg, with the assistance of curator Katherine James."Because diamonds are some of the the hardest, most robust material on Earth, they are perfect little time capsules and have the capacity to tell us what processes were occurring extremely early in Earth's history," says Dr Katie Smart, a Lecturer at the Wits School of Geoscience and the lead researcher on the paper, Early Archaean tectonics and mantle redox recorded in Witwatersrand diamonds, that was published in the journal, Earth is approximately 4.5 billion years old, and while a rock record exists from about 4 billion years ago, the complex preservational history of the most ancient rocks exposed on Earth's surface has led to a heated debate amongst Geoscientists on when plate tectonics began operating on Earth. Many researchers believe plate tectonics began in the Archaean (the Eon that took place from 4 to 2.5 billion years ago), although the exact timing is highly contested.While the diamonds of this study were found in 3 billion-year-old sedimentary rocks, diamond formation occurred much deeper, within Earth's mantle. Additionally, based on the nitrogen characteristics of the diamonds, they also formed much earlier, around 3.5 billion years ago. Transport of the diamonds to the surface of Earth by kimberlite-like volcanism, followed by their voyage across the ancient Earth surface and into the Witwatersrand basin, occurred between 3.5 and 3 billion years ago.By using an ion probe to analyse the carbon and nitrogen isotope compositions of the Witwatersrand diamonds, which have been pristinely preserved for more than three billion years, Smart and her team found that plate tectonics was likely in operation on Earth as early as 3.5 billion years ago."We can use the carbon and nitrogen isotope compositions of the diamonds to tell us where the source material involved in the formation of the Witwatersrand diamonds over 3 billion years ago came from," says Smart."The nitrogen isotope composition of the Witwatersrand diamonds indicated a sedimentary source (nitrogen derived from Earth's surface) and this tells us that the nitrogen incorporated in the Witwatersrand diamonds did not come from Earth's mantle, but that it was rather transported from Earth's surface into the upper mantle through plate tectonics. This is important because the nitrogen trapped in the Witwatersrand diamonds indicates that plate tectonics, as we recognise it today, was operating on ancient Archaean Earth, and actively transported material at Earth's surface deep into the mantle."Earth as a planet is unique because of the dynamic process of plate tectonics that constantly transports surface material into Earth's mantle, which extends between 7 km to over 2800km below Earth's surface. The process is driven by both convection cells within Earth's mantle and the character of crustal plates at Earth's surface, where newly formed oceanic crustal plates are formed at spreading centres at mid-ocean ridges and then pushed apart. Older, cooler and more dense crust at convergent plate margins is then pulled into, or sinks, into the mantle at subduction zones. The subduction of crustal plates into the mantle can also carry sediments and organic material deep into Earth's interior.The plate tectonic process is vital for shaping Earth as we know it, as the activity of plate tectonics causes earthquakes, volcanic eruptions, and is responsible for constructing Earth's landscapes, such as deep sea trenches and building of mountains on the continents."Various researchers have tried to establish when exactly plate tectonics started on Earth, but while there are many investigations of ancient rocks on Earth's surface -- like the 3.5 billion year old Barberton Greenstone Belt here in South Africa, or the 4 billion year old Acasta Gneiss in northwest Canada -- we are looking at the problem from a different viewpoint -- by investigating minerals derived from Earth's mantle," says Smart."We are not the first research group to study diamonds in order to tell when plate tectonics began, but our study of confirmed Archaean diamonds has suggested that plate tectonics was in operation by at least 3.5 billion years."The green Witwatersrand diamonds were found in the Witwatersrand conglomerate, where the gold was found that led to the establishment of the city of Johannesburg.A number of these diamonds were found between 1890 and 1930, when men were still mining by hand and pick axes. After the industrialisation of the mines in the 1930s, most of the diamonds in the conglomerate were crushed to dust. For this reason, the Witwatersrand diamonds are extremely rare.The Witwatersrand conglomerate is known to be at least three billion years old. The diamonds that are found in the conglomerate are known as "placer" diamonds. These diamonds did not originate in the conglomerate, but were transported from their original kimberlite sources by secondary means, such as rivers.Most diamonds are believed to be younger than three billion years old, but as the Witwatersrand conglomerate is known to be three billion years old, the diamonds found in the conglomerate must have been formed more than 3 billion years ago. Thus, they can be referred to as "confirmed ancient diamonds."
Earthquakes
2,016
January 19, 2016
https://www.sciencedaily.com/releases/2016/01/160119103333.htm
Scientists detect deep carbon emissions associated with continental rifting
Scientists at the University of New Mexico conducted research to effectively study carbon emissions through fault systems in the East African Rift (EAR) in an effort to understand carbon emissions from Earth's interior and how it affects the atmosphere.
Carbon dioxide (COThe research, funded by the National Science Foundation (NSF) Tectonics Program, is directed by UNM Professor Tobias Fischer and is part of a continued effort to better quantify global emissions of COLed by UNM Ph. D. student Hyunwoo Lee, the lead author of the paper titled, Massive and prolonged deep carbon emissions associated with continental rifting published in "COThe EAR is the world's largest active continental rift and is comprised through distinct western and eastern sectors. Several active volcanoes emit large volumes of CO"To measure diffuse COAdditional gas samples collected along fault zones in the Magadi-Natron basin showed an elevated COThe data from all samples were then compared to gas data from the active volcano Oldoinyo Lengai and found to have carbon isotope compositions that indicated a strong magmatic contribution to the observed COJames Muirhead, a doctoral student at the University of Idaho, focused on the relationship between the structure of the faults and the gas they released, including what controls carbon dioxide flow from depth and what volumes of gas the faults release.Combing the CO"We found that about 4 megatonnes per year of mantle-derived COThe findings suggest that CO"It is often argued that large volcanic eruptions instantly transfer significant amounts of CO"Widespread continental rifting and super-continent breakup could produce massive, long-term COLarge-scale rifting events could play a previously unrecognized role in heating up the atmosphere and perhaps ending global ice ages."It is important to note, however, even when including the newly quantified COCindy Ebinger, a professor of earth and environmental sciences at the University of Rochester, coordinated field activities near the Kenya-Tanzania border and analyzed earthquake patterns within the rift zone."The unique coupling of gas chemistry and earthquake studies made it possible to discover the escape of gas along permeable fault zones that serve as conduits to the surface," said Ebinger. "The work also allowed us to document the process of crustal growth through the formation of igneous rocks from magma in early-stage continental rift zones."Lee says the scientists plan to measure diffuse CO"Because some geological settings, for example fault zones, have never been paid attention to, global CO
Earthquakes
2,016
January 12, 2016
https://www.sciencedaily.com/releases/2016/01/160112144424.htm
New geological evidence aids tsunami hazard assessments from Alaska to Hawaii
New data for frequent large tsunamis at a remote island near Dutch Harbor, Alaska provides geological evidence to aid tsunami hazard preparedness efforts around the Pacific Rim. Recent fieldwork in Alaska's Aleutian Islands suggests that a presently "creeping" section of the Aleutian Subduction Zone fault could potentially generate an earthquake great enough to send a large tsunami across the Pacific to Hawaii.
These findings, published by a team of scientists led by U.S. Geological Survey geologist Rob Witter in Creeping faults exhibit slow, continuous motion along them. Because the two tectonic plates are not fully locked by friction, it is unclear whether or not creeping faults can host large earthquakes. Geological observations at Stardust Bay, Alaska point toward previously unrecognized tsunami sources along a presently creeping part of the Aleutian Subduction Zone.Prevailing scientific models about earthquake generation are challenged when it comes to forecasting earthquake probabilities where observations indicate a creeping megathrust (the gently-dipping fault between converging tectonic plates, where one plate is thrust below the other). Usually, scientific models forecast the highest seismic hazard where the tectonic plates are locked together. The study site, Stardust Bay, faces a creeping part of the eastern Aleutian Subduction Zone, which is sandwiched between the rupture areas of historical earthquakes in 1946 and 1957 that generated tsunamis with devastating consequences to coastal communities around the Pacific Ocean. This study is the first to identify geological evidence for repeated prehistoric tsunamis along a creeping part of the eastern Aleutian subduction zone located between the 1946 and 1957 earthquakes.The new evidence includes six sand sheets deposited up to 15 meters (or 50 feet) above sea level by past large tsunamis that probably were generated by great Aleutian earthquakes, and indicate a previously unknown tsunami source that poses a new hazard to the Pacific basin.Using hand-driven cores, augers, and shovels to reveal the sediments blanketing a lowland facing the Pacific Ocean, and using radiocarbon dating to estimate the times of sand sheet deposition, scientists established a geologic history of past large tsunamis. The youngest sand sheet and modern drift logs stranded as far as 805 meters, or half a mile, inland and 18 meters (or 60 feet) above sea level record a large tsunami triggered by the magnitude 8.6 Andreanof Islands earthquake in 1957. Older sand sheets resulted from tsunamis that may have been even larger than the 1957 tsunami. The oldest tsunami sand layer was deposited approximately 1700 years ago, and the average interval between tsunami deposits is 300-340 years.These geological observations indicate large tsunamis in the eastern Aleutians have recurred every 300-340 years on average, and provide additional field-based information that is relevant to new tsunami evacuation zone maps for Hawaii.
Earthquakes
2,016
January 12, 2016
https://www.sciencedaily.com/releases/2016/01/160112093417.htm
Researchers to participate in flood reconnaissance mission
University of Arkansas engineering researchers -- experts in the study of how soil reacts to stress caused by earthquakes or floods -- are participating in a multi-institutional research mission to document the effects of recent, severe flooding in the Midwest.
Michelle Bernhardt, assistant professor of civil engineering, and graduate student Behdad Mofarrai will travel to parts of Arkansas and Missouri this week to photograph and assess flood damage and the performance of levee systems. They will work alongside the U.S. Army Corps of Engineers, local levee districts and researchers at other universities as part of a Geotechnical Extreme Events Reconnaissance Association team.Sponsored by the National Science Foundation, the extreme event team gathers perishable data immediately following extreme natural events -- earthquakes, tsunamis, hurricanes, landslides and floods -- to better understand the response of infrastructure and complex transportation systems. Researchers map and survey both damaged and undamaged areas to provide data that will be used to improve engineering and design of these systems."Our goal is to gather as much time-sensitive data as possible," Bernhardt said. "Flooding events are interesting because data can sometimes be washed away or destroyed during repair activities, so it is critical that thorough documentation be executed during the event and shortly thereafter. Gathering this data from the field is critical to further our understanding of how levees perform in a given event and under what mechanisms failures occur."For this mission, scientists and engineers will work with the Corps and state and local organizations to document geotechnical, hydraulic and climatic impact of the flooding and how it affects public policy. Their work will augment a systems-based approach to understanding the overall performance of the central-Midwest flood-protection system.Bernhardt uses computer modeling to digitally simulate small particles, such as soil, and show how they react to displacements and stress, such as those caused by an earthquake or flood. These particle-to-particle interactions improve understanding of large-scale interaction and how materials react to different loading mechanisms. Her research contributes to furthering our knowledge of fundamental soil behavior.
Earthquakes
2,016
January 11, 2016
https://www.sciencedaily.com/releases/2016/01/160111161055.htm
Expedition probes undersea magma system
A team of University of Oregon scientists is home after a month-long cruise in the eastern Mediterranean, but this was no vacation. The focus was the plumbing system of magma underneath the island of Santorini, formed by the largest supervolcanic eruption in the past 10,000 years.
The expedition -- led by UO geologists Emilie Hooft and Doug Toomey under a National Science Foundation grant -- included British, Greek and U.S. researchers on board the U.S. Research Vessel Marcus G. Langseth. Five UO graduate students and one undergraduate student were on board, and another UO graduate student helped install seismometers on the nearby island of Anafi.Data collected with seismometers will now be analyzed using large parallel computers to build maps and understand the structure of the magma plumbing system that lies 10 to 20 kilometers, or six to 12 miles, under the seafloor. Little is known about magmatic systems at deep depths.Photo of two UO grad students prepping a seismometer for deployment"The goal is to understand the deep roots, or magma plumbing system, of an arc volcano," Hooft said. "We have some idea of how shallow magma bodies are shaped, but the magmatic system that lies in the deep crust remains poorly understood and difficult to study. It is in this region that magmas from the mantle undergo chemical processes to form the rock compositions that presumably dominate the continental lower crust."Santorini, besides being an idyllic vacation spot, is perfect for tackling the problem of imaging the deeper roots of a volcano, Hooft said. The island recently has experienced significant unrest linked to magma recharge, including inflation of the ground and intense earthquake swarms.Since Santorini is a semi-submerged volcanic system, the scientific team was able to collect a very large 3D marine-land seismic dataset using seismic-sound detecting equipment towed behind the R/V Marcus G. Langseth, which is the most-sophisticated seismic vessel in the world's academic fleet. The equipment generates bubbles of compressed air to produce pressure waves. At the seafloor, the waves enter rocks and return a clean seismic signal as they proceed under the seabed to the roots of the volcanic system.If successful, the data will provide the structure of the deep magma system and its surroundings in 10 times more detail than any other volcano on Earth. Researchers dropped 91 specially designed seismometers to the seafloor and installed another 65 land seismometers on Santorini and nearby islands. The entire onshore-offshore network recorded the seismic signaling more than 14,000 times.The scientists mapped new regions of the seafloor, revealing the structure of faults and landslides between the islands of Santorini and Amorgos. These measurements may help resolve questions related to the largest 20th century earthquake in Greece, which occurred in 1956, and its accompanying tsunami.
Earthquakes
2,016
January 11, 2016
https://www.sciencedaily.com/releases/2016/01/160111135439.htm
Thousands of landslides in Nepal earthquake raise parallels for Pacific Northwest
Research teams have evaluated the major 7.8 magnitude subduction zone earthquake in Gorkha, Nepal, in April 2015, and identified characteristics that may be of special relevance to the future of the Pacific Northwest.
Most striking was the enormous number and severity of landslides.Many people understand the damage that can be caused to structures, roads, bridges and utilities by ground shaking in these long-lasting types of earthquakes, such as the one that's anticipated on the Cascadia Subduction Zone between northern California and British Columbia.But following the Nepal earthquake -- even during the dry season when soils were the most stable -- there were also tens of thousands of landslides in the region, according to reconnaissance team estimates. In their recent report published in Other estimates, based on the broader relationship between landslides and earthquake magnitude, suggest the Nepal earthquake might have caused between 25,000 and 60,000 landslides.The subduction zone earthquake expected in the future of the Pacific Northwest is expected to be larger than the event in Nepal.Ben Mason, a geotechnical engineer and assistant professor in the College of Engineering at Oregon State University, was a member of the Geotechnical Extreme Event Reconnaissance team that explored the Nepal terrain. He said that event made clear that structural damage is only one of the serious threats raised by subduction zone earthquakes."In the Coast Range and other hilly areas of Oregon and Washington, we should expect a huge number of landslides associated with the earthquake we face," Mason said. "And in this region our soils are wet almost all year long, sometimes more than others. Each situation is different, but soils that are heavily saturated can have their strength cut in half."Wet soils will also increase the risk of soil liquefaction, Mason said, which could be pervasive in the Willamette Valley and many areas of Puget Sound, Seattle, Tacoma, and Portland, especially along the Columbia River.Scientists have discovered that the last subduction zone earthquake to hit the Pacific Northwest was in January 1700, when -- like now -- soils probably would have been soggy from winter rains and most vulnerable to landslides.The scientific study of slope stability is still a work in progress, Mason said, and often easier to explain after a landslide event has occurred than before it happens. But continued research on earthquake events such as those in Nepal may help improve the ability to identify areas most vulnerable to landslides, he said. Models can be improved and projections made more accurate."If you look just at the terrain in some parts of Nepal and remove the buildings and people, you could think you were looking at the Willamette Valley," Mason said. "There's a lot we can learn there."In Nepal, the damage was devastating.Landslides triggered by ground shaking were the dominant geotechnical effect of the April earthquake, the researchers wrote in their report, as slopes weakened and finally gave way. Landslides caused by the main shock or aftershocks blocked roads, dammed rivers, damaged or destroyed villages, and caused hundreds of fatalities.The largest and most destructive event, the Langtang debris avalanche, began as a snow and ice avalanche and gathered debris that became an airborne landslide surging off a 500-meter-tall cliff. An air blast from the event flattened the forest in the valley below, moved 2 million cubic meters of material and killed about 200 people.Surveying the damages after the event, Mason said one of his most compelling impressions was the way people helped each other."Nepal is one of the poorest places, in terms of gross domestic product, that I've ever visited," he said. "People are used to adversity, but they are culturally rich. After this event it was amazing how their communities bounced back, people helped treat each other's injuries and saved lives. As we make our disaster plans in the Pacific Northwest, there are things we could learn from them, both about the needs for individual initiative and community response."Aside from landslides, many lives were lost in collapsing structures in Nepal, often in homes constructed of rock, brick or concrete, and frequently built without adequate enforcement of building codes, the report suggested. Overall, thousands of structures were destroyed. There are estimates that about 9,000 people died, and more than 23,000 were injured. The earthquake even triggered an avalanche on Mount Everest that killed at least 19 people.The reconnaissance effort in Nepal was made possible by support from the National Science Foundation, the U.S. Geological Survey, the U.S. Agency for International Development, the OSU College of Engineering, and other agencies and universities around the world.
Earthquakes
2,016
January 11, 2016
https://www.sciencedaily.com/releases/2016/01/160111135046.htm
Scientists pinpoint unbroken section of Nepal fault line and show why Himalayas grow
An international team of scientists has shed new light on the earthquake that devastated Nepal in April 2015, killing more than 8,000 people.
A study published in the journal The researchers, from the UK's Centre for the Observation and Modelling of Earthquakes, Volcanoes and Tectonics (COMET), as well as academics from the USA and France, also demonstrate that the rupture on the fault stopped 11km below Kathmandu. This indicates that another major earthquake could take place within a shorter timeframe than the centuries that might be expected for the area.Lead author Dr John Elliott of Oxford University, a member of the COMET team, said: 'Nepal has some of the highest mountain ranges in the world that have been built up over millions of years because of the collision of India with Asia. But the way in which mountains grow and when this occurs is still debated.'We have shown that the fault beneath Nepal has a kink in it, creating a ramp 20km underground. Material is continually being pushed up this ramp, which explains why the mountains were seen to be growing in the decades before the earthquake.'The earthquake itself then reversed this, dropping the mountains back down again when the pressure was released as the crust suddenly snapped in April 2015.'Using the latest satellite technology, we have been able to precisely measure the land height changes across the entire eastern half of Nepal. The highest peaks dropped by up to 60cm in the first seconds of the earthquake.'Mount Everest, at more than 50km east of the earthquake zone, was too far away to be affected by the subsidence seen in this event.Dr Pablo Gonzalez of the University of Leeds, a member of the COMET team, said: 'We successfully mapped the earthquake motion using satellite technology on a very difficult mountainous terrain. We developed newly processing algorithms to obtain clearer displacement maps, which revealed the most likely fault geometry at depth. Such geometry makes sense of the puzzling geological observations.'Another key finding of the study shows that the rupture in the fault stopped 11km below Kathmandu, leaving an upper portion that remains unbroken.Dr Elliott said: 'Using the high-resolution satellite images, we have shown that only a small amount of the earthquake reached the surface. This is surprising for such a big earthquake, which we would normally expect to leave a major fault trace in the landscape. This makes it a challenge when trying to find past earthquake ruptures, as they could be hidden.'We found that the rupture from April's earthquake stopped 11km beneath Kathmandu, and that this sudden break is because of damage to the fault from interactions with older faults in the region. This is important because the upper half of the fault has not yet ruptured, but is continuously building up more pressure over time as India continues to collide into Nepal.'As this part of the fault is nearer the surface, the future rupture of this upper portion has the potential for a much greater impact on Kathmandu if it were to break in one go in a similar sized event to that of April 2015.'Work on other earthquakes has suggested that when a rupture stops like this, it can be years or decades before it resumes, rather than the centuries that might usually be expected.'Unfortunately, there is no way of predicting precisely when another earthquake will take place. It's simply a case of countries and cities making sure they are well prepared for when it does happen.'The research was a collaboration between scientists from the University of Oxford, the University of Leeds, the University of Cambridge, California Institute of Technology, PSL Research University (France), and engineering consultancy Arup.The majority of the work was funded by the Natural Environment Research Council (NERC).
Earthquakes
2,016
January 5, 2016
https://www.sciencedaily.com/releases/2016/01/160105132720.htm
First ever digital geologic map of Alaska
A new digital geologic map of Alaska is being released today providing land users, managers and scientists geologic information for the evaluation of land use in relation to recreation, resource extraction, conservation, and natural hazards.
For the first time, this new geologic map of Alaska incorporates one of geology's most significant paradigm shifts, the theory of plate tectonics, into this fully digital product. The map gives visual context to the abundant mineral and energy resources found throughout the state in a beautifully detailed and accessible format."I am pleased that Alaska will now have a state-wide digital map detailing both surface and subsurface geologic resources and conditions," said USGS newly confirmed director Suzette Kimball. "This geologic map provides important information for the mineral and energy industries for exploration and remediation strategies. It will enable resource managers and land management agencies to evaluate resources and land use, and to prepare for natural hazards, such as earthquakes.""The data contained in this digital map will be invaluable," said National Park Service Director Jonathan B. Jarvis. "It is a great resource and especially enhances the capacity for science-informed decision making for natural and cultural resources, interpretive programs, and visitor safety.""A better understanding of Alaska's geology is vital to our state's future. This new map makes a real contribution to our state, from the scientific work it embodies to the responsible resource production it may facilitate. Projects like this one underscore the important mission of the U.S. Geological Survey, and I'm thankful to them for completing it," said Sen. Lisa Murkowski, R-Alaska.This map is a completely new compilation, with this version carrying the distinction of being the first 100 percent digital statewide map. Alaska's new map reflects the changes in our modern understanding of geology as it builds on the past. More than 750 references were used in creating the map, some as old as 1908 and as new as 2015. It shows an uncommon level of detail for state geologic maps. Being 100 percent digital this map has multiple associated databases that allow the creation of a variety of derivative maps and other products."This work is an important synthesis that will both increase public access to critical information and enhance the fundamental understanding of Alaska's history, natural resources and environment," said Mark Myers, Commissioner of Alaska's Department of Natural Resources. "I applaud the collaborative nature of this effort, including the input provided by the Alaska Division of Geological and Geophysical Surveys, which will be useful for natural disaster preparation, resource development, land use planning and management, infrastructure and urban planning and management, education, and scientific research."Geologists and resource managers alike can utilize this latest geologic map of Alaska, but even the lay person will enjoy the visual feast of attractive color patterns on the map which shed light on the state's geologic past and present. Nearly 20 years in the making, this release marks the 200th anniversary of the release of the world's first geologic map by William Smith of England in 1815.More than any other area of the United States, Alaska reflects a wide range of past and current geologic environments and processes. The map sheds light on the geologic past and present; today, geologic processes are still very important in Alaska; with many active volcanoes, frequent earthquakes, receding and advancing glaciers and visible climate impacts."This map is the continuation of a long line of USGS maps of Alaska, reflecting ever increasing knowledge of the geology of the state," said Frederic Wilson, research geologist with the USGS and lead author of the new map. "In the past, starting in 1904, the maps came out about every 20 years; the 35-year gap between this edition and the last has been a time of major new mapping efforts by the USGS, as well as a revolution in the science of geology through the paradigm shift to plate tectonics, and the development of digital methods."
Earthquakes
2,016
December 23, 2015
https://www.sciencedaily.com/releases/2015/12/151223134119.htm
Geologic formation could hold clues to melting glacier floodwaters
Geologists investigating an unusual landform in the Wabash River Valley in southern Illinois expected to find seismic origins, but instead found the aftermath of rushing floodwaters from melting Midwestern glaciers after the last ice age. The finding could give clues to how floodwaters may behave as glacier melt increases in places like Greenland and Iceland.
Illinois State Geological Survey researchers Timothy Larson, Andrew Phillips and Scott Elrick published their findings in the journal Along the western edge of the Wabash River Valley lies a scarp, or short cliff, about 10 to 20 feet high and running in a nearly straight line for about 6 miles. The Meadow Bank scarp runs nearly perfectly parallel to a fault zone 1 mile to the west. Geologists suspected the Meadow Bank was formed by some past seismic activity along the fault, perhaps an earthquake that caused the scarp to shear upwards.In an effort to assess earthquake hazard, the ISGS researchers set out to probe the relationship between the fault and the scarp and instead found a deeper mystery: There was no relationship at all."This was very surprising to us," said Larson. "You look at it, you see how parallel it is to the fault. We know that historically there were earthquakes in the area. It just begs to be related. But it turns out it's not possible."The researchers talked to miners and studied records from the White County Coal Company, which mined coal throughout the area around and under the Meadow Bank. They confirmed that there was no fault or seismic activity hidden under the bank. Thus, the researchers set out to answer the new question: How was such a long, straight scarp formed, if not tectonically?The researchers bored into the ground to take samples along and around the Meadow Bank. They found evidence that the scarp was formed by erosion, and concluded that the type of erosion that could produce such a striking, straight feature had to come from a quick, strong force -- such as a flood surge from a melting glacier."Looking at the layers in the sediment, you can trace back to big floods at the end of the last glacial time," said Larson. "As the glaciers melt, the water may build up in lakes along the glacier edge until it gets so deep that it overflows and rushes out in a big discharge. We see this happening today in places like Iceland. So at some point, a glacier flushed a huge slug of water, full of sediment debris, through the Wabash River Valley. It may have happened several times. It washed everything out and formed this straight shot down through the valley."The researchers hope that further study of the Meadow Bank could provide clues about the glacial floods -- where they originated, what kind of debris they carried and how their course was set. This could provide insight to geologists studying glaciers melting today, such as those in Greenland, Iceland, Canada and New Zealand, as they try to project how the meltwater could behave and the possible effects on the surrounding landscape.
Earthquakes
2,015
December 23, 2015
https://www.sciencedaily.com/releases/2015/12/151223134106.htm
Dating historic activity at Oso site shows recurring major landslides
The large, fast-moving mudslide that buried much of Oso, Washington in March 2014 was the deadliest landslide in U.S. history. Since then, it's been revealed that this area has experienced major slides before, but it's not known how long ago they occurred.
University of Washington geologists analyzed woody debris buried in earlier slides and used radiocarbon dating to map the history of activity at the site. The findings, published online in the journal "The soil in this area is all glacial material, so one hypothesis is the material could have fallen apart in a series of large landslides soon after the ice retreated, thousands of years ago," said corresponding author Sean LaHusen, a UW doctoral student in Earth and space sciences. "We found that that's not the case -- in fact, landslides have been continuing in recent history."The study establishes a new method to date all the previous landslides at a particular location. The method shows that the slopes in the area around Oso have collapsed on average once every 500 years, and at a higher rate of about once every 140 years over the past 2,000 years."This was well known as an area of hillslope instability, but the question was: 'Were the larger slides thousands of years old or hundreds of years old?' Now we can say that many of them are hundreds of years old," said co-author Alison Duvall, a UW assistant professor of Earth and space sciences.LaHusen had not yet begun his graduate studies when he asked about studying the history of geologic activity at the Oso site. In late summer of 2014, the researchers began their work wading along riverbanks to look for preserved branches or trees that could be used to date previous landslides."When you have a large, catastrophic landslide, it can often uproot living trees which kills them and also encapsulates them in the landslide mass," Duvall said. "If you can find them in the landslide mass, you can assume that they were killed by the landslide, and thus you can date when the landslide occurred."The team managed to unearth samples of wood buried in the Rowan landslide, just downstream of the Oso site, and the Headache Creek landslide, just upriver of the 2014 slide. Results from several debris samples show that the Rowan landslide, approximately five times the size of the Oso slide, took place just 300 to 694 years ago. The Headache Creek landslide is within a couple hundred years of 6,000 years old.Previous UW research had shown a history of geologic activity at the Oso site, including previous major landslides and a recent small slide at the same slope that collapsed in 2014. But while the position of past slides and degree of surface erosion can show the order that the older slides happened, it has not been possible to give a date for the past events.The new study uses the radiocarbon dates for two slides to establish a roughness curve to date other events along a 3.7-mile (6-kilometer) stretch of the north fork of the Stillaguamish River. A roughness curve uses the amount of surface erosion to establish each slide's age. The two dates put firm limits on the curve, so that other nearby slides can be dated from their roughness characteristics without having to find material buried inside each mass of soil."This is the first time this calibrated surface dating method has been used for landslide chronologies, and it seems to work really well," LaHusen said. "It can provide some information about how often these events recur, which is the first step toward a regional risk analysis."Applying the new method for other locations would require gathering samples for each area, they cautioned, because each site has its own soil composition and erosion characteristics.It's not known whether the findings for the Oso site's history would apply to other parts of the Stillaguamish River, Duvall said, or to other places in Washington state. The researchers are still studying debris from other locations. But the results do have implications for the immediate area."It suggests that the Oso landslide was not so much of an anomaly," Duvall said.She and LaHusen are also working with the UW's M-9 Project, which is studying hazards from magnitude-9 earthquakes along the Cascadia subduction zone. They would like to learn whether landslides across Washington state coincided with past earthquakes, and use simulations of future shaking to predict which places in the state are most vulnerable to earthquake-triggered landslides.
Earthquakes
2,015
December 22, 2015
https://www.sciencedaily.com/releases/2015/12/151222084733.htm
Forensic seismology tested on 2006 munitions depot 'cook-off' in Baghdad
On Oct. 10, 2006, a mortar round hit the ammunition supply depot at the U.S. Forward Operating Base Falcon south of Baghdad. The round started a smoldering fire punctuated by whizzing skyrockets, a rain of incandescent fragments, and massive explosions that bloomed into mushroom clouds. Soldiers who videotaped the "cook-off" can be heard wondering what exactly was in the dump and how much longer the explosions would continue.
But the soldiers weren't the only ones recording the cook-off: a seismometer just four miles away was also registering every boom and shock. The seismometer was one of 10 installed in 2005 and 2006 in northern and northeastern Iraq to study the seismic properties of Earth's crust in that area so that it would be possible to quantify the yield of nearby earthquakes or nuclear tests.The principal investigator on the team that deployed the seismometers was Ghassan I. Aleqabi, PhD, a seismic deployment coordinator in the Department of Earth and Planetary Sciences in Arts & Sciences at Washington University in St. Louis. Iraqi in origin, Aleqabi had obtained his PhD in seismology at Saint Louis University and settled in Saint Louis.Installing and maintaining instruments in war-torn Iraq was sometimes a hair-raising business. Installation of the seismometer that recorded the cook-off had to be delayed until April 2006 because it was dangerous even to enter the city. And, once deployed, the seismometers, which recorded 100 samples per second, filled their hard drives in a few months, so someone had to return to the sites to bring out the data.Then the ammunition dump went up. Aleqabi and his colleague Michael Wysession, PhD, professor of earth and planetary sciences in Arts & Sciences, were curious and decided to see if the seismometer had recorded the cook-off. "Sure enough, you could see a whole sequence of explosions," Wysession said.They report what they found online in the Dec. 22 issue of the Analyzing the record in various ways, they found that some types of weapons jumped right out at them. "Mortar fire has a very specific signature that is always the same," Wysession said. "If you make a spectrogram, which breaks out the signal into different frequencies, you see that the firing of the mortar produces one set of frequencies and the case splintering around the explosive produces another. When you see those signals you know that's a mortar firing. You can begin to pick out what's going on."Passing helicopters produced lovely swooping S-curves in the seismograms as they moved toward and away from the seismometer and their dominant frequency dropped (the same effect that makes the siren on an emergency vehicle drop in pitch). "You can look at how much the frequency drops and over what length of time and determine how far away the helicopter is, and how fast it's going, which is really fascinating," Wysession said.But they also discovered some limitations. It was not possible to read every little rumble and report. The seismometer, for example, picked up two different car bombs, but their seismic records looked very different. By checking with counterterrorism intelligence sources, Aleqabi learned more about the bombs. One had detonated in a fairly open space at a university and the other had gone up at a checkpoint in a narrow street lined with tall buildings. The checkpoint explosion reverberated in the small spaces, creating a more complex sound pattern that made it harder to figure out the explosive type and yield.But looking at the seismic recording before and during the cook-off, the seismologists could reconstruct the sequence of events that led to the catastrophe. About 7:22 local time they could see the signatures of mortar fire in the record. At 7:31, a helicopter flew by. An explosion at 7:36 was the one that probably ignited the cook-off, but it was followed by a series of small explosions that gave the soldiers on the base time to take cover. Then, at 7:40, there was a huge explosion and all hell broke loose.Seismometers were developed to record earthquakes, Wysession said, but then they turned out to be useful for monitoring nuclear tests, and now people are using them in all kinds of creative ways. "We can independently verify with seismometers the occurrence of global warming because we can track the decadal increase in the storm intensity globally," he said.Seismology is also used as a forensic tool, he said, to help investigative agencies and insurance companies piece together what happened during terrorist attacks or industrial accidents. One of the leaders in this field is Keith Koper, a Washington University alumnus who is now a professor of seismology at the University of Utah.Unfortunately, given recent terrorist attacks in Paris and elsewhere around the world, this paper may be more timely than the authors ever expected it to be, Wysession said. "A network of seismometers in an urban area could tell you a lot about a terror attack." A real time-array, he points out, might have prevented the ammo "cook-off." Because the "cook-off" was preceded by a volley of mortar fire -- and mortar firings have a unique seismic signature -- it might have been possible to pinpoint the source of the rounds before the round that destroyed the ammo depot was fired."I think we'll hear more about forensic seismology as time goes on," Wysession said.
Earthquakes
2,015
December 19, 2015
https://www.sciencedaily.com/releases/2015/12/151219144742.htm
10,000-year record shows dramatic uplift at Andean volcano
Ongoing studies of a massive volcanic field in the Andes mountains show that the rapid uplift which has raised the surface more than six feet in eight years has occurred many times during the past 10,000 years.
A clearly defined ancient lakeshore that is about 600 feet above the current lake level must have been horizontal when it formed about 100 centuries ago. Since then, the southern end of the shoreline has risen 220 feet, or about 20 stories, says Brad Singer, a professor of geoscience at the University of Wisconsin-Madison. The finding, he says, "extends the current deformation behavior well into the geologic past. The shoreline appears to record a similar behavior to what we are seeing today, but over 10,000 years."The volcanic field is known as Laguna del Maule. The dramatic finding rested on a simple, painstaking study of the ancient lakeshore, which resembles a bathtub ring. Singer and colleagues traveled along the shoreline on foot, and precisely recorded its altitude with a GPS receiver.The most likely cause of the sustained rise is the long-term intrusion of molten rock beneath the lake, says Singer, who has spent more than 20 years studying volcanoes in Chile. "I was shocked that we measured this much rise. This requires the intrusion of a Half Dome's worth of magma in 10,000 years."Half Dome, an iconic granite massif at Yosemite National Park in California, has a volume of about 1.5 cubic miles. Half Dome and similar structures form when molten rock -- magma -- cools and solidifies underground, and then the rock body is pushed upward over the eons.The modern uplift at Maule is what convinced Singer to organize a large-scale scientific campaign to explore a dangerous, highly eruptive region. "I am not aware of magma-drive uplift at these rates, anywhere, over either of these time periods," he says.Singer is leading a five-year National Science Foundation-funded investigation of Laguna del Maule that involves 30 scientists from the United States, Chile, Canada, Argentina and Singapore. At least 36 eruptions have occurred there during the past 20,000 years.The researchers presented the new data on the uplift during the last 10,000 years on Dec. 16, at the annual American Geophysical Union meeting in San Francisco.Laguna del Maule may cast light on the current -- but much slower -- uplift at the Yellowstone caldera in Wyoming and at Long Valley in California. "These volcanoes have produced super-eruptions spewing hundreds of cubic kilometers of volcanic ash, but the uplift and deformation today are far slower than what we see at the much younger Laguna del Maule volcanic field," Singer says.The lake basin at Maule, measuring roughly 14 by 17 miles, is dominated by massive, repeated lava flows. But the full influence of Maule's volcanoes extends much farther, Singer says. "The impressive lava flows we see in the lake basin are only a fraction of the record of eruptions. Downwind, in Argentina, deposits of volcanic ash and pumice show that the system's footprint is many times larger than what appears at the lake." Understanding the real hazards of Laguna del Maule must consider the downwind impacts of the explosive eruptions, he adds.Chile has seen remarkable geologic activity in recent years. In 2010, the fifth-largest earthquake ever recorded on a seismometer occurred 120 miles west of Laguna del Maule.In the past 12 months alone, Calbuco, Villarica and Copahue volcanoes have erupted.In the United States, eruptions are often compared to the one at Mount St. Helens in 1980, which released about 1 cubic kilometer of rock. One of the 36 Laguna del Maule eruptions nearly 20,000 years ago spewed 20 times that much ash.Other nearby volcanoes have surpassed 100 cubic kilometers, entering the realm of the "super-volcano."The new results shed light on the force that has been "jacking up" this piece of earth's crust, Singer says. "Some people have argued that the dramatic deformations like we are seeing today could be driven by the expansion of steam above the magma." However, gravity measurements around the lake basin by Basil Tikoff of UW-Madison and Craig Miller and Glyn Williams-Jones of Simon Fraser University in Canada suggest that steam is unlikely to be the major cause of uplift. Only solidified magma can support 67 meters of uplift, Singer says: "Steam would leak out."The average interval between eruptions at Laguna del Maule over 20,000 years is 400 to 500 years, and the last eruption was more than 450 years ago, prior to Spanish colonization.These findings "mean the current state of unrest is not the first," Singer says. "The crust has gone up by more than my own height in less than 10 years, but it has done similar things throughout the last 10,000 years, and likely even longer. This uplift coincides with a flare-up of large eruptions around the southern end of the lake. Thus the most likely explanation is the sustained input of new magma underground, although some of it could be due to geologic faults."We are trying to determine the dimensions of the active system at depth, which will help us understand the hazard," Singer adds, "but there is no way of knowing if the next eruption will be business as usual, or something outside of human experience."
Earthquakes
2,015
December 17, 2015
https://www.sciencedaily.com/releases/2015/12/151217081432.htm
Patchy weather in the center of Earth
The temperature 3,000 kilometres below the surface of Earth is much more varied than previously thought, scientists have found.
The discovery of the regional variations in the lower mantle where it meets the core, which are up to three times greater than expected, will help scientists explain the structure of Earth and how it formed."Where the mantle meets the core is a more dramatic boundary than the surface of Earth," said the lead researcher, Associate Professor Hrvoje Tkalčić, from The Australian National University (ANU)."The contrast between the solid mantle and the liquid core is greater than the contrast between the ground and the air. The core is like a planet within a planet." said Associate Professor Tkalčić, a geophysicist in the ANU Research School of Earth Sciences."The center of Earth is harder to study than the center of the sun."Temperatures in the lower mantle the reach around 3,000-3,500 degrees Celsius and the barometer reads about 125 gigapascals, about one and a quarter million times atmospheric pressure.Variations in these temperatures and other material properties such as density and chemical composition affect the speed at which waves travel through Earth.The team examined more than 4,000 seismometers measurements of earthquakes from around the world.In a process similar to a CT scan, the team then ran a complex mathematical process to unravel the data and build the most detailed global map of the lower mantle, showing features ranging from as large as the entire hemisphere down to 400 kilometres across.The map showed the seismic speeds varied more than expected over these distances and were probably driven by heat transfer across the core-mantle boundary and radioactivity."These images will help us understand how convection connects Earth's surface with the bottom of the mantle," said Associate Professor Tkalčić."These thermal variations also have profound implications for the geodynamo in the core, which creates Earth's magnetic field."
Earthquakes
2,015
December 16, 2015
https://www.sciencedaily.com/releases/2015/12/151216162237.htm
Fewer landslides than expected after 2015 Nepal earthquake
Fewer landslides resulted from the devastating April 2015 Nepal earthquake than expected, reports a University of Arizona-led international team of scientists in the journal
In addition, no large floods from overflowing glacial lakes occurred after the magnitude 7.8 quake, which struck near the town of Gorkha, Nepal on April 25, 2015."It was a really bad earthquake -- over 9,000 fatalities in four countries, primarily Nepal," said lead author Jeffrey Kargel, senior associate research scientist in the University of Arizona department of hydrology and water resources. "As horrific as this was, the situation could have been far worse for an earthquake of this magnitude."When the earthquake struck, glaciologist Kargel considered how he could help from more than 8,000 miles away."For the first 24 hours after the quake I was beside myself suffering for my friends and the country of Nepal that I so love," he said. "I thought, what can I do? I'm sitting here in Tucson -- how can I help Nepal?"He realized his expertise in satellite imaging could help find out where landslides had happened, especially in remote mountain villages far from population centers.He and UA geologist Gregory Leonard called on colleagues in the Global Land Ice Measurements from Space (GLIMS) network that Kargel led to help identify affected areas by using satellite imagery. An international consortium of glaciologists, GLIMS monitors glaciers all over the world. The GLIMS team's initial efforts focused on possible earthquake effects on Himalayan glaciers, but quickly expanded to searching for post-earthquake landslides.Within a day or two, Kargel, GLIMS scientists and others joined with the NASA Applied Sciences Disasters group to use remote sensing to help document damage and identify areas of need. The international group of scientists requested that several satellites take images of the region to enable the systematic mapping of landslides.As a result of that request, both government space agencies and commercial enterprises provided thousands of images. Kargel's group selected which ones to analyze and organized into six teams to scrutinize the vast earthquake-affected region for landslides.The scientists volunteered their time and worked long hours to analyze the images. Kargel said producing the landslide inventory was possible only because the network of volunteer analysts spanning nine nations had free access to such data.More than 10 satellites from four countries provided images and other data so the volunteer analysts could map and report the various geological hazards, including landslides, that resulted from the earthquake. Computer models were used to evaluate the likelihood that the downstream edges of glacial lakes would collapse and flood villages and valleys belowA range of groups, including international emergency response teams, received timely and relevant information about the post-earthquake geological hazards because of the rapid and open sharing of information among many different organizations.About a month after the disaster, the International Centre for Integrated Mountain Development (ICIMOD) used the scientists' information to prepare a report and briefing for the Nepalese cabinet. As a result, the Nepal government increased support for a geohazard task force, which mobilized additional geologists to further assess current and future vulnerabilities.The 4,312 landslides that happened within six weeks after the quake were far fewer than occurred after similar-magnitude quakes in other mountainous areas.The team also surveyed 491 glacial lakes and saw only nine that were affected by landslides. Satellite images did not reveal any flooding from those lakes.The team's paper "Geomorphic and Geologic Controls of Geohazards Induced by Nepal's 2015 Gorkha Earthquake" was published online by the journal Kargel, Leonard, Dan Shugar of the University of Washington Tacoma, Umesh Haritashya of the University of Dayton in Ohio, Eric Fielding of NASA's Jet Propulsion Laboratory, UA student Pratima KC, and 58 other scientists, from more than 35 institutions in 12 countries, are co-authors on the research report.NASA, the Hakai Institute, the Japan Aerospace Exploration Agency, DigitalGlobe, the Chinese Academy of Sciences and the International Centre for Integrated Mountain Development (ICIMOD) supported the research.Although the initial research effort was purely humanitarian, the scientists eventually realized they had a huge database that could be analyzed to learn more about geohazards from this and other quakes.In previous earthquakes in mountainous terrain, many earthquake-initiated landslides occurred from minutes to years after the initial quake. However, landslide susceptibility varies from quake to quake, the scientists wrote in their research paper.To study the Gorkha quake landslides, the scientists used their satellite-based findings plus media reports, eyewitness photography and field assessments from helicopters. The researchers limited their analysis from the day of the earthquake to June 10, 2015, the onset of the monsoon.In addition to identifying the locations and severity of landslides, the researchers found an unexpected pattern of where the landslides happened.Co-author Fielding used satellite radar imagery to create a map of the terrain that dropped during the earthquake and where land surface had risen. The Earth's surface dropped almost five feet (1.4 m) in some places and rose as much as five feet (1.5 meters) in others.By overlaying Fielding's map with the landslide map, the scientists could see if there was any correspondence between the number of landslides and the Earth's displacement.Most of the documented landslides occurred in areas where the ground surface dropped down, rather than in the areas where the ground was uplifted.That pattern was unexpected and hadn't been observed before, Kargel said.The research team is currently investigating why there were fewer landslides than expected and why they occurred where they did.One possible explanation is that the Gorkha earthquake caused much less shaking at the surface than other earthquakes of similar magnitude.Fielding said, "Seismologists recorded relatively less shaking with seismometers in Kathmandu and other locations, and the smaller number of landslides suggests the shaking may have been reduced in the whole area.""All kinds of Earth processes can cause a landslide," Kargel said. "The Gorkha earthquake observations add to our understanding of landslides around the world."The main satellites used included WorldView 1, 2, and 3 from DigitalGlobe, Landsat 7 and 8 from NASA and USGS, Earth Observer-1 from NASA, ASTER onboard the Terra satellite from NASA and JAXA, Gaofen-1 from the China National Space Administration, RADARSAT-2 from the Canadian Space Agency and MDA, and ALOS-2 from JAXA.
Earthquakes
2,015
December 16, 2015
https://www.sciencedaily.com/releases/2015/12/151216151740.htm
Catastrophic medieval earthquakes in Nepal
Pokhara, the second largest town of Nepal, has been built on massive debris deposits, which are associated with strong medieval earthquakes. Three quakes, in 1100, 1255 and 1344, with magnitudes of around Mw 8 triggered large-scale collapses, mass wasting and initiated the redistribution of material by catastrophic debris flows on the mountain range. An international team of scientists led by the University of Potsdam has discovered that these flows of gravel, rocks and sand have poured over a distance of more than 60 kilometers from the high mountain peaks of the Annapurna massif downstream.
Christoff Andermann from the GFZ German Research Centre for Geosciences in Potsdam participated in the study, published now in the One big boulder, situated on top of the sediment depositions, has raised the interest of the scientists: "The boulder has a diameter of almost ten meters and weighs around 300 tons. At the top of the boulder we measured the concentration of a Beryllium isotope which is produced by cosmogenic radiation." This 10Be chemical extraction was carried out in the isotope laboratory at the GFZ in Potsdam and was measured with the accelerator mass spectrometer at the Helmholtz-Zentrum Dresden-Rossendorf, Germany. The results show that the deposition of the big boulder matches the timing of another large earthquake from 1681. Pokhara lies at the foot of the more than 8000 meters high Annapurna massif; whether the big boulder was transported during the last dated earthquake with the debris, or was just toppled by the strong shaking needs to be further investigated. Nevertheless, the movement of the big boulder can be connected to this strong earthquake.This research has several important implications reaching beyond fundamental earth sciences. The study provides new insights into the mobilization and volumes of transported material associated with strong earthquakes. Dating of such sediment bodies provides information about the reoccurrence intervals of earthquakes in the Himalayas, and ultimately demonstrates the role of earthquakes in shaping high mountain landscapes. This knowledge is crucial to better evaluate the risks in tectonically active mountain belts.
Earthquakes
2,015
December 16, 2015
https://www.sciencedaily.com/releases/2015/12/151216140535.htm
Natural or humanmade earthquakes? New technique can tell the difference
A new study by Stanford researchers suggests that earthquakes triggered by human activity follow several indicative patterns that could help scientists distinguish them from naturally occurring temblors.
The findings were presented this week at the American Geophysical Union's fall meeting in San Francisco.Jenny Suckale, an assistant professor of geophysics at Stanford's School of Earth, Energy & Environmental Sciences, and her postdoctoral researcher David Dempsey analyzed a sequence of earthquakes on an unmapped basement fault near the town of Guy, Arkansas, from 2010 to 2011.In geology, "basement" refers to rock located beneath a sedimentary cover that may contain oil and other gas reserves that can be exploited through drilling or hydraulic fracturing, also known as "fracking." Scientists suspected that the Arkansas quakes were triggered by the injection of roughly 94.5 million gallons of wastewater into two nearby wells that extend into the basement layer during a nine-month span. The injected water increases the pore pressure in the basement layer, adding more stress to already stressed faults until one slips and releases seismic waves, triggering an earthquake.One of the study's main conclusions is that the likelihood of large-magnitude humanmade, or "induced," earthquakes increases over time, independent of the previous seismicity rate. A reservoir simulation model that Suckale and Dempsey developed found a linear relationship between frequency and magnitude for induced quakes, with magnitude increasing the longer wastewater is pumped into a well."It's an indication that even if the number of earthquakes you experience each month is not changing, as you go further along in time you should expect to see larger magnitude events," said Dempsey, who is now at the University of Auckland in New Zealand.This trend doesn't continue indefinitely, however. The research shows that induced quakes begin to fall off after reaching some maximum magnitude as the triggered faults release more of their stress as seismic waves.While energy companies might welcome the notion that there are upper limits to how strong an induced quake on a particular fault can be, it's difficult to know what that ceiling will be."The question becomes, Does it taper off at magnitude 3 or a more dangerous magnitude 6.5?" Suckale said.Other studies have found that the rate of wastewater injection into a well is more important than the total volume injected for triggering earthquakes. But the Stanford study found that, given similar rates of wastewater injection, there is a direct correlation between the volume injected and the incidence of earthquakes. Of the two wells studied near Guy, Well 1 received four times the wastewater volume as Well 5, and induced four times as many earthquakes."There's a scaling there in terms of the volume injected," Dempsey said.The study's findings could have implications for both the oil and natural gas industry and for government regulators. Under current practices, extraction activities typically shut down in an area if a high-magnitude earthquake occurs. But according to Suckale, a better approach might be to limit production before a large quake occurs."Very often with these faults, once you have a big earthquake, you might not have one for a while because you just released all the stress," Suckale said.
Earthquakes
2,015
December 14, 2015
https://www.sciencedaily.com/releases/2015/12/151214165724.htm
Small fish species evolved rapidly following 1964 Alaska earthquake
Evolution is usually thought of as occurring over long time periods, but it also can happen quickly. Consider a tiny fish whose transformation after the 1964 Alaskan earthquake was uncovered by University of Oregon scientists and their University of Alaska collaborators.
The fish, seawater-native threespine stickleback, in just decades experienced changes in both their genes and visible external traits such as eyes, shape, color, bone size and body armor when they adapted to survive in fresh water. The earthquake -- 9.2 on the Richter scale and second highest ever recorded -- caused geological uplift that captured marine fish in newly formed freshwater ponds on islands in Prince William Sound and the Gulf of Alaska south of Anchorage.The findings -- detailed in a paper available online in the "We've now moved the timescale of the evolution of stickleback fish to decades, and it may even be sooner than that," said Cresko, who also is the UO's associate vice president for research and a member of the UO Institute of Ecology and Evolution. "In some of the populations that we studied we found evidence of changes in fewer than even 10 years. For the field, it indicates that evolutionary change can happen quickly, and this likely has been happening with other organisms as well."Survival in a new environment is not new for stickleback, a small silver-colored fish found throughout the Northern Hemisphere. A Cresko-led team, using a rapid genome-sequencing technology (RAD-seq) created at the UO with collaborator Eric Johnson, showed in 2010 how stickleback had evolved genetically to survive in fresh water after glaciers receded 13,000 years ago. For the new study, researchers asked how rapidly such adaptation could happen.The newly published research involved stickleback collected by University of Alaska researchers from freshwater ponds on hard-to-reach marine islands that were seismically thrust up several meters in the 1964 quake.RAD-seq technology again was used to study the new samples. Genetic changes were similar to those found in the earlier study, but they had occurred in less than 50 years in multiple, separate stickleback populations. Stickleback, the researchers concluded, have evolved as a species over the long haul with regions of their genomes alternatively honed for either freshwater or marine life."This research perhaps opens a window on how climate change could affect all kinds of species," said Susan L. Bassham, a Cresko lab senior research associate who also was co-author of the 2010 paper. "What we've shown here is that organisms -- even vertebrates, with long generation times -- can respond very fast to environmental change."And this is not just a plastic change, like becoming tan in the sun; the genome itself is being rapidly reshaped," she said. "Stickleback fish can adapt on this time scale because the species as a whole has evolved, over millions of years, a genetic bag of tricks for invading and surviving in new freshwater habitats. This hidden genetic diversity is always waiting for its chance, in the sea."Co-authors with Bassham and Cresko on the PNAS paper were Emily A. Lescak of UA-Anchorage and Fairbanks; Julian Catchen of the University of Illinois at Urbana-Champaign; and Ofer Gelmond, Frank A. von Hippel and Mary L. Sherbick of UA-Anchorage.NSF grants DEB0949053 and IOS102728 to Cresko and DEB 0919234 to von Hippel provided the primary funding for the project. National Institutes of Health grant 1R24GM079486-01A1 and the M. J. Murdock Charitable Trust also supported Cresko.The 2010 study appeared in the
Earthquakes
2,015
December 14, 2015
https://www.sciencedaily.com/releases/2015/12/151214150237.htm
Age of blueschist is not an indicator of the date of emergence of plate tectonics
One of the big mysteries in the history of the Earth is the emergence of plate tectonics. When exactly did the processes of plate tectonics begin that today involve the subduction of oceanic plates? Scientific opinion varies widely as to this. The dominant view is that oceanic plates have been pushing under other plates and sinking into the Earth's mantle -- a process known as subduction -- since the beginning of the Hadean eon, more than four billion years ago. Others date the onset of plate tectonic movements to the Neoproterozoic era of 500 to 1,000 million years ago. This hypothesis is based on the fact that the rock called blueschist began to appear 700 to 800 million years ago. Geoscientists at Johannes Gutenberg University Mainz (JGU) in Germany have now shown that the appearance of blueschist is connected to long-term changes in the composition of the oceanic crust and therefore does not provide evidence of when plate tectonics began. The study has been published in the journal
Blueschist is a blue-violet colored rock that is relatively rare and is found, among other places, in the Alps, in Japan, and on the west coast of the USA. The oldest blueschist found originated in the Neoproterozoic era and is 700 to 800 million years old. This metavolcanic rock is created during the subduction of oceanic crust. Required for its formation are high pressure and relatively low temperatures of 200 to 500 degrees Celsius. As such conditions have only prevailed in subduction zones in the recent past, blueschist provides evidence of when subduction-driven plate tectonics occurred. The reason why there was no blueschist present on Earth during its first 3.8 billion years is a hotly contested topic among geologists."We know that the formation of blueschist is definitely linked to subduction," explained Professor Richard White of the Institute of Geosciences at Mainz University. "The fact that the oldest blueschist is only 700 to 800 million years old does not mean, however, that there were no subduction processes before then, as is sometimes claimed," added Dr. Richard Palin. In their study, the two researchers have now managed to demonstrate for the first time that the absence of blueschist in the earliest geological periods goes back to a change in the chemical composition of the ocean's crust in the course of the Earth's history, which in turn is a result of the gradual cooling of the Earth's mantle since the Archean eon.The oceanic crust that formed on the early, hot Earth was rich in magnesium oxide. Using computer models, Palin and White have been able to show that it was not possible for blueschist to form from this magnesium oxide-rich rock during subduction. Instead, the subduction of the magnesium oxide-rich oceanic crust led to the formation of rock similar to greenschist, which is a metamorphic rock that is formed today at low temperatures and low pressure. Since these greenschist rocks can hold more water than most blueschist, more fluid was able to enter the early Earth's mantle than today, a factor that has an effect on the formation of magmas, which is one of the topics being studied by the Volcanoes and Atmosphere in Magmatic Open Systems (VAMOS) research unit at Johannes Gutenberg University Mainz.
Earthquakes
2,015
December 11, 2015
https://www.sciencedaily.com/releases/2015/12/151211130111.htm
Mapping downgoing plate topography: The 2005 Sumatra earthquake
New geophysical data show that fault slip during the March 2005 magnitude 8.7 (Mw) earthquake off the west coast of northern Sumatra, Indonesia (also referred to as the Simeulue-Nias earthquake), was stopped by the topography on the downgoing plate.
Earthquakes in subduction zones, where one tectonic plate is forced beneath another, usually break only a part of the plate boundary fault. The pieces that break independently are known as segments. Topography on the top of the downgoing plate has often been suggested a cause of this segmentation, but there are few examples where this topography is as well-known as well as the details of earthquake rupture.Data collected over the subduction zone offshore of Sumatra, Indonesia, has enabled the top of the downgoing plate to be mapped across a long-lived segment boundary at one end of the rupture zone. Seismic reflection data, similar to that used to find oil reserves, gives a detailed image of the shape of the downgoing plate. A 3-km high on the top of the plate over a 15-km by 30-km region matches where the 2005 earthquake rupture stopped. The topographic high appears to strengthen the plate boundary, and only very large earthquakes would break through this barrier.This survey by Timothy Henstock and colleagues spans a complex segment boundary zone between the southern termination of the Mw 8.7 earthquake and the northern termination of a major 1797 earthquake that was partly filled by a Mw 7.7 event in 1935. They have identified an isolated 3 km basement high at the northern edge of this zone, close to the 2005 slip termination. They note that the high probably originated at the Wharton fossil ridge, and is almost aseismic in both local and global data sets, suggesting that while the region around it may be weakened by fracturing and fluids, the basement high locally strengthens the plate boundary, stopping rupture propagation.
Earthquakes
2,015
December 10, 2015
https://www.sciencedaily.com/releases/2015/12/151210144707.htm
First explanations for boundary within Earth's mantle
Earth's mantle, the large zone of slow-flowing rock that lies between the crust and the planet's core, powers every earthquake and volcanic eruption on the planet's surface. Evidence suggests that the mantle behaves differently below 1 megameter (1,000 kilometers, or 621 miles) in depth, but so far seismologists have not been able to explain why this boundary exists.
Two new studies co-authored by University of Maryland geologists provide different, though not necessarily incompatible, explanations. One study suggests that the mantle below 1 megameter is more viscous--meaning it flows more slowly--than the section above the boundary. The other study proposes that the section below the boundary is denser--meaning its molecules are more tightly packed--than the section above it, due to a shift in rock composition.Taken together, the studies provide the first detailed look at why large-scale geologic features within the mantle behave differently on either side of the megameter divide. The papers were published on December 11, 2015, in the journals "The existence of the megameter boundary has been suspected and inferred for a while," said Vedran Lekic, an assistant professor of geology at UMD and co-author of the Although the mantle is mostly solid, it flows very slowly in the context of geologic time. Two main sources of evidence suggest the existence of the megameter boundary and thus inspired the current studies.First, many huge slabs of ocean crust that have been dragged down, or subducted, into the mantle can still be seen in the deep Earth. These slabs slowly sink downward toward the bottom of the mantle. A large number of these slabs have stalled out and appear to float just above the megameter boundary, indicating a notable change in physical properties below the boundary.Second, large plumes of hot rock rise from the deepest reaches of the mantle, and the outlines of these structures can be seen in the deep Earth as well. As the rock in these mantle plumes flows upward, many of the plumes are deflected sideways as they pass the megameter boundary. This, too, indicates a fundamental difference in physical properties on either side of the boundary."Learning about the anatomy of the mantle tells us more about how the deep interior of Earth works and what mechanisms are behind mantle convection," said Nicholas Schmerr, an assistant professor of geology at UMD and co-author of the The physics of the deep Earth are complicated, so establishing the mantle's basic physical properties, such as density and viscosity, is an important step. Density refers to the packing of molecules within any substance (gas, liquid or solid), while viscosity is commonly described as the thickness of a fluid or semi-solid. Sometimes density and viscosity correlate with each other, while sometimes they are at odds. For example, honey is both more viscous and dense than water. Oil, on the other hand, is more viscous than water but less dense.In their study, Schmerr, lead author Maxim Ballmer (Tokyo Institute of Technology and the University of Hawaii at Manoa) and two colleagues used a computer model of a simplified Earth. Each run of the model began with a slightly different chemical composition--and thus a different range of densities--in the mantle at various depths. The researchers then used the model to investigate how slabs of ocean crust would behave as they travel down toward the lower mantle.In the real world, slabs are observed to behave in one of three ways: The slabs either stall at around 600 kilometers, stall out at the megameter boundary, or continue sinking all the way to the lower mantle. Of the many scenarios for mantle chemical composition the researchers tested, one most closely resembled the real world and included the possibility that slabs can stall at the megameter boundary. This scenario included an increased amount of dense, silicon-rich basalt rock in the lower mantle, below the megameter boundary.Lekic, lead author Max Rudolph (Portland State University) and another colleague took a different approach, starting instead with whole-Earth satellite measurements. The team then subtracted surface features--such as mountain ranges and valleys--to better see slight differences in Earth's basic shape caused by local differences in gravity. (Imagine a slightly misshapen basketball with its outer cover removed.)The team mapped these slight differences in Earth's idealized shape onto known shapes and locations of mantle plumes and integrated the data into a model that helped them relate the idealized shape to differences in viscosity between the layers of the mantle. Their results pointed to less viscous, more free-flowing mantle rock above the megameter boundary, transitioning to highly viscous rock below the boundary. Their results help to explain why mantle plumes are frequently deflected sideways as they extend upward beyond the megameter boundary."While explaining one mystery--the behavior of rising plumes and sinking slabs--our results lead to a new conundrum," Lekic said. "What causes the rocks below the megameter boundary to become more resistant to flow? There are no obvious candidates for what is causing this change, so there is a potential for learning something fundamentally new about the materials that make up Earth."Lekic and Schmerr plan to collaborate to see if the results of both studies are consistent with one another--in effect, whether the lower mantle is both dense and viscous, like honey, when compared with the mantle above the megameter boundary."This work can tell us a lot about where Earth has been and where it is going, in terms of heat and tectonics," Schmerr said. "When we look around our solar system, we see lots of planets at various stages of evolution. But Earth is unique, so learning what is going on deep inside its mantle is very important."
Earthquakes
2,015
December 8, 2015
https://www.sciencedaily.com/releases/2015/12/151208150520.htm
Death Valley study helps determine evolution of western US landscapes
The faulted alluvial fans near Badwater in Death Valley are amongst the most visited and classic landforms in the U.S. New mapping and dating of these landforms, presented in this open-access study by Kurt Frankel and colleagues, help to determine the timing of past earthquakes and how tectonic deformation is distributed across the western U.S.
This in turn provides important data for seismic hazard mitigation and for understanding how the great landscapes of the western U.S. have evolved over recent geologic time.Death Valley constitutes one of the most dramatic landscapes in North America, and it is famous for its faulted mountain fronts, spectacular alluvial fans, and extensive saline playa. Moreover, the valley is the type example of a pull-apart basin, which is controlled by the northern Death Valley-Fish Lake Valley fault zone, the Black Mountains fault zone, and the southern Death Valley fault zone.These three fault zones make up the Holocene fault zones of the Death Valley fault system. This Death Valley pull-apart often provides an analog for the evolution, including stress transfer and depositional systems, in other tectonically active transtensional regimes, such as the Dead Sea, East Africa, and Alpine fault of New Zealand.
Earthquakes
2,015
December 8, 2015
https://www.sciencedaily.com/releases/2015/12/151208134632.htm
Hot rock and ice: Volcanic chain underlies Antarctica
Planetary scientists would be thrilled if they could peel Earth like an orange and look at what lies beneath the thin crust. We live on the planet's cold surface, but Earth is a solid body and the surface is continually deformed, split, wrinkled and ruptured by the roiling of warmer layers beneath it.
The contrast between the surface and the depth is nowhere starker -- or more important -- than in Antarctica. What is causing the mysterious line of volcanoes that emerge from the ice sheet there, and what does it mean for the future of the ice?"Our understanding of what's going on is really hampered because we can't see the geology," said Andrew Lloyd, a graduate student in earth and planetary sciences in Arts & Sciences at Washington University in St. Louis. "We have to turn to geophysical methods, such as seismology, to learn more," he said.Lloyd helped deploy research seismometers across the West Antarctic Rift System and Marie Byrd Land in the austral summer of 2009-10. He then returned in late 2011 and snowmobiled more than 1,000 miles, living in a Scott tent, to recover the precious data.The recordings the instruments made of the reverberations of distant earthquakes from January 2010 to January 2012 were used to create maps of seismic velocities beneath the rift valley. An analysis of the maps was published online in the This is the first time seismologists have been able to deploy instruments rugged enough to survive a winter in this part of the frozen continent, and so this is the first detailed look at Earth beneath this region.Not surprisingly, the maps show a giant blob of superheated rock about 60 miles beneath Mount Sidley, the last of a chain of volcanic mountains in Marie Byrd Land at one end of the transect. More surprisingly, they reveal hot rock beneath the Bentley Subglacial Trench, a deep basin at the other end of the transect.The Bentley Subglacial Trench is part of the West Antarctic Rift System and hot rock beneath the region indicates that this part of the rift system was active quite recently.Mount Sidley, the highest volcano in Antarctica, sits directly above a hot region in the mantle, Lloyd said. Mount Sidley is the southernmost mountain in a volcanic mountain range in Marie Byrd Land, a mountainous region dotted with volcanoes near the coast of West Antarctica."A line of volcanoes hints there might be a hidden mantle plume, like a blowtorch, beneath the plate," said Doug Wiens, PhD, professor of earth and planetary sciences and a co-author on the paper. "The volcanoes would pop up in a row as the plate moved over it.""But it's a bit unclear if this is happening here," he said. We think we know which direction the plate is moving, but the volcanic chain is going in a different direction and two additional nearby volcanic chains are oriented in yet other directions."If this was just a plate moving over a couple of mantle plumes, you'd expect them to line up, as they do in the Hawaiian Islands," he said.Although the hot zone's shape is ill-defined, it is clear there is higher heat flow into the base of the ice sheet in this area, Wiens said.The most interesting finding, Lloyd said, is the discovery of a hot zone beneath the Bentley Subglacial Trench.The basin is part of the West Antarctic Rift System, a series of rifts, adjacent to the Transantarctic Mountains, along which the continent was stretched and thinned.The old rock of East Antarctica rises well above sea level, but west of the Transantarctic Mountains, extension has pulled the crust into a broad saddle, or rift valley, much of which lies a kilometer below sea level."If you removed the ice, West Antarctica would rebound, and most of it would be near sea level. But the narrower and deeper basins might remain below it," Lloyd said. "The Bentley Subglacial Trench, which is the lowest point on Earth not covered by an ocean, would still be a kilometer and a half below sea level if the ice were removed."Because the West Antarctic Rift is hidden, less is known about it than about other famous rift systems such as the East African Rift or, in the United States, the Rio Grande Rift."We didn't know what we'd find beneath the basin," Wiens said. "For all we knew it would be old and cold."We didn't detect any earthquakes, so we don't think the rift is currently active, but the heat suggests rifting stopped quite recently."In this way, it resembles the Rio Grande Rift, which is also no longer active but has yet to cool completely.A period of diffuse extension created the rift valley in the late Cretaceous, roughly 100 million years ago, Lloyd said, and more focused extension then created deep basins like the Bentley Subglacial Basin and the Terror Rift in the Ross Sea."This period of more focused extension likely occurred in the Neogene," Lloyd said. "If it's still hot there, it might also be hot under other basins in the rift system."The rift system is thought to have a major influence on ice streams in West Antarctica. "Rifting and ice flow occur on completely different time scales," Lloyd said, "so rifting is not going to suddenly make the ice sheet unstable."But to accurately model how quickly the ice is going to flow or the rock to rebound, we need to understand the 'boundary conditions' for ice models, such as heat flow from the mantle," he said."Seismic surveys like this one will help inform models of the ice sheet," Wiens said. "Modelers need an estimate of the heat flow, and they need to know something about the geological conditions at the bottom of the ice sheet in order to estimate drag. Right now, both of these factors are very poorly constrained."While heat flow through Earth's crust has been measured at at least 34,000 different spots around the globe, in Antarctica it has been measured in less than a dozen places. In July 2015, scientists reported the heat flow at one of these spots was four times higher than the global average.Ever since then, scientists have been wondering why the reading was so high. "Recent extension in the Bentley Subglacial Trench might explain these readings," Wiens said.The next big problem, he said, is to understand the structure under the Thwaites and Pine Island glaciers, which lie closer to the coastline than the Bentley Subglacial Trench. These two glaciers have been described as the 'weak underbelly' of the ice sheet because surges in the ice flow there could theoretically cause the rapid disintegration of the entire West Antarctic ice sheet.During the 2014-2015 Antarctic field season, Lloyd helped deploy another 10 seismic stations that together with seismometers deployed by the British will map the underside of this key area.
Earthquakes
2,015
December 6, 2015
https://www.sciencedaily.com/releases/2015/12/151206062852.htm
Maximum observed earthquake magnitudes along continental transform faults
Continental transform faults evolve when two plates slide along each other. The most prominent examples are the San Andreas Fault in California and the North Anatolian Fault in Turkey. Earthquakes along those faults typically do not exceed earthquake magnitudes around M8 but occur at shallow depth thus posing a major threat to nearby metropolitan regions such as San Francisco or Istanbul.
To estimate the seismic hazard and resulting risk it is essential to know the maximum earthquake magnitude to be expected along particular faults. This, however, is not trivial since instrumental recordings date back only 150 years while the recurrence period for the largest earthquakes can be much longer.A team of scientists from the GFZ German Centre for Geosciences in collaboration with the University of Southern California has now presented a global evaluation of observed maximum earthquakes along all major transform faults allowing to better estimate the maximum earthquake strengths.The major findings of the study are that for 75% of the data the observed maximum magnitude generally scales with the offset across the faults if exceeding 10 km. The offset across a fault results from the continuous slip of several mm to a few cm per year leading to offsets of kilometers after millions of years. Furthermore, it was found that the length of the rupture of individual earthquakes scales with mapped fault length.For the remaining 25% of the earthquakes a larger coseismic stress drop was found to occur. "This means that those earthquakes release more seismic energy during the rupture process and they all occur along faults with low slip rates allowing to distinguish them from the majority of events that show a direct relation to cumulative offset," says GFZ-scientist Patricia Martínez-Garzón, lead author of the study.The results contribute towards developing refined building codes, risk mitigation concepts and early-warning systems and are, thus, of great relevance for millions of people living in population centers near transform faults.
Earthquakes
2,015
December 4, 2015
https://www.sciencedaily.com/releases/2015/12/151204145912.htm
Scientists develop 'Shazam for earthquakes'
An algorithm inspired by a popular song-matching app is helping Stanford scientists find previously overlooked earthquakes in large databases of ground motion measurements.
They call their algorithm Fingerprint And Similarity Thresholding, or FAST, and it could transform how seismologists detect microquakes -- temblors that don't pack enough punch to register as earthquakes when analyzed by conventional methods. While microquakes don't threaten buildings or people, monitoring them could help scientists predict how frequently, and where, larger quakes are likely to occur."In the past decade or so, one of the major trends in seismology has been the use of waveform similarity to find weakly recorded earthquakes," said Greg Beroza, a professor of geophysics at Stanford School of Earth, Energy & Environmental Sciences.The technique most commonly employed to do this, called template matching, functions by comparing an earthquake's seismic wave pattern against previously recorded wave signatures in a database. The downsides of template matching are that it can be time-consuming and that it requires seismologists to have a clear idea of the signal they are looking for ahead of time.The FAST technique, which is detailed in the current issue of the journal The fingerprints are then sorted into separate bins, or groups, based on their similarities."We then search for pairs of fingerprints that are similar, and then map those back to the time windows that they came from," said study co-author Clara Yoon, a graduate student in Beroza's research group. "That's how we identify the earthquakes."Earthquakes occurring on the same section of a fault have similar fingerprints, regardless of their magnitudes, because the seismic waves they generate travel through the same underground structures to reach the surface."It doesn't matter if one earthquake happened 10 years ago and the other one happened yesterday. They're actually going to have waveforms that look very similar," Yoon said.This sorting step, which the Stanford scientists compare to grouping similar documents in a filing cabinet, is why FAST is so efficient."Instead of comparing a signal to every other signal in the database, most of which are noise and not associated with any earthquakes at all, FAST compares like with like," said Beroza, who is the Wayne Loel Professor at Stanford. "Tests we have done on a 6-month data set show that FAST finds matches about 3,000 times faster than conventional techniques. Larger data sets should show an even greater advantage."The idea for FAST occurred to Beroza several years ago. While perusing an electronics store, he heard a catchy song he didn't know playing over the speakers. Beroza pulled out his smartphone and opened Shazam, an app that could listen to and identify the song by name."Shazam did its thing and within 10 seconds it was trying to sell me the song," Beroza said. In a moment of insight, Beroza realized that Shazam wasn't simply comparing the digital file of the song against other files in a database. It was doing something more sophisticated, namely capturing the audio waveform of a short section of the song and comparing that snippet to other waveforms housed on an online server. Not only that, the app had to be able to quickly filter out irrelevant noise from the environment such as people's conversations."I thought, 'That's cool,' and then a moment later, 'That's what I want to do with seismology,'" Beroza said.It took several years, but Beroza eventually assembled a team of computer-savvy researchers to help build upon his eureka moment. Drawing heavily upon recent advances in computer science, the group created a search algorithm capable of quickly scanning continuous ground motion data for similar matches."In the early stages, we thought that we were going to need high-performance supercomputers to tackle the problem in a brute force fashion by running thousands of comparisons at once," said study co-author Ossian O'Reilly, a graduate student in the Department of Geophysics. "But we soon realized that even they wouldn't be able to handle the amount of data we wanted to process. So we started learning about the ingenious algorithms devised by the computer science community for solving related problems."In particular, the team borrowed techniques from data mining and machine learning to create FAST, said study co-author Karianne Bergen, a graduate student at Stanford's Institute for Computational and Mathematical Engineering (ICME)."The scalability of FAST comes from the use of a data mining technique called locality-sensitive hashing, or LSH," Bergen said. "LSH is a widely used technique for identifying similar items in large data sets. FAST is the first use of LSH in earthquake detection."In the new study, the Stanford scientists used FAST to analyze a week's worth of data collected in 2011 by a seismic station on the Calaveras Fault in California's Bay Area. This same fault recently ruptured and set off a sequence of hundreds of small quakes.Not only did FAST detect the known earthquakes, it also discovered several dozen weak quakes that had previously been overlooked."A lot of the newer earthquakes that we found were magnitude 1 or below, so that tells us our technique is really sensitive," Yoon said. "FAST was able to spot the missed quakes because it looks for similar wave patterns across the seismic data, regardless of their energy level."The team thinks FAST could prove useful in places like Oklahoma and Arkansas, which have recently experienced spikes in suspected induced earthquakes due to the increased injection of wastewater from oil and gas development into the subsurface. "If you can detect the smaller quakes, you could identify the risk of a larger quake occurring from continued injection," Yoon said.Similarly, an improved understanding of how often different magnitude earthquakes happen could help seismologists better predict how frequently large, natural quakes will occur.The team is currently working on scaling up their FAST algorithm to analyze data collected across longer periods of time, from multiple seismic stations and in more challenging scenarios."That is very important if you want to be able to determine the location of these earthquakes," Yoon said. "If you have data from only one station, you can't accurately pinpoint the epicenter of a quake."
Earthquakes
2,015
December 4, 2015
https://www.sciencedaily.com/releases/2015/12/151204094343.htm
Laser scanning shows rates and patterns of surface deformation from the South Napa earthquake, California, USA
U.S. Geological Survey scientists used 3D laser scanning to make repeat measurements of an area affected by the 2014 magnitude 6.0 South Napa earthquake in order to define in great detail the surface deformation that occurred both during and after the earthquake. The recent revolution in 3D laser measurement technology (LiDAR) allows scientists to collect detailed information about the shape of the land surface and the objects that sit upon it with unprecedented accuracy.
These spatially extensive measurement techniques provide new understanding of how earthquakes and other phenomena deform the shape of Earth's surface, reinforce the notion that not all surface deformation occurs during an earthquake itself, and provide insight into what can be expected following future earthquakes. When earthquakes strike, damage is expected to occur along the fault trace over a few seconds or perhaps minutes as Earth's tectonic plates shift, shake, and tear the ground.However, in some cases, the damage to Earth's crust and what sits on top of it can unfold slowly over hours, days, weeks, and even years following an earthquake. This is often termed "afterslip." and it has been observed following many moderate earthquakes. Surface deformation following the South Napa quake occurred variously as discrete fault slip, rotation of a block of earth adjacent to the fault, and by vertical elevation changes. Comparison of the new 2014 terrestrial laser scanner data with 2003 airborne laser scanner data also indicate that the earthquake caused vertical warping across the fault zone rather than forming a distinct vertical scarp, challenging notions of how topography is created in moderate earthquakes.
Earthquakes
2,015
December 2, 2015
https://www.sciencedaily.com/releases/2015/12/151202124238.htm
Determinant factors for energy consumption, perception of energy conservation clarified
A research group led by Keishiro Hara, Specially Appointed Associate Professor, Center for Environmental Innovation Design for Sustainability, Osaka University performed large-scale questionnaire surveys in Suita City, Osaka in 2009 and 2013: before and after the required electricity conservation practice following the Great East Japan earthquake in 2011.
By the time-series analysis of factors associated with household energy use (electricity and gas) and perception of savings, "household income," "actual amount of energy consumption," and "perception of energy savings" were identified as three closely related elements. In addition, when compared with 2009, there was a large change in consumption behavior and perception of energy savings among the people of the city in 2013.These results provide new knowledge regarding the understanding of the mechanisms of energy consumption behavior, and suggest important information which will contribute to the design of effective policies aimed at the promotion of energy-conservation in the household.This research group, in collaboration with Suita City, carried out a comparative investigation into energy consumption behavior and perception of energy saving in 2009 and 2013. Using the results of this survey, this group compared the factors which determine household energy use and perception of savings over the time period.The results of this research provided essential knowledge on the determinant factors associated with the residential consumption and perception of savings of electricity and gas, while identifying "household income," "actual amount of energy consumption," and "perception of energy savings" as three closely related elements. Detailed analysis also revealed that households with high energy consumption and those with moderate consumption are becoming polarized within the city and that there was a growing gap between consumption behavior and the perception of conservation in 2013.The results provide essential insight into the energy consumption patterns and perception of savings at the household level.In future study, it is necessary to conduct thorough and objective validation on the reasons and specific mechanisms involved in such changes over the period examined, including the possible influences of the required electricity conservation practice following the Great East Japan earthquake in 2011, by carrying out continuing surveys not only in Suita city but also in other parts of Japan.
Earthquakes
2,015
November 30, 2015
https://www.sciencedaily.com/releases/2015/11/151130163422.htm
Climate can grind mountains faster than they can be rebuilt
Researchers for the first time have attempted to measure all the material leaving and entering a mountain range over more than a million years and discovered that erosion caused by glaciation during ice ages can, in the right circumstances, wear down mountains faster than plate tectonics can build them.
The international study conducted by the Integrated Ocean Drilling Program and led by scientists from the University of Florida, The University of Texas at Austin and Oregon State University, adds insight into a longstanding debate about the balance of climate and tectonic forces that influence mountain building. It is published in the Researchers studied the St. Elias Mountains on the Alaskan coast and found that erosion accelerated sharply about 1 million years ago when global climate cooling triggered stronger and more persistent ice ages than times past."Humans often see mountain ranges as static, unyielding parts of the landscape," said co-chief scientist John Jaeger, an associate professor of geology at the University of Florida. "But our work has shown that they are actively evolving along with, and responding to, Earth's climate, which just shows how truly dynamic and coupled this planet is."The study, conducted by a team of scientists from 10 countries, culminated more than a decade of field work. Researchers first used seismic equipment to image and map a huge fan of sediment in the deep sea in the Gulf of Alaska caused by erosion of the nearby mountains and took short sediment cores to understand the modern system. They then collected and dated almost 4 kilometers of sediment from the floor of the gulf and the Alaskan continental shelf, revealing millions of years of geologic history."It turned out most [sediments] were younger than we anticipated, and most rates (of sediment production and thus erosion) were higher than we anticipated," said lead author and co-chief scientist Sean Gulick of the University of Texas Institute for Geophysics, a unit of the Jackson School of Geosciences. "Since the big climate change during the mid-Pleistocene transition when we switched from short (about 40,000-year) ice ages to super-long (about 100,000-year) ice ages, erosion became much greater... In fact, there was more erosion than tectonics has replaced.""We were pleasantly surprised by how well we could establish ages of the sediment sequences as we were drilling, and the composition of the sediment gave clear evidence of when the glaciation started and then expanded, in synch with global climate trends over the past several million years," said co-author Alan Mix of Oregon State University. "Only by drilling the sea floor where the sediment accumulates could we see these details."Mountain ranges form when tectonic plates thrust into one another over millions of years and scrunch up the Earth's outer crust. But even as mountains are built by these titanic forces, other agents -- some combination of tectonic and climate processes -- work to remove the accumulating crust.Since the mid-Pleistocene, erosion rates have continued to beat tectonic inputs by 50 to 80 percent, demonstrating that climatic processes, such as the movement of glaciers, can outstrip mountain building over a span of a million years. The findings highlight the pivotal role climate fluctuations play in shaping Earth's landforms.
Earthquakes
2,015
November 25, 2015
https://www.sciencedaily.com/releases/2015/11/151125083932.htm
Anti-seismic bricks to improve buildings' response to earthquakes
Sisbrick is a new class of earthquake-resistant building materials that seismically isolates partition walls from the main building structure, significantly reducing the tension between these two elements and, therefore, the damage incurred.
Researchers from the Universitat Politècnica de València (Polytechnic University of Valencia, UPV) have designed a new seismic isolator that improves the way buildings respond to earthquakes. The key to the Sisbrick, as the invention is known, lies in the way different materials have been combined to achieve two main effects: it is able to absorb horizontal seismic movements, while also supporting vertical loads (for instance, partition walls) that act on the integrity of the building frame. Designed specifically for use in partition walls, its brick form means it can be readily incorporated into traditional construction techniques, without the need for additional measures or equipment.Techniques and special bricks to improve the way buildings respond to earthquakes are already available on the market. However, what sets Sisbrick apart is its approach to partition walls. As researcher Luis Pallarés at the UPV's Concrete Science and Technology Institute (ICITECH) explains, these structures greatly condition a building's response to a seismic event. Merely making partition walls more resistant does not address the more widespread damage caused by earthquakes. The Sisbrick's large capacity to absorb the horizontal movements caused by earthquakes seismically isolates the partition walls from the main building frame: "They effectively serve as an insulating barrier, avoiding the transfer of loads from these partition elements to the main structure. By doing so, their impact on overall structural integrity in the face of an earthquake is greatly reduced" (Pallarés).This also brings real seismic response into line with projected seismic response as calculated at the building design stage. Francisco Javier Pallerés, also of ICITECH, tell us that "today, seismic calculations only take into consideration the structure of the building frame and do not consider the partition walls, despite the clear and widely-reported influence they have on a building's response to earthquakes." By isolating the partition walls from the main frame, these calculations become more reliable.On top of the convenient brick form, only a relatively small amount of these bricks is needed to achieve this seismic isolation. Laboratory testing proves that, if the bricks are arranged in a specific way, just a small amount can afford significant gains in seismic wave absorption. Specifically, partition walls that incorporate Sisbricks can absorb horizontal movements in the order of three times greater than those that do not. This translates into considerably less tension in the partition walls, meaning correspondingly less tension is transferred to the building frame during earthquakes.The Sisbrick has been patented by the UPV. Testing is currently being carried out into the thermal and acoustic isolation afforded by this material, in order to comply with the specifications of the Building Code. The team at ICITECH is currently looking for collaborators for the implementation, manufacture and commercialization of this product.
Earthquakes
2,015
November 24, 2015
https://www.sciencedaily.com/releases/2015/11/151124122045.htm
How Earth's Pacific plates collapsed
Scientists drilling into the ocean floor have for the first time found out what happens when one tectonic plate first gets pushed under another.
The international expedition drilled into the Pacific ocean floor and found distinctive rocks formed when the Pacific tectonic plate changed direction and began to plunge under the Philippine Sea Plate about 50 million years ago."It's a bit like a rugby scrum, with two rows of forwards pushing on each other. Then one side goes down and the other side goes over the top," said study leader Professor Richard Arculus, from The Australian National University (ANU)."But we never knew what started the scrum collapsing," said Professor Arculus, a petrologist in the ANU Research School of Earth Sciences.The new knowledge will help scientists understand the huge earthquakes and volcanoes that form where the Earth's plates collide and one plate gets pushed under the other.As part of the International Ocean Discovery Program, the team studied the sea floor in 4,700 metres of water in the Amami Sankaku Basin of the north-western Pacific Ocean, near the Izu-Bonin-Mariana Trench, which forms the deepest parts of the Earth's oceans.Drilling 1,600 metres into the sea floor, the team recovered rock types from the extensive rifts and big volcanoes that were initiated as one plate bored under the other in a process known as subduction."We found rocks low in titanium, but high in scandium and vanadium, so the Earth's mantle overlying the subducting plate must have been around 1,300 degrees Celsius and perhaps 150 degrees hotter than we expected to find," Professor Arculus said.The team found the tectonic scrum collapsed at the south end first and then the Pacific Plate rapidly collapsed 1,000 kilometres northwards in about one million years."It's quite complex. There's a scissoring motion going on. You'd need skycam to see the 3D nature of it," Professor Arculus said.Professor Arculus said that the new knowledge could give insights into the formation of copper and gold deposits that are often formed where plates collide.The research is published in
Earthquakes
2,015
November 23, 2015
https://www.sciencedaily.com/releases/2015/11/151123202627.htm
Stretchy slabs found in the deep Earth
A new study suggests that the common belief that the Earth's rigid tectonic plates stay strong when they slide under another plate, known as subduction, may not be universal.
Typically during subduction, plates slide down at a constant rate into the warmer, less-dense mantle at a fairly steep angle. However, in a process called flat-slab subduction, the lower plate moves almost horizontally underneath the upper plate.The research, published in the journal By studying the speed at which seismic waves travel in different directions through the same material, a phenomenon called seismic anisotropy, the researchers found that interior of the Nazca plate had been deformed during subduction.Lead author of the study, Dr Caroline Eakin, Research Fellow in Ocean and Earth Science at the University of Southampton, said: "The process of consuming old seafloor at subduction zones, where great slabs of oceanic material are swallowed up, drives circulation in the Earth's interior and keeps the planet going strong. One of the most crucial but least known aspects of this process is the strength and behavior of oceanic slabs once they sink below the Earth's surface. Our findings provide some of the first direct evidence that subducted slabs are not only weaker and softer than conventionally envisioned, but also that we can peer inside the slab and directly witness their behavior as they sink."When oceanic plates form at mid-ocean ridges, their movement away from the ridge causes olivine (the most abundant mineral in the Earth's interior) to align with the direction of plate growth. This olivine structure is then 'frozen' into the oceanic plate as it travels across the Earth's surface. The olivine fabric causes the seismic waves to travel at different speeds in different directions, depending on whether or not they are going 'with the grain' or 'against the grain'.The scientists measured seismic waves at 15 local seismic stations over two and a half years, from 2010 to 2013, and seven further stations located on different continents. They found that the original olivine structure within the slab had vanished and been replaced by a new olivine alignment in an opposing orientation to before.Dr Eakin said: "The best way to explain this observation is that the slab's interior must have been stretched or deformed during subduction. This means that slabs are weak enough to deform internally in the upper mantle over time."The researchers believe that deformation associated with stretching of the slab as it bends to takes on its flat-slab shape was enough to erase the frozen olivine structure and create a new alignment, which closely follows the contours of the slab bends."Imaging Earth's plates once they have sunk back into the Earth is very difficult," said Lara Wagner, from the Carnegie Institution for Science and a principal investigator of the PULSE Peruvian project. "It's very exciting to see results that tell us more about their ultimate fate, and how the materials within them are slowly reworked by the planet's hot interior. The original fabric in these plates stays stable for so long at the Earth's surface, that it is eye opening to see how dramatically and quickly that can change," Lara added.
Earthquakes
2,015
November 23, 2015
https://www.sciencedaily.com/releases/2015/11/151123201945.htm
Mountain ranges evolve, respond to Earth's climate, study shows
Ground-breaking new research has shown that erosion caused by glaciation during ice ages can, in the right circumstances, wear down mountains faster than plate tectonics can build them.
The international study, including Dr Ian Bailey from the University of Exeter, has given a fascinating insight into how climate and tectonic forces influence mountain building over a prolonged period of time.The research team attempted to measure all the material that left and entered the St Elias Mountain range, on the Alaskan coast, over the past five million years, using state-of-the-art seismic imaging equipment and marine coring.They found that erosion accelerated sharply when global climate cooling triggered stronger and more persistent ice ages about one million years ago.The pioneering new research, which is the product of the culmination more than a decade of field work, has shown that mountain ranges actively evolve with, and respond to, Earth's climate, rather than being static, unyielding parts of the landscape.The international study, conducted by the Integrated Ocean Drilling Program and led by scientists from The University of Texas at Austin, University of Florida and Oregon State University, is published in the Dr Bailey, a Geology Lecturer from the Camborne School of Mines, based at the University of Exeter's Penryn Campus in Cornwall said: "Understanding precisely how the balance of climate and tectonic forces influences mountain building remains an outstanding unknown in Earth Sciences."A tremendous amount of important information has been gained by studying the onshore geology associated with the St Elias Mountain range."Our exciting findings, which add new insight to this important debate, could only be made, however, by examining at the adjacent off-shore marine sediment record."The study, conducted by a team of scientists from 10 countries, used seismic equipment to image and map a huge fan of sediment in the deep sea in the Gulf of Alaska caused by erosion of the St Elias Mountain range and took short sediment cores to understand the modern system.They then determined when and how fast the fan accumulated by dating nearly four kilometres of marine cores collected from the gulf and the Alaskan continental shelf.Sean Gulick, lead author and co-chief scientist from the University of Texas Institute for Geophysics (UTIG), a unit of the Jackson School of Geosciences said: "It turned out most sediments were younger than we anticipated, and most rates of sediment production and thus erosion were higher than we anticipated."Since the big climate change during the mid-Pleistocene transition when we switched from short (about 40,000-year) ice ages to super long (about 100,000-year) ice ages, erosion became much greater. In fact, there was more erosion than tectonics has replaced.""We were pleasantly surprised by how well we could establish ages of the sediment sequences as we were drilling, and the composition of the sediment gave clear evidence of when the glaciation started and then expanded, in synch with global climate trends over the past several million years," said co-author Alan Mix of Oregon State University. "Only by drilling the sea floor where the sediment accumulates could we see these details."Mountain ranges form when tectonic plates thrust into one another over millions of years and scrunch up Earth's outer crust. But even as mountains are built by these titanic forces, other agents--some combination of tectonic and climate processes--work to remove the accumulating crust.Since the mid-Pleistocene, erosion rates have continued to beat tectonic inputs by 50 to 80 percent, demonstrating that climatic processes, such as the movement of glaciers, can outstrip mountain building over a span of a million years. The findings highlight the pivotal role climate fluctuations play in shaping Earth's landforms.
Earthquakes
2,015
November 17, 2015
https://www.sciencedaily.com/releases/2015/11/151117130140.htm
From nanocrystals to earthquakes, solid materials share similar failure characteristics
Apparently, size doesn't always matter. An extensive study by an interdisciplinary research group suggests that the deformation properties of nanocrystals are not much different from those of Earth's crust.
"When solid materials such as nanocrystals, bulk metallic glasses, rocks, or granular materials are slowly deformed by compression or shear, they slip intermittently with slip-avalanches similar to earthquakes," explained Karin Dahmen, a professor of physics at the University of Illinois at Urbana-Champaign. "Typically these systems are studied separately. But we found that the scaling behavior of their slip statistics agree across a surprisingly wide range of different length scales and material structures.""Identifying agreement in aspects of the slip statistics is important, because it enables us to transfer results from one scale to another, from one material to another, from one stress to another, or from one strain rate to another," stated Shivesh Pathak, a physics undergraduate at Illinois, and a co-author of the paper, "Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes," appearing in "The results provide new tools and methods to use the slip statistics to predict future materials deformation," added Michael LeBlanc, a physics graduate student and co-author of the paper. "They also clarify which system parameters significantly affect the deformation behavior on long length scales. We expect the results to be useful for applications in materials testing, failure prediction, and hazard prevention."Researchers representing a broad a range of disciplines--including physics, geosciences, mechanical engineering, chemical engineering, and materials science--from the United States, Germany, and the Netherlands contributed to the study, comparing five different experimental systems, on several different scales, with model predictions.As a solid is sheared, each weak spot is stuck until the local shear stress exceeds a random failure threshold. It then slips by a random amount until it re-sticks. The released stress is redistributed to all other weak spots. Thus, a slipping weak spot can trigger other spots to fail in a slip avalanche.Using tools from the theory of phase transitions, such as the renormalization group, one can show that the slip statistics of the model do not depend on the details of the system."Although these systems span 13 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties," stated Pathak. "Their size distributions follow the same simple (power law) function, multiplied with the same exponential cutoff."The cutoff, which is the largest slip or earthquake size, grows with applied force for materials spanning length scales from nanometers to kilometers. The dependence of the size of the largest slip or quake on stress reflects "tuned critical" behavior, rather than so-called self-organized criticality, which would imply stress-independence."The agreement of the scaling properties of the slip statistics across scales does not imply the predictability of individual slips or earthquakes," LeBlanc said. "Rather, it implies that we can predict the scaling behavior of average properties of the slip statistics and the probability of slips of a certain size, including their dependence on stress and strain-rate."
Earthquakes
2,015
November 16, 2015
https://www.sciencedaily.com/releases/2015/11/151116120806.htm
Discovery of hidden earthquake presents challenge to earthquake early-warning systems
Seismologists at the University of Liverpool studying the 2011 Chile earthquake have discovered a previously undetected earthquake which took place seconds after the initial rupture.
This newly discovered phenomena which they called a `closely-spaced doublet' presents a challenge to earthquake and tsunami early warning systems as it increases the risk of larger-than-expected tsunamis in the aftermath of a typical subduction earthquake.In a study published in They discovered that just 12 seconds later and 30 km further offshore, a second rupture of a similar size, which was un-detected by national and global earthquake monitoring centres, occurred along an extensional (pull-apart) fault in the middle of the South American plate beneath the Pacific Ocean.Liverpool seismologist, Professor Andreas Rietbrock, said: "Real-time global seismic monitoring and early warning events have come a long way and it is possible for a magnitude 5 or greater earthquake to be detected within a matter of minutes. Therefore, it is striking that an earthquake with magnitude close to 7 was effectively hidden from our standard monitoring systems.""Previous doublet events have been documented in subduction zones before, but such instantaneous triggering of large ruptures at close distances has no known precedent. Such triggered events dramatically complicate potential earthquake impact assessments and tsunami early warning systems as the risk of a larger than expected tsunami is higher following a typical subduction earthquake."Dr Stephen Hicks, who was part of the research team, said: "We believe that seismic waves travelling outward from the first rupture immediately shook up and weakened the shallower second fault, causing the hidden rupture. Scientists believe that the overlying plate at collisional plate boundaries is broken up on a large scale and contains networks of faults. It is plausible that similar closely-spaced doublets may occur elsewhere around the Pacific Ring of Fire. "Professor Rietbrock added: "This work challenges the commonly-held notion that slip during large earthquakes may only occur along a single fault. The result was surprising as there was no indication of such a complicated rupture from global earthquake monitoring systems. ""Our findings present a concern for tsunami early warning systems. Without real-time monitoring of seismometers located close to the fault, it is possible that tsunami and shaking hazard from future subduction earthquakes may be underestimated."As part of the University's Liverpool Earth Observatory, seismologists are installing a seismic network in Southern Peru in close collaboration with the Geophysical Institute of Peru.This area along the South American continental margin has the potential for a large magnitude 8+ earthquake and it is important to understand the associated seismic and tsunami hazard.
Earthquakes
2,015
November 11, 2015
https://www.sciencedaily.com/releases/2015/11/151111143225.htm
Plate tectonics thanks to plumes?
"Knowing what a chicken looks like and what all the chickens before it looked like doesn't help us to understand the egg," says Taras Gerya. The ETH Professor of Geophysics uses this metaphor to address plate tectonics and the early history of the Earth. The Earth's lithosphere is divided into several plates that are in constant motion, and today's geologists have a good understanding of what drives these plate movements: heavier ocean plates are submerged beneath lighter continental plates along what are known as subduction zones. Once the movement has begun, it is perpetuated due to the weight of the dense subducting plate.
But just as in the past, earth scientists still do not understand what triggered plate tectonics in the first place, nor how the first subduction zone was formed. A weak spot in the Earth's lithosphere was necessary in order for parts of the Earth's crust to begin their descent into the Earth's mantle. Was this weak spot caused by a gigantic meteorite that effectively smashed a hole in the Earth's lithosphere? Or did mantle convection forces shatter the lithosphere into moving parts?Gerya is not satisfied with any of these potential explanations. "It's not trivial to draw conclusions about what set the tectonic movements in motion," he says. The ETH professor therefore set out to find a new, plausible explanation.Among other things, he found inspiration in studies about the surface of the planet Venus, which has never had plate tectonics. Gerya observed (and modelled) huge, crater-like circles (coronae) on Venus that may also have existed on the Earth's surface in the early period (Precambrian) of the Earth's history before plate tectonics even began. These structures could indicate that mantle plumes once rose from Venus' iron core to the outer layer, thus softening and weakening the planet's surface. Plumes form in the deep interior of the planet. They rise up to the lithosphere, bringing with them hot partially molten mantle material that causes the lithosphere to weaken and deform. Halted by the resistance of the hard lithosphere, the material begins to spread, taking on a mushroom-like shape.Such plumes also likely existed in the Earth's interior and could have created the weaknesses in the Earth's lithosphere needed to initiate plate tectonics on Earth.The ETH geophysicist worked with his team to develop new computer models that he then used to investigate this idea for the first time in high resolution and in 3D. The corresponding publication has recently been published in The simulations show that mantle plumes and the weaknesses they create could have actually initiated the first subduction zones.In the simulations, the plume weakens the overlying lithosphere and forms a circular, thinning weak point with a diameter of several dozen to hundreds of kilometres. This is stretched over time by the supply of hot material from the deep mantle. "In order to make a ring larger, you have to break it," explains the researcher. This also applies to the Earth's surface: the ring-shaped weaknesses can (in the model) only be enlarged and subducted if the margins are torn.The tears spread throughout the lithosphere, large slabs of the heavier rigid lithosphere plunge into the soft mantle, and the first plate margins emerge. The tension created by the plunging slabs ultimately sets the plates in motion. They plunge, well lubricated by the buried seawater of the ocean above. Subduction has begun -- and with it, plate tectonics. "Water acts as a lubricant and is an absolute necessity in the initiation of a self-sustaining subduction," says Gerya.In their simulations, the researchers compare different temperature conditions and lithosphere states. They came to the conclusion that plume-induced plate tectonics could plausibly develop under the conditions that prevailed in the Precambrian around three billion years ago. Back then the Earth's lithosphere was already thick and cool, but the mantle was still very hot, providing enough energy to significantly weaken the lithosphere above the plumes.Had the lithosphere instead being thin and warm, and therefore soft, the simulations show that a ring-shaped rapidly descending structure called drip would simply have formed around the plume head. While this would have steadily sunk into the mantle, it would not have caused the soft lithosphere to subduct and tear and therefore would not have produced plate margins. Likewise, the computer simulations showed that under today's conditions, where there is less temperature difference between lithosphere and plume material, plume-induced subduction is hard to initiate because the lithosphere is already too rigid and the plumes are barely able to weaken it sufficiently."Our new models explain how plate tectonics came about," says the geophysicist. Plume activity was enough to give rise to today's plate mosaic. He calls the power of the plumes the dominant trigger for global plate tectonics.The simulations can also explain how so-called triple junctions, i.e. zones in which three plates come together, are nucleated by multi-directional stretching of the lithosphere induced by plumes. One such example of a triple junction can be found in the Horn of Africa where Ethiopia, Eritrea and Djibouti meet.A possible plume-weakened zone analogous to a starting point for global plate tectonics likely exists in the modern world: the researchers see such a zone in the Caribbean plate. Its shape, location and spread correspond largely to the new model simulations.Indeed it is arguably impossible to prove how global plate tectonics started on Earth based solely on observations: there is no geophysical and only a small amount of geological data from the Earth's early years, and laboratory experiments are not possible for extremely large-scale and very long-term tectonic processes, says the ETH researcher. "Computer models are therefore the only way we can reproduce and understand the events of the Earth's early history."
Earthquakes
2,015
November 11, 2015
https://www.sciencedaily.com/releases/2015/11/151111115334.htm
Predicting earthquakes: Titan takes on the big one
The San Andreas Fault system, which runs almost the entire length of California, is prone to shaking, causing about 10,000 minor earthquakes each year just in the southern California area.
However, cities that line the fault, like Los Angeles and San Francisco, have not experienced a major destructive earthquake -- of magnitude 7.5 or more -- since their intensive urbanizations in the early twentieth century. With knowledge that large earthquakes occur at about 150-year intervals on the San Andreas, seismologists are certain that the next "big one" is near.The last massive earthquake to hit San Francisco, having a 7.8 magnitude, occurred in 1906, taking 700 lives and causing $400 million worth of damage. Since then, researchers have collected data from smaller quakes throughout California, but such data doesn't give emergency officials and structural engineers the information they need to prepare for a quake of magnitude 7.5 or bigger.With this in mind, a team led by Thomas Jordan of the Southern California Earthquake Center (SCEC), headquartered at the University of Southern California (USC) in Los Angeles, is using the Titan supercomputer at the US Department of Energy's (DOE's) Oak Ridge National Laboratory (ORNL) to develop physics-based earthquake simulations to better understand earthquake systems, including the potential seismic hazards from known faults and the impact of strong ground motions on urban areas."We're trying to solve a problem, and the problem is predicting ground shaking in large earthquakes at specific sites during a particular period of time," Jordan said.Ground shaking depends upon the type of earthquake, the way a fault ruptures, and how the waves propagate, or spread, through all 3-D structures on Earth.Clearly, understanding what might happen in a particular area is no simple task. In fact, the prediction involves a laundry list of complex inputs that could not be calculated all together without the help of Titan, a 27-petaflop Cray XK7 machine with a hybrid CPU-GPU architecture. Titan is managed by the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.Running on Titan, the team uses SCEC's CyberShake -- a physics-based computational approach that integrates many features of an earthquake event -- to calculate a probabilistic seismic hazard map for California. In May, Jordan's team completed its highest resolution CyberShake map for Southern California using the OLCF's Titan.One of the most important variables that affects earthquake damage to buildings is seismic wave frequency, or the rate at which an earthquake wave repeats each second. With greater detail and increases in the simulated frequency -- from 0.5 hertz to 1 hertz -- the latest CyberShake map is the most useful one to date and serves as an important tool for engineers who use its results to design and build critical infrastructure and buildings.Building structures respond differently to certain frequencies. Large structures like skyscrapers, bridges, and highway overpasses are sensitive to low-frequency shaking, whereas smaller structures like homes are more likely to be damaged by high-frequency shaking, which ranges from 2 to 10 hertz and above.High-frequency simulations are more computationally complex, however, limiting the information that engineers have for building safer structures that are sensitive to these waves. Jordan's team is attempting to bridge this gap."We're in the process of trying to bootstrap our way to higher frequencies," Jordan said.Let's Get PhysicalThe process that Jordan's team follows begins with historical earthquakes."Seismology has this significant advantage of having well-recorded earthquake events that we can compare our simulations against," said Philip Maechling, team member and computer scientist at USC. "We develop the physics codes and the 3-D models, then we test them by running a simulation of a well-observed historic earthquake. We compare the simulated ground motions that we calculate against what was actually recorded. If they match, we can conclude that the simulations are behaving correctly."The team then simulates scenario earthquakes, individual quakes that have not occurred but that are cause for concern. Because seismologists cannot get enough information from scenario earthquakes for long-term statements, they then simulate all possible earthquakes by running ensembles, a suite of simulations that differ slightly from one another."They're the same earthquake with the same magnitude, but the rupture characteristics -- where it started and how it propagated, for example -- will change the areas at Earth's surface that are affected by this strong ground motion," Maechling said.As the team increased the maximum frequency in historic earthquake simulations, however, they identified a threshold right around 1 hertz, at which their simulations diverged from observations. The team determined it needed to integrate more advanced physics into its code for more realistic results."One of the simplifications we use in low-frequency simulations is a flat simulation region," Maechling said. "We assume that Earth is like a rectangular box. I don't know if you've been to California, but it's not flat. There are a lot of hills. This kind of simplifying assumption worked well at low frequencies, but to improve these simulations and their results, we had to add new complexities, like topography. We had to add mountains into our simulation."Including topography -- the roughness of Earth's surface -- the team's simulations now include additional geometrical and attenuation (gradual dampening of the shaking due to loss of energy) effects -- near-fault plasticity, frequency-dependent attenuation, small-scale near-surface heterogeneities, near-surface nonlinearity, and fault roughness.On Titan, the team introduced and tested the new physics calculations individually to isolate their effects. By the end of 2014, the team updated the physics in its code to get a complete, realistic simulation capability that is now able to perform simulations using Earth models near 4 hertz."The kind of analysis we're doing has been done in the past, but it was using completely empirical techniques -- looking at data and trying to map observations onto new situations," Jordan said. "What we're doing is developing a physics-based seismic hazard analysis, where we get tremendous gains by incorporating the laws of physics, to predict what will be in the future. This was impossible without high-performance computing. We are at a point now where computers can do these calculations using physics and improve our ability to do the type of analysis necessary to create a safe environment for society."Movers and ShakersWith the new physics included in SCEC's earthquake code -- the Anelastic Wave Propagation by Olsen, Day, and Cui (AWP-ODC) -- Jordan's team was able to run its first CyberShake hazard curve on Titan for one site at 1 hertz, establishing the computational technique in preparation for a full-fledged CyberShake map.A seismic hazard curve provides all the probabilities that an earthquake will occur at a specific site, within a given time frame, and with ground shaking exceeding a given threshold.The team used the US Geologic Survey's (USGS's) Uniform California Earthquake Forecast -- which identifies all possible earthquake ruptures for a particular site -- for generating CyberShake hazard curves for 336 sites across southern California.This May, the team calculated hazard curves for all 336 sites needed to complete the first 1 hertz urban seismic hazard map for Los Angeles. With double the maximum simulated frequency from last year's 0.5 hertz map, this map proves to be twice as accurate.The map will be registered into the USGS Urban Seismic Hazard Map project, and when it passes the appropriate scientific and technical review, its results will be submitted for use in the 2020 update of the Recommended Seismic Provisions of the National Earthquake Hazards Reduction Program.This major milestone in seismic hazard analysis was possible only with the help of Titan and its GPUs."Titan gives us the ability to submit jobs onto many GPU-accelerated nodes at once," Jordan said. "There's nothing comparable. Even with other GPU systems, we can't get our jobs through the GPU queue fast enough to keep our research group busy. Titan is absolutely the best choice for running our GPU jobs."Yifeng Cui, team member and computational scientist at the San Diego Supercomputer Center, modified AWP-ODC to take advantage of Titan's hybrid architecture, thereby improving performance and speed-up. He was awarded NVIDIA's 2015 Global Impact Award for his work."It's fantastic computer science," Jordan said. "What Yifeng has done is get in and really use the structure of Titan in an appropriate way to speed up what are very complicated codes. We have to manipulate a lot of variables at each point within these very large grids and there's a lot of internal communication that's required to do the calculations."Using Cui's GPU-accelerated code on Titan, the team ran simulations 6.3 times more efficiently than the CPU-only implementation, saving them 2 million core hours for the project. Completion of the project required about 9.6 million core hours on Titan."The computational time required to do high-frequency simulations takes many node hours," Maechling said. "It could easily take hundreds of thousands of node hours. That's a huge computational amount that well exceeds what SCEC has available at our university. These pushing-to-higher-frequency earthquake simulations require very large computers because the simulations are computationally expensive. We really wouldn't be able to do these high-frequency simulations without a computer like Titan."With Titan, Jordan's team plans to push the maximum simulated frequency above 10 hertz to better inform engineers and emergency officials about potential seismic events, including the inevitable "big one.""We have the potential to have a positive impact and to help reduce the risks from earthquakes," Maechling said. "We can help society better understand earthquakes and what hazards they present. We have the potential to make a broad social impact through safer environment."
Earthquakes
2,015
November 9, 2015
https://www.sciencedaily.com/releases/2015/11/151109103908.htm
Microplate discovery dates birth of Himalayas
An international team of scientists has discovered the first oceanic microplate in the Indian Ocean--helping identify when the initial collision between India and Eurasia occurred, leading to the birth of the Himalayas.
Although there are at least seven microplates known in the Pacific Ocean, this is the first ancient Indian Ocean microplate to be discovered. Radar beam images from an orbiting satellite have helped put together pieces of this plate tectonic jigsaw and pinpointed the age for the collision, whose precise date has divided scientists for decades.Reported in Researchers led by the University of Sydney School of Geosciences discovered that crustal stresses caused by the initial collision cracked the Antarctic Plate far away from the collisional zone and broke off a fragment the size of Australia's Tasmania in a remote patch of the central Indian Ocean.The authors, comprising Professor Dietmar Müller and Dr Kara Matthews from the University of Sydney and Professor David Sandwell from the Scripps Institution of Oceanography, have named the ancient Indian microplate the Mammerickx Microplate, after Dr Jacqueline Mammerickx, a pioneer in seafloor mapping.The Mammerickx Microplate rotation is revealed by a rotating pattern of grooves and hills that turn the topography of the ocean floor into a jagged landscape. These so-called "abyssal hills" record a sudden increase in crustal stress, dating the birth of the Himalayan Mountain Range to 47 million years ago.The ongoing tectonic collision between the two continents produces geological stresses that build up along the Himalayas and leads to numerous earthquakes every year--but this latest finding indicates how stressed the Indian Plate became when its northern edge first collided with Eurasia.The new research shows that 50 million years ago, India was travelling northwards at speeds of some 15 centimetres a year--close to the plate tectonic speed limit. Soon after it slammed into Eurasia crustal stresses along the mid-ocean ridge between India and Antarctica intensified to breaking point. A chunk of Antarctica's crust broke off and started rotating like a ball bearing, creating the newly discovered tectonic plate.The discovery was made using satellite radar beam mapping from space, which measures the bumps and dips of the sea surface caused by water being attracted by submarine mountains and valleys, combined with conventional marine geophysical data.Lead author Dr Matthews explains: "The age of the largest continental collision on Earth has long been controversial, with age-estimates ranging from at least 59 to 34 million years ago."Knowing this age is particularly important for understanding the link between the growth of mountain belts and major climate change."Co-author Professor Müller said: "Dating this collision requires looking at a complex set of geological and geophysical data, and no doubt discussion about when this major collision first started will continue, but we have added a completely new, independent observation, which has not been previously used to unravel the birth of this collision."It is beyond doubt that the collision must have led to a major change in India's crustal stress field--that's why the plate fragmentation we mapped is a bit like a smoking gun for pinning down the collision age."Co-author Professor Sandwell from the Scripps Institution of Oceanography said humans had explored and mapped remote lands extensively but the same was not true for our ocean basins."We have more detailed maps of Pluto than we do of most of our own planet because about 71 per cent of the Earth's surface is covered with water," Professor Sandwell said."Roughly 90% of the seafloor is uncharted by ships and it would take 200 ship-years of time to make a complete survey of the deep ocean outside continental shelves, at a cost of between two- to three billion US dollars."That's why advances in comparatively low-cost satellite technology are the key to charting the deep, relatively unknown abyssal plains, at the bottom of the ocean."The paper 'Oceanic microplate formation records the onset of India-Eurasia collision' was be published in
Earthquakes
2,015
November 5, 2015
https://www.sciencedaily.com/releases/2015/11/151105121523.htm
Climate change is moving mountains
For millions of years global climate change has altered the structure and internal movement of mountain ranges, but the resulting glacial development and erosion can in turn change a mountain's local climate. The degree of this cause-and-effect relationship has never been clearly observed, until now.
Based on research led by University of Cincinnati geologist Eva Enkelmann in the St. Elias Mountain Range -- located along the Pacific coastal region of North America -- the way a mountain range moves and behaves topographically can also change and create its local climate by redirecting wind and precipitation. The repercussions of these changes can in turn, accelerate the erosion and tectonic seismic activity of that mountain range.Based on her findings, Enkelmann shows clear evidence for a strong relationship between global and local climate change and a mountain's internal tectonic plate shifts and topographic changes.Enkelmann, an assistant professor in the University of Cincinnati Department of Geology, was among several UC researchers and thousands of geoscientists from around the globe presenting their findings at the 2015 Annual Geological Society of America Meeting, Nov.1-4, in Baltimore.This research also was published in July in the journal "To understanding how mountain structures evolve through geologic time is no quick task because we are talking millions of years," says Enkelmann. "There are two primary processes that result in the building and eroding of mountains and those processes are interacting."Looking at the St. Elias Mountains in particular, Enkelmann notes how dry it is in the northern part of the mountain range. But the precipitation is very high in the southern area, resulting in more erosion and material coming off the southern flanks. So as the climate change influences the erosion, that can produce a shift in the tectonics. This has been suggested in earlier studies based on numerical and analytical models, however, it had not yet been shown to have occurred over geologic times in the real world.Enkelmann synthesized several different data sets to show how a rapid exhumation occurred in the central part of the mountain range over four to two million years ago. This feedback process between erosion and internal tectonic shifting resulted in a mass of material moving up toward the surface very rapidly.Enkelmann's model suggests that global climate shifts triggered a change in the rheology -- the way material behaves.While Earth was much warmer millions of years ago, glaciers still existed in the high altitudes. However, 2.6 million years ago Earth experienced a shift to a colder climate and glaciation intensified. Existing glaciers grew larger, froze solid, covered the area and did not move.Enkelmann says the glaciers today are wet-based and are moving, very aggressively eroding material around and out, and in the case of her observation, into the Gulf of Alaska. The tectonic forces (internal plates moving toward one another) continue to move toward Alaska, get pushed underneath and the sediment on top is piling up above the Yakutat plate.Adding to the already complex effects of climate change, these processes essentially work against each other.The movement of glaciers can compete with the internal buildup and develop a feedback process that is very rapid and ferocious. Scientists have suggested that the Himalayas, European Alps and mountains in Taiwan were caused by the same competing reactions as those Enkelmann has observed in southeastern Alaska.In Enkelmann's observation, the climate-driven erosion can influence the tectonics and change the motion of the rocks in that area. This makes studying the St. Elias Mountain Range particularly ideal because this area is very active tectonically, with strong glacial erosion. As an example, she cites the Great Alaskan Earthquake of 1964 -- the world's second largest earthquake recorded to date -- that also resulted in a tsunami."In 1899, there were two big earthquakes in a row, an 8.1 and an 8.2 magnitude, says Enkelmann pointing to a photo of the resulting shoreline lift that still stands today. "These earthquakes resulted in up to 14 meters of co-seismic uplift on the shore, so the shoreline basically popped up 14 meters (45 feet) and it happened immediately."Our biggest concern today is the continued potential for earthquakes that can also result in tsunamis," says Enkelmann.Enkelmann appreciates the challenge of collecting samples here because this range has the highest peaks of any coastal mountain range and is only 20 kilometers from the Pacific Ocean, but she points out that it is a tough area to study because of the big ice sheets."So as geologists, we go to the area and take samples and do measurements in the field on the mountain ranges that stick out," says Enkelmann. "One approach is to sample the material that comes out of the glaciers that has transported the eroded sediment and analyze that sediment."By going to all of these individual glaciers, we can get a much better understanding of what has happened and what was moved on the entire mountain range."
Earthquakes
2,015
November 3, 2015
https://www.sciencedaily.com/releases/2015/11/151103064728.htm
Past earthquakes play a role in future landslides, research suggests
The likelihood of an area experiencing a potentially devastating landslide could be influenced by its previous exposure to earthquakes many decades earlier.
This is according to new research led by Cardiff University showing that areas which have experienced strong earthquakes in the past were more likely to produce landslides when a second earthquake hit later on.Researchers speculate that this is because damage can reside in the side of mountains after an initial earthquake, and that the consequences of this damage may only be felt when a second earthquake hits.These new insights could have important implications for disaster management and prevention by helping researchers better predict areas that may be susceptible to future landslides.The consequences of the landslides that occurred after two large earthquakes hit Nepal earlier this year, killing more than 9,000 people and inflicting wide-spread damage, serves to show how valuable a prediction tool would be.Predictive models that are currently used to assess the likelihood of landslides do not consider historical occurrences of previous earthquakes, and instead focus on the strength of the earthquake and the characteristics of the particular area, including the make-up of rock and the steepness of slopes."This could potentially be a significant gap in our understanding of the factors that lead to landsliding," said Dr Robert Parker, lead author of the paper, from Cardiff University' School of Earth and Ocean Sciences.After the Nepal earthquakes, a program called ShakeSlide, developed by Dr Parker, was used to predict areas affected by landslides and assist in post-disaster efforts. These new findings may lead to improved predictions, through models that consider the legacy of past earthquakes.To reach their conclusions, the research team analysed data from two individual earthquakes that occurred in close-proximity to each other, in 1929 and 1968, on the South Island of New Zealand.The epicentres of the two earthquakes were around 21 km apart and both triggered landslides over a large area.The researchers firstly analysed the influence that standard factors, such as the strength of the earthquake and the gradient of hillslopes, had on the distribution of landslides.Where the results were unexplained by these standard factors, the researchers investigated whether the results could be attributed to the legacy of previous events.Their results suggested that hillslopes in regions that experienced strong ground motions in the 1929 earthquake were more likely to fail during the 1968 earthquake than would be expected on the basis of the standard factors alone."Our results suggest that areas that experienced strong shaking in the first earthquake were more likely to produce landslides in the second earthquake than would be expected based on the strength of shaking and hillslope characteristics alone," said Dr Parker.Dr Parker and his team have speculated that the increased likelihood of occurrence may be down to the fact that damage persists in the landscape after an initial earthquake, making it sufficiently weaker and thus more prone to a landslide if another earthquake hits in the future.Dr Parker continued: "Strong shaking in a past earthquake may actually cause mountains to be more hazardous, in terms of landslides, in a future earthquake many years or decades later. You could think of it as mountains remembering past earthquakes, which affects how they respond to future earthquakes."Dr Parker and his team are now investigating whether this 'memory effect' is seen in other areas, and have begun investigating the earthquakes that occurred in Nepal.The new study has been published in the journal
Earthquakes
2,015
November 2, 2015
https://www.sciencedaily.com/releases/2015/11/151102130134.htm
Northwest's next big earthquake: Source mapped
A large team of scientists has nearly completed the first map of the mantle under the tectonic plate that is colliding with the Pacific Northwest and putting Seattle, Portland and Vancouver at risk of the largest earthquakes and tsunamis in the world.
A new report from five members of the mapping team describes how the movement of the ocean-bottom Juan de Fuca plate is connected to the flow of the mantle 150 kilometers (100 miles) underground, which could help seismologists understand the forces generating quakes as large as the destructive Tohoku quake that struck Japan in 2011."This is the first time we've been able to map out the flow of mantle across an entire plate, so as to understand plate tectonics on a grand scale," said Richard Allen, a professor and chair of earth and planetary science at the University of California, Berkeley, and the senior author of a paper published online Nov. 2 in the journal The major surprise, Allen said, is that the mantle beneath a small piece of the Juan de Fuca plate is moving differently from the rest of the plate, resulting in segmentation of the subduction zone. Similar segmentation is seen in Pacific Northwest megaquakes, which don't always break along the entire 1,000-kilometer (600-mile) length, producing magnitude 9 or greater events. Instead, it often breaks along shorter segments, generating quakes of magnitude 7 or 8.The Juan de Fuca plate offshore of Oregon, Washington and British Columbia is small -- about the size of California and 50-70 kilometers thick -- but "big enough to generate magnitude 9 earthquakes" as it's shoved under the continental North American plate, Allen said. Because of the hazard from this so-called Cascadia Subduction Zone, a recent New Yorker article portrayed the area as a disaster waiting to happen, predicting that "an earthquake will destroy a sizable portion of the coastal Northwest."But little is known about the tectonic plates submerged under the oceans, how they are linked to processes inside Earth, such as the melted mantle rock underlying them, or how the crust and mantle interact to cause megathrust earthquakes at subductions zones.The Juan de Fuca plate is one of seven major and dozens of minor plates that cover Earth like a jigsaw puzzle, pushed around by molten rock rising at mid-ocean ridges and, at their margins, diving under other plates or ramming into them to generate mountain ranges like the Himalayas. The largest of Earth's tectonic plates, the Pacific Plate, is moving eastward and plunging under the entire western edge of the Americas, creating a "ring of fire" dotted with volcanoes and mountain ranges and imperiled by earthquakes.Until now, however, scientists have deployed only a handful of seismometers on the seabed worldwide to explore the mantle underlying these plates, said Allen, who also is director of the Berkeley Seismological Laboratory and one of the co-principal investigators for the $20 million Cascadia Initiative. Led by the University of Oregon, the initiative is funded by the National Science Foundation to develop new underwater and on-shore seismic instruments to measure the plate's interaction with the mantle or asthenosphere, and monitor quake and volcanic activity at the trench off the coast where the Juan de Fuca plate subducts under the North American plate."The experiment was unprecedented in that there were 70 seismometers deployed at a time, sitting there for 10 months, which is much bigger than any other ocean-bottom experiment ever done before," said Robert Martin-Short, a UC Berkeley graduate student and first author of the paper. "We've learned a lot from the deployment of these new instruments, and now have a giant array that we know works well on the seafloor and which we can move somewhere else in the future for a similar experiment."While the deployment of seismometers at 120 sites on the ocean floor was a technical challenge, Allen said, "the offshore environment is much simpler, the plates are thinner and more uniform than continental plates and we can see through them to get a better sense of what is going on beneath."Since 2012, the team has made 24 two-week ocean voyages to place and retrieve the seabed seismometers, providing dozens of students -- undergraduates and graduate students from UC Berkeley, Columbia University, the universities of Oregon and Washington, and Imperial College in the UK -- an opportunity to participate in field research. The last of the seabed seismometers were pulled up this month and the data is being prepared for analysis.Based on the first three years of data, Allen and his team confirmed what geophysicists suspected. At the mid-ocean Juan de Fuca ridge about 500 kilometers (300 miles) offshore of Seattle -- the western edge of the Juan de Fuca plate -- the flow of the mantle below the plate is perpendicular to the ridge, presumably because the newly formed plate drags the underlying mantle eastward with it.As the plate moves away from the ridge, the mantle flow rotates slightly northward toward the trench. At its eastern margin, the plate and underlying mantle move in alignment, perpendicular to the subduction zone, as expected. Presumably, the subducted portion of the plate deep under the trench is pulling the massive plate downward at the same time that the emerging lava at the mid-ocean spreading ridge is elevating the plate and pushing it eastward.Allen and his colleagues found, however, that a part of the Juan de Fuca plate called the Gorda Plate, located off the northern California coast, is not coupled to the mantle, leaving the mantle beneath Gorda to move independently of the plate above. Instead, the Gorda mantle seems to be aligned with the mantle moving under the Pacific plate."The Juan de Fuca plate is clearly influencing the flow of the mantle beneath it, but the Gorda Plate is apparently too small to affect the underlying mantle," he said.This change in mantle flow produces a break or discontinuity in the forces on the plate, possibly explaining segmentation along the subduction zone."When you look at earthquakes in Cascadia, they sometimes break just along the southern segment, sometimes on the southern two-thirds, and sometimes along the entire length of the plate," Allen said. "The change in the mantle flow could be linked to that segmentation."The Cascadia Initiative is a community experiment designed by the research community with all data immediately available to the public. NSF funded the project with money it received through the 2009 stimulus or American Recovery and Reinvestment Act (ARRA). Eleven scientists, including Allen, from across the U.S. formed the Cascadia Initiative Expedition Team responsible for the offshore seismic deployment.Allen and Martin-Short's co-authors on the Nature Geosciences paper are Ian Bastow and Eoghan Totten of Imperial College and UC Berkeley geophysicist Mark Richards, a professor of earth and planetary science. Richards helped develop the geodynamic model of the interaction between the plate and the mantle that explains how the faster moving Pacific Plate could override the influence that the Gorda Plate has on the mantle below.
Earthquakes
2,015
November 2, 2015
https://www.sciencedaily.com/releases/2015/11/151102083225.htm
Earthquakes recorded through fossils
The Cascadia subduction zone (CSZ) has captured major attention from paleoseismologists due to evidence from several large (magnitude 8-9) earthquakes preserved in coastal salt marshes. Stratigraphic records are proving to be useful for learning about the CSZ's past, and microfossils may provide more answers about large ancient earthquakes. They may also allow modelers to learn more about potential major hazards related to earthquakes in the area, which would contribute to public preparedness for such events.
Over the past three decades, researchers have found stratigraphic evidence of subsidence occurring during earthquakes beneath the salt marshes of Humboldt Bay, California, USA, at the southern end of the CSZ. J. Scott Padgett of the University of Rhode Island uses analysis of fossil foraminifera to estimate this subsidence at Arcata Bay, just north of Humboldt Bay. He will report on his research on 2 November at the Geological Society of America's Annual Meeting in Baltimore, Maryland, USA.Padgett notes, "Previous investigations were able to provide estimates of subsidence with large errors, which are only so helpful to the modelers." More recently, researchers started using an improved analysis on the microfossil data in Oregon, and were able to generate subsidence estimates with smaller errors. Their refined results enabled modelers to produce earthquake models that are more consistent with observed subsidence measurements seen in today's instrumented earthquakes.Similar work is being done at Jacoby Creek, a small coastal drainage that flows into northern Arcata Bay. There, researchers have found three sharp contacts between salt marsh peat and intertidal mud dating back over the past 2,000 years. Radiocarbon ages of plant macrofossils at the top of the buried peats are 195, 1280, and 1710 years old. These new ages provide tighter constraints on the timing of past earthquakes and subsidence at the southern end of the CSZ.Padgett says there are several lines of evidence that support their results and interpretation at Jacoby Creek. "These include the sharp mud-over-peat contacts that are laterally continuous over 5 kilometers, changes in fossil foraminifera assemblages across the buried peat contacts, long-lasting submergence also derived from fossil foraminifera records, and radiocarbon ages of plant macrofossils taken from buried peat deposits that are consistent with other southern Cascadia earthquake chronologies derived from buried peat and tsunami deposits."
Earthquakes
2,015
October 30, 2015
https://www.sciencedaily.com/releases/2015/10/151030111111.htm
Technique for analyzing bedrock could help builders, planners identify safe building zones
Research by a UCLA geologist and colleagues could give builders and urban planners more detailed information about how susceptible areas are to landslides and earthquakes.
The study, by Seulgi Moon, a UCLA assistant professor of geology, and colleagues at MIT and the University of Wyoming, is published Oct. 30 in the journal Their research focused on bedrock, just beneath the soil and roots and the Earth's surface. Bedrock is the layer at the bottom of what geologists refer to as the "critical zone" because its cracks and fractures provide pathways for air and water, which break down rock and form the soil that is an essential ingredient for all living organisms.The chemical and physical breakdown of rocks in the bedrock layer -- which scientists refer to as "weathering" --can influence how Earth's landscapes evolve over time, and the chemical reactions help regulate Earth's climate by consuming carbon dioxide.But until now, scientists have been unable to accurately predict how wide or deep the weathered part of the bedrock extends, or how extensive the weathering is in any given location.Moon and her colleagues devised a mathematical model that estimates the amount of stress the bedrock is under -- from the weight of rocks in the layers above and from the forces of tectonic plates below -- which will enable them and other scientists to predict where fractures may occur. The study is the first to use real data from geophysical imaging of bedrock at depth to demonstrate that the shape of the landscape, or topography, can influence the fracturing of the bedrock.Moon conducted the research from 2013 through earlier this year, when she was an MIT postdoctoral scholar working with Taylor Perron, an MIT associate professor. They found that if a landscape is undergoing only a small amount of compression from the movement of tectonic plates, fractured zones in the bedrock mimic the topography. On the other hand, if a region is undergoing a high degree of compression from tectonic plates, the bottom of fractured zones will essentially be the inverse of the surface topography.To test the model, the group collaborated with University of Wyoming researchers who specialize in measuring seismic waves in bedrock as well as electrical resistivity and borehole imaging, which can detect the amount of fracturing present within the bedrock. The team analyzed seismic surveys of sites with different amounts of tectonic compression in Colorado, South Carolina and Maryland, and they found that the measured shapes of fractured zone of bedrock in all three sites matched the profiles predicted by their model.The research was funded in part by the U.S. Army Research Office.
Earthquakes
2,015
October 29, 2015
https://www.sciencedaily.com/releases/2015/10/151029190848.htm
Researchers advance understanding of mountain watersheds
University of Wyoming geoscientists have discovered that the underground water-holding capacity of mountain watersheds may be controlled by stresses in Earth's crust. The results, which may have important ramifications for understanding streamflow and aquifer systems in upland watersheds, appears Oct. 30 in
The scientists conducted geophysical surveys to estimate the volume of open pore space in the subsurface at three sites around the country. Computer models of the state of stress at those sites showed remarkable agreement with the geophysical images. The surprising implication, says Steve Holbrook, a UW professor in the Department of Geology and Geophysics, is that scientists may be able to predict the distribution of pore space in the subsurface of mountain watersheds by looking at the state of stress in Earth's crust. That state of stress controls where subsurface fractures are opening up -- which, in turn, creates the space for water to reside in the subsurface, he says."I think this paper is important because it proposes a new theoretical framework for understanding the large-scale porosity structure of watersheds, especially in areas with crystalline bedrock (such as granite or gneiss)," Holbrook says. "This has important implications for understanding runoff in streams, aquifer recharge and the long-term evolution of landscapes."James St. Clair, a UW doctoral student, is lead author of the paper, titled "Geophysical Imaging Reveals Topographic Stress Control of Bedrock Weathering." Holbrook, Cliff Riebe, a UW associate professor of geology and geophysics; and Brad Carr, a research scientist in geology and geophysics; are co-authors of the paper.Researchers from MIT, UCLA, the University of Hawaii, Johns Hopkins University, Duke University and the Colorado School of Mines also contributed.Weathered bedrock and soil together make up the life-sustaining layer at Earth's surface commonly referred to as the "critical zone." Two of the three study sites were part of the national Critical Zone Observatory (CZO) network -- Gordon Gulch in Boulder Creek, Colo., and Calhoun Experimental Forest, S.C. The third study site was Pond Branch, Md., near Baltimore."The paper provides a new framework for understanding the distribution of permeable fractures in the critical zone (CZ). This is important because it provides a means for predicting where in the subsurface there are likely to be fractures capable of storing water and/or supporting groundwater flow," St. Clair says. "Since we cannot see into the subsurface without drilling holes or performing geophysical surveys, our results provide the means for making first order predictions about CZ structure as a function of the local topography and knowledge (or an estimate) of the regional tectonic stress conditions."The research included a combination of geophysical imaging of the subsurface -- conducted by UW's Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) -- and numerical models of the stress distribution in the subsurface, work that was done at MIT and the University of Hawaii, Holbrook says.The team performed seismic refraction and electrical resistivity surveys to determine the depth of bedrock at the three sites, which were chosen due to varying topography and ambient tectonic stress. At the two East Coast sites, the bedrock showed a surprising mirror-image relationship to topography; at the Rocky Mountain site, the bedrock was parallel to topography. In each case, the stress models successfully predicted the bedrock pattern."We found a remarkable agreement between the predictions of those stress models and the images of the porosity in the subsurface with geophysics at a large scale, at the landscape scale," Holbrook says. "It's the first time anyone's really looked at this at the landscape scale."St. Clair says he was fortunate to work with a talented group of scientists with an extensive amount of research experience. He adds the experience improved his ability to work with a group of people with diverse backgrounds and improve his writing."Our results may be important to hydrologists, geomorphologists and geophysicists," St. Clair says. "Hydrologists, because it provides a means for identifying where water may be stored or where the flow rates are likely to be high; geomorphologists, because our results predict where chemical weathering rates are likely to be accelerated due to increased fluid flow along permeable fractures; and geophysicists, because it points out the potential influence of shallow stress fields on the seismic response of the CZ."Despite the discovery, Holbrook says there is still much work to be done to test this model in different environments."But, now we have a theoretical framework to guide that work, as well as unique geophysical data to suggest that the hypothesis has merit," he says.
Earthquakes
2,015
October 29, 2015
https://www.sciencedaily.com/releases/2015/10/151029190842.htm
Babe Ruth and earthquake hazard maps
Northwestern University researchers have turned to an unusual source -- Major League Baseball -- to help learn why maps used to predict shaking in future earthquakes often do poorly.
Earthquake hazard maps use assumptions about where, when, and how big future earthquakes will be to predict the level of shaking. The results are used in designing earthquake-resistant buildings. However, as the study's lead author, earth science and statistics graduate student Edward Brooks, explains "sometimes the maps do well, and sometimes they do poorly. In particular, the shaking and thus damage in some recent large earthquakes was much larger than expected."Part of the problem is that seismologists have not developed ways to describe how well these maps perform. As Seth Stein, William Deering Professor of Geological Sciences explains "we need the kind of information the weather service has, where they can tell you how much confidence to have in their forecasts."The question is how to measure performance. Bruce Spencer, professor of statistics, explains that "it's like asking how good a baseball player Babe Ruth was. The answer depends on how one measures performance. In many seasons Ruth led the league in both home runs and in the number of times he struck out. By one measure he did very well, and by another, very poorly. In the same way, we are using several measures to describe how hazard maps perform."Another problem is that the hazard maps try to forecast shaking over hundreds over years, because buildings have long lifetimes. As a result, it takes a long time to tell how well a map is working. To get around this, the team looked backwards in time, using records of earthquake shaking in Japan that go back 500 years. They compared the shaking to the forecasts of the published hazard maps. They also compared the shaking to maps in which the expected shaking was the same everywhere in Japan, and maps in which the expected shaking at places was assigned at random from the published maps.The results were surprising. In Brook's words "it turns out that by the most commonly used measure using the uniform and randomized maps work better than the published maps. By another measure, the published maps work better."The message, in Stein's view, is that seismologists need to know a lot more about how these maps work. "Some of the problem is likely to be that how earthquakes occur in space and time is more complicated that the maps assume. Until we get a better handle on this, people using earthquake hazard maps should recognize that they have large uncertainties. Brightly colored maps look good, but the earth doesn't have to obey them and sometimes won't."This research will be presented at the 2015 Annual Meeting of the Geological Society of America in Baltimore, MD, as part of the Bridging Two Continents joint "meeting-within-a meeting" with the Geological Society of China.
Earthquakes
2,015
October 27, 2015
https://www.sciencedaily.com/releases/2015/10/151027123133.htm
Physics of booming and burping sand dunes revealed
Avalanching sand from dune faces in Death Valley National Park and the Mojave Desert can trigger loud, rumbling "booming" or short bursts of "burping" sounds -- behaving as a perfectly tuned musical instrument.
This sound is persistent and the dunes "sing" in frequencies ranging from 70 to 105 Hertz, with higher harmonics. Prior to the onset of a nearly monotone booming, burps of sound of smaller amplitude occur over a significantly broader span of frequencies.As a group of researchers from California Institute of Technology and the University of Cambridge report in AIP's journal "Intrigued by these odd sounds emanating from the dunes, Nathalie Vriend researched this phenomenon as a Ph.D. student at Caltech with Melany Hunt, a professor of mechanical engineering. They collaborated with Rob Clayton, a professor of geophysics and borrowed a variety of geophysical scientific instruments to go out and "probe" the dunes' acoustical mystery. Vriend has since moved to the University of Cambridge.""During approximately 25 individual summer field days, on very hot and sandy dunes in California, we probed booming dunes," Vriend said, "and they slowly revealed their underlying physics to us."The group focused on discovering how, specifically, the booming and burping sounds travel through sand. "We measured the wave propagation characteristics, which include the motion of grains and frequency and energy of the emitted sound. This, in turn, revealed that booming and burping are two different, but related, phenomena," she said.To do this, they used geophones to measure seismic vibrations within the ground, which are similar to microphones that pick up acoustical vibrations -- sound pressure -- in the air. "The waves travelling through the dune move individual grains of sand, which exert a force on the geophone that we use for measurements," Vriend added.It turns out that "burping sounds correspond to a surface Rayleigh wave, travelling radially along the surface of the dune in a nonlinear manner," noted Vriend. "This means that relations between these properties are complicated because of the influence of individual grains."The loud booming sounds, she pointed out, originate from "linear P-waves that travel volumetrically and are reflected from internal layers inside the actual dune."The group was somewhat surprised to learn that for both booming and burping, the surface and volumetric signals are present with their own characteristic features and properties -- but the dominant signals are different.Another revelation was being able to excite the natural dune resonance on one occasion by simply providing an "impulse" on the dune surface. "A blow of a hammer on a plate triggered a natural resonance -- around the booming frequency -- inside the dune, which is something we've never seen described in literature," Vriend said.Since the group's study revealed that burping and booming emissions are different acoustic phenomena, governed by different physical principles, it may also help explain some differences in measurements and interpretations regarding singing sand dunes made during the past decade."More broadly, seismic surveys for oilfield exploration or earthquake investigations tend to rely on length scales that are usually much larger than those used by our study," she added. "Even if the study is done on a sandy substrate, the 'effective medium' response is recorded and individual grain interactions aren't usually relevant. Our work illustrates the dual behavior of wave propagation when scales are reduced to a length where small- and larger-scale wave propagation converge."Vriend is now a Royal Society Research Fellow within the Department of Applied Mathematical and Theoretical Physics at the University of Cambridge. Her research group is working on a variety of projects to probe and solve other mysteries of granular dynamics.One of these projects involves exploring "the granular dynamics during avalanching and its influence on the origin of structure in sand dunes in greater detail," she said. "Our recent work involves using field and laboratory techniques to probe natural avalanching and sorting on large desert dunes in Qatar."
Earthquakes
2,015
October 23, 2015
https://www.sciencedaily.com/releases/2015/10/151023084140.htm
Japanese sea defense guidelines could assist other tsunami-prone nations, study suggests
Japan's lead in implementing sea defence improvements to guard against future disasters is an important reference point for other tsunami-prone nations, a study led by Plymouth University has suggested.
Before 2011, Japan was considered to be the best prepared nation on earth to withstand a large tsunami on its coasts, with structures specifically designed to afford sufficient protection to coastal settlements and critical infrastructure.However, the size of the waves generated by the Great East Japan Earthquake in March that year led to sea defences and other coastal structures being overwhelmed, and in many cases completely or partially destroyed.These included the brand new tsunami defence breakwater at Kamaishi, the sea dikes defending the international airport at Sendai, the 10m seawall in Tar? and, most critically, the seawall that protected the Fukushima Daiichi nuclear power station.Since then, new engineering guidelines have been drawn up transforming Japan's coastal defences, and devising new ways to keep its coastlines safe in the future.But research in the The study, led by Dr Alison Raby, Associate Professor in the School of Marine Science and Engineering at Plymouth University, includes a full analysis of Japan's history of coastal defence design and measures taken since 2011.Dr Raby says: "After the 2004 Boxing Day tsunami in the Indian Ocean, much of the world's efforts concentrated on tsunami early warning and evacuation. Such non-structural measures already in place in Japan were quite effective and meant that tsunami casualty figures -- although exceeding 18,000 -- were relatively low in comparison to the levels of devastation caused and the population living in the inundated areas. What the 2011 event did result in was Japan rethinking and revising its design codes for sea defence structures in an effort to limit inundation extent and devastation from future events. It is essential that Japan's new sea defence plans are disseminated as widely as possible, both to inform industrialised nations and those that rely on international codes."The research, funded by the Engineering and Physical Sciences Research Council, enabled Dr Raby and other UK scientists and engineers to join an international team of experts on field trips to Japan. These two trips were conducted by the Earthquake Engineers Field Investigation Team and are part of a wider international effort to reduce the impacts of earthquakes globally.During the initial visit in mid-2011, they were able to observe the levels of destruction caused, while a follow-up in 2013 enabled them to see the recovery, newly-completed sea defences and the design guidelines being implemented to mitigate against future catastrophes.Their analysis involved translating the disaster scenario manual prepared by Japan's National Institute for Land and Infrastructure Management, which features comprehensive material enabling designers to appreciate possible failure mechanisms. They also compared it with its European and US equivalents, and highlighted potential deficiencies."This is understandable in some regions where less developed countries face competing pressures for limited financial resources, but it is notable that this threat is not addressed in design codes for at-risk European countries," the final paper says. "There needs to be more joined-up thinking between those who understand the tsunami sources and the implications for populations and infrastructure."
Earthquakes
2,015
October 20, 2015
https://www.sciencedaily.com/releases/2015/10/151020140940.htm
Triggered earthquakes give insight into changes below Earth's surface
It is well known that an earthquake in one part of the world can trigger others thousands of kilometers away.
But in a paper published in the journal Earthquakes can fundamentally change the elastic properties of Earth's crust in regions up to 6,000 kilometers away, altering its ability to withstand stresses for a period of up to a few weeks, according to Kevin Chao, a postdoc in MIT's Department of Earth, Atmospheric and Planetary Sciences and a member of a research team led by Andrew Delorey at Los Alamos National Laboratory.The research demonstrates that Earth is a dynamic and interconnected system, where one large earthquake can create a cascading sequence of events thousands of kilometers away, Chao says.Earthquakes occur when stress builds up along a tectonic fault. This stress causes the two surfaces of the fault, which had previously been stuck together due to friction, to suddenly move, or slide, releasing energy in the form of seismic waves.These waves take the form of both body waves, which cause the shaking movement that does so much damage during a quake, and surface waves. Surface waves can travel thousands of kilometers beneath the ground.When a surface wave from an earthquake some way off passes through another fault region, it changes the balance between the frictional properties that keep the surfaces locked together, the elasticity that allows the crust to withstand strain, and the stress state that can cause it to fail, Chao says."When surface waves pass through, all of these properties rearrange and change," he says. "If a fault with high stress is ready to fail, it will accumulate more stresses in the fault, meaning an earthquake could occur at any time."To demonstrate these changes, the researchers studied the 2012 earthquake off the coast of North Sumatra in the Indian Ocean. The earthquake, which had a magnitude of 8.6, is known to have been followed by two earthquakes in Japan with a magnitude greater than 5.5.When the researchers studied data from strain meter readings, GPS equipment, and information on seismicity -- or the number of small-magnitude earthquakes -- in the region, as well as the migration of the earthquakes, they found that the two triggered quakes with a magnitude of greater than 5.5 were part of a cluster of activity in the area in the days after the Indian Ocean event."When the Indian Ocean earthquake occurred, the surface wave passed through the northeast of Japan, and the seismicity in the region was suddenly triggered," Chao says. "During that time of increased seismicity, there were three triggered earthquakes in the region with a magnitude of greater than 5.5," he says.This region of Earth's crust was already critically stressed following the major Japanese earthquake of 2011, so the additional stress, albeit temporary, caused by the surface wave passing through, was enough to trigger another cluster of quakes.When a fault fails and an earthquake occurs, it also pushes into the neighboring region, reducing the available space and compressing the crust in this area.So the researchers also looked for signs of compressive stress in this region of Japan following the Indian Ocean earthquake. They found signs that cracks in the rock under the Japanese mainland were closing as a result of compressive stress, increasing the shear strength of the crust.While the research will not in itself allow us to predict earthquakes, it does help to increase our understanding of how they are triggered, as well as how Earth's crust behaves, Chao says."We still cannot say that there will definitely be another earthquake after the first one has struck, because although we know there will be changes, we do not know the existing stress conditions in every region, so we cannot predict anything with certainty," Chao says."But one important thing we can say is that we know earthquakes do interact with each other, because surface waves can travel thousands of kilometers, and change the elasticity in another region," he adds.
Earthquakes
2,015
October 7, 2015
https://www.sciencedaily.com/releases/2015/10/151007185217.htm
New emergency alert technology could fine-tune warnings for smartphones
Smartphones -- owned by a record 64 percent of American adults -- are increasingly used as effective means to deliver emergency alerts issued by state and federal agencies. In support of the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T), researchers at the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, have developed a new concept called Arbitrary-Size Location-Aware Targeting (ASLAT) -- a more accurate method of delivering certain types of messages that could even warn users to avoid particular nearby locations. The findings are detailed in a report to DHS S&T, published in June:
"Currently, under the Wireless Emergency Alerts (WEA) infrastructure, messages often go out very broadly, generating a large number of false alarms, while other people who are in the warning area do not receive those warnings due to poor cell coverage or a few other factors," said Richard "D.J." Waddell of APL's Asymmetric Operations Sector, and ASLAT program manager. "ASLAT dramatically reduces both false negatives and false positives."For location-specific emergencies, such as large building fires, natural gas leaks, or small-scale natural disasters such as flash floods and tornadoes, ASLAT could allow more accurate delivery of the warning to the correct populations. For events requiring very rapid notification such as an earthquake, the ASLAT algorithm would skip any steps that cause even a very minor delay."ASLAT uses the location awareness of wireless devices -- their internal knowledge of where they are on Earth -- to eliminate false negatives and positives when sending an emergency alert across multiple cellular network sites," said Emre Gunduzhan of APL, technical lead for ASLAT. "Another interesting feature of ASLAT is that it can warn not only people in the immediate vicinity of a hazard, but also people who may have selected that hazard as their destination."Many geolocation technologies were studied for suitability with ASLAT, Gunduzhan explained. "The team looked at Global Positioning System (GPS), mobile-device-based Time Of Arrivals (TOA) and Time Difference Of Arrivals (TDOA) techniques, as well as proximity to Wi-Fi," he said. "These were all suitable since they don't introduce new loads onto the cellular system -- important during an emergency -- and they maintain the privacy of the user."While these existing technologies can work effectively, some changes in WEA standards and implementations would be required to maximize the effectiveness of ASLAT."DHS S&T is looking for methods that can improve how government agencies warn Americans about danger and threats," Waddell said, "and we brought together APL technical experts to examine the systems in use and formulate some very promising solutions. The ASLAT team at APL is proud to have delivered this report on behalf of our sponsor, and we think the technologies recommended could have benefits to other communications challenges facing the Lab's sponsors."
Earthquakes
2,015
October 2, 2015
https://www.sciencedaily.com/releases/2015/10/151002144903.htm
Signs of ancient mega-tsunami could portend modern hazard
Scientists working off west Africa in the Cape Verde Islands have found evidence that the sudden collapse of a volcano there tens of thousands of years ago generated an ocean tsunami that dwarfed anything ever seen by humans. The researchers say an 800-foot wave engulfed an island more than 30 miles away. The study could revive a simmering controversy over whether sudden giant collapses present a realistic hazard today around volcanic islands, or even along more distant continental coasts. The study appears today in the journal
"Our point is that flank collapses can happen extremely fast and catastrophically, and therefore are capable of triggering giant tsunamis," said lead author Ricardo Ramalho, who did the research as a postdoctoral associate at Columbia University's Lamont-Doherty Earth Observatory, where he is now an adjunct scientist. "They probably don't happen very often. But we need to take this into account when we think about the hazard potential of these kinds of volcanic features."The apparent collapse occurred some 73,000 years ago at the Fogo volcano, one of the world's largest and most active island volcanoes. Nowadays, it towers 2,829 meters (9,300 feet) above sea level, and erupts about every 20 years, most recently last fall. Santiago Island, where the wave apparently hit, is now home to some 250,000 people.There is no dispute that volcanic flanks present a hazard; at least eight smaller collapses have occurred in Alaska, Japan and elsewhere in the last several hundred years, and some have generated deadly tsunamis. But many scientists doubt whether big volcanoes can collapse with the suddenness that the new study suggests. Rather, they envision landslides coming in gradual stages, generating multiple, smaller tsunamis. A 2011 French study also looked at the Fogo collapse, suggesting that it took place somewhere between 124,000-65,000 years ago; but that study says it involved more than one landslide. The French researchers estimate that the resulting multiple waves would have reached only 45 feet--even at that, enough to do plenty of harm today.A handful of previous other studies have proposed much larger prehistoric collapses and resulting megatsunamis, in the Hawaiian islands, at Italy's Mt. Etna, and the Indian Ocean's Reunion Island. But critics have said these examples are too few and the evidence too thin. The new study adds a new possible example; it says the estimated 160 cubic kilometers (40 cubic miles) of rock that Fogo lost during the collapse was dropped all at once, resulting in the 800-foot wave. By comparison, the biggest known recent tsunamis, which devastated the Indian Ocean's coasts in 2004 and eastern Japan in 2011, reached only about 100 feet. (Like most other well documented tsunamis, these were generated by movements of undersea earthquake faults--not volcanic collapses.)Santiago Island lies 55 kilometers (34 miles) from Fogo. Several years ago, Ramalho and colleagues were working on Santiago when they spotted unusual boulders lying as far as 2,000 feet inland and nearly 650 feet above sea level. Some are as big as delivery vans, and they are utterly unlike the young volcanic terrain on which they lie. Rather, they match marine-type rocks that ring the island's shoreline: limestones, conglomerates and submarine basalts. Some weigh up to 770 tons. The only realistic explanation the scientists could come up with: A gigantic wave must have ripped them from the shoreline and lofted them up. They derived the size of the wave by calculating the energy it would have taken to accomplish this feat.To date the event, in the lab Ramalho and Lamont-Doherty geochemist Gisela Winckler measured isotopes of the element helium embedded near the boulders' surfaces. Such isotopes change depending on how long a rock has been lying in the open, exposed to cosmic rays. The analyses centered around 73,000 years--well within the earlier French estimate of a smaller event. The analysis "provides the link between the collapse and impact, which you can make only if you have both dates," said Winckler.Tsunami expert Bill McGuire, a professor emeritus at University College London who was not involved in the research, said the study "provides robust evidence of megatsunami formation [and] confirms that when volcanoes collapse, they can do so extremely rapidly." Based on his own work, McGuire s says that such megatsunamis probably come only once every 10,000 years. "Nonetheless," he said, "the scale of such events, as the Fogo study testifies, and their potentially devastating impact, makes them a clear and serious hazard in ocean basins that host active volcanoes."Ramalho cautions that the study should not be taken as a red flag that another big collapse is imminent here or elsewhere. "It doesn't mean every collapse happens catastrophically," he said. "But it's maybe not as rare as we thought."In the early 2000s, other researchers started publishing evidence that the Cape Verdes could generate large tsunamis. Others have argued that Spain's Canary Islands have already done so. Simon Day, a senior researcher at University College London has sparked repeated controversy by warning that any future eruption of the Canary Islands' active Cumbre Vieja volcano could set off a flank collapse that might form an initial wave 3,000 feet high. This, he says, could erase more than nearby islands. Such a wave might still be 300 feet high when it reached west Africa an hour or so later he says, and would still be 150 feet high along the coasts of North and South America. So far, such studies have raised mainly tsunamis of publicity, and vigorous objections from other scientists that such events are improbable. A 2013 study of deep-sea sediments by the United Kingdom's National Oceanography Centre suggests that the Canaries have probably mostly seen gradual collapses.Part of the controversy hangs not only on the physics of the collapses themselves, but on how efficiently resulting waves could travel. In 1792, part of Japan's Mount Unzen collapsed, hitting a series of nearby bays with waves as high as 300 feet, and killing some 15,000 people. On July 9, 1958, an earthquake shook 90 million tons of rock into Alaska's isolated Lituya Bay; this created an astounding 1,724-foot-high wave, the largest ever recorded. Two fishermen who happened to be in their boat that day were carried clear over a nearby forest; miraculously, they survived.These events, however, occurred in confined spaces. In the open ocean, waves created by landslides are generally thought to lose energy quickly, and thus to pose mainly a regional hazard. However, this is based largely on modeling, not real-world experience, so no one really knows how fast a killer wave might decay into a harmless ripple. In any case, most scientists are more concerned with tsunamis generated by undersea earthquakes, which are more common. When seabed faults slip, as they did in 2004 and 2011, they shove massive amounts of water upward. In deep water, this shows up as a mere swell at the surface; but when the swell reaches shallower coastal areas, its energy concentrates into in a smaller volume of water, and it rears up dramatically. The 2004 Indian Ocean earthquake and tsunami killed 230,000 people in 14 countries; the 2011 Tohoku event killed nearly 20,000 in Japan, and has caused a long-term nuclear disaster.James Hunt, a tsunami expert at the United Kingdom's National Oceanography Centre who was not involved in the study, said the research makes it clear that "even modest landslides could produce high-amplitude anomalous tsunami waves on opposing island coastlines." The question, he said, "is whether these translate into hazardous events in the far field, which is debatable."When Fogo erupted last year, Ramalho and other geologists rushed in to observe. Lava flows (since calmed down) displaced some 1,200 people, and destroyed buildings including a new volcano visitors' center. "Right now, people in Cape Verde have a lot more to worry about, like rebuilding their livelihoods after the last eruption," said Ramalho. "But Fogo may collapse again one day, so we need to be vigilant."
Earthquakes
2,015
October 1, 2015
https://www.sciencedaily.com/releases/2015/10/151001153038.htm
Asteroid impact, volcanism were one-two punch for dinosaurs
Berkeley geologists have uncovered compelling evidence that an asteroid impact on Earth 66 million years ago accelerated the eruptions of volcanoes in India for hundreds of thousands of years, and that together these planet-wide catastrophes caused the extinction of many land and marine animals, including the dinosaurs.
For 35 years, paleontologists and geologists have debated the role these two global events played in the last mass extinction, with one side claiming the eruptions were irrelevant, and the other side claiming the impact was a blip in a long-term die-off.The new evidence includes the most accurate dates yet for the volcanic eruptions before and after the impact. The new dates show that the Deccan Traps lava flows, which at the time were erupting at a slower pace, doubled in output within 50,000 years of the asteroid or comet impact that is thought to have initiated the last mass extinction on Earth.Both the impact and the volcanism would have blanketed the planet with dust and noxious fumes, drastically changing the climate and sending many species to an early grave."Based on our dating of the lavas, we can be pretty certain that the volcanism and the impact occurred within 50,000 years of the extinction, so it becomes somewhat artificial to distinguish between them as killing mechanisms: both phenomena were clearly at work at the same time," said lead researcher Paul Renne, a UC Berkeley professor-in-residence of earth and planetary science and director of the Berkeley Geochronology Center. "It is going to be basically impossible to ascribe actual atmospheric effects to one or the other. They both happened at the same time."The geologists argue that the impact abruptly changed the volcanoes' plumbing system, which produced major changes in the chemistry and frequency of the eruptions. After this change, long-term volcanic eruptions likely delayed recovery of life for 500,000 years after the KT boundary, the term for the end of the Cretaceous and the beginning of the Tertiary period when large land animals and many small sea creatures disappeared from the fossil record."The biodiversity and chemical signature of the ocean took about half a million years to really recover after the KT boundary, which is about how long the accelerated volcanism lasted," Renne said. "We are proposing that the volcanism unleashed and accelerated right at the KT boundary suppressed the recovery until the volcanoes waned."Co-author Mark Richards, a UC Berkeley professor of earth and planetary science and the one who originally proposed that the comet or asteroid impact reignited the Deccan Traps lava flows, is agnostic about which event was the real death knell for much of life on Earth. But the link between the impact and the flood basalts is becoming harder to deny."If our high-precision dates continue to pin these three events -- the impact, the extinction and the major pulse of volcanism -- closer and closer together, people are going to have to accept the likelihood of a connection among them. The scenario we are suggesting -- that the impact triggered the volcanism -- does in fact reconcile what had previously appeared to be an unimaginable coincidence," he said.Renne, Richards and their colleagues will publish the new dates for the Deccan Traps eruptions in the Oct. 2 issue of the journal Since 1980, when UC Berkeley geologist Walter Alvarez and his father, the late UC Berkeley physicist Luis Alvarez, discovered evidence of a comet or asteroid impact on Earth 66 million years ago, scientists have argued about whether the impact was the cause of the mass extinction that occurred at the same time, the end of the Cretaceous period, or the KT boundary. Some argued that the huge volcanic eruptions in India known as the Deccan Traps, which occurred around the same time, were the main culprit in the extinctions. Others insisted the death knell had been the impact, which left behind a large crater dubbed Chicxulub off Mexico's Yucatan peninsula, and viewed the Deccan Traps eruptions as a minor sideshow.Earlier this year, Richards, Renne and eight other geoscientists proposed a new scenario: that the impact ignited volcanoes around the globe, most catastrophically in India, and that the two events combined to cause the KT extinction.In attempts to test this hypothesis, the team last year collected lava samples from throughout the Deccan Traps east of Mumbai, sampling flows from near the beginning, several hundred thousand years before the extinction and near the end, some half a million years after the extinction. High-precision argon-40/argon-39 isotope dating allowed them to establish the chronology of the flows and the rate of flow over time.In the "At the KT boundary, we see major changes in the volcanic system of the Deccan Traps, in terms of the rate at which eruptions were happening, the size of the eruptions, the volume of the eruptions and some aspects of the chemistry of the eruptions, which speaks to the actual processes by which the magmas were generated," Renne said. "All these things changed in a fundamental way, and increasingly it seems they happened right at the KT boundary. Our data don't conclusively prove that the impact caused these changes, but the connection looks increasingly clear."Richards said that a large nearby earthquake of a magnitude 8, 9 or 10 -- as large or larger than the quake that struck Japan in 2011 -- could also have reignited the Deccan Trap flows. In fact, large quakes may have rattled underground magma chambers and ignited eruptions throughout Earth's history. But the simultaneous changes in the lava flows and the impact at the KT boundary seem more than mere coincidence."These changes are consistent with an accelerated rate of magma production and eruption that you could get from a large earthquake such as would be created by the Chicxulub impact," he said.In 2013, Renne and his team at the Berkeley Geochronology Center and elsewhere also dated the KT boundary extinction and dust from the impact and found they occurred within less than 32,000 years of one another -- the blink of an eye in geologic terms, he said. Renne's team plans to obtain isotope dates for more basalt samples from the Deccan Traps to detail the history of the lava flows that cover much of western India, in order to better understand how they changed with time and correlate to the impact and extinctions. Meanwhile, Richards is working with volcano experts to understand how large ground shaking caused by earthquakes or asteroid impacts affects volcanic eruptions.
Earthquakes
2,015
October 1, 2015
https://www.sciencedaily.com/releases/2015/10/151001142225.htm
Simulating path of 'magma mush' inside an active volcano
Months of warning signs from Mauna Loa, on Hawaii's Big Island, prompted the U.S. Geological Society to recently start releasing weekly updates on activity at the world's largest active volcano.
For now, such warning signs can only rely on external clues, like earthquakes and venting gases. But a University of Washington simulation has managed to demonstrate what's happening deep inside the volcano. The study, published Sept. 7 in "The thing about studying volcanoes is we can't really see inside of them to know what's going on," said co-author Jillian Schleicher, a UW doctoral student in Earth and space sciences. "Whenever there's unrest, like earthquakes, gas emissions or surface deformation, it's really difficult to know what processes are taking place inside the volcano."Each volcano has a unique personality. Volcanologists use the remains of past eruptions and previous warning signs to predict when it might blow. But those predictions are based on only vague understanding of the system's inner workings.The idealized UW computer simulation could help volcanologists better understand how energy builds up inside a system like Mauna Loa, which is a focus of the UW group's research, to predict when it will erupt."This tool is novel because it lets us explore the mechanics," said first author George Bergantz, a UW professor of Earth and space sciences. "It creates an interpretive framework for what controls the movement, and what might produce the signals we see on the outside."The team used a computer model originally developed by the U.S. Department of Energy to model fuel combustion. The UW group previously adapted the code to simulate volcanic eruptions and ash plumes; this paper is the first time it's been used to go down inside the volcano and examine the movement of each individual crystal.A volcano is filled with "magma mush," a slushy material that is part magma, or liquid rock, and part solid crystal. Previous studies approximated it as a thick fluid. But capturing its true dual nature makes a difference, since the crystals interact in ways that matter for its motion."If we see earthquakes deep inside Mauna Loa, that tells us that there's magma moving up through the volcano," Bergantz said. "But how can we better understand its progress up through that plumbing system?"The simulation shows the magma has three circulation states: slow, medium and fast. In the slow state, new magma just percolates up through the crystal pores. As the rate of injected magma increases, however, it creates a "mixing bowl" region where older crystals get mixed in with the new material."In these crystal-rich mushes, we know that we have magma going in and sometimes it might punch through [the layer of crystals at the bottom]," Schleicher said. "But we don't know how the mixing is happening or the timescales involved."The current model is an idealized magma chamber, but with more computing power it could be expanded to reproduce a particular volcano's internal structure.Now that researchers can simulate what happens inside a magma chamber, Schleicher will look at rock samples from Mauna Loa and analyze the layers in the crystals. Crystals preserve chemical clues as they grow, similar to tree rings. Matching the model with the crystals' composition will help recreate the rock's history and track how magma has moved inside Mauna Loa."Mauna Loa is a terrific place to study because it's very active, and the rocks contain a single type of crystal," Bergantz said. "What we learn at Mauna Loa will allow us to make headway on other places, like Mount St. Helens, that are intrinsically more difficult."
Earthquakes
2,015
October 1, 2015
https://www.sciencedaily.com/releases/2015/10/151001130052.htm
New science redefines remote: Even pandas global
This just in from the pandas nestled in a remote corner of China: Their influence spans the globe.
In this week's international journal The results, with implications far beyond panda conservation, are a bit mindboggling. From economic impacts in cities like San Diego and London to crops in Memphis to global greenhouse gases, the framework of telecoupling (socioeconomic and environmental interactions over distances) lays out a dynamic, complex view of how issues of sustainability reach across the world -- and then impacts rush back. And effects spring up along the way.Understanding the minutia is important to sustainability, says lead author Jianguo "Jack" Liu, the Rachel Carson Chair in Sustainability at Michigan State University (MSU), because policies intended to improve the world's environment or save rare animals aren't supposed to cause damage in other places."In this new world of hyper connectivity, even remote areas like the Wolong Nature Reserve are connected in so many ways. If we're going to understand the world fully, and advise policy makers well, science needs to make the big picture as specific and detailed as possible.So Liu, the director of MSU's Center for Systems Integration and Sustainability, and his colleagues across the United States and China, hunkered into the task of showing how pandas in the mountains of southwestern China affect the globe, and vice versa.Their paper "Multiple Telecouplings and their Complex Interrelationships" does a deep dive into the two-way superhighway of trade, economics, physical sciences and environmental ups and downs between Wolong and the rest of the world.For example: panda loans. The cuddly iconic bears from Wolong have gone to seven cities in five countries across the world, usually for fees and often carrying diplomatic and public relations good will in both directions. The scientists identify not just those points, but zero in on the juggernaut of people involved in negotiating and facilitating the arrangements and the economic benefits and costs -- from increased donations to zoo attendance to operational and training expenses.Then the paper dives deeper. The scientific collaborations and conservation efforts expand as the pandas span out. The carbon emissions soar as pandas get new homes. For instance, a panda pair that was resettled on loan in Edinburgh, Scotland, not only spewed out some 232,000 kg of COIt makes any discussions of panda loans -- or the future of panda loans -- much richer than a simple yes/no consideration. Liu also noted that the telecoupling framework has the potential to extract important understandings from the 2008 Wenchuan earthquake that devastated much of the area in and around Wolong."The earthquake became almost like a natural laboratory experiment," Liu said. "Causing tourism to instantly stop for a long time has given us a good way to understand the many aspects of conservation and tourism. Wolong has been, in good times and hard times, a wonderful place to understand many important issues in conservation, ecology and sustainability that are shared around the world.
Earthquakes
2,015
September 30, 2015
https://www.sciencedaily.com/releases/2015/09/150930073124.htm
Earthquake rupture halted by seamounts
Chile is one of the countries that is most at risk from damaging earthquakes. Therefore, no one was caught by surprise when a series of tremors struck the area around the northern Chilean city of Iquique in spring 2014. The main quake on 1 April reached a magnitude of 8.1 and triggered a tsunami. But experts were surprised that the quake was not as large and damaging as expected, and that it affected only a limited region. Geologists from the GEOMAR Helmholtz Centre for Ocean Research Kiel and the Cluster of Excellence "The Future Ocean," the German Institute for Geosciences and Natural Resources (BGR) and the Institute of Marine Sciences (CSIC) in Barcelona (Spain) now presented a possible explanation for the smaller than expected tremor. They published their findings in the international journal
The reason for the high earthquake frequency in Chile lies just off the coast where the oceanic Nazca plate, one of several tectonic plates in the Pacific region, subducts underneath the South American plate. This leads to the accumulation of stress that will sooner or later, be released during an earthquake. "In northern Chile, however, there is a 550 kilometer wide gap that did not experience a major earthquake since 1877," says the lead author of the current study, Dr. Jacob Geersen (GEOMAR/The Future Ocean). "Within this seismic gap, experts expected the next mega earthquake. And at first, many scientists believed the earthquake on 1 April 2014 was this mega quake. But it affected only the central part of the gap and remained well below the expected magnitude of up to 9.0," says Dr. Geersen.To understand the reason for the low intensity of the 2014 Iquique earthquake, Dr. Geersen and his colleagues studied the seafloor topography off northern Chile combined with seismic images that resolve the deep structure under the seafloor. The seismic data were already collected in 1995 by the BGR in the framework of the "Crustal Investigations off- and on-shore Nazca / Central Andes (CINCA)" research project. "It turned out that the seafloor of the Nazca plate in the affected region is not entirely flat. Instead, there are numerous extinct volcanos, so called seamounts, some of them thousands of meters high," describes co-author César R. Ranero, ICREA Research Professor at the Institute of Marine Sciences (CSIC) in Barcelona.These seamounts are, together with the Nazca plate, pushed beneath the South American Plate. "Using the seismic data, we clearly identified several former seamounts, which are now located at the interface between the two plates, thereby actively deforming this interface and the overlying South American Plate," says Dr. Geersen. Because of this roughness and the associated fractures, less stress is build up in the area around the subducting seamounts and the resulting earthquake is smaller. "In addition, the subducted seamounts probably stopped the spatial propagation of seismic rupture physically during the Iquique earthquake," says Dr. Geersen.The risk of a future mega earthquake in the seismic gap of northern Chile is not yet reduced. "A portion of the accumulated stress has now been released by the 2014 earthquake. However, in the unbroken northern and southern portion of the seismic gap there is still enough energy that remains to be released during an earthquake with a magnitude larger than 8.5" says Dr. Geersen. Therefore, scientists from around the world continue to monitor the region. In autumn 2015, a team of GEOMAR scientists will also visit the area off northern Chile onboard the German research vessel SONNE in order to install high-precision instrumentation on the seabed that is capable of detecting even small centimeter-scale movements of the subsurface. "We are yet to forecast earthquakes precisely. But the more we learn about them the better we can assess the associated risks and take mitigation efforts" concludes Dr. Geersen.
Earthquakes
2,015
September 29, 2015
https://www.sciencedaily.com/releases/2015/09/150929092855.htm
Scientists simulate Earth's middle crust to understand earthquakes
Researchers have for the first time been able to measure a material's resistance to fracturing from various types of tectonic motions in Earth's middle crust, a discovery that may lead to better understanding of how large earthquakes and slower moving events interact.
The University of Texas Institute for Geophysics (UTIG), research unit of the Jackson School of Geosciences, spearheaded the discovery. The study was published in the September edition of Scientists conducted the research using Carbopol, a gel-like substance that can simulate the characteristics of rock formations in Earth's middle crust because it is simultaneously brittle and malleable.Researchers performed shear tests on the Carbopol, where a portion of the material is pulled one direction and a portion is pulled in the opposite direction. This is similar to what happens to rock formations in the middle crust during earthquakes or slow-slip events, a type of tectonic movement that resembles an earthquake but happens over a much longer period of time.Previously, nearly all research into such movements of Earth's crusts was done by measuring tectonic movement using GPS readings and linking these findings with friction laws. Those observations did not address how rock behaves when it softens under heat and pressure."It is not really clear how slow-slip events interact with earthquakes, whether they can trigger earthquakes or it's the other way around -- that earthquakes trigger slow-slip events," said Jacqueline Reber, the study's lead author who performed this research as postdoctoral fellow at UTIG, and who is now an assistant professor at Iowa State University.The research also adds insight into middle crust strain transients, temporary stress on surrounding rock that's caused by tectonic motion."By understanding the mechanics of strain transients a little bit better, we eventually hope to get better insight into how they relate to big, catastrophic earthquakes."Unlike slow slips events, earthquakes -- or stick-slip events -- occur when surfaces quickly alternate between sticking to each other and sliding over each other."While earlier studies focused mostly on frictional behavior as an explanation for strain transients we focus in our work on the impact of rheology (how a material flows under stress), especially when it is semi-brittle," said Reber.The semi-brittle middle crust can be compared to a candy bar made of nuts and caramel. The nuts represent the brittle rock. The caramel represents the ductile rock.Researchers exposed Carbopol, in which the ratio between brittle and ductile parts determines how much stress it can take before being permanently deformed or breaking, to forces created by a simple spring-powered shearing apparatus. Lower yield stress induced the Carbonol to imitate hotter, more viscous rock from deeper in Earth's crust by making it more ductile; at higher yield stress it imitated cooler, more brittle rock.The tests showed viscous deformation and constant creep movement at lower yield stress and slip-stick behavior at higher yield stress. This highlights the importance of a material's often complex properties for determining the manner and speed it will respond to stress.
Earthquakes
2,015
September 22, 2015
https://www.sciencedaily.com/releases/2015/09/150922115456.htm
Are we wiser about tsunamis? Expert says yes and no
The world may not be well prepared for the next significant tsunami, reports Northwestern University tsunami expert Emile A. Okal in a new study that includes a "wisdom index" for 17 tsunamis since 2004.
The 2004 Sumatra-Andaman tsunami was the most devastating in recorded history, killing more than 225,000 people, including thousands of tourists. In his review of that event and 16 other significant tsunamis since then, Okal used the concept of a "wisdom index" to grade the performance of scientists, decision-makers and populations at risk. The index was based on the warning issued (or not) during the event and on the response of the population.Okal found mixed results as to how much wiser people have become about these natural events and how to reduce their impact."We cannot foresee how well we will be doing in the next tsunami," said Okal, a seismologist and professor of Earth and planetary sciences in the Weinberg College of Arts and Sciences. "I found that mitigation of these 17 tsunamis was rather erratic -- there is not sustained improvement with time, nor a clear correlation of the wisdom index with the geographic location of the tsunami source."In his paper, Okal reflects on the progress made since the catastrophic event of 2004 in various aspects of tsunami science, warning and mitigation and more generally in tsunami resilience, i.e., the preventive adaptation of communities to this form of natural hazard."The Quest for Wisdom: Lessons From Seventeen Tsunamis, 2004-2014" was published Sept. 21 by the journal In addition to the mixed "wisdom indices," the key results of Okal's study are:Okal stresses the importance of incorporating any new knowledge into tsunami warning procedures and public awareness."In this day and age of professional and leisure travel, the general public worldwide should be aware of tsunami risk," Okal said. "The 2004 Sumatra event was the most lethal disaster in the history of Sweden. The country lost about 500 tourists on the beaches of Thailand."Okal said his research was strongly influenced by his 20-year collaboration with Costas Synolakis, director of the Tsunami Research Center at the University of Southern California.In a separate article in the same issue of the journal, Synolakis critically assesses the 2011 Fukushima Nuclear Power Plant accident in Japan and concludes it was due to the cumulation of a number of scientific, engineering and management blunders that could and should have been prevented.
Earthquakes
2,015
September 21, 2015
https://www.sciencedaily.com/releases/2015/09/150921095152.htm
Fukushima disaster was preventable, study suggests
The worst nuclear disaster since the 1986 Chernobyl meltdown never should have happened, according to a new study.
In the peer-reviewed "While most studies have focused on the response to the accident, we've found that there were design problems that led to the disaster that should have been dealt with long before the earthquake hit," said Synolakis, professor of civil and environmental engineering at USC Viterbi. "Earlier government and industry studies focused on the mechanical failures and 'buried the lead.' The pre-event tsunami hazards study if done properly, would have identified the diesel generators as the lynch pin of a future disaster. Fukushima Dai-ichi was a siting duck waiting to be flooded."The authors describe the disaster as a "cascade of industrial, regulatory and engineering failures," leading to a situation where critical infrastructure -- in this case, backup generators to keep the cooling the plant in the event of main power loss -- was built in harm's way.At the four damaged nuclear power plants (Onagawa, Fukushima Dai-ichi, Fukushimi Dai-ni, and Toka Dai-ni) 22 of the 33 total backup diesel generators were washed away, including 12 of 13 at Fukushima Dai-ichi. Of the 33 total backup power lines to off-site generators, all but two were obliterated by the tsunami.Unable to cool itself, Fukushima Dai-ichi's reactors melted down one by one."What doomed Fukushima Dai-ichi was the elevation of the EDGs (emergency diesel generators)," the authors wrote. One set was located in a basement, and the others at 10 and 13 meters above sea level; inexplicably and fatally low, Synolakis said.Synolakis and Kânoğlu report that the Tokyo Electric Power Company (TEPCO), which ran the plant, first reduced the height of the coastal cliffs where the plant was built, underestimated potential tsunami heights, relied on its own internal faulty data and incomplete modeling -- and ignored warnings from Japanese scientists that larger tsunamis were possible.Prior to the disaster, TEPCO estimated that the maximum possible rise in water level at Fukushima Dai-ichi was 6.1 meters -- a number that appears to have been based on low-resolution studies of earthquakes of magnitude 7.5, even though up to magnitude 8.6 quakes have been recorded along the same coast where the plant is located.This is also despite the fact that TEPCO did two sets of calculations in 2008 based on datasets from different sources, each of which suggested that tsunami heights could top 8.4 meters -- possibly reaching above 10 meters.During the 2011 disaster, tsunami heights reached an estimated 13 meters at Fukushimi Dai-ichi -- high enough to flood all of the backup generators and wash away power lines.Further, the 2010 Chilean earthquake (magnitude 8.8) should have been a wake-up call to TEPCO, said Synolakis, who describes it as the "last chance to avoid the accident." TEPCO conducted a new safety assessment of Fukushima Dai-ichi -- but used 5.7 meters as the maximum possible height of a tsunami, against the published recommendations of some of its own scientists. TEPCO concluded in November 2010 that they had "assessed and confirmed the safety of the nuclear plants," presenting its findings at a nuclear engineering conference in Japan."The problem is that all of TEPCO's studies were done internally, there were no safety factors built in the analysis, which anyway lacked context. Globally, we lack standards for the tsunami-specific training and certification of engineers and scientists who perform hazard studies, and for the regulators who review them, who can in principle ensure that changes be made, if needed." Synolakis said. "How many licensing boards have tsunami-specific questions when granting professional accreditation?"Lacking tsunami specific training, certification and licensing, the potential for similar mistakes to occur in hazard studies for other coastal nuclear power plants exists, he said. He points to recent studies around the world where lack of experience and context produced tsunami inundation projections with Fukushima size underestimation of the hazard.Synolakis and Kânoğlu's paper was published on September 21. Their research as supported by ASTARTE Grant 603839 and the National Science Foundation, Award CMMI 1313839. In the same issue of the Philosophical Transactions, another review paper from the universities of Oxford, Cambridge and USC discusses hazards in the Eastern Mediterranean, where nuclear power plants are being planned for construction in the next few years.
Earthquakes
2,015
September 14, 2015
https://www.sciencedaily.com/releases/2015/09/150914124653.htm
Seismic signature of small underground chemical blasts linked to gas released in explosion
After analyzing the seismic waves produced by small underground chemical explosions at a test site in Vermont, scientists say that some features of seismic waves could be affected by the amount of gas produced in the explosion.
This unexpected finding may have implications for how scientists use these types of chemical explosions to indirectly study the seismic signal of nuclear detonations. Researchers use chemical blasts to learn more about the specific seismic signatures produced by explosions--which differ from those produced by earthquakes--to help efforts to detect and trace nuclear test explosions under entities such as the Comprehensive Nuclear-Test-Ban Treaty.Chemical explosions are only a proxy for nuclear explosions, however, and it is difficult to say how or if the results of the new study may apply to seismic monitoring of nuclear explosions, cautioned study author Anastasia Stroujkova of Weston Geophysical Corp.In the study published online in the In particular, the lingering, non-condensable gas produced in the explosions seemed to affect the size of low frequency portion of P waves, which could be important for seismic monitoring, said Stroujkova. High frequency seismic waves weaken faster than low frequency waves, sometimes becoming lost among the background noise of other seismic signals before they can reach monitoring stations a thousand or more kilometers away, "while the low frequencies can be detected and analyzed further away from the sources," she explained.In nuclear explosions, the low frequency amplitude is also proportional to the yield, or amount of energy discharged by a detonation, Stroujkova noted, making it "an important observable characteristic of seismic waves."The amount of gas produced in the chemical explosions could affect the low frequency P waves in two possible ways, Stroujkova suggested. The expanding gas could contribute to an increase in the volume of the rock cavity produced by the explosion, or the gas may be fracturing the surrounding rock for a long time after the initial explosion. "More research is needed to better understand and clarify these possible effects," she said.The explosions studied by Stroujkova are part of the New England Damage Experiment (NEDE) in Vermont, conducted by Weston Geophysical Corp. in collaboration with New England Research, Inc. The main goal of the NEDE is to study seismic waves generated by explosives that differ in detonation power. In particular, the project has focused on the slower, shear or S waves, in which oscillations take place in the plane perpendicular to P waves.The small explosions studied by Stroujkova were fueled by either an ammonium nitrate and fuel oil combination or Composition B, an explosive mix often used in artillery projectiles and land mines. The explosives were detonated in boreholes in the granite bedrock at depths ranging from about 11 to 13 meters (36 to 46 feet) deep. The explosions were performed by a team of professional blasters within a rock quarry. The explosions, designed to be fully contained underground, are smaller than the majority of the blasts used in quarry mining.
Earthquakes
2,015
September 10, 2015
https://www.sciencedaily.com/releases/2015/09/150910144055.htm
Megathrust quake faults weaker, less stressed than thought
Some of the inner workings of Earth's subduction zones and their "megathrust" faults are revealed in a paper published in the journal
Subduction zone megathrust faults produce most of the world's largest earthquakes. The stresses are the forces acting on the subduction zone fault system, and are the forces that drive the earthquakes. Understanding these forces will allow scientists to better model the physical processes of subduction zones, and the results of these physical models may give us more insight into earthquake hazards."Even a 'weak' fault, meaning a fault with low frictional strength, can accumulate enough stress to produce a large earthquake. It may even be easier for a weak fault to produce a large earthquake, because once an earthquake starts, there aren't as many strongly stuck patches of the fault that could stop the rupture," explained lead author and USGS geophysicist Hardebeck.Although the physical properties of these faults are difficult to observe and measure directly, their frictional strength can be estimated indirectly by calculating the directions and relative magnitudes of the stresses that act on them. The frictional strength of a fault determines how much stress it can take before it slips, creating an earthquake.Evaluating the orientations of thousands of smaller earthquakes surrounding the megathrust fault, Hardebeck calculated the orientation of stress, and from that inferred that all of the faults comprising the subduction zone system have similar strength. Together with prior evidence showing that some subduction zone faults are "weak," this implies that all of the faults are "weak," and that subduction zones are "low-stress" environments.A "strong" fault has the frictional strength equivalent to an artificial fault cut in a rock sample in the laboratory. However, the stress released in earthquakes is only about one tenth of the stress that a "strong" fault should be able to withstand. A "weak" fault, in contrast, has only the strength to hold about one earthquake's worth of stress. A large earthquake on a "weak" fault releases most of the stress, and before the next large earthquake the stress is reloaded due to motion of Earth's tectonic plates.
Earthquakes
2,015
September 8, 2015
https://www.sciencedaily.com/releases/2015/09/150908210655.htm
Earthquake baseline set in UK to inform future fracking
Seismic activity across the UK has been analyzed for the first time to set a national baseline for earthquakes caused by human activity ahead of any future decisions around fracking.
The study - published today in the academic journal The research was carried out by ReFINE (Researching Fracking in Europe), an independent research consortium focusing on the issue of shale gas and oil exploitation using fracking methods.Research lead Professor Richard Davies, of Newcastle University, said: "Earthquakes triggered or induced by humans are not a new concept for us here in the UK, but earthquakes related to fracking are."Understanding what the current situation is and setting a national baseline is imperative, otherwise how can we say with any confidence in the future what the impact of fracking has been nationwide?"What this research shows is that in recent years, an average of at least three earthquakes a year, with local magnitudes greater than or equal to 1.5, are as a result of human activity. If widespread exploitation of the UK's shale reservoirs is granted and numbers consistently rise then, in conjunction with local monitoring data, we should be able to confidently demonstrate a causal link."The first human-induced earthquake in the UK probably occurred in 1755 due to the collapse of lead mines in Derbyshire. Data collected by the British Geological Survey between 1970 and 2012 shows there have been approximately 8,000 onshore recorded seismic events in the UK with a range of origins including mining, deep geothermal energy, industrial explosions, meteorological phenomena such as lightning strikes and natural causes.Analysing the 1,769 seismic events over a forty year period that were above or equal to 1.5 in local magnitude - the minimum detectable threshold - the team of experts from Newcastle, Durham and Keele universities showed that at least 21% were related to human activity, at least 40% were naturally-occurring and 39% were 'undefined'. "We have been careful only to include earthquakes where there has been a strongly indicated link to human activity," explains Professor Davies. "Using historic data means that isn't always possible so some man-made events remain undefined."The data shows a sharp decline in the number of earthquakes from the 1980s, mirroring the demise of the UK coal industry. From 1999 onwards, there has been an average of at least three earthquakes a year related to human activity, with an annual range between zero and eight.Taking into account the undefined earthquakes, the author's average would still be just 12, providing a useful baseline prior to any future widespread use of fracking for the exploitation of the UK's shale reservoirs.The link between coal mining and man-made seismicity has been known for the last 150 years and became particularly apparent when the British Geological Survey launched its National Seismic Monitoring system in the late 1960s.This latest research suggests that by the mid-1980s, a third of all detected seismic events in the UK were coal mining related, with the majority occurring in Derbyshire, Nottinghamshire, Staffordshire and Yorkshire coalfields. The number of coal mining related earthquakes drops by more than 95% from 1991, correlating to the closure of the UK's deep mines.The first UK exploratory fracking operation at Preese Hall, Lancashire, in 2011 resulted in a 2.3 magnitude earthquake. Two months later, a second earthquake of 1.5 magnitude was recorded and operations were suspended."Historically, fracking-related earthquakes have been small but the UK is criss-crossed by faults - some of which may be critically stressed - and if triggered these could result in earthquakes that people can feel," explains Professor Davies."Worldwide, the biggest published example of a fracking earthquake to date is 4.4 in magnitude, recorded in Canada last year, although an event of this size in the UK is highly unlikely."We are the first country in the world to analyse historical earthquakes to establish a baseline. Our research provides the UK with its first pre-fracking national baseline against which, allied to other data sets, we will be able to make informed decisions about shale exploitation and the impact it has, or hasn't had, in the future."
Earthquakes
2,015
September 3, 2015
https://www.sciencedaily.com/releases/2015/09/150903121716.htm
Study shows how investments reflected shift in environmental views
A new study from The University of Texas at Dallas examines the differences in climate change perceptions in the United States and Europe by looking at investor behavior.
Dr. Anastasia Shcherbakova, clinical assistant professor of finance and managerial economics in the Naveen Jindal School of Management, said the researchers used the 2011 Fukushima crisis in Japan as a natural experiment to evaluate responses of U.S. and European investors to a shifting view of nuclear power.This study, published in a recent issue of the "This earthquake was something that was not foreseen by anybody, but it managed to change the way that people thought about nuclear power rather dramatically," said Shcherbakova, who also serves as director of the Master of Science in Energy Management program at UT Dallas.In the aftermath of the earthquake and subsequent tsunami and nuclear disaster, there was worldwide pessimism about nuclear energy, Shcherbakova said. Many countries took their nuclear power plants offline for safety inspections, and some countries announced they were getting out of nuclear power."The global population is growing, which means we are going to require more energy," Shcherbakova said. "When governments start shedding nuclear generation capacity, we are going to need other energy sources to fill the gap."The study examined the behavior of investors in U.S. and European financial markets that reflected their perceptions about future profitability of fossil fuels and renewable energy. This profitability assessment is based on the perceived potential of these fuel types to substitute for nuclear generation, should it become scarcer in the aftermath of the Fukushima crisis.The researchers used 2010-2011 stock market data from the Center for Research in Security Prices, the New York Stock Exchange and the Bloomberg database.Results show that investment behavior reflects investors' environmental perceptions. The researchers observed a significant increase in returns to coal in the U.S., implying that investors put more money into coal stocks, Shcherbakova said. This suggests that they perceived cost efficiency and reliability of energy supply to be more pertinent issues than climate change.In Europe, investors put significantly more money into renewable energy stocks, suggesting that they reflect the region's environmentally conscious attitudes and willingness to pay for environmental outcomes, relative to investors in U.S. markets, Shcherbakova said."In the U.S., there is a very long history of coal," she said. "If you want a cheap and reliable source of energy, you just run your coal plant all day long. In Europe, there's been a significant backlash against coal. People like their countryside and are much more nature-oriented."In Europe, people have been much more willing to embrace renewable energy even though it's much more expensive. In the U.S., we observe a much more price-sensitive energy consumer."Shcherbakova said one implication of the findings is that financial markets can be important contributors to successes and failures of government policy.For example, if the government said that by 2025, 25 percent of the country's energy must come from renewable resources, it would be difficult to achieve that goal if the policy isn't aligned with the reality of economics, she said.
Earthquakes
2,015
August 27, 2015
https://www.sciencedaily.com/releases/2015/08/150827083528.htm
What would a tsunami in the Mediterranean look like?
A team of European researchers have developed a model to simulate the impact of tsunamis generated by earthquakes and applied it to the Eastern Mediterranean. The results show how tsunami waves could hit and inundate coastal areas in southern Italy and Greece. The study is published today (27 August) in Ocean Science, an open access journal of the European Geosciences Union (EGU).
Though not as frequent as in the Pacific and Indian oceans, tsunamis also occur in the Mediterranean, mainly due to earthquakes generated when the African plate slides underneath the Eurasian plate. About 10% of all tsunamis worldwide happen in the Mediterranean, with on average, one large tsunami happening in the region once a century. The risk to coastal areas is high because of the high population density in the area -- some 130 million people live along the sea's coastline. Moreover, tsunami waves in the Mediterranean need to travel only a very short distance before hitting the coast, reaching it with little advance warning. The new study shows the extent of flooding in selected areas along the coasts of southern Italy and Greece, if hit by large tsunamis in the region, and could help local authorities identify vulnerable areas."The main gap in relevant knowledge in tsunami modelling is what happens when tsunami waves approach the nearshore and run inland," says Achilleas Samaras, the lead author of the study and a researcher at the University of Bologna in Italy. The nearshore is the zone where waves transform -- becoming steeper and changing their propagation direction -- as they propagate over shallow water close to the shore. "We wanted to find out how coastal areas would be affected by tsunamis in a region that is not only the most active in the Mediterranean in terms of seismicity and tectonic movements, but has also experienced numerous tsunami events in the past."The team developed a computer model to represent how tsunamis in the Mediterranean could form, propagate and hit the coast, using information about the seafloor depth, shoreline and topography. "We simulate tsunami generation by introducing earthquake-generated displacements at either the sea bed or the surface," explains Samaras. "The model then simulates how these disturbances -- the tsunami waves -- propagate and are transformed as they reach the nearshore and inundate coastal areas."As detailed in the Ocean Science study, the team applied their model to tsunamis generated by earthquakes of approximately M7.0 magnitude off the coasts of eastern Sicily and southern Crete. Results show that, in both cases, the tsunamis would inundate the low-lying coastal areas up to approximately 5 metres above sea level. The effects would be more severe for Crete where some 3.5 square kilometres of land would be under water."Due to the complexity of the studied phenomena, one should not arbitrarily extend the validity of the presented results by assuming that a tsunami with a magnitude at generation five times larger, for example, would result in an inundation area five times larger," cautions Samaras. "It is reasonable, however, to consider such results as indicative of how different areas in each region would be affected by larger events.""Although the simulated earthquake-induced tsunamis are not small, there has been a recorded history of significantly larger events, in terms of earthquake magnitude and mainshock areas, taking place in the region," says Samaras. For example, a clustering of earthquakes, the largest with magnitude between 8.0 and 8.5, hit off the coast of Crete in 365 AD. The resulting tsunami destroyed ancient cities in Greece, Italy and Egypt, killing some 5000 people in Alexandria alone. More recently, an earthquake of magnitude of about 7.0 hit the Messina region in Italy in 1908, causing a tsunami that killed thousands, with observed waves locally exceeding 10 metres in height.The team sees the results as a starting point for a more detailed assessment of coastal flooding risk and mitigation along the coasts of the Eastern Mediterranean. "Our simulations could be used to help public authorities and policy makers create a comprehensive database of tsunami scenarios in the Mediterranean, identify vulnerable coastal regions for each scenario, and properly plan their defence."Animation: Animation:
Earthquakes
2,015
August 26, 2015
https://www.sciencedaily.com/releases/2015/08/150826135726.htm
Mechanism behind 'strange' earthquakes discovered
It's not a huge mystery why Los Angeles experiences earthquakes. The city sits near a boundary between two tectonic plates -- they shift, we shake. But what about places that aren't along tectonic plate boundaries?
For example, seismicity on the North American plate occurs as far afield as southern Missouri, where earthquakes between 1811 and 1812 estimated at around magnitude 7 caused the Mississippi River to flow backward for hours.Until now, the cause of that seismicity has remained unclear.While earthquakes along tectonic plate boundaries are caused by motion between the plates, earthquakes away from fault lines are primarily driven by motion beneath the plates, according to a new study published by USC scientist Thorsten Becker in Just beneath the Earth's crust is a layer of hot, semi-liquid rock that is continually flowing -- heating up and rising, then cooling and sinking. That convective process, interacting with the ever-changing motion of the plates at the surface, is driving intraplate seismicity and determining in large part where those earthquakes occur. To a lesser extent, the structure of the crust above also influences the location, according to their models."This will not be the last word on the origin of strange earthquakes. However, our work shows how imaging advances in seismology can be combined with mantle flow modeling to probe the links between seismicity and mantle convection," said Becker, lead author of the study and professor of Earth sciences at the USC Dornsife College of Letters, Arts and Sciences.Becker and his team used an updated mantle flow model to study the motion beneath the mountain belt that cuts north to south through the interior of the Western United States.The area is seismically active -- the reason Yellowstone has geysers is that it sits atop a volcanic hotspot. Previously, scientists had suggested that the varying density of the plates was the main cause. (Imagine a mountain's own weight causing it to want to flow apart and thin out.)Instead, the team found that the small-scale convective currents beneath the plate correlated with seismic events above in a predictable way. They also tried using the varying plate density or "gravitational potential energy variations" to predict seismic events and found a much poorer correlation."This study shows a direct link between deep convection and shallow earthquakes that we didn't anticipate, and it charts a course for improved seismic hazard mapping in plate interiors," said Tony Lowry, co-author of the paper and associate professor of geophysics and geodynamics at Utah State University.
Earthquakes
2,015
August 24, 2015
https://www.sciencedaily.com/releases/2015/08/150824212055.htm
Catastrophic landslides post-earthquake
In the last few months, it has once more become clear that large earthquakes can solicit catastrophic landsliding. In the wake of the Nepal earthquake, the landslide community has been warning of persistent and damaging mass wasting due to monsoon rainfall in the epicentral area. However, very little is actually known about the legacy of earthquakes on steep, unstable hillslopes.
Using a dense time series of satellite images and air photos, Odin Marc and colleague reconstructed the history of landsliding in four mountain areas hit by large, shallow earthquakes. Their reconstructions show that the rate of landsliding caused by rainfall is systematically elevated after an earthquake, up to 20-fold, and then recovers over a period of months to years.The magnitude of this response and the duration of the recovery phase are possibly related to the size of the earthquake. Ruling out other mechanisms, Marc and colleagues found evidence suggesting that heightened landslide rates and their gradual decrease are due to shaking-induced damage of rocks very near Earth's surface and active healing processes.These findings show that in mountain areas intensely shaken by large earthquakes, people should reckon with higher than normal landslide risks during the recovery and rebuilding phase. This risk can be anticipated and monitored and rates of mass wasting are likely to return to pre-earthquake levels eventually.
Earthquakes
2,015
August 22, 2015
https://www.sciencedaily.com/releases/2015/08/150822154920.htm
Self-healing landscape: Landslides after earthquake
In mountainous regions earthquakes often cause strong landslides, which can be exacerbated by heavy rain. However, after an initial increase, the frequency of these mass wasting events, often enormous and dangerous, declines, in fact independently of meteorological events and aftershocks. These new findings are presented by a German-Franco-Japanese team of geoscientists in the current issue of the journal Geology, under the lead of the GFZ German Research Centre for Geosciences. Even after strong earthquake the activity of landslides returns back over the course of one to four years to the background level before the earthquake.
The interactions over time between earthquakes and processing shaping the landscape are still not well understood. The geoscientists have investigated areas affected by landslides related to four moderate to severe earthquakes (6.6 to 7.6 on Richter scale). "The main difficulty was that one must distinguish between the meteorological and the seismic causes of landsliding. Heavy rain can also produce landslides and can enhance landsliding after an earthquake," says GFZ scientists Marc Odin, the lead author of the study. Two processes are interacting here. A strong earthquake shakes soil layer loose from the underlying bedrock and also damages the rock below the top soil. Water seeps into the resulting the cracks and crevices and acts like a lubricating film on which a mountain slope slides into the valley.With the present results of the team of geoscientists, this conceptual model has to be modified. "We analytically separated the effect of the rain from the seismic activity and so were able to determine that the decrease of landslides through time is based on an internal healing process of the landscape," said Marc Odin. The destabilization of the landscape caused by the quake gradually recovers. In the course of months to years, depending on weather, rocks and the strength of the earthquake, the slide rates return to the pre-earthquake level: The cracks slowly get closed again or are filled with sand and earth. The landscape self-heals its underlayer and returns to its background hazard potential.This research is highly relevant: currently the GFZ analyzes these processes in the context of the Nepal-quake of April this year: "We had the chance to start a series of measurements directly after the quake and continue for the next few years," explained Niels Hovius, Head of the Section "Geomorphology" at the GFZ, about the current deployment of his team in the Himalayas.
Earthquakes
2,015
August 19, 2015
https://www.sciencedaily.com/releases/2015/08/150819120655.htm
Computer models show significant tsunami strength for Ventura and Oxnard, California
Few can forget the photos and videos of apocalyptic destruction a tsunami caused in 2011 in Sendai, Japan. Could Ventura and Oxnard in California be vulnerable to the effects of a local earthquake-generated tsunami? Yes, albeit on a much smaller scale than the 2011 Japan earthquake and tsunami, according to computer models used by a team of researchers, led by seismologists at the University of California, Riverside.
According to their numerical 3D models of an earthquake and resultant tsunami on the Pitas Point and Red Mountain faults -- faults located offshore Ventura, Calif. -- a magnitude 7.7 earthquake would result in many parts of the regional coastline being inundated a few kilometers inland by a tsunami wave, with inundation in places greater than that indicated by the state of California's current reference inundation line.Study results appear in "The hazard from earthquake-generated tsunamis in the Ventura/Oxnard area has received relatively little attention," said Kenny J. Ryan, a graduate student in the Department of Earth Sciences at UC Riverside and the first author of the research paper. "For our study, the shape of the coastline and seafloor produce the most interesting effects on the tsunami, causing a southward moving tsunami to refract -- and therefore rotate -- and focus on the Ventura/Oxnard area. Unfortunately, the Ventura/Oxnard area has relatively flat topography along the coast, so a tsunami can inundate that area quite effectively."Tsunamis are mainly generated by earthquakes. Sustained by gravity, they are long ocean waves that increase in amplitude (the tsunamis become larger) as water depth decreases. Since water depth is generally shallow near coastlines, the tsunami can grow in size as it approaches land, becoming particularly hazardous along heavily populated coastlines such as the Southern California coastline. Capable of achieving propagation speeds of about 435 miles per hour in deep water, tsunamis can get reflected and refracted due to changes in topography/bathymetry along shorelines.In their study, the researchers used two different modeling codes: one for the earthquake and one for the tsunami. The vertical seafloor deformation from the earthquake model was used as input into the tsunami model to generate the tsunami. The tsunami code then calculated tsunami propagation and inundation."Our study is different in that we use a dynamic earthquake model to calculate seafloor displacement from the earthquake," said coauthor David D. Oglesby, a professor of geophysics in whose lab Ryan works. "Dynamic models such as these calculate movement in time by looking at the forces on and around the fault in time. They are physics-based, and fault slip distribution and ground motion are calculated results of the models."A magnitude 7.7 earthquake generated by the researchers' models along the Pitas Point and Red Mountain faults results in the following scenario:"The models result in large tsunami amplitudes northward and eastward of the fault due to the shape of the coastline and seafloor," Ryan explained. "The probability of such an event in a given time frame is low compared to smaller earthquake events. Nonetheless, it is crucial to investigate the possible effects from such rare but plausible earthquake and tsunami scenarios so that a full hazard assessment can be made. Results from such modeling efforts can help reveal potential regions of high tsunami hazard."Research has shown that the faults in the Ventura basin in Southern California are capable of generating earthquakes of magnitude 7 or greater as well as significant local tsunamis. Research has also shown that tsunamis generated locally by faulting and landslides offshore California can impact the California coastline in a matter of minutes."Our study describes one potential earthquake and tsunami scenario along the Pitas Point and Red Mountain faults, and is designed to illustrate the usefulness of rupture modeling in determining tsunami inundation," Ryan cautioned. "It is not intended to give an overall distribution of all possible earthquakes and tsunami hazards in this region. Our models simply give an indication of what may be possible in this region."
Earthquakes
2,015
August 18, 2015
https://www.sciencedaily.com/releases/2015/08/150818131526.htm
Cascadia initiative to monitor Northwest Pacific seismic risks
Early data coming in from a massive, four-year deployment of seismometers onshore and offshore in the Pacific Northwest are giving scientists a clearer picture of the Cascadia subduction zone, a region with a past and potential future of devastating "megathrust" earthquakes.
The preliminary results from the Cascadia Initiative include a report of previously undetected, small earthquakes offshore, and seismic imaging that reveals new offshore structures at the subduction zone. The reports, published as a focus section in the September-October 2015 issue of The Cascadia subduction zone (CSZ) is a 1,100-(680 mile) kilometer Pacific fault that runs roughly from Cape Mendocino, California in the south to northern Vancouver Island, British Columbia. The zone marks the place where the Juan de Fuca and Gorda tectonic plates slip beneath the North American plate at a rate of about 2.3 to 4 centimeters (.9 to 1.6 inches) per year. At subduction zones like this throughout the globe, the tremendous strain built up in these crustal collisions has been released in the world's largest recorded earthquakes. These megathrust quakes include the 2004 magnitude 9.1 Sumatran Andaman earthquake that devastated parts of Indonesia, and the 2011 magnitude 9.0 Tohoku earthquake in Japan.Although the CSZ has been relatively quiet in recent years, researchers have compiled a historical record of full and partial ruptures of the massive fault, with a magnitude 9.0 earthquake and tsunami last occurring in 1700. Scientists estimate that these megathrust quakes occur at 400 to 600-year intervals. Agencies such as the U.S. Federal Emergency Management Agency (FEMA) and others have warned of catastrophic damage along the U.S. Northwest coast in the wake of a megathrust quake.Funded from 2011 to 2015 by the National Science Foundation, the Cascadia Initiative was designed in part to collect information on the potential seismic threat of the CSZ. The project includes 27 new inland seismic stations, upgrades to 232 other land stations, and the deployment of 60 new seismometers on the ocean bottom, spread across the tectonic plates. The data collected by the initiative is openly available to the full scientific community in a database managed by the Incorporated Research Institutions for Seismology (IRIS) Data Management Center.The project "offers a unique opportunity to image the seismic structures associated with an entire plate, including its spreading center and subduction zone, within an easily accessible part of the continental and offshore United States," said University of Massachusetts Amherst researcher Haiying Gao, a guest editor of the "I don't think we can predict the time or location of the next megathrust earthquake in the CSZ based on the current research progress," Gao cautioned. "Nevertheless, the Cascadia Initiative significantly contributes to a better understanding of the structure of the downgoing oceanic plates and thus to the assessment and mitigation of potential seismic and tsunamic hazards."For instance, a paper by New Mexico Tech researchers Emily Morton and Susan Bilek describes 96 small new earthquakes occurring in 2011 and 2012 that were detected with the help of the Initiative's ocean floor seismometers. These earthquakes occurred in the shallow, offshore "locked" part of the CSZ, where the fault is stuck in place, and had not been observed by land-based instruments. Detecting and locating these small seismic events can help researchers understand how strain on the megathrust fault may be changing, and to help predict how the megathrust might behave during a large rupture.The seismic data collected by the Initiative has also helped Gao and her colleague Yang Shen at the University of Rhode Island, along with another study by Columbia University scientist Helen Janiszewski and Cornell University researcher Geoffrey Abers, to compile a picture of the CSZ structure that points to new places where the crushing pressure of subduction is squeezing water from and transforming rock at the trench where the Juan de Fuca plate is bending under the North American plate. The newly deployed seafloor seismometers, said Gao, have offered an unprecedented look at how released fluid can affect the fault's strength and behavior at the offshore trench.Offshore instruments are important tools for observing and detecting tsunami risks in the region, according to a paper by Anne Sheehan of the University of Colorado Boulder and colleagues. They compared readings from several types of seafloor pressure gauges to study the tsunami caused by the 2012 Haida Gwaii earthquake, to evaluate how well the gauges could detect the timing and size of a tsunami.Other papers in the issue examine how deep sea sediments may affect seismic wave readings, and evaluate how the Cascadia Initiative's data collection from ocean bottom seismometers has improved over the first three years of the study.
Earthquakes
2,015
August 18, 2015
https://www.sciencedaily.com/releases/2015/08/150818112431.htm
Examining the fate of Fukushima contaminants
An international research team reports results of a three-year study of sediment samples collected offshore from the Fukushima Daiichi Nuclear Power Plant in a new paper published August 18, 2015, in the American Chemical Society's journal,
The research aids in understanding what happens to Fukushima contaminants after they are buried on the seafloor off coastal Japan.Led by Ken Buesseler, a senior scientist and marine chemist at the Woods Hole Oceanographic Institution (WHOI), the team found that a small fraction of contaminated seafloor sediments off Fukushima are moved offshore by typhoons that resuspend radioactive particles in the water, which then travel laterally with southeasterly currents into the Pacific Ocean."Cesium is one of the dominant radionuclides that was released in unprecedented amounts with contaminated water from Japan's Fukushima Daiichi nuclear power plant following the March 11, 2011, earthquake and tsunami," says Buesseler. "A little over 99 percent of it moved with the water offshore, but a very small fraction--less than one percent--ended up on the sea floor as buried sediment.""We've been looking at the fate of that buried sediment on the continental shelf and tracking how much of that contaminated sediment gets offshore through re-suspension from the ocean bottom," he adds.The research team, which included colleagues from the Japan Agency for Marine-Earth Science and Technology and the Japan Atomic Energy Agency, analyzed three years' worth of data collected from time-series sediment traps.Researchers deployed the pre-programmed, funnel-shaped instruments 115 kilometers (approximately 70 miles) southeast of the nuclear power plant at depths of 500 meters (1,640 feet) and 1,000 meters (3,280 feet). The two traps began collecting samples on July 19, 2011--130 days after the March 11th earthquake and tsunami--and were recovered and reset annually.After analyzing the data, researchers found radiocesium from the Fukushima Daiichi Nuclear Power Plant accident in the sediment samples along with a high fraction of clay material, which is characteristic of shelf and slope sediments suggesting a near shore source."This was a bit of a surprise because when we think of sediment in the ocean, we think of it as sinking vertically, originating from someplace above. But what this study clearly shows is that the only place that the material in our sediment traps could have come from was the continental shelf and slope buried nearshore. We know this because the coastal sediments from the shelf have a unique Fukushima radioactive and mineral signal," says Buesseler.The data also revealed that peak movements of the sediments with radiocesium coincided with passing typhoons which likely triggered the resuspension of coastal sediments. Radiocesium was still detected in sediment samples from July 2014."The total transport is small, though it is readily detectable. One percent or less of the contaminated sediment that's moving offshore every year means things aren't going to change very fast," Buesseler says. "What's buried is going to stay buried for decades to come. And that's what may be contributing to elevated levels of cesium in fish--particularly bottom-dwelling fish off Japan."While there were hundreds of different radionuclides released from the Fukushima Daiichi Nuclear Power Plant during the disaster, after the initial decay of contaminants with half lives (the time it takes for one half of a given amount of radionuclide to decay) less than days to weeks, much of the attention has remained focused on cesium-137 and-134-- two of the more abundant contaminants. Cesium-134 has a half-life of a little over two years, and so any found in the ocean could come only from the reactors at Fukushima. Cesium-137 has a half-life of roughly 30 years and is also known to have entered the Pacific as a result of aboveground nuclear weapons tests in the 1950s and '60s, providing a benchmark against which to measure any additional releases from the reactors.In October, Buesseler and the research team will return to Japan to redeploy more sediment traps. The continued study will help estimate how long it takes to decrease the level of radiocesium in seafloor sediments near the Fukushima Daiichi Nuclear Power Plant.
Earthquakes
2,015
August 10, 2015
https://www.sciencedaily.com/releases/2015/08/150810132126.htm
Scientists pioneer method to track water flowing through glaciers
Researchers for the first time have used seismic sensors to track meltwater flowing through glaciers and into the ocean, an essential step to understanding the future of the world's largest glaciers as climate changes.
The University of Texas Institute for Geophysics (UTIG) helped pioneer this new method on glaciers in Greenland and Alaska. The study will be published Aug. 10 in the journal Meltwater moving through a glacier into the ocean is critically important because it can increase melting and destabilize the glacier in a number of ways: The water can speed the glacier's flow downhill toward the sea; it can move rocks, boulders and other sediments toward the terminus of the glacier along its base; and it can churn and stir warm ocean water, bringing it in contact with the glacier."It's like when you drop an ice cube into a pot of warm water. It will eventually melt, but it will melt a lot faster if you stir that water," said Timothy Bartholomaus, a postdoctoral fellow at UTIG and the study's lead author. "Subglacial discharge provides that stirring."The new technique offers scientists a tool for tracking meltwater at glaciers that end in the ocean, called tidewater glaciers. Unlike landlocked glaciers, where scientists can simply measure the meltwater flowing in glacial rivers, there previously had not been a method available to track what's occurring within tidewater glaciers."All of the biggest glaciers in Greenland, all of the biggest glaciers in Antarctica, they end in the ocean," Bartholomaus said. "We need to understand how these glaciers are moving and how they are melting at their front. If we want to answer those questions, we need to know what's occurring with the meltwater being discharged from the glacier."UTIG research associate Jake Walter worked on the study. The team also includes researchers from the University of Alaska Southeast, the U.S. Geological Survey and the University of Alaska Fairbanks. Bartholomaus did his fieldwork while studying for his doctorate at the University of Alaska Fairbanks, but he analyzed the data and wrote the study while at UTIG.UTIG is a research unit of The University of Texas at Austin Jackson School of Geosciences.The team discovered the new method while trying to study earthquakes caused by iceberg calving -- when large chunks of ice break off glaciers. Bartholomaus said the ability to identify these earthquakes, known as icequakes, varied over the season, and that they were much more difficult to detect during summer because seismic background noise was obscuring the icequake signals.The team set about trying to determine what was causing the background noise, investigating potential causes such as rainfall, iceberg calving and the movement of the glacier over the ground. Eventually, as the researchers discounted these theories, they discovered that the seismic vibrations being detected by the equipment was caused by meltwater percolating down through the glacier and weaving its way through the complicated plumbing system in the interior of the ice.Researchers tested the theory on glaciers with meltwater rivers and found that the timing of the meltwater and the seismic signals synced perfectly. The method is very good at identifying when the glacial discharge is flowing into the ocean, Bartholomaus said, but it will take more research to determine exactly how much water is flowing out."Now that we know when subglacial discharge is faster or slower, we can make better measurements of glacier change," Bartholomaus said. "My hope is that this method will really help us understand how the glaciers and the oceans are coupled, and how the ocean might be affecting the behavior of tidewater glaciers."
Earthquakes
2,015
August 6, 2015
https://www.sciencedaily.com/releases/2015/08/150806144555.htm
April 2015 earthquake in Nepal reviewed in detail
For more than 20 years, Caltech geologist Jean-Philippe Avouac has collaborated with the Department of Mines and Geology of Nepal to study the Himalayas--the most active, above-water mountain range on Earth--to learn more about the processes that build mountains and trigger earthquakes. Over that period, he and his colleagues have installed a network of GPS stations in Nepal that allows them to monitor the way Earth's crust moves during and in between earthquakes. So when he heard on April 25 that a magnitude 7.8 earthquake had struck near Gorkha, Nepal, not far from Kathmandu, he thought he knew what to expect--utter devastation throughout Kathmandu and a death toll in the hundreds of thousands.
"At first when I saw the news trickling in from Kathmandu, I thought there was a problem of communication, that we weren't hearing the full extent of the damage," says Avouac, Caltech's Earle C. Anthony Professor of Geology. "As it turns out, there was little damage to the regular dwellings, and thankfully, as a result, there were far fewer deaths than I originally anticipated."Using data from the GPS stations, an accelerometer that measures ground motion in Kathmandu, data from seismological stations around the world, and radar images collected by orbiting satellites, an international team of scientists led by Caltech has pieced together the first complete account of what physically happened during the Gorkha earthquake--a picture that explains how the large earthquake wound up leaving the majority of low-story buildings unscathed while devastating some treasured taller structures.The findings are described in two papers that now appear online. The first, in the journal In the first study, the researchers show that the earthquake occurred on the Main Himalayan Thrust (MHT), the main megathrust fault along which northern India is pushing beneath Eurasia at a rate of about two centimeters per year, driving the Himalayas upward. Based on GPS measurements, scientists know that a large portion of this fault is "locked." Large earthquakes typically release stress on such locked faults--as the lower tectonic plate (here, the Indian plate) pulls the upper plate (here, the Eurasian plate) downward, strain builds in these locked sections until the upper plate breaks free, releasing strain and producing an earthquake. There are areas along the fault in western Nepal that are known to be locked and have not experienced a major earthquake since a big one (larger than magnitude 8.5) in 1505. But the Gorkha earthquake ruptured only a small fraction of the locked zone, so there is still the potential for the locked portion to produce a large earthquake."The Gorkha earthquake didn't do the job of transferring deformation all the way to the front of the Himalaya," says Avouac. "So the Himalaya could certainly generate larger earthquakes in the future, but we have no idea when."The epicenter of the April 25 event was located in the Gorkha District of Nepal, 75 kilometers to the west-northwest of Kathmandu, and propagated eastward at a rate of about 2.8 kilometers per second, causing slip in the north-south direction--a progression that the researchers describe as "unzipping" a section of the locked fault."With the geological context in Nepal, this is a place where we expect big earthquakes. We also knew, based on GPS measurements of the way the plates have moved over the last two decades, how 'stuck' this particular fault was, so this earthquake was not a surprise," says Jean Paul Ampuero, assistant professor of seismology at Caltech and coauthor on the In this case, one of the surprises was that the quake did not rupture all the way to the surface. Records of past earthquakes on the same fault--including a powerful one (possibly as strong as magnitude 8.4) that shook Kathmandu in 1934--indicate that ruptures have previously reached the surface. But Avouac, Ampuero, and their colleagues used satellite Synthetic Aperture Radar data and a technique called back projection that takes advantage of the dense arrays of seismic stations in the United States, Europe, and Australia to track the progression of the earthquake, and found that it was quite contained at depth. The high-frequency waves that were largely produced in the lower section of the rupture occurred at a depth of about 15 kilometers."That was good news for Kathmandu," says Ampuero. "If the earthquake had broken all the way to the surface, it could have been much, much worse."The researchers note, however, that the Gorkha earthquake did increase the stress on the adjacent portion of the fault that remains locked, closer to Kathmandu. It is unclear whether this additional stress will eventually trigger another earthquake or if that portion of the fault will "creep," a process that allows the two plates to move slowly past one another, dissipating stress. The researchers are building computer models and monitoring post-earthquake deformation of the crust to try to determine which scenario is more likely.Another surprise from the earthquake, one that explains why many of the homes and other buildings in Kathmandu were spared, is described in the The GPS records described in the "It would be good news if the smooth onset of slip, and hence the limited induced shaking, were a systematic property of the Himalayan megathrust fault, or of megathrust faults in general." says Avouac. "Based on observations from this and other megathrust earthquakes, this is a possibility."In contrast to what they saw with high-frequency waves, the researchers found that the earthquake produced an unexpectedly large amount of low-frequency waves with longer periods of about five seconds. This longer-period shaking was responsible for the collapse of taller structures in Kathmandu, such as the Dharahara Tower, a 60-meter-high tower that survived larger earthquakes in 1833 and 1934 but collapsed completely during the Gorkha quake.To understand this, consider plucking the strings of a guitar. Each string resonates at a certain natural frequency, or pitch, depending on the length, composition, and tension of the string. Likewise, buildings and other structures have a natural pitch or frequency of shaking at which they resonate; in general, the taller the building, the longer the period at which it resonates. If a strong earthquake causes the ground to shake with a frequency that matches a building's pitch, the shaking will be amplified within the building, and the structure will likely collapse.Turning to the GPS records from two of Avouac's stations in the Kathmandu Valley, the researchers found that the effect of the low-frequency waves was amplified by the geological context of the Kathmandu basin. The basin is an ancient lakebed that is now filled with relatively soft sediment. For about 40 seconds after the earthquake, seismic waves from the quake were trapped within the basin and continued to reverberate, ringing like a bell with a frequency of five seconds."That's just the right frequency to damage tall buildings like the Dharahara Tower because it's close to their natural period," Avouac explains.In follow-up work, Domniki Asimaki, professor of mechanical and civil engineering at Caltech, is examining the details of the shaking experienced throughout the basin. On a recent trip to Kathmandu, she documented very little damage to low-story buildings throughout much of the city but identified a pattern of intense shaking experienced at the edges of the basin, on hilltops or in the foothills where sediment meets the mountains. This was largely due to the resonance of seismic waves within the basin.Asimaki notes that Los Angeles is also built atop sedimentary deposits and is surrounded by hills and mountain ranges that would also be prone to this type of increased shaking intensity during a major earthquake."In fact," she says, "the buildings in downtown Los Angeles are much taller than those in Kathmandu and therefore resonate with a much lower frequency. So if the same shaking had happened in L.A., a lot of the really tall buildings would have been challenged."That points to one of the reasons it is important to understand how the land responded to the Gorkha earthquake, Avouac says. "Such studies of the site effects in Nepal provide an important opportunity to validate the codes and methods we use to predict the kind of shaking and damage that would be expected as a result of earthquakes elsewhere, such as in the Los Angeles Basin."
Earthquakes
2,015
August 6, 2015
https://www.sciencedaily.com/releases/2015/08/150806112054.htm
Typhoon Haiyan's destructive tsunami-like waves generated by surf beat over a coral reef
Researchers from the International Research Institute of Disaster Science (IRIDeS) at Tohoku University in Sendai, Japan, have been looking into how tsunami-type waves can originate from massive storm systems, independent of earthquakes or landslides.
According to Volker Roeber and Jeremy D. Bricker, massive storm systems can be the cause of devastating tsunami-type waves. It happened during Typhoon Haiyan, which struck the Philippines in November 2013. Typhoon Haiyan was one of the strongest typhoons ever recorded, causing more than 6,000 casualties.A development aid worker caught a scene on video where a wave devastated parts of a small fishing village (see Fig. 1). The wave swept away entire houses and was reminiscent of the tsunami waves that followed the Great East Japan Earthquake in 2011.Roeber and Bricker have investigated the generation and characteristics of this phenomenon and have found a surprisingly simple explanation called "surf beat."As waves travel in the open ocean, their wavelength determines the propagation speed. Longer waves overtake shorter ones and the superposition of multiple waves leads to the formation of wave groups, often referred to as "sets." The wave groups can be considered as an additional long wave component embedded in the sea state and bound by the storm waves.Offshore of the wave breaking zone, the long wave component is not directly visible because it is very long and superimposed by other short waves. "The long group waves can have wavelengths of several kilometers but their height is much lower than that of the individual storm waves. This is similar to what characterizes nearshore tsunamis," explains Roeber.As the wave groups approach shallow areas, they break and transport water onshore. Consequently, the accumulated water level pulsates with the wave groups. This is what's called the "surf beat" and it can be observed at many beaches worldwide.While a surf beat is usually harmless and goes unnoticed, it had a devastating impact on the town of Hernani during Typhoon Haiyan. What's more surprising is that Hernani is located behind a 0.5 km wide fringing reef, which should have protected it. "Until now, we have believed that reefs serve as a reliable protection from storm waves," says Bricker. "But during Haiyan, the opposite seemed to have happened."It turned out that the steep slope of the reef that leads towards the open ocean allows for only a very short wave-breaking zone. When waves break, they dissipate energy. But over the steep reef slope, the breaking process affected mostly the short storm waves.In contrast, the long group waves, which form the surf beat, were able to keep almost all of their energy. These waves propagated freely over the reef flat and even steepened at the beach into a turbulent breaking wave, which destroyed the town's seawall and many houses behind it."We computed this tsunami-like wave with our numerical models and found its energy to be very similar to waves from past tsunamis in the Pacific," says Roeber.In addition, the researchers state that the Hernani wave was not a worst-case scenario. If the period of the group waves had been in sync with the natural oscillation period of the fringing reef, the wave would have amplified due to resonance and could have been even more destructive. This scenario would have occurred if the reef had been about half as wide.As Typhoon Haiyan showed, tropical storms not only cause devastation due to strong winds. In coastal areas they also cause flooding from storm surges.A storm surge is an abnormal increase in the local sea level. It is mainly driven by the wind -- which pushes water landward -- and the low pressure of the storm that causes the water to arch like a convex lens. A storm surge can flood low-lying coastal areas similar to an extremely high tide level.In general, disaster management agencies determine coastal flood hazard zones based on studying the storm surge inundation. In most countries, this is currently done by using computer models. However, the conventional storm surge models neglect the important dynamics of the individual storm waves, which is a plausible explanation for the destruction at Hernani."The storm surge models do a great job for what they were designed for, but they simply cannot account for the phenomena such as what we have seen in Hernani," says Bricker. Both researchers therefore think that it is necessary to additionally utilize a new generation of models that resolve individual waves for hazard mitigation purposes."We have developed accurate and powerful numerical tools which are able to compute these dangerous tsunami-type waves," says Roeber. He especially recommends that the flood maps for coastal communities sheltered by fringing reefs should be re-assessed. This includes many islands in the tropical and sub-tropical latitudes that have experienced strong storms in the past, such as Okinawa and Hawaii."Tsunami-like waves such as the one in Hernani will happen again," he says. "But we now can be better prepared."
Earthquakes
2,015
August 4, 2015
https://www.sciencedaily.com/releases/2015/08/150804143555.htm
Precariously balanced rocks provide clues for unearthing underground fault connections
Stacked in gravity-defying arrangements in the western San Bernardino Mountains, near the San Andreas Fault, granite boulders that should have been toppled by earthquakes long ago resolutely remain. In exploring why these rocks still stand, researchers have uncovered connections between Southern California's San Jacinto and San Andreas faults that could change how the region plans for future earthquakes.
In a study to be published online Aug. 5 in One such interaction, according to the researchers, might be a rupture that began on the San Andreas Fault but then jumped over to the San Jacinto Fault, near the Cajon Pass. "These faults influence each other, and it looks like sometimes they have probably ruptured together in the past," Grant Ludwig said. "We can't say so for sure, but that's what our data point toward, and it's an important possibility that we should think about in doing our earthquake planning."The Cajon Pass, she noted, is the site of "some very crucial lifeline infrastructure, like I-15, and we should be considering the chance that there might be broader disruptions in that area."Most of the earthquake planning scenarios that engineers and others use to guide the design of buildings, aqueducts and other important infrastructure only account for the consequences of ruptures along one fault, she said."This paper suggests that we might consider the impact of a rupture that involves both the San Jacinto and San Andreas faults, which has the potential to affect more people than just the San Andreas or just the San Jacinto," Grant Ludwig said.She and her colleagues examined 36 of these perilously arranged boulders near Silverwood Lake and Grass Valley that are only 7 to 10 kilometers from the San Andreas and San Jacinto faults. They're at least 10,000 years old and should have experienced ground shaking from 50 to 100 large, surface-rupturing earthquakes over that time.Scientists gauge the fragility of these rocks by studying their geometry and conducting field tests such as tilt analyses, in which they put a pulley on a precariously balanced boulder and determine the force required to "tilt it to the point where, if you let it go, it will fall under the influence of gravity," Grant Ludwig explained.The researchers report this force as a measure of acceleration. Just as a person in the passenger seat of a car tilts back when the driver steps on the gas pedal, the delicately balanced rocks tilt in response to the ground accelerating beneath them due to an earthquake.Grant Ludwig and colleagues compared the fragility of the piled-up rocks with the expected ground acceleration in three earthquake scenarios created by the U.S. Geological Survey's ShakeMap program: a magnitude-7.8 rupture of the southern San Andreas Fault, a magnitude-7.4 San Andreas quake near San Bernardino, and the magnitude-7.9 Fort Tejon temblor in 1857.According to these scenarios -- and national seismic hazard maps for the area -- the 36 precariously balanced boulders should have toppled long ago. "It was a real scientific puzzle, a real head-scratcher," Grant Ludwig said. "How can you have these rocks right next to the San Andreas Fault? It's an interesting scientific question, but it also has practical implications, because we want our seismic hazard maps to be as good as possible."After a decade of investigating many potential explanations, the researchers concluded that only interaction between the San Jacinto and San Andreas faults could have produced the kind of rupture pattern that would preserve the area's precariously stacked rocks.Recognizing this interaction could change earthquake planning scenarios for the area, Grant Ludwig concluded."The San Jacinto Fault has been very seismically active; it's produced a lot of earthquakes during [recorded history]. And the southern San Andreas Fault has not; it's been pretty quiet since 1857," she said. "This brings up the question of whether we might have an earthquake on the San Jacinto that triggers one on the southern San Andreas, or vice versa."
Earthquakes
2,015
July 24, 2015
https://www.sciencedaily.com/releases/2015/07/150724093736.htm
Residential tourism model implemented in tourist destinations has increased earthquake risk
Antonio Aledo, Professor of Sociology at the University of Alicante, warns that "because of real estate speculation and the management of public budgets based on income from the real estate business, seismic risk has been forgotten." Taking as reference the town of Torrevieja, where one of the biggest earthquakes in the province of Alicante took place in 1829 with more than 389 dead and 209 wounded, the professor has published an article on seismic risk in tourist destinations since "the technological solutions proposed in its Local Action Plan against earthquakes does not seem enough," he says. Professor Antonio Aledo ensures that the population, "including foreign residents and tourists, should have more information about how to behave in the face of an earthquake."
As indicated by the professor, "the residential tourism model implemented in Torrevieja has not only created economic risk by the construction crisis and social risk (it is the poorest municipality in Spain according to the National Institute of Statistics), but it has also increased seismic vulnerability." Aledo suggests the strict application of seismic-resistant construction measures and the recovery of the model developed after the earthquake of 1829 by engineer José Larramendi.The paper "The unquestionability of risk: social vulnerability and earthquake risk in tourist destinations," published in "Certainly a key factor to this new approach to seismic risk in tourist destinations is its inclusion in the current and future management of the city; the preventive approach through social communication; and the implementation of risk preparation plans to build up security and confidence in the citizenship" adds the author.
Earthquakes
2,015
July 23, 2015
https://www.sciencedaily.com/releases/2015/07/150723133125.htm
Researchers map out trajectory of April 2015 earthquake in Nepal
Researchers from Scripps Institution of Oceanography at UC San Diego have accurately mapped out the movement of the devastating 7.8-magnitude Nepal earthquake that killed over 9,000 and injured over 23,000 people. Scientists have determined that the earthquake was a rupture consisting of three different stages. The study could help a rapidly growing region understand its future seismic risks.
The Himalayan region is particularly prone to earthquakes and this study will serve as an important benchmark for understanding where future earthquakes may occur, especially since the area has experienced high population growth over the past few decades.The study assessed the presence of low frequency and high frequency waves over the three stages of the earthquake. High frequency waves cause more shaking, thereby posing the greatest risks for structural damages. Low frequency waves are less violent and less damaging to buildings and infrastructure."The Nepal earthquake is a warning sign that the region is of high seismic risk, and each earthquake behaves differently. Some earthquakes jump from one fault line to another, whereas the Nepal quake apparently occurred on the same fault line in three different stages, moving eastward," said Scripps geophysicist Peter Shearer, "Using this research, we can better understand and identify areas of high seismic hazard in the region."This first peer-reviewed study on the April 2015 earthquake in Nepal, "Detailed rupture imaging of the 25 April 2015 Nepal earthquake using teleseismic P waves" was published online July 16 in the American Geophysical Union (AGU) journal Using the Global Seismic Network (GSN), Shearer and Scripps graduate student Wenyuan Fan were able to unravel the complex evolution of fault slips during this earthquake. The study concludes that the rupture traveled mostly eastward and occurred in three distinct stages; Stage 1 was weak and slow; Stage 2 was near Kathmandu and had the greatest slip but was relatively deficient in high-frequency radiation; and Stage 3 was relatively slow as well. Overall, this earthquake was more complicated, with multi-stage movements on multiple faults, than smooth models of continuous rupture on a single fault plane."Using the GSN instead of regional array data really enhanced the spatial resolution of the back-projection images and helped us see that frequency-dependent rupture was one of the main features of this earthquake," said Fan. "Stage 2 was high-frequency-deficient and occurred closest to Kathmandu, which was probably why ground shaking was less severer than expected for such a high-magnitude earthquake."The Global Seismic Network provides high-quality broadband digital seismic data for monitoring earthquakes and learning about Earth structure. A precursor to this network was initiated by Scripps researchers in the 1960s and is still in use today. Scripps currently operates one-third of the 153 global seismometers of the GSN. Fan and Shearer used the GSN data because they are open-source (available to anyone), have good coverage of the Nepal region, and have a long history of reliable recordings."In general, understanding large earthquakes will inform our ability to forecast the nature of future earthquakes," said Shearer.Shearer and Fan hope to use the same methodology to study other large, global earthquakes from the past decade to provide a broader picture of earthquake behavior and help in predicting ground shaking for future events.
Earthquakes
2,015
July 22, 2015
https://www.sciencedaily.com/releases/2015/07/150722164225.htm
Researchers quantify nature's role in human well-being
The benefits people reap from nature -- or the harm they can suffer from natural disasters -- can seem as obvious as an earthquake. Yet putting numbers to changes in those ecosystem services and how human well-being is affected has fallen short, until now.
A team of researchers from Michigan State University and Chinese Academy of Sciences are advancing new modeling technology to quantify human dependence on nature, human well-being, and relationships between the two. The latest step is published in this week's The paper notes that people who depended on multiple types of ecosystem services -- such as agricultural products, non-timber forest products, ecotourism -- fared better than those who had all their earning eggs in one natural resource basket."Quantifying the complex human-nature relationships will open the doors to properly respond to environmental changes and guide policies that support both people and the environment across telecoupled human and natural systems," said Jianguo "Jack" Liu, Rachel Carson Chair in Sustainability and director of the Center for Systems Integration and Sustainability (CSIS) at Michigan State University.Wu Yang, who received his doctoral degree while at CSIS, led an effort to scrutinize the impacts of China's devastating 2008 Wenchuan earthquake to show that amidst the devastation not everyone suffered equally and, while ecosystems played an important role in disaster impacts, not every ecosystem service carried the same weight in terms of delivering benefits to people. The results of the study showed that the poor, and those who had low access to social capital, suffered more after the earthquake."We created new ways to quantify human dependence on ecosystem services, measure human well-being, and understand to what extent the dependence on nature affected the well-being of people," said Yang, who now is an associate scientist at Conservation International (CI) in Arlington, VA. "Now we're showing the quantitative linkages between nature and human well-being."In the case of the Wolong Nature Reserve, which was near the earthquake's epicenter, many people there depend on the area's rich biodiversity and its powerful tourist draw as the home of the endangered giant pandas for their livelihood. The earthquake opened an opportunity to understand the many layers of destruction the earthquake caused, and how humans responded to it."Those highly dependent on ecosystem services, who are typically impoverished and have little social capital, will be forced to take desperate actions, potentially illegal logging or poaching, if recovery programs still focus on reserve-level infrastructure development and ecological restoration without targeting those poor households," Yang said. "This is why this new thinking is important -- we need quantitative understanding of the linkages between nature and people in order to guide policy making."The methods outlined in the paper can be applied across the globe, using either new data from surveys or existing sources such as statistical yearbooks and censuses. The new approach uses this information to measure multiple dimensions of human well-being such as basic material, security, health, social relationships, and freedom of choice and action, Yang said. For example, he and colleagues from Conservation International recently have also expanded the methods to a national spatially-explicit assessment of human well-being in Cambodia. The scientists also have developed ways to validate their findings with the people who live in areas of study, to confirm the data properly reflects reality.
Earthquakes
2,015
July 21, 2015
https://www.sciencedaily.com/releases/2015/07/150721102952.htm
Satellites peer into rock 50 miles beneath Tibetan Plateau
Gravity data captured by satellite has allowed researchers to take a closer look at the geology deep beneath the Tibetan Plateau.
The analysis, published in the journal There, the Indian tectonic plate presses continually northward into the Eurasian tectonic plate, giving rise to the highest mountains on Earth -- and deadly earthquakes, such as the one that killed more than 9,000 people in Nepal earlier this year.The study supports what researchers have long suspected: Horizontal compression between the two continental plates is the dominant driver of geophysical processes in the region, said C.K. Shum, professor and Distinguished University Scholar in the Division of Geodetic Science, School of Earth Sciences at The Ohio State University and a co-author of the study."The new gravity data onboard the joint NASA-German Aerospace Center GRACE gravimeter mission and the European Space Agency's GOCE gravity gradiometer mission enabled scientists to build global gravity field models with unprecedented accuracy and resolution, which improved our understanding of the crustal structure," Shum said. "Specifically, we're now able to better quantify the thickening and buckling of the crust beneath the Tibetan Plateau."Shum is part of an international research team led by Younghong Shin of the Korea Institute of Geosciences and Mineral Resource. With other researchers in Korea, Italy and China, they are working together to conduct geophysical interpretations of the Tibetan Plateau geodynamics using the latest combined gravity measurements by the GOCE gravity gradiometer and the GRACE gravimeter missions.Satellites such as GRACE and GOCE measure small changes in the force of gravity around the planet. Gravity varies slightly from place to place in part because of an uneven distribution of rock in Earth's interior.The resulting computer model offers a 3-D reconstruction of what's happening deep within earth.As the two continental plates press together horizontally, the crust piles up. Like traffic backing up on a congested freeway system, the rock follows whatever side roads may be available to relieve the pressure.But unlike cars on a freeway, the rock beneath Tibet has two additional options for escape. It can push upward to form the Himalayan mountain chain, or downward to form the base of the Tibetan Plateau.The process takes millions of years, but caught in the 3-D image of the computer model, the up-and-down and side-to-side motions create a complex interplay of wavy patterns at the boundary between the crust and the mantle, known to researchers as the Mohorovičić discontinuity, or "Moho.""What's particularly useful about the new gravity model is that it reveals the Moho topography is not random, but rather has a semi-regular pattern of ranges and folds, and agrees with the ongoing tectonic collision and current crustal movement measured by GPS," Shin said.As such, the researchers hope that the model will provide new insights into the analysis of collisional boundaries around the world.Co-author Carla Braitenberg of the University of Trieste said that the study has already helped explain one curious aspect of the region's geology: the sideways motion of the Tibetan Plateau. While India is pushing the plateau northward, GPS measurements show that portions of the crust are flowing eastward and even turning to the southeast."The GOCE data show that the movement recorded at the surface has a deep counterpart at the base of the crust," Braitenberg said. Connecting the rock flow below to movement above will help researchers better understand the forces at work in the region.Those same forces led to the deadly Nepal earthquake in April 2015. But Shum said that the new model almost certainly won't help with earthquake forecasting -- at least not in the near future."I would say that we would understand the mechanism more if we had more measurements," he said, but such capabilities "would be very far away."Even in California -- where, Shum pointed out, different tectonic processes are at work than in Tibet -- researchers are unable to forecast earthquakes, despite having abundant GPS, seismic and gravity data. Even less is known about Tibet, in part because the rough terrain makes installing GPS equipment difficult.Other co-authors on the study included Sang Mook Lee of Seoul National University; Sung-Ho Na of the University of Science and Technology in Daejeon, Korea; Kwang Sun Choi of Pusan National University; Houtse Hsu of the Institute of Geodesy & Geophysics, Chinese Academy of Sciences; and Young-Sue Park and Mutaek Lim of the Korea Institute of Geosciences and Mineral Resource.This research was supported by the Basic Research Project of the Korea Institute of Geoscience and Mineral Resources, funded by the Ministry of Science, ICT and Future Planning of Korea. Shum was partially supported by NASA's GRACE Science Team Program and Concept in Advanced Geodesy Program. Braitenberg was partially supported by the European Space Agency's Center for Earth Observation as part of the GOCE User ToolBox project.
Earthquakes
2,015
July 9, 2015
https://www.sciencedaily.com/releases/2015/07/150709144850.htm
Volcanic rocks resembling Roman concrete explain record uplift in Italian caldera
The discovery of a fiber-reinforced, concrete-like rock formed in the depths of a dormant supervolcano could help explain the unusual ground swelling that led to the evacuation of an Italian port city and inspire durable building materials in the future, Stanford scientists say.
The "natural concrete" at the Campi Flegrei volcano is similar to Roman concrete, a legendary compound invented by the Romans and used to construct the Pantheon, the Coliseum, and ancient shipping ports throughout the Mediterranean."This implies the existence of a natural process in the subsurface of Campi Flegrei that is similar to the one that is used to produce concrete," said Tiziana Vanorio, an experimental geophysicist at Stanford's School of Earth, Energy & Environmental Sciences.Campi Flegrei lies at the center of a large depression, or caldera, that is pockmarked by craters formed during past eruptions, the last of which occurred nearly 500 years ago. Nestled within this caldera is the colorful port city of Pozzuoli, which was founded in 600 B.C. by the Greeks and called "Puteoli" by the Romans.Beginning in 1982, the ground beneath Pozzuoli began rising at an alarming rate. Within a two-year span, the uplift exceeded six feet-an amount unprecedented anywhere in the world. "The rising sea bottom rendered the Bay of Pozzuoli too shallow for large craft," Vanorio said.Making matters worse, the ground swelling was accompanied by swarms of micro-earthquakes. Many of the tremors were too small to be felt, but when a magnitude 4 quake juddered Pozzuoli, officials evacuated the city's historic downtown. Pozzuoli became a ghost town overnight.A teenager at the time, Vanorio was among the approximately 40,000 residents forced to flee Pozzuoli and settle in towns scattered between Naples and Rome. The event made an impression on the young Vanorio, and inspired her interests in the geosciences. Now an assistant professor at Stanford, Vanorio decided to apply her knowledge about how rocks in the deep Earth respond to mechanical and chemical changes to investigate how the ground beneath Pozzuoli was able to withstand so much warping before cracking and setting off micro-earthquakes."Ground swelling occurs at other calderas such as Yellowstone or Long Valley in the United States, but never to this degree, and it usually requires far less uplift to trigger earthquakes at other places," Vanorio said. "At Campi Flegrei, the micro-earthquakes were delayed by months despite really large ground deformations."To understand why the surface of the caldera was able to accommodate incredible strain without suddenly cracking, Vanorio and a post-doctoral associate, Waruntorn Kanitpanyacharoen, studied rock cores from the region. In the early 1980s, a deep drilling program probed the active geothermal system of Campi Flegrei to a depth of about 2 miles. When the pair analyzed the rock samples, they discovered that Campi Flegrei's caprock-a hard rock layer located near the caldera's surface-is rich in pozzolana, or volcanic ash from the region.The scientists also noticed that the caprock contained tobermorite and ettringite-fibrous minerals that are also found in humanmade concrete. These minerals would have made Campi Flegrei's caprock more ductile, and their presence explains why the ground beneath Pozzuoli was able to withstand significant bending before breaking and shearing. But how did tobermorite and ettringite come to form in the caprock?Once again, the drill cores provided the crucial clue. The samples showed that the deep basement of the caldera-the "wall" of the bowl-like depression-consisted of carbonate-bearing rocks similar to limestone, and that interspersed within the carbonate rocks was a needle-shaped mineral called actinolite."The actinolite was the key to understanding all of the other chemical reactions that had to take place to form the natural cement at Campi Flegrei," said Kanitpanyacharoen, who is now at Chulalongkorn University in Thailand.From the actinolite and graphite, the scientists deduced that a chemical reaction called decarbonation was occurring beneath Campi Flegrei. They believe that the combination of heat and circulating mineral-rich waters decarbonates the deep basement, prompting the formation of actinolite as well as carbon dioxide gas. As the CO"This is the same chemical reaction that the ancient Romans unwittingly exploited to create their famous concrete, but in Campi Flegrei it happens naturally," Vanorio said.In fact, Vanorio suspects that the inspiration for Roman concrete came from observing interactions between the volcanic ash at Pozzuoli and seawater in the region. The Roman philosopher Seneca, for example, noted that the "dust at Puteoli becomes stone if it touches water.""The Romans were keen observers of the natural world and fine empiricists," Vanorio said. "Seneca, and before him Vitruvius, understood that there was something special about the ash at Pozzuoli, and the Romans used the pozzolana to create their own concrete, albeit with a different source of lime."Pozzuoli was the main commercial and military port for the Roman Empire, and it was common for ships to use pozzolana as ballast while trading grain from the eastern Mediterranean. As a result of this practice, volcanic ash from Campi Flegrei-and the use of Roman concrete-spread across the ancient world. Archeologists have recently found that piers in Alexandria, Caesarea, and Cyprus are all made from Roman concrete and have pozzolana as a primary ingredient.Interestingly, the same chemical reaction that is responsible for the unique properties of the Campi Flegrei's caprock can also trigger its downfall. If too much decarbonation occurs-as might happen if a large amount of saltwater, or brine, gets injected into the system-an excess of carbon dioxide, methane and steam is produced. As these gases rise toward the surface, they bump up against the natural cement layer, warping the caprock. This is what lifted Pozzuoli in the 1980s. When strain from the pressure buildup exceeded the strength of the caprock, the rock sheared and cracked, setting off swarms of micro-earthquakes. As pent-up gases and fluids vent into the atmosphere, the ground swelling subsided. Vanorio and Kanitpanyacharoen suspect that as more calcium hydroxide was produced at depth and transported to the surface, the damaged caprock was slowly repaired, its cracks "healed" as more natural cement was produced.Vanorio believes the conditions and processes responsible for the exceptional rock properties at Campi Flegrei could be present at other calderas around the world. A better understanding of the conditions and processes that formed Campi Flegrei's caprock could also allow scientists to recreate it in the lab, and perhaps even improve upon it to engineer more durable and resilient concretes that are better able to withstand large stresses and shaking, or to heal themselves after damage."There is a need for eco-friendly materials and concretes that can accommodate stresses more easily," Vanorio said. "For example, extracting natural gas by hydraulic fracturing can cause rapid stress changes that cause concrete well casings to fail and lead to gas leaks and water contamination."Video:
Earthquakes
2,015
July 6, 2015
https://www.sciencedaily.com/releases/2015/07/150706091348.htm
Geology: Slow episodic slip probably occurs in the plate boundary
A University of Tokyo research group has discovered slow-moving low-frequency tremors which occur at the shallow subduction plate boundary in Hyuga-nada, off east Kyushu. This indicates the possibility that the plate boundary in the vicinity of the Nankai Trough is slipping episodically and slowly (over days or weeks) without inducing a strong seismic wave.
It was thought that the shallow part of the plate boundary was completely "uncoupled," being able to slowly slip relative to the neighboring plate. However, after the 2011 Great East Japan Earthquake, it was discovered that is not entirely correct, and it is very important, in particular in the Nankai Trough, an area in which a major earthquake is expected, to understand the coupling state of the plate boundary. Hyuga-nada is located off east Kyushu in the western part of the Nankai Trough, a highly seismically active area in which M7-class interplate earthquakes occur every few decades, but interplate slip at the shallow plate boundary in this region is insufficiently understood.A research group comprising Project Researcher Yusuke Yamashita, Assistant Professor Tomoaki Yamada, Professor Masanao Shinohara and Professor Kazushige Obara at the University of Tokyo Earthquake Research Institute and researchers at Kyushu University, Kagoshima University, Nagasaki University, and the National Research Institute for Earth Science and Disaster Prevention, carried out ocean bottom seismological observation using 12 ocean bottom seismometers installed on the seafloor of Hyuga-nada from April to July 2013. The research group discovered migrating (moving) shallow low-frequency tremors which are thought to be triggered by slow episodic slipping (slow slip event) at the shallow plate boundary. The shallow tremors had similar migration properties to deep low-frequency tremors that occur at the deep subducting plate interface, and that they also occurred synchronized in time and space with shallow very-low-frequency tremors that also thought to be triggered by slow slip events. These observations indicate that episodic slow slip events are probably occurring at the shallow plate boundary in the vicinity of the Nankai Trough.After the 2011 Great East Japan Earthquake, a fundamental review of the shallow plate boundary interface is required. These new findings provide important insight into slip behavior at a shallow plate boundary and will improve understanding and modeling of subduction megathrust earthquakes and tsunamis in the future.This research was published in the journal
Earthquakes
2,015
July 1, 2015
https://www.sciencedaily.com/releases/2015/07/150701152333.htm
Creating a stopwatch for volcanic eruptions
We've long known that beneath the scenic landscapes of Yellowstone National Park sleeps a supervolcano with a giant chamber of hot, partly molten rock below it.
Though it hasn't risen from slumber in nearly 70,000 years, many wonder when Yellowstone volcano will awaken and erupt again. According to new research at Arizona State University, there may be a way to predict when that happens.While geological processes don't follow a schedule, petrologist Christy Till, a professor in ASU's School of Earth and Space Exploration, has produced one way to estimate when Yellowstone might erupt again."We find that the last time Yellowstone erupted after sitting dormant for a long time, the eruption was triggered within 10 months of new magma moving into the base of the volcano, while other times it erupted closer to the 10 year mark," says Till.The new study, published Wednesday in the journal This does not mean that Yellowstone will erupt in 10 months, or even 10 years. The countdown clock starts ticking when there is evidence of magma moving into the crust. If that happens, there will be some notice as Yellowstone is monitored by numerous instruments that can detect precursors to eruptions such as earthquake swarms caused by magma moving beneath the surface.And if history is a good predictor of the future, the next eruption won't be cataclysmic.Geologic evidence suggests that Yellowstone has produced three enormous eruptions within the past 2.1 million years, but these are not the only type of eruptions that can occur. Volcanologists say there have been more than 23 smaller eruptions at Yellowstone since the last major eruption approximately 640,000 years ago. The most recent small eruption occurred approximately 70,000 years ago.If a magma doesn't erupt, it will sit in the crust and slowly cool, forming crystals. The magma will sit in that state -- mostly crystals with a tiny amount of liquid magma -- for a very long time. Over thousands of years, the last little bit of this magma will crystallize unless it becomes reheated and reignites another eruption.For Till and her colleagues, the question was, "How quickly can you reheat a cooled magma chamber and get it to erupt?"Till collected samples from lava flows and analyzed the crystals in them with the NanoSIMS. The crystals from the magma chamber grow zones like tree rings, which allow a reconstruction of their history and changes in their environment through time."Our results suggest an eruption at the beginning of Yellowstone's most recent volcanic cycle was triggered within 10 months after reheating of a mostly crystallized magma reservoir following a 220,000-year period of volcanic quiescence," says Till. "A similarly energetic reheating of Yellowstone's current sub-surface magma bodies could end approximately 70,000 years of volcanic repose and lead to a future eruption over similar timescales."
Earthquakes
2,015
June 30, 2015
https://www.sciencedaily.com/releases/2015/06/150630121707.htm
Research redefines the properties of faults when rock melts
Geoscientists at the University of Liverpool have used friction experiments to investigate the processes of fault slip.
Fault slip occurs in many natural environments -- including during earthquakes -- when large stress build-ups are rapidly released as two sliding tectonic plates grinds together. In this process a large amount of the energy released can be converted to heat, that leads to frictional melting.Frictional melts, when cooled, preserve in the rock-record as pseudotachylytes; but their influence is much greater than just this. As Professor Lavallée and co-workers have demonstrated, the flow properties of the frictional melt helps control fault slip.The researchers, from the University's School of Environmental Sciences, warn of the inadequacy of simple Newtonian viscous analyses to describe molten rock along faults, and instead call for the more realistic application of viscoelastic theory.Melt may be considered a liquid, which is able to undergo a glass transition, as a result of changing temperature and/ or strain-rate. This catastrophic transition allows the melt to either flow or fracture, according to the fault slip conditions.Professor Lavallée said: "Even once frictional melt forms, slip can continue as if there was no melt; if the slip rate is fast enough the melt behaves as a solid."Using slip analysis models, the researchers describe the conditions that result in either viscous remobilisation or fracture of the melt, a description which will be of great use in the understanding of fault slip in melt-bearing slip zones.Professor Lavallée added: "This new description of fault slip is not just important for our understanding of earthquake fault rheology, it has far reaching implications for magma transport in volcanic eruptions, for landslide and sector collapse instabilities, and within material sciences; namely for the glass and ceramic industries."
Earthquakes
2,015
June 30, 2015
https://www.sciencedaily.com/releases/2015/06/150630100603.htm
Earthquakes in western Solomon Islands have long history, study shows
Researchers have found that parts of the western Solomon Islands, a region thought to be free of large earthquakes until an 8.1 magnitude quake devastated the area in 2007, have a long history of big seismic events.
The findings, published online in The team, led by researchers at The University of Texas Austin, analyzed corals for the study. The coral, in addition to providing a record of when large earthquakes happened during the past 3,000 years, helped provide insight into the relationship between earthquakes and more gradual geological processes, such as tectonic plate convergence and island building through uplift."We're using corals to bridge this gap between earthquakes and long-term deformation, how the land evolves," said lead researcher Kaustubh Thirumalai, a doctoral student at the University of Texas Institute for Geophysics (UTIG), a research unit within the Jackson School of Geosciences.The 2007 event was the only large earthquake recorded in 100 years of monitoring the region that started with British colonization in the 1900s. While studying uplifted coral at multiple sites along the eastern coast of the island of Ranongga the researchers found evidence for six earthquakes in the region during the past 3,000 years, with some being as large as or larger than the 2007 earthquake."This just shows the importance of paleoseismology and paleogeodesy," Thirumalai said. "If we have 100 years of instrumental data saying there's no big earthquakes here, but we have paleo-records that say we've had something like five giant ones in the last few thousand years, that gives you a different perspective on hazards and risk assessment."During an earthquake, land near its epicenter can be lifted as much as several meters. When the land is shallow-water seafloor, such as it is around the islands, corals can be lifted out of the water with it. The air kills the soft polyps that form coral, leaving behind their network of skeletons and giving the uplifted corals a rock-like appearance.Uplifted coral make good records for earthquakes because they record the time an earthquake occurs and help estimate how strong it was. The coral's time of death, which can be deduced through a chemical analysis similar to carbon dating, shows when the earthquake occurred, while the amount of uplift present in the land where the coral was found gives clues about its strength."If we have multiple corals going back in time, and we can date them very precisely, we can go from one earthquake, to many earthquakes, to thousands of years of deformation of the land," Thirumalai said.The UTIG research team comprised Thirumalai, Frederick Taylor, Luc Lavier, Cliff Frohlich and Laura Wallace. They collaborated with scientists from National Taiwan University, including Chuan-Chou "River" Shen, an expert in coral dating, and researchers from the Chinese Academy of Science; the Department of Mines, Energy and Water Resources in the Solomon Islands; and locals who live on Ranongga Island.The earthquakes in the region are a result of plate tectonic motion near the island; only four kilometers offshore the Pacific Plate starts to subduct beneath the Australian Plate. A theory of island building says that uplifts during earthquakes are one of the main drivers of land creation and uprising.However, the earthquake record suggested by the corals was not enough to account for the measured rate of tectonic convergence. This suggests that other geological processes besides those that directly cause earthquakes play an important role in tectonic plate movement and uplift of the islands.Learning the detailed relationship between earthquakes and these forces will take more research, said Thirumalai. But this study has shown uplifted coral are important geological tools.Data collected during a rapid-response mission to study uplifted corals in the wake of the 2007 earthquake served an important role in the research, Thirumalai said. The mission, which Taylor led and the Jackson School funded, provided data that served as a benchmark for analyzing the strength of earthquakes that happened before 2007.
Earthquakes
2,015
June 29, 2015
https://www.sciencedaily.com/releases/2015/06/150629162230.htm
Helium leakage from Earth's mantle in Los Angeles Basin
UC Santa Barbara geologist Jim Boles has found evidence of helium leakage from Earth's mantle along a 30-mile stretch of the Newport-Inglewood Fault Zone in the Los Angeles Basin. Using samples of casing gas from two dozen oil wells ranging from LA's Westside to Newport Beach in Orange County, Boles discovered that more than one-third of the sites -- some of the deepest ones -- show evidence of high levels of helium-3 (3He).
Considered primordial, 3He is a vestige of the Big Bang. Its only terrestrial source is the mantle. Leakage of 3He suggests that the Newport-Inglewood fault is deeper than scientists previously thought. Boles's findings appear in "The results are unexpected for the area, because the LA Basin is different from where most mantle helium anomalies occur," said Boles, professor emeritus in UCSB's Department of Earth Science. "The Newport-Inglewood fault appears to sit on a 30-million-year-old subduction zone, so it is surprising that it maintains a significant pathway through the crust."When Boles and his co-authors analyzed the 24 gas samples, they found that high levels of 3He inversely correlate with carbon dioxide (COBlueschist found at the bottom of nearby deep wells indicates that the Newport-Inglewood fault is an ancient subduction zone -- where two tectonic plates collide -- even though its location is more than 40 miles west of the current plate boundary of the San Andreas Fault System. Found 20 miles down, blueschist is a metamorphic rock only revealed when regurgitated to the surface via geologic upheaval."About 30 million years ago, the Pacific plate was colliding with the North American plate, which created a subduction zone at the Newport-Inglewood fault," Boles explained. "Then somehow that intersection jumped clear over to the present San Andreas Fault, although how this occurred is really not known. This paper shows that the mantle is leaking more at the Newport-Inglewood fault zone than at the San Andreas Fault, which is a new discovery."The study's findings contradict a scientific hypothesis that supports the existence of a major décollement -- a low-angle thrust fault -- below the surface of the LA Basin. "We show that the Newport-Inglewood fault is not only deep-seated but also directly or indirectly connected with the mantle," Boles said."If the décollement existed, it would have to cross the Newport-Inglewood fault zone, which isn't likely," he added. "Our findings indicate that the Newport-Inglewood fault is a lot more important than previously thought, but time will tell what the true importance of all this is."Study co-authors include Grant Garven of Tufts University; Hilario Camacho of Occidental Oil and Gas Corp.; and John Lupton of the National Oceanic and Atmospheric Administration's Pacific Marine Environmental Laboratory.This research was supported by the U.S. Department of Energy's Office of Science and Office of Basic Energy Sciences and by the NOAA Pacific Marine Environmental Laboratory.
Earthquakes
2,015
June 29, 2015
https://www.sciencedaily.com/releases/2015/06/150629123431.htm
Indonesian mud volcano likely human-caused, study suggests
New research led by the University of Adelaide hopes to close the debate on whether a major mud volcano disaster in Indonesia was triggered by an earthquake or had human-made origins.
A mud volcano suddenly opened up in the city of Sidoarjo in East Java, Indonesia, in May 2006. Nine years later the eruption continues -- having buried more than 6.5kmResults of new research published today in correspondence in the journal "There has been intense debate over the cause of the mud volcano ever since it erupted," Adjunct Associate Professor Tingay says."Some researchers argue that the volcano was human-made and resulted from a drilling accident (a blowout) in a nearby gas well. Others have argued that it was a natural event that was remotely triggered by a large earthquake that occurred 250km away and two days previously. There has been no scientific consensus about this, and it's a very hot topic politically in Indonesia."Our new research essentially disproves all existing earthquake-triggering models and, in my opinion, puts the matter to rest," he says.The study by Adjunct Associate Professor Tingay and colleagues in the US (Portland University; University of California, Berkeley) and UK (Newcastle University) is the first to use actual physical data collected in the days before and after the earthquake, rather than models and comparisons."The earthquake-trigger theory proposes that seismic shaking induced liquefaction of a clay layer at the disaster location. Clay liquefaction is always associated with extensive gas release, and it is this large gas release that has been argued to have helped the mud flow upwards and erupt on the surface. However, we examined precise and continuous subsurface gas measurements from the adjacent well and show that there was no gas release following the earthquake," Adjunct Associate Professor Tingay says."The rocks showed no response to the earthquake, indicating that the earthquake could not have been responsible for the mud flow disaster. Furthermore, the measurements highlight that the onset of underground activity preceding the mud eruption only started when the drilling 'kick' occurred, strongly suggesting that the disaster was initiated by a drilling accident."We also use gas signatures from different rocks and the mud eruption itself to 'fingerprint' the initial source of erupting fluids. We demonstrate that erupting fluids were initially sourced from a deep formation, which is only predicted to occur in the drilling-trigger hypothesis. Taken together, our data strongly supports a human-made trigger. We hope this closes the debate on whether an earthquake caused this unique disaster," he says.
Earthquakes
2,015
June 25, 2015
https://www.sciencedaily.com/releases/2015/06/150625143926.htm
Backward-moving glacier helps scientists explain glacial earthquakes
The relentless flow of a glacier may seem unstoppable, but a team of researchers from the United Kingdom and the U.S. has shown that during some calving events -- when an iceberg breaks off into the ocean -- the glacier moves rapidly backward and downward, causing the characteristic glacial earthquakes which until now have been poorly understood.
This new insight into glacier behavior, gained by combining field observations in Greenland with laboratory calving experiments, should enable scientists to measure glacier calving remotely and will improve the reliability of models that predict future sea-level rise in a warming climate.The research is scheduled for publication in Science Express on June 25. The lead author is Tavi Murray of Swansea University. Co-authors include U-M's L. Mac Cathles, an assistant professor in the Department of Earth and Environmental Sciences and the Department of Atmospheric, Oceanic and Space Sciences and a postdoctoral fellow in the Michigan Society of Fellows.The Greenland ice sheet is an important contributor to global sea level, and nearly half of the ice sheet's annual mass loss occurs through the calving of icebergs to the ocean. Glacial earthquakes have increased sevenfold in the last two decades and have been migrating north, suggesting an increase in rates of mass loss from the ice sheet through calving."Our new understanding is a crucial step toward developing tools to remotely measure the mass loss that occurs when icebergs break off ice sheets," Cathles said. "Combining field observations with laboratory measurements from scaled-model calving experiments provided insights into the dynamics of calving and glacial earthquakes that would not have otherwise been possible."Helheim Glacier is one of the largest glaciers in southeast Greenland. At 6 kilometers (3.7 miles) wide and more than 200 kilometers (124.3 miles) long, it can flow as fast as 30 meters (98 feet) a day. Icebergs calving from Helheim Glacier have been measured at up to 4 kilometers (2.5 miles) across, with a volume of about 1.25 cubic kilometers (0.3 cubic miles).During summer 2013, researchers from Swansea, Newcastle and Sheffield universities installed a robust wireless network of Global Positioning System devices on the chaotic surface of Helheim to measure velocity and displacement of the glacier surface.With U.S. collaborators from U-M, Columbia University and Emory University, earthquake data from the Global Seismographic Network and scaled-down models in water tanks were used to explain the unexpected movements of the glacier in the minutes surrounding the calving events."We were really surprised to see the glacier flowing backward in our GPS data. The motion happens every time a large iceberg is calved and a glacial earthquake is produced," said Swansea's Murray. "A theoretical model for the earthquakes and the laboratory experiments has allowed us to explain the backward and downward movement."U-M's Cathles helped design and run the laboratory experiments of iceberg calving presented in the paper. The international collaboration grew out of a conversation that Cathles and Murray had at an International Glaciological Society meeting in Chamonix, France, last summer."We both presented in the same session and realized that I was measuring in the lab a very similar signal to what Professor Murray was observing in the field," Cathles said. "That started a year-long collaboration in which the paper's co-authors talked regularly and collectively developed a model to explain the GPS observations and a deeper understanding of how glacial earthquakes are generated during an iceberg calving event."Understanding this glacier behavior and the associated glacial earthquakes is a crucial step toward remote measurement of calving events and their contribution to sea-level change. This tool has the potential to provide unprecedented global, near-real-time estimates of iceberg loss from the ice sheet.The research was supported by the U.K. Natural Environment Research Council, the U.S. National Science Foundation and the Climate Change Consortium of Wales and Thales U.K.
Earthquakes
2,015
June 23, 2015
https://www.sciencedaily.com/releases/2015/06/150623200018.htm
Understanding subduction zone earthquakes: The 2004 Sumatra earthquake
The 26 December 2004 Mw ~9.2 Indian Ocean earthquake (also known as the Sumatra-Andaman or Aceh-Andaman earthquake), which generated massive, destructive tsunamis, especially along the Aceh coast of northern Sumatra, Indonesia, clearly demonstrated the need for a better understanding of how frequently subduction zone earthquakes and tsunamis occur. Toward that end, Harvey M. Kelsey of Humboldt State University and colleagues present a study of earthquake history in the area.
Using subsidence stratigraphy, the team traced the different modes of coastal sedimentation over the course of time in the eastern Indian Ocean where relative sea-level change evolved from rapidly rising to static from 8,000 years ago to the present day.Kelsey and colleagues discovered that 3,800 to 7,500 years ago, while sea level was gradually rising, there were seven subduction zone earthquakes recorded in coastal deposits. This was determined in part by the fact that each earthquake caused burial of a mangrove soil by sediment and/or deposition of tsunami sand at the time of the earthquake.The team also discovered that sea level gradually stopped rising about 3,800 years ago, which meant that buried soils no longer formed. Thus, detecting subduction zone earthquakes required a different approach. They found a record of successive earthquakes in a sequence of stacked tsunami deposits on the coastal plain. Individual tsunami deposits were 0.2 to 0.5 m thick. Based on this information, Kelsey and colleagues determined that in the past 3,800 years there were between four and six tsunamis caused by Andaman-Aceh-type earthquakes.The authors conclude that knowing the relative sea-level record for a coastal region on a subduction zone margin is the initial step in investigating paleoseismic history. For mid-latitude coasts that border subduction zones, sequences of buried soils may provide a long-duration, subsidence stratigraphic paleoseismic record that spans to the present, but in other settings such as the Aceh coastal plain, joint research approaches, for example targeted foraminiferal analyses and palynology, are required to both exploit the changing form of the relative sea-level curve and characterize coastal evolution in the context of the diminishing importance of accommodation space.
Earthquakes
2,015
June 22, 2015
https://www.sciencedaily.com/releases/2015/06/150622122849.htm
Uplifted island: The island Isla Santa María in south of central Chile
Charles Darwin and his captain Robert Fitzroy witnessed the great earthquake of 1835 in south central Chile. The "Beagle"-Captain's precise measurements showed an uplift of the island Isla Santa María of 2 to 3 meters after the earthquake. What Darwin and Fitzroy couldn't know was the fact that 175 years later nearly at the same position such a strong earthquake would recur.
At the South American west coastline the Pacific Ocean floor moves under the South American continent. Resulting that through an in- and decrease of tension the earth's crust along the whole continent from Tierra del Fuego to Peru broke alongside the entire distance in series of earthquakes within one and a half century. The earthquake of 1835 was the beginning of such a seismic cycle in this area.After examining the results of the Maule earthquake in 2010 a team of geologists from Germany, Chile and the US for the first time were able to measure and simulate a complete seismic cycle at its vertical movement of the earth's crust at this place.In the current online-edition of The Maule earthquake belongs to the great earthquakes, which was fully recorded and therefore well documented by a modern network of space-geodetical and geophysical measuring systems on the ground. More difficult was the reconstruction of the processes in 1835. But nautical charts from 1804 before the earthquake, from 1835 and 1886 as well as the precise documentation of captain Fitzroy allow in combination with present-day methods a sufficient accurate determination of the vertical movement of the earth's crust along a complete seismic cycle.At the beginning of such a cycle energy is stored by elastic deformation of the earth's crust, then released at the time of the earthquake. "But interestingly, our observations hint at a variable subsidence rate during the seismic cycle" explains Marcos Moreno from GFZ German Research Centre for Geosciences, one of the co-authors. "Between great earthquakes the plates beneath Isla Santa María are large locked, dragging the edge of the South American plate, and the island upon it, downward and eastward." During the earthquakes, motion is suddenly reversed and the edge of the South America Plate and island are thrust upward and to the west." This complex movement pattern could be perfectly confirmed by a numerical model. In total, over time arises a permanent vertical uplift of 10 to 20% of the complete uplift.Records of earthquakes show that there are no periodically sequence repetition times or consistent repeating magnitudes of earthquakes. An important instrument for a better estimation of risks caused by earthquakes are the compilation and measurement of earth's crust deformation through an entire seismic cycle.
Earthquakes
2,015
June 19, 2015
https://www.sciencedaily.com/releases/2015/06/150619103524.htm
Earth science: New estimates of deep carbon cycle
Over billions of years, the total carbon content of the outer part of the Earth -- in its upper mantle, crust, oceans, and atmospheres -- has gradually increased, scientists reported this month in the journal
Craig Manning, a professor of geology and geochemistry at UCLA, and Peter Kelemen, a geochemistry professor at Columbia University, present new analyses that represent an important advance in refining our understanding of Earth's deep carbon cycle.Manning and Kelemen studied how carbon, the chemical basis of all known life, behaves in a variety of tectonic settings. They assessed, among other factors, how much carbon is added to Earth's crust and how much carbon is released into the atmosphere. The new model combines measurements, predictions and calculations.Their research includes analysis of existing data on samples taken at sites around the world as well as new data from Oman.The carbon 'budget' near the Earth's surface exerts important controls on global climate change and our energy resources, and has important implications for the origin and evolution of life, Manning said. Yet much more carbon is stored in the deep Earth. The surface carbon that is so important to us is made available chiefly by volcanic processes originating deep in the planet's interior.Today carbon can return to Earth's deep interior only by subduction -- the geologic process by which one tectonic plate moves under another tectonic plate and sinks into the Earth's mantle. Previous research suggested that roughly half of the carbon stored in subducted oceanic mantle, crust and sediments makes it into the deep mantle. Kelemen and Manning's new analysis suggests instead that subduction may return almost no carbon to the mantle, and that 'exchange between the deep interior and surface reservoirs is in balance.'Some carbon must make it past subduction zones. Diamonds form in the mantle both from carbon that has never traveled to Earth's surface, known as primordial carbon, and from carbon that has cycled from the mantle to the surface and back again, known as recycled carbon. Manning and Kelemen corroborated their findings with a calculation based on the characteristics of diamonds, which form from carbon in the earth's mantle.Deep carbon is important because the carbon at the Earth's surface, on which we depend, 'exists only by permission of the deep Earth,' Manning said, quoting a friend. At times in the Earth's history, the planet has been warmer (in the Cretaceous period, for example), and shallow seas covered North America. The new research sheds light on the Earth's climate over geologic time scales.
Earthquakes
2,015
June 18, 2015
https://www.sciencedaily.com/releases/2015/06/150618145901.htm
US mid-continent seismicity linked to high-rate injection wells
A dramatic increase in the rate of earthquakes in the central and eastern U.S. since 2009 is associated with fluid injection wells used in oil and gas development, says a new study by the University of Colorado Boulder and the U.S. Geological Survey.
The number of earthquakes associated with injection wells has skyrocketed from a handful per year in the 1970s to more than 650 in 2014, according to CU-Boulder doctoral student Matthew Weingarten, who led the study. The increase included several damaging quakes in 2011 and 2012 ranging between magnitudes 4.7 and 5.6 in Prague, Oklahoma; Trinidad, Colorado; Timpson, Texas; and Guy, Arkansas."This is the first study to look at correlations between injection wells and earthquakes on a broad, nearly national scale," said Weingarten of CU-Boulder's geological sciences department. "We saw an enormous increase in earthquakes associated with these high-rate injection wells, especially since 2009, and we think the evidence is convincing that the earthquakes we are seeing near injection sites are induced by oil and gas activity."A paper on the subject appears in the June 18 issue of The researchers found that "high-rate" injection wells -- those pumping more than 300,000 barrels of wastewater a month into the ground -- were much more likely to be associated with earthquakes than lower-rate injection wells. Injections are conducted either for enhanced oil recovery, which involves the pumping of fluid into depleted oil reservoirs to increase oil production, or for the disposal of salty fluids produced by oil and gas activity, said Weingarten.Co-authors on the study include CU-Boulder Professor Shemin Ge of the geological sciences department and Jonathan Godt, Barbara Bekins and Justin Rubinstein of the U.S. Geological Survey (USGS). Godt is based in Denver and Bekins and Rubenstein are based in Menlo Park, California.The team assembled a database of roughly 180,000 injection wells in the study area, which ranged from Colorado to the East Coast. More than 18,000 wells were associated with earthquakes -- primarily in Oklahoma and Texas -- and 77 percent of associated injection wells remain active, according to the study authors.Of the wells associated with earthquakes, 66 percent were oil recovery wells, said Ge. But active saltwater disposal wells were 1.5 times as likely as oil recovery wells to be associated with earthquakes. "Oil recovery wells involve an input of fluid to 'sweep' oil toward a second well for removal, while wastewater injection wells only put fluid into the system, producing a larger pressure change in the reservoir," Ge said.Enhanced oil recovery wells differ from hydraulic fracturing, or fracking wells, in that they usually inject for years or decades and are operated in tandem with conventional oil production wells, said Weingarten. In contrast, fracking wells typically inject for just hours or days.The team noted that thousands of injection wells have operated during the last few decades in the central and eastern U.S. without a ramp-up in seismic events. "It's really the wells that have been operating for a relatively short period of time and injecting fluids at high rates that are strongly associated with earthquakes," said Weingarten.In addition to looking at injection rates of individual wells over the study area, the team also looked at other aspects of well operations including a well's cumulative injected volume of fluid over time, the monthly injection pressure at individual wellheads, the injection depth, and their proximity to "basement rock" where earthquake faults may lie. None showed significant statistical correlation to seismic activity at a national level, according to the study.Oklahoma had the most seismic activity of any state associated with wastewater injection wells. But parts of Colorado, west Texas, central Arkansas and southern Illinois also showed concentrations of earthquakes associated with such wells, said Weingarten.In Colorado, the areas most affected by earthquakes associated with injection wells were the Raton Basin in the southern part of the state and near Greeley north of Denver."People can't control the geology of a region or the scale of seismic stress," Weingarten said. "But managing rates of fluid injection may help decrease the likelihood of induced earthquakes in the future."The study was supported by the USGS John Wesley Powell Center for Analysis and Synthesis, which provides opportunities for collaboration between government, academic and private sector scientists.
Earthquakes
2,015
June 18, 2015
https://www.sciencedaily.com/releases/2015/06/150618145809.htm
Oklahoma earthquakes linked to oil and gas drilling
Stanford geophysicists have identified the triggering mechanism responsible for the recent spike of earthquakes in parts of Oklahoma -- a crucial first step in eventually stopping them.
In a new study published in the June 19 issue of the journal In addition, the pair showed that the primary source of the quake-triggering wastewater is not so-called "flow back water" generated after hydraulic fracturing operations. Rather, the culprit is "produced water"-brackish water that naturally coexists with oil and gas within the Earth. Companies separate produced water from extracted oil and gas and typically reinject it into deeper disposal wells."What we've learned in this study is that the fluid injection responsible for most of the recent quakes in Oklahoma is due to production and subsequent injection of massive amounts of wastewater, and is unrelated to hydraulic fracturing," said Zoback, the Benjamin M. Page Professor in the School of Earth, Energy & Environmental Sciences.The Stanford study results were a major contributing factor in the recent decision by the Oklahoma Geological Survey (OGS) to issue a statement that said it was "very likely" that most of the state's recent earthquakes are due to the injection of produced water into disposal wells that extend down to, or even beyond, the Arbuckle formation.Before 2008, Oklahoma experienced one or two magnitude 4 earthquakes per decade, but in 2014 alone, the state experienced 24 such seismic events. Although the earthquakes are felt throughout much of the state, they pose little danger to the public, but scientists say that the possibility of triggering larger, potentially damaging earthquakes cannot be discounted.In the study, Zoback and Walsh looked at three study areas-centered around the towns of Cherokee, Perry and Jones-in Oklahoma that have experienced the greatest number of earthquakes in recent years. All three areas showed clear increases in quakes following increases in wastetwater disposal. Three nearby control areas that did not have much wastewater disposal did not experience increases in the number of quakes.Because the pair were also able to review data about the total amount of wastewater injected at wells, as well as the total amount of hydraulic fracturing happening in each study area, they were able to conclude that the bulk of the injected water was produced water generated using conventional oil extraction techniques, not during hydraulic fracturing."We know that some of the produced water came from wells that were hydraulically fractured, but in the three areas of most seismicity, over 95 percent of the wastewater disposal is produced water, not hydraulic fracturing flowback water," said Zoback, who is also a senior fellow at Stanford's Precourt Institute for Energy and director of the university's recently launched Natural Gas Initiative, which is focused on ensuring that natural gas is developed and used in ways that are economically, environmentally, and societally optimal.The three study areas in Oklahoma that Zoback and Walsh looked at all showed a time delay between peak injection rate and the onset of seismicity, as well as spatial separations between the epicenter of the quakes and the injection well sites. Some of the quakes occurred months or even years after injection rates peaked and in locations that were sometimes located miles away from any wells.These discrepancies had previously puzzled scientists, and had even been used by some to argue against a link between wastewater disposal and triggered earthquakes, but Zoback said they are easily explained by a simple conceptual model for Oklahoma's seismicity that his team has developed.According to this model, wastewater disposal is increasing the pore pressure in the Arbuckle formation, the disposal zone that sits directly above the crystalline basement, the rock layer where earthquake faults lie. Pore pressure is the pressure of the fluids within the fractures and pore spaces of rocks at depth. The earth's crust contains many pre-existing faults, some of which are geologically active today. Shear stress builds up slowly on these faults over the course of geologic time, until it finally overcomes the frictional strength that keeps the two sides of a fault clamped together. When this happens, the fault slips, and energy is released as an earthquake.Active faults in Oklahoma might trigger an earthquake every few thousand years. However, by increasing the fluid pressure through disposal of wastewater into the Arbuckle formation in the three areas of concentrated seismicity-from about 20 million barrels per year in 1997 to about 400 million barrels per year in 2013-humans have sped up this process dramatically. "The earthquakes in Oklahoma would have happened eventually," Walsh said. "But by injecting water into the faults and pressurizing them, we've advanced the clock and made them occur today."Moreover, because pressure from the wastewater injection is spreading throughout the Arbuckle formation, it can affect faults located far from well sites, creating the observed time delay. "You can easily imagine that if a fault wasn't located directly beneath a well, but several miles away, it would take time for the fluid pressure to propagate," Walsh said.Now that the source of the recent quakes in Oklahoma is known, scientists and regulators can work on ways to stop them. One possible solution, Zoback said, would be cease injection of produced water into the Arbuckle formation entirely, and instead inject it back into producing formations such as the Mississippian Lime, an oil-rich limestone layer where much of the produced water in Oklahoma comes from in the first place.Some companies already reinject water back into reservoirs in order to displace remaining oil and make it easier to recover. The Stanford study found that this technique, called enhanced oil recovery, does not result in increased earthquakes.Even if companies opt to use producing formations to store wastewater, however, the quakes won't cease immediately. "They've already injected so much water that the pressure is still spreading throughout the Arbuckle formation," Zoback said. "The earthquakes won't stop overnight, but they should subside over time."
Earthquakes
2,015
June 16, 2015
https://www.sciencedaily.com/releases/2015/06/150616102401.htm
'Unprecedented' earthquake evidence in Africa discovered
Lead researcher Hannah Hilbert-Wolf and supervisor Dr. Eric Roberts used innovative methods to examine the ground around Mbeya in Tanzania where a large earthquake occurred some 25,000 years ago.
They found evidence of fluidization (where soil behaves like quicksand) and upward displacement of material unprecedented in a continental setting, raising questions of how resilient the rapidly growing cities of the region would be in a major shake.'We can now use this to evaluate how the ground would deform in a modern earthquake,' said Dr. Roberts. 'This is important because the approach is inexpensive and can be used to model how structures might be affected by future events, providing a valuable tool in hazard assessment.'Hilbert-Wolf said the team found evidence of massive ground deformation and previously unknown styles of liquefaction and fluidization, caused by past earthquakes. 'This could be a major concern for the growing urban population of East Africa, which has similar tectonic settings and surface conditions,' she said.The study comes on the back of a series of damaging earthquakes already this year, including in Nepal and Papua New Guinea and the study may be of much use in predicting the effects of earthquakes in those countries.'What we have shown is that in developing countries in particular, which may lack extensive seismic monitoring, the rock record can be used to not only investigate the timing and frequency of past events, but also provide important insights into how the ground will behave in certain areas to seismic shock,' said Hilbert-Wolf.In 1910, 7.5 million people lived in Tanzania when the most powerful earthquake in Africa of the twentieth century struck, collapsing houses and triggering liquefaction and fluidization. By 2050 it is estimated that around 130 million people will live in Tanzania, mostly in constructed urban settings that are more susceptible to earthquake damage and surface deformation than traditionally fabricated buildings.
Earthquakes
2,015
June 15, 2015
https://www.sciencedaily.com/releases/2015/06/150615130857.htm
Origins of Red Sea's mysterious 'cannon earthquakes' revealed
For many generations, Bedouin people living in the Abu Dabbab area on the Egyptian Red Sea coast have heard distinct noises--like the rumbling of a quarry blast or cannon shot--accompanying small earthquakes in the region. Now, a new study published in the
Seismic activity in the area of the Egyptian seaside resort Abu Dabbab may be caused by an active fault that lays below a 10-kilometer thick block of old, now rigid igneous rock. The surface of the block slides along the active parts of the fault, lubricated by fluids from the Red Sea that have penetrated the crust, according to Sami El Khrepy of King Saud University in Riyadh, Saudi Arabia and colleagues.The researchers think this large and rigid block of igneous crust acts as a sort of broadcaster, allowing the full sounds of seismic movement to rise through the rock with little weakening of the acoustic signal. The high-frequency sounds of earthquakes can then be heard by humans at the surface.Earlier studies had suggested that the Abu Dabbab earthquakes were caused by magma rising through the crust, but the new report "found that a volcanic origin of the seismicity is unlikely, and the area is not expected to be subjected to volcanic hazard," said El Khrepy.Earthquake swarms are frequent in this area of the northern Red Sea near Abu Dabbab, but most of the earthquakes are weak, ranging in magnitude from 0.3 to 3.5. The largest well-documented earthquakes, measuring magnitude 6.1 and magnitude 5.1, occurred in 1955 and 1984, respectively.El Khrepy and colleagues decided to take a closer look at the structure of the Abu Dabbab crust, to determine the origin of this seismicity. To peer into the crust, they combined data from local earthquake monitoring with a new set of local and regional earthquake data collected by the National Seismic Network of Egypt (ENSN), which was completed in 2002. They then applied a technique called seismic tomography, which uses data on the speed of seismic waves traveling through different rock types to develop a 3-D map of some of the subsurface geological features in the area."This study is the first detailed look at the seismic tomography in this region of Abu Dabbab's cannon earthquakes," said El Khrepy, "It was not possible to do this work without ENSN deployment."The researchers determined that the earthquakes at Abu Dabbab extend in a line from the coast into the Red Sea, "and the seismicity pattern is arc-shaped in depth, confined to the dome-like structure of the rigid igneous body that formed during the Precambrian era'' above an active fault, El Khrepy said.The strike-slip and thrust movements of the fault may drive water from the Red Sea in between the fault and the surface of the igneous block, allowing the two to slip past each other, the scientists suggest."Based on the new results and also the historical data," El Khrepy concluded, "we report that the confined seismicity in this zone is of tectonic, and not volcanic, origin."
Earthquakes
2,015
June 10, 2015
https://www.sciencedaily.com/releases/2015/06/150610111135.htm
Social media should play greater role in disaster communication
When Typhoon Haiyan slammed into the Philippines in 2013, thousands of people were killed, in part because they didn't know it was coming or didn't know how to protect themselves.
Could an increased use of social media, particularly on the part of the nation's government, have made a difference?While that question remains open, it is clear that social media should play a larger role in emergency preparedness, says Bruno Takahashi, a Michigan State University assistant professor of journalism who studies the issue.Using the Philippines' typhoon as a case study, Takahashi and his fellow researchers looked into the matter and determined that more tweets and Facebook messaging might have made a difference."We need to think of social media not as an afterthought," he said. "It needs to be integrated into emergency-preparedness plans."He said as the typhoon, one of the strongest storms ever recorded on Earth, made landfall, many individuals and some journalists were using Twitter to spread information. However, the government was not."All of the coordination of relief and what to do to seek shelter came after the storm hit," Takahashi said. "Maybe that is something governments should do ahead of time -- be more proactive."For this study, which was published in the journal "We have to think about social media not just as this place online where people go to have fun or share mindless thoughts," he said. "It's apparent that social media can be a really powerful tool, not only for preparedness, but also as a coping mechanism."Just as radio was years ago, social media helps people connect with others, lets them know there are others out there sharing the same problems."It lets people know they are somehow connected to others," he said. "People use social media to share their feelings, as well as help them try to make sense of the tragedy."One way in which Tacloban City, which took the brunt of the storm, used social media afterwards is officials set up a center where people could log onto Facebook. They were given three minutes to send a message, letting friends and loved ones know they were all rightHow effective can social media, particularly Twitter, be at spreading news? Takahashi said that social media messages can spread faster than natural disasters, including earthquakes like the one in Nepal last month."There was an instance in which people who had not felt an earthquake got a tweet about it, then felt it seconds later," he said.
Earthquakes
2,015
June 10, 2015
https://www.sciencedaily.com/releases/2015/06/150610093215.htm
Researchers to help create 'early-warning systems' through social media to combat future disasters
University of Leicester researchers are examining how communities can use social media to improve their resilience to both human-made and natural disasters -- such as the recent Nepal earthquake or the sinking of ships that left thousands of Rohingya and Bangladeshi migrants stranded at sea.
Researchers from the University of Leicester's Department of Media and Communication, led by Dr Paul Reilly, are contributing to a European Commission Horizon 2020-funded project which will examine how social media can be used to crowdsource information during a crisis situation -- and how this information can help reduce response and recovery times and raise awareness about the risk of future disasters.The project, 'IMPROVER: Improved risk evaluation and implementation of resilience concepts to critical infrastructure', will see the Leicester team look specifically at how community representatives and those involved in emergency management can use social media to create early-warning systems that can be activated during such events.They hope to identify examples of good practice for information dissemination to the public during crises. These will be used to develop a communication strategy for emergency services and incident managers that will raise public awareness about the risks associated with these events.Dr Paul Reilly, who is leading the Leicester project, said: "We hope this research will provide valuable evidence for agencies involved in emergency management and members of the public."We will look at the value of crowdsourced crisis information for those involved in emergency management. In particular, we will explore how members of the public can be empowered to provide accurate and timely information during these events that decrease response and recovery times."The researchers will examine: natural catastrophes, such as earthquakes and flooding; fires in buildings and tunnels; and outdoor events such as pop concerts.Examples such as the Flood Alert app in the UK and the 'One Source One Message' in Australia will be used to explore how the live information provided by social media users might be used by resilience practitioners to increase disaster preparedness within communities vulnerable to such incidents.The Leicester team will consist of Dr Paul Reilly and Research Associate Dr Dimitrinka Atanasova from the University of Leicester's Department of Media and Communication, with this part of the project due to finish in December 2016.Dr Atanasova added: "Crowdsourcing efforts are on the rise. In the very recent Nepal earthquake we have witnessed another wave of crowdsourcing efforts, such as tweet and image maps of people trapped in debris. As these efforts are rising, there is also a growing need to critically evaluate them which is what we hope to do it this project."The IMPROVER project also includes: SP Technical Research Institute of Sweden; Euro-Mediterranean Seismological Centre; The Arctic University of Norway in Tromsø; DBI (Denmark), INOV INESC INOVAÇÃ; INERIS (the French National Institute for Industrial Environment and Risks); SP Fire Research AS; University College London; and the European Commission's Joint Research Centre.Dr Reilly and Dr Atanasova will be discussing the research at the IMPROVER kick-off meeting, which will take place at the SP Technical Research Institute of Sweden, Borås, Sweden on 11 and 12 June.\
Earthquakes
2,015
June 9, 2015
https://www.sciencedaily.com/releases/2015/06/150609124543.htm
'Myths' persist about the increase in human-caused seismic activity
Seismologists studying the recent dramatic upswing in earthquakes triggered by human activity want to clear up a few common misconceptions about the trend.
There is increasing evidence that these earthquakes are caused by injecting fluids from oil and gas operations deep into the earth. These human-caused earthquakes are sometimes called "induced earthquakes."A Guest editor Justin Rubinstein, a scientist with the U.S. Geological Survey, explains that most of the induced earthquakes felt in the United States are from the disposal of large amounts of wastewater from oil and gas production. The majority of this wastewater is ancient ocean brine that was trapped in rock layers along with gas and oil deposits. Only a small percentage of induced seismicity comes from fracking processes that inject liquid into the ground to break up rock layers to free oil and gas for recovery.Wastewater disposal from oil and gas operations has increased in the U.S. in the past decade, especially in states like Oklahoma where the amount of wastewater disposal doubled between 1999 and 2013."Wastewater disposal is expanding and waste fluids are being injected into new locations. There have been changes in production practices as well, so in some areas there is much more wastewater that needs to be disposed," Rubinstein noted.Not all fluid injection causes earthquakes that can be detected or felt, Rubinstein added. Only a few dozen of the tens of thousands of wastewater disposal, enhanced oil recovery and hydraulic fracture wells in the U.S. have been linked to induced earthquakes that can be felt.The central United States has experienced a surge in seismicity in the past six years, rising from an average of 24 earthquakes magnitude 3.0 or larger per year between 1973 and 2008 to an average of 193 earthquakes of this size every year between 2009 and 2014, with 688 occurring in 2014 alone.Researchers are also tracking induced earthquakes in Canada, and the current batch of studies suggests that fracking might be more significant than wastewater disposal for causing earthquakes in that country, according to focus section co-editor David Eaton of the University of Calgary."There appear to be interregional differences between the U.S. and Canada," he noted, "but it's too early to say yet whether those reflect operational differences in the geological site conditions, or if it simply reflects the focus of studies that have been completed to date."As research continues in both countries, experts are recommending a more proactive approach to the risks of induced seismicity. A focus section article by Randi Jean Walters and colleagues at Stanford University outlines a possible workflow to reduce pre and post-injection risks at oil and gas sites. The workflow would incorporate seismic monitoring, a thorough understanding of a region's past and present geology and detailed information on the industrial methods used in an oil and gas operation. Perhaps most important, they write, an ongoing risk assessment would take into account what sorts of resources--from buildings to natural settings--would be affected by seismic activity, and what kinds of seismic activity the surrounding population is willing to tolerate.Another focus section paper by James Dieterich and colleagues at the University of California, Riverside explores the mechanics of induced seismicity. Their study uses an earthquake simulation program called RSQSim to explore how simple faults with various levels of pre-existing stress respond to fluid injection. Their model is able to reproduce many of the observed characteristics of induced seismicity and relate them to physical quantities such as injection duration and injected volumes. If the simulator can model more complex situations in future trials, it may offer guidance on managing the seismic risks at injection sites and estimating the probabilities of inducing earthquakes.Other articles in the issue investigate characteristics of induced earthquakes that have been proposed to be different in natural and induced earthquakes, including their ground shaking and faulting styles.
Earthquakes
2,015