Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
June 4, 2015 | https://www.sciencedaily.com/releases/2015/06/150604084443.htm | Simulation helps to prepare for the consequences of natural disasters | A simulation tool has been developed under the European CRISMA project coordinated by VTT. The tool helps users to prepare for unexpected catastrophes and natural disasters. Pilot case in Finland focused on winter storms and the resulting power cuts and evacuations. | Modelling and simulation were used to assess various probable or fictional crises with immediate impacts on human lives and society, and the impact of various types of response and preparedness actions.The tool was also used to simulate coastal flooding and submersion, earthquakes and the subsequent forest fires, chemical spills in inhabited areas and capacity management in the event of a catastrophe. The progress of coastal submersion in France, an earthquake in Italy, chemical spills in Israel, and catastrophe management in Germany were also simulated.Project website: | Earthquakes | 2,015 |
May 29, 2015 | https://www.sciencedaily.com/releases/2015/05/150529131822.htm | Little-known quake, tsunami hazards lurk offshore of Southern California | While their attention may be inland on the San Andreas Fault, residents of coastal Southern California could be surprised by very large earthquakes -- and even tsunamis -- from several major faults that lie offshore, a new study finds. | The latest research into the little known, fault-riddled, undersea landscape off of Southern California and northern Baja California has revealed more worrisome details about a tectonic train wreck in the Earth's crust with the potential for magnitude 7.9 to 8.0 earthquakes. The new study supports the likelihood that these vertical fault zones have displaced the seafloor in the past, which means they could send out tsunami-generating pulses towards the nearby coastal mega-city of Los Angeles and neighboring San Diego."We're dealing with continental collision," said geologist Mark Legg of Legg Geophysical in Huntington Beach, California, regarding the cause of the offshore danger. "That's fundamental. That's why we have this mess of a complicated logjam."Legg is the lead author of the new analysis accepted for publication in the The logjam Legg referred to is composed of blocks of the Earth's crust caught in the ongoing tectonic battle between the North American tectonic plate and the Pacific plate. The blocks are wedged together all the way from the San Andreas Fault on the east, to the edge of the continental shelf on the west, from 150 to 200 kilometers (90 to 125 miles) offshore. These chunks of crust get squeezed and rotated as the Pacific plate slides northwest, away from California, relative to the North American plate. The mostly underwater part of this region is called the California Continental Borderland, and includes the Channel Islands.By combining older seafloor data and digital seismic data from earthquakes along with 4,500 kilometers (2,796 miles) of new seafloor depth measurements, or bathymetry, collected in 2010, Legg and his colleagues were able to take a closer look at the structure of two of the larger seafloor faults in the Borderland: the Santa Cruz-Catalina Ridge Fault and the Ferrelo Fault. What they were searching for are signs, like those seen along the San Andreas, that indicate how much the faults have slipped over time and whether some of that slippage caused some of the seafloor to thrust upwards.What they found along the Santa Cruz-Catalina Ridge Fault are ridges, valleys and other clear signs that the fragmented, blocky crust has been lifted upward, while also slipping sideways like the plates along the San Andreas Fault do. Further out to sea, the Ferrelo Fault zone showed thrust faulting -- which is an upwards movement of one side of the fault. The vertical movement means that blocks of crust are being compressed as well as sliding horizontally relative to each other-what Legg describes as "transpression."Compression comes from the blocks of the Borderland being dragged northwest, but then slamming into the roots of the Transverse Ranges -- which are east-west running mountains north and west of Los Angeles. In fact, the logjam has helped build the Transverse Ranges, Legg explained."The Transverse Ranges rose quickly, like a mini Himalaya," Legg said.The real Himalaya arose from a tectonic-plate collision in which the crumpled crust on both sides piled up into fast-growing, steep mountains rather than getting pushed down into Earth's mantle as happens at some plate boundaries.As Southern California's pile-up continues, the plate movements that build up seismic stress on the San Andreas are also putting stress on the long Santa Cruz-Catalina Ridge and Ferrelo Faults. And there is no reason to believe that those faults and others in the Borderlands can't rupture in the same manner as the San Andreas, said Legg."Such large faults could even have the potential of a magnitude 8 quake," said geologist Christopher Sorlien of the University of California at Santa Barbara, who is not a co-author on the new paper."This continental shelf off California is not like other continental shelves -- like in the Eastern U.S.," said Sorlien.Whereas most continental shelves are about twice as wide and inactive, like that off the U.S. Atlantic coast, the California continental shelf is very narrow and is dominated by active faults and tectonics. In fact, it's unlike most continental shelves in the world, he said. It's also one of the least well mapped and understood. "It's essentially terra incognita.""This is one of the only parts of the continental shelf of the 48 contiguous states that didn't have complete ... high-resolution bathymetry years ago," Sorlien said.And that's why getting a better handle on the hazards posed by the Borderland's undersea faults has been long in coming and slow to catch on, even among earth scientists, he said.NOAA was working on complete high-resolution bathymetry of the U.S. Exclusive Economic Zone -- the waters within 200 miles of shore -- until the budget was cut, said Legg. That left out Southern California and left researchers like himself using whatever bits and pieces of smaller surveys to assemble a picture of what's going on in the Borderland, he explained."We've got high resolution maps of the surface of Mars," Legg said, "yet we still don't have decent bathymetry for our own backyard." | Earthquakes | 2,015 |
May 29, 2015 | https://www.sciencedaily.com/releases/2015/05/150529101108.htm | New technique harnesses everyday seismic waves to image Earth | A new technique developed at Stanford University harnesses the buzz of everyday human activity to map the interior of the Earth. "We think we can use it to image the subsurface of the entire continental United States," said Stanford geophysics postdoctoral researcher Nori Nakata. | Using tiny ground tremors generated by the rumble of cars and trucks across highways, the activities within offices and homes, pedestrians crossing the street and even airplanes flying overhead, a team led by Nakata created detailed three-dimensional subsurface maps of the California port city of Long Beach.The maps, detailed in a recent issue of the The subsurface maps were created by applying a new signal processing technique to a particular type of seismic waves -- energy waves that travel across the Earth's surface and through its interior. Seismic waves can be generated naturally, during earthquakes and volcanic eruptions, for example, or by artificial means such as explosions.There are two major types of seismic waves: surface waves and body waves. As their name suggests, surface waves travel along the surface of the Earth. Scientists have long been able to harness surface waves to study the upper layers of the planet's crust, and recently they have even been able to extract surface waves from the so-called ambient seismic field. Also known as ambient noise, these are very weak but continuous seismic waves that are generated by colliding ocean waves, among other things.Body waves, in contrast, travel through the Earth, and as a result can provide much better spatial resolution of the planet's interior than surface waves. "Scientists have been performing body-wave tomography with signals from earthquakes and explosives for decades," said study coauthor Jesse Lawrence, an assistant professor of geophysics at Stanford. "But you can't control when and where an earthquake happens, and explosives are expensive and often damaging."For this reason, geophysicists have long sought to develop a way to perform body wave tomography without relying on earthquakes or resorting to explosives. This has proven challenging, however, because body waves have lower amplitudes than surface waves, and are therefore harder to observe."Usually you need to combine and average lots and lots of data to even see them," Lawrence said.In the new study, the Stanford team applied a new software processing technique, called a body-wave extraction filter. Nakata developed the filter to analyze ambient noise data gathered from a network of thousands of sensors that had been installed across Long Beach to monitor existing oil reservoirs beneath the city.While experimenting with different types of filters for parsing and analyzing ambient noise, Nakata came up with an idea for a filter of his own that focused specifically on body waves. "When I saw the Long Beach data, I realized I had all of the pieces in my hand to isolate body- wave energy from ambient noise," Nakata said. "I was excited, but at the same time I was skeptical my idea would work."The filter Nakata developed and then refined with help from his Stanford colleagues represents a new way of processing the ambient noise by comparing each observation to every other observation, which boosts the body-wave signal while reducing the noise.Using its filter, the team was able to create maps that revealed details about the subsurface of Long Beach down to a depth of more than half a mile (1.1. kilometers). The body-wave maps were comparable to, and in some cases better than, existing imaging techniques.One map, for example, clearly revealed the Newport-Inglewood fault, an active geological fault that cuts through Long Beach. This fault also shows up in surface-wave maps, but the spatial resolution of the body-wave velocity map was much higher, and revealed new information about the velocity of seismic waves traveling through the fault's surrounding rocks, which in turn provides valuable clues about their composition and organization."This has been something of a holy grail in Earth imaging, and Nori's work is a first-of-its-kind study," said geophysicist Greg Beroza, the Wayne Loel Professor at Stanford, who was not involved in the study. "His groundbreaking achievement is sure to be widely emulated."Lawrence says the real power of the new technique will come when it is combined with surface wave tomography. "Primary waves, which are a type of body wave, are sensitive to compressional forces, whereas surface waves are more sensitive to shear, or sliding, forces," Lawrence said. "To characterize the subsurface properly, one must measure both shear and compressional properties. Using one wave type and not the other is like trying to study a painting by looking at it through a frosted window."Now that ambient-noise body wave tomography has been shown to work, Nakata says he plans to apply his technique to much larger test areas. | Earthquakes | 2,015 |
May 28, 2015 | https://www.sciencedaily.com/releases/2015/05/150528104110.htm | Aftershock assessment: Buildings collapse during earthquake aftershocks | Earthquakes kill, but their aftershocks can cause the rapid collapse of buildings left standing in the aftermath of the initial quake. Research published in the | Negar Nazari and John W. van de Lindt of the Department of Civil and Environmental Engineering, at Colorado State University in Fort Collins and Yue Li of Michigan Technological University, in Houghton, USA, point out that it is relatively obvious that buildings that survive a main shock will be at varying degrees of risk of collapse as aftershocks travel through the earthquake zone. Aftershocks are usually several orders of magnitude less intense than the primary earthquake, but can nevertheless have high ground motion intensity, last longer and occur at different vibration frequencies. In addition, changes in the structure of a building and its foundations, whether crippling or not, mean that the different energy content of the ground acceleration can during an aftershock further complicates any analysis. This adds up to a very difficult risk assessment for surviving buildings.In order to compute the risk of collapse, the probability, for building damaged by a main shock, the team has introduced a logical method based on two key earthquake variables: magnitude and site-to-source distance. They have carried out tests using different site-to-source distances with an incremental dynamic analysis based on simulated ground motions caused by the main shock and aftershocks and applied this to a computer modeled, two-storey, timber-frame building in a hypothetical town in California relatively close to a geological fault line, as a proof of principle. Full-scale structural data was available from an actual building.The team found that collapse probability increased if there were a sequence of aftershocks following a main shock just 10 kilometers distant from the building. Stronger aftershocks mean greater risk that correlates with the actual magnitude of the shock. As one might also expect if the site-to-source distance is greater, risk is lower. Overall, however, the analysis allows the team to quantify this risk based on the two variables, distance and aftershock magnitude. | Earthquakes | 2,015 |
May 21, 2015 | https://www.sciencedaily.com/releases/2015/05/150521144038.htm | Seismic signals used to track above-ground explosions | Lawrence Livermore researchers have determined that a tunnel bomb explosion by Syrian rebels was less than 60 tons as claimed by sources. | Using seismic stations in Turkey, Livermore scientists Michael Pasyanos and Sean Ford created a method to determine source characteristics of near earth surface explosions. They found the above-ground tunnel bomb blast under the Wadi al-Deif Army Base near Aleppo last spring was likely not as large as originally estimated and was closer to 40 tons.Seismology has long been used to determine the source characteristics of underground explosions, such as yield and depth, and plays a prominent role in nuclear explosion monitoring. But now some of the same techniques have been modified to determine the strength and source of near and above-ground blasts.The new method to track above-ground explosions serves as a forensic tool for investigators and governmental agencies seeking to understand the precise cause of an explosion."The technique accounts for the reduction in amplitudes as the explosion depth approaches the free surface and less energy is coupled into the ground," said Michael Pasyanos, an LLNL geophysicist and lead author of a paper appearing in an upcoming issue of The team, also made up LLNL scientist Sean Ford, used the method on a series of shallow explosions in New Mexico where the yields and depths were known.Pasyanos and Ford's examination of source characteristics of near-surface explosions is an extension of the regional amplitude envelope method. This technique was developed and applied to North Korean nuclear explosions, then applied to chemical explosions and nuclear tests in Nevada."The technique takes an earthquake or explosion source model and corrects for the wave propagation to generate predicted waveform envelopes at any particular frequency band," Pasyanos said.Methods for determining the yields of contained events range from teleseismic amplitudes and P-wave spectra to regional P-wave amplitudes and magnitudes. Pasyanos developed a method to characterize underground explosions based on regional amplitude envelopes across a broad range of frequencies. One advantage of the method is that examining the signal over a wide frequency band can reduce some of the strong tradeoffs between yield and depth, Pasyanos said"By allowing the methodology to consider shallow, uncontained events just below, at, or even above Earth's surface, we make the method relevant to new classes of events including mining events, military explosions, industrial accidents, plane crashes or potential terrorist attacks." Pasyanos said. "A yield estimate is often very important to investigators and governmental agencies seeking to understand the precise cause of an explosion."For the Syrian explosion, the team did not have local seismic data from Syria, but it was well recorded by regional stations from the Continental Dynamics: Central Anatolian Tectonics (CD-CAT) deployment in Turkey.If the explosion occurred well above the surface, a yield of 100 tons TNT equivalent would be required to produce the observed seismic signal."Given the video footage of the explosion, however, we know that it was neither at nor above the free surface, nor fully coupled," Ford said. "We estimate a chemical yield ranging from 6 and 50 tons depending on the depth, with the best estimate between 20-40 tons. Including independent information on the depth, we could narrow this considerably. If, for instance, we definitively knew that the explosion occurred at 2 meters below the surface, then we would estimate the yield at 40 tons."The team found that though there are expected tradeoffs between yield and depth/height, when constrained by other information, the yields are consistent with ground truth yields in tests in New Mexico and reasonable values from what Pasyanos and Ford know about in Syria. | Earthquakes | 2,015 |
May 18, 2015 | https://www.sciencedaily.com/releases/2015/05/150518135131.htm | Signs of ancient earthquakes may raise risks for New Zealand | Researchers have uncovered the first geologic evidence that New Zealand's southern Hikurangi margin can rupture during large earthquakes. The two earthquakes took place within the last 1000 years, and one was accompanied by a tsunami, according to the study published in the | The earthquakes took place roughly 350 years apart, according to the analysis by Kate Clark of GNS Science and colleagues. This may mean that the time between large earthquakes in this region is shorter than scientists have thought. The current seismic models account for these types of earthquakes every 500 to 1000 years.A worst-case, M8.9 Hikurangi earthquake could cause about 3350 deaths and 7000 injuries, and lead to $13 billion in costs in New Zealand's capital Wellington alone, according to a calculation made in 2013 by a different set of scientists. The Hikurangi margin marks the area where the Pacific and Australian tectonic plates collide to the east of New Zealand. The margin is one of the few places around the Pacific where a major subduction interface earthquake--which occurs deep in the crust where one plate is thrust under another--has not occurred in historic times.However, "subduction earthquakes are not a 'new' risk for New Zealand, as we have always assumed they can occur, and they are accounted for in our seismic hazard models," Clark said. "This study is significant in that it confirms that risk."We have a record of three to five past earthquakes on most of the major upper plate faults in the [New Zealand] lower North Island and upper South Island, but there was previously no evidence of past subduction earthquakes on the southern Hikurangi margin," Clark explained. "Subduction earthquakes have the potential to be significantly larger in magnitude than upper plate fault ruptures, affect a much larger spatial area and are much more likely to trigger tsunami."To look for evidence of past earthquakes on the margin, the researchers performed a painstaking examination of the geologic layers contained within a salt marsh at Big Lagoon in the southeastern Wairau River valley on South Island. They analyzed cores drilled from the marsh to look at differences in the kinds of sediment and the shells from tiny marine animals called foraminifera, deposited throughout a stretch of the lagoon's history.These data revealed that the lagoon sank relatively suddenly twice during the past 1000 years, suggesting that the land was subsiding as a result of significant earthquakes. The more recent earthquake occurred between 520 and 470 years ago.Another earlier earthquake probably took place between 880 and 800 years ago. Judging by the sedimentary debris found at that time, this earthquake was accompanied by a tsunami that swept more than 360 meters inland at the study site.Clark said the findings will help researchers better understand the risks posed by large subduction interface earthquakes in the region. Studies have shown, for example, that there have been subduction earthquakes on the central Hikurangi margin, "and we wanted to understand if the central and southern Hikurangi margins are likely to rupture in the same earthquake," said Clark."We can see that the [southern Hikurangi margin] subduction earthquake at about 500 years before present possibly correlates with a central Hikurangi margin earthquake, implying both segments may have ruptured in the same earthquake," she continued, "but the radiocarbon dating is not yet precise enough to be certain--possibly there were two earthquake closely spaced in time."Clark said that she and other scientists are looking at other locations in the lower North Island to find evidence of the same paleoearthquakes, which could help provide a better picture of how big these quakes might have been and how they impacted the region."In addition we would like to go further back in time and find evidence of older subduction earthquakes," Clark said."With a longer record of past subduction earthquakes we can get a better constraint on the recurrence of such earthquakes, which will help to forecast future subduction earthquakes." | Earthquakes | 2,015 |
May 18, 2015 | https://www.sciencedaily.com/releases/2015/05/150518121652.htm | Common mechanism for shallow and deep earthquakes proposed | Earthquakes are labeled "shallow" if they occur at less than 50 kilometers depth. They are labeled "deep" if they occur at 300-700 kilometers depth. When slippage occurs during these earthquakes, the faults weaken. How this fault weakening takes place is central to understanding earthquake sliding. | A new study published online in "Although shallow earthquakes -- the kind that threaten California -- must initiate differently from the very deep ones, our new work shows that, once started, they both slide by the same physics," said deep-earthquake expert Harry W. Green II, a distinguished professor of the Graduate Division in UC Riverside's Department of Earth Sciences, who led the research project. "Our research paper presents a new, unifying model of how earthquakes work. Our results provide a more accurate understanding of what happens during earthquake sliding that can lead to better computer models and could lead to better predictions of seismic shaking danger."The physics of the sliding is the self-lubrication of the earthquake fault by flow of a new material consisting of tiny new crystals, the study reports. Both shallow earthquakes and deep ones involve phase transformations of rocks that produce tiny crystals of new phases on which sliding occurs."Other researchers have suggested that fluids are present in the fault zones or generated there," Green said. "Our study shows fluids are not necessary for fault weakening. As earthquakes get started, local extreme heating takes place in the fault zone. The result of that heating in shallow earthquakes is to initiate reactions like the ones that take place in deep earthquakes so they both end up lubricated in the same way."Green explained that at 300-700 kilometers depth, the pressure and temperature are so high that rocks in this deep interior of the planet cannot break by the brittle processes seen on Earth's surface. In the case of shallow earthquakes, stresses on the fault increase slowly in response to slow movement of tectonic plates, with sliding beginning when these stresses exceed static friction. While deep earthquakes also get started in response to increasing stresses, the rocks there flow rather than break, except under special conditions."Those special conditions of temperature and pressure induce minerals in the rock to break down to other minerals, and in the process of this phase transformation a fault can form and suddenly move, radiating the shaking -- just like at shallow depths," Green said.The research explains why large faults like the San Andreas Fault in California do not have a heat-flow anomaly around them. Were shallow earthquakes to slide by the grinding and crunching of rock, as geologists once imagined, the process would generate enough heat so that major faults like the San Andreas would be a little warmer along their length than they would be otherwise."But such a predicted warm region along such faults has never been found," Green said. "The logical conclusion is that the fault must move more easily than we thought. Extreme heating in a very thin zone along the fault produces the very weak lubricant. The volume of material that is heated is very small and survives for a very short time -- seconds, perhaps -- followed by very little heat generation during sliding because the lubricant is very weak."The new research also explains why faults with glass on them (reflecting the fact that during the earthquake the fault zone melted) are rare. As shallow earthquakes start, the temperature rises locally until it is hot enough to start a chemical reaction -- usually the breakdown of clays or carbonates or other hydrous phases in the fault zone. The reactions that break down the clays or carbonates stop the temperature from climbing higher, with heat being used up in the reactions that produce the nanocrystalline lubricant.If the fault zone does not have hydrous phases or carbonates, the sudden heating that begins when sliding starts raises the local temperature on the fault all the way to the melting temperature of the rock. In such cases, the melt behaves like a lubricant and the sliding surface ends up covered with melt (that would quench to a glass) instead of the nanocrystalline lubricant."The reason this does not happen often, that is, the reason we do not see lots of faults with glass on them, is that the Earth's crust is made up to a large degree of hydrous and carbonate phases, and even the rocks that don't have such phases usually have feldspars that get crushed up in the fault zone," Green explained. "The feldspars will 'rot' to clays during the hundred years or so between earthquakes as water moves along the fault zone. In that case, when the next earthquake comes, the fault zone is ready with clays and other phases that can break down, and the process repeats itself."The research involved the study of laboratory earthquakes -- high-pressure earthquakes as well as high-speed ones -- using electron microscopy in friction and faulting experiments. It was Green's laboratory that first conducted a serendipitous series of experiments, in 1989, on the right kind of mantle rocks that give geologists insight into how deep earthquakes work. In the new work, Green and his team also investigated the Punchbowl Fault, an ancestral branch of the San Andreas Fault that has been exhumed by erosion from several kilometers depth, and found nanometric materials within the fault -- as predicted by their model. | Earthquakes | 2,015 |
May 14, 2015 | https://www.sciencedaily.com/releases/2015/05/150514152825.htm | Earthquakes reveal deep secrets beneath East Asia | A new work based on 3-D supercomputer simulations of earthquake data has found hidden rock structures deep under East Asia. Researchers from China, Canada, and the U.S. worked together to publish their results in March 2015 in the American Geophysical Union | The scientists used seismic data from 227 East Asia earthquakes during 2007-2011, which they used to image depths to about 900 kilometers, or about 560 miles below ground.Notable structures include a high velocity colossus beneath the Tibetan plateau, and a deep mantle upwelling beneath the Hangai Dome in Mongolia. The researchers say their line of work could potentially help find hidden hydrocarbon resources, and more broadly it could help explore the Earth under East Asia and the rest of the world."With the help of supercomputing, it becomes possible to render crystal-clear images of Earth's complex interior," principal investigator and lead author Min Chen said of the study. Chen is a postdoctoral research associate in the department of Earth Sciences at Rice University.Chen and her colleagues ran simulations on the Stampede and Lonestar4 supercomputers of the Texas Advanced Computing Center through an allocation by XSEDE, the eXtreme Science and Engineering Discovery Environment funded by the National Science Foundation."We are combining different kinds of seismic waves to render a more coherent image of the Earth," Chen said. "This process has been helped by supercomputing power that is provided by XSEDE.""What is really new here is that this is an application of what is sometimes referred to as full waveform inversion in exploration geophysics," study co-author Jeroen Tromp said. Tromp is a professor of Geosciences and Applied and Computational Mathematics, and the Blair Professor of Geology at Princeton University.In essence the application combined seismic records from thousands of stations for each earthquake to produce scientifically accurate, high-res 3-D tomographic images of the subsurface beneath immense geological formations.XSEDE provided more than just time on supercomputers for the science team. Through the Campus Champions program, researchers worked directly with Rice XSEDE champion Qiyou Jiang of Rice's Center for Research Computing and with former Rice staffer Roger Moye, who used Rice's DAVinCI supercomputer to help Chen with different issues she had with high performance computing." "They are the contacts I had with XSEDE," Chen said."These collaborations are really important," said Tromp of XSEDE. "They cannot be done without the help and advice of the computational science experts at these supercomputing centers. Without access to these computational resources, we would not be able to do this kind of work."Like a thrown pebble generates ripples in a pond, earthquakes make waves that can travel thousands of miles through the Earth. A seismic wave slows down or speeds up a small percentage as it travels through changes in rock composition and temperature. The scientists mapped these wave speed changes to model the physical properties of rock hidden below ground.Tromp explained that the goal for his team was to match the observed ground-shaking information at seismographic stations to fully numerical simulations run on supercomputers."In the computer, we set off these earthquakes," says Tromp. "The waves ripple across southeast Asia. We simulate what the ground motion should look like at these stations. Then we compare that to the actual observations.The differences between our simulations and the observations are used to improve our models of the Earth's interior," Tromp said. "What's astonishing is how well those images correlate with what we know about the tectonics, in this case, of East Asia from surface observations."The Tibetan Plateau, known as 'the roof of the world,' rises about three miles, or five kilometers above sea level. The details of how it formed remain hidden to scientists today.The leading theory holds that the plateau formed and is maintained by the northward motion of the India plate, which forces the plateau to shorten horizontally and move upward simultaneously.Scientists can't yet totally account for the speed of the movement of ground below the surface at the Tibetan Plateau or what happened to the Tethys Ocean that once separated the India and Eurasia plates. But a piece of the puzzle might have been found."We found that beneath the Tibetan plateau, the world's largest and highest plateau, there is a sub-vertical high velocity structure that extends down to the bottom of the mantle transition zone," Chen said.The bottom of the transition zone goes to depths of 660 kilometers, she said. "Three-dimensional geometry of the high velocity structure depicts the lithosphere beneath the plateau, which gives clues of the fate of the subducted oceanic and the continental parts of the Indian plate under the Eurasian plate," Chen said.The collision of plates at the Tibetan Plateau has caused devastating earthquakes, such as the recent 2015 Nepal earthquake at the southern edge of where the two plates meet. Scientists hope to use earthquakes to model the substructure and better understand the origins of these earthquakes.To reach any kind of understanding, the scientists first grappled with some big data, 1.7 million frequency-dependent traveltime measurements from seismic waveforms. "We applied this very sophisticated imaging technique called adjoint tomography with a key component that is a numerical code package called SPECFEM3D_GLOBE," Chen said.Specifically, they used SPECFEM3D GLOBE, open source software maintained by the UC Davis Computational Infrastructure for Geodynamics. "It uses parallel computing to simulate the very complex seismic waves through the Earth," Chen said.Even with the tools in place, the study was still costly. "The cost is in the simulations of the wave propagation," says Tromp. "That takes hundreds of cores for tens of minutes at a time per earthquake.As you can imagine, that's a very expensive proposition just for one iteration simulating all these 227 earthquakes." In all, the study used about eight million CPU hours on the Stampede and Lonestar4 supercomputers."The big computing power of supercomputers really helped a lot in terms of shortening the simulation time and in getting an image of the Earth within a reasonable timeframe," said Chen. "It's still very challenging. It took us two years to develop this current model beneath East Asia. Hopefully, in the future it's going to be even faster."Three-D imaging inside the Earth can help society find new resources, said Tromp. The iterative inversion methods they used to model structures deep below are the same ones used in exploration seismology to look for hidden hydrocarbons."There's a wonderful synergy at the moment," Tromp said. "The kinds of things we're doing here with earthquakes to try and image the Earth's crust and upper mantle and what people are doing in exploration geophysics to try and image hydrocarbon reservoirs.""In my point of view, it's the era of big seismic data," Chen said. She said their ultimate goal is to make everything about seismic imaging methods automatic and accessible by anyone to better understand the Earth.It sounded something like a Google Earth for inside the Earth itself. "Right, exactly. Assisted by the supercomputing systems of XSEDE, you can have a tour inside the Earth and possibly make some new discoveries." Chen said. | Earthquakes | 2,015 |
May 11, 2015 | https://www.sciencedaily.com/releases/2015/05/150511095610.htm | The origins and future of Lake Eyre and the Murray-Darling Basin | Geoscientists have, for the first time, discovered the origins of Australia's two largest basins: Lake Eyre and the Murray-Darling Basin. The research also implies that in 30 million years' time both basins will cease to exist. | Monash University geoscientist Associate Professor Wouter Schellart, and his colleague Professor Wim Spakman from Utrecht University, have discovered how the floor of an entire ocean basin that was destroyed 70 to 50 million years ago off the North coast of New Guinea is currently located at 800-1200km depth below Central and South-eastern Australia.Using supercomputers, the researchers found that this dense piece of ocean floor material (called a lithospheric slab) is slowly sinking into the Earth's mantle and is responsible for the formation of the Lake Eyre Basin, one of the Earth's largest internally drained basins and home to the lowest point in Australia at 15m below sea level, as well as the Murray-Darling Basin, home to the largest river system in Australia. With a combined surface area exceeding 2 million square kilometres, both basins are located directly above the deep mantle slab.The research also predicts that in 30 million years from now, when Australia has moved about 1500km northwards, the fossil slab will be located below the Southern Ocean and, as a consequence, the Lake Eyre Basin and Murray Darling Basin will cease to exist.Using geological and geophysical data from the New Guinea region, Schellart was able to reconstruct the geological evolution of the region over the last 70 million years, including the motion of the tectonic plates and plate boundaries. He discovered that the occurrence of deep ocean floor rocks, volcanic rocks and deformed rocks, which are currently found in the mountain ranges of New Guinea, point to the existence of a 4000km wide subduction zone. At subduction zones such as these, an oceanic tectonic plate sinks (subducts) into the Earth's interior, the mantle.With these plate tectonic reconstructions Schellart was able to predict where the fossil subduction zone was during its lifetime some 50-70 million years ago, and therefore where the lithospheric slab disappeared into the mantle. With a global seismic tomography model that makes use of seismic waves to map the internal structure of the Earth's mantle, Schellart and Spakman were able to identify the fossil slab structure below central and south-eastern Australia at a location and depth predicted by the reconstructions."When we first compared the predictions of our reconstructions with the mantle tomography model we were amazed by how perfectly they aligned," Associate Professor Schellart said.The researchers then developed a computer model to simulate flow in the Earth's mantle to be able to predict the sinking velocity of the fossil slab and to investigate how this sinking might affect the Earth's surface. They found that although the slab is sinking at a rate of less than 1cm per year, this slow sinking generates a downward flow in the mantle that is sufficient to pull down the Earth's surface and create these huge basins."It is incredible to think that tectonic processes that took place millions of years ago at the northern margin of New Guinea are responsible for the landscape we see today in Central and South-eastern Australia," Associate Professor Schellart said.The research was undertaken with the assistance of resources from the National Computational Infrastructure (NCI), which is supported by the Australian Government.The research was published in the international journal | Earthquakes | 2,015 |
May 6, 2015 | https://www.sciencedaily.com/releases/2015/05/150506125115.htm | Explosive volcanoes fueled by water | University of Oregon geologists have tapped water in surface rocks to show how magma forms deep underground and produces explosive volcanoes in the Cascade Range. | "Water is a key player," says Paul J. Wallace, a professor in the UO's Department of Geological Sciences and coauthor of a paper in the May issue of A five-member team, led by UO doctoral student Kristina J. Walowski, methodically examined water and other elements contained in olivine-rich basalt samples that were gathered from cinder cone volcanoes that surround Lassen Peak in Northern California, at the southern edge of the Cascade chain.The discovery helps solve a puzzle about plate tectonics and Earth's deep water cycle beneath the Pacific Ring of Fire, which scientists began studying in the 1960s to understand the region's propensity for big earthquakes and explosive volcanoes. The ring stretches from New Zealand, along the eastern edge of Asia, north across the Aleutian Islands of Alaska and south along the coast of North and South America. It contains more than 75 percent of the planet's volcanoes.To understand how water affects subduction of the oceanic plate, in which layers of different rock types sink into the mantle, the UO team studied hydrogen isotopes in water contained in tiny blobs of glass trapped in olivine crystals in basalt.To do so, the team used equipment in Wallace's lab, CAMCOR, the Carnegie Institution in Washington, D.C., and a lab at Oregon State University. CAMCOR is UO's Advanced Materials Characterization in Oregon, a high-tech extension service located in the underground Lorry I. Lokey Laboratories.Next, the team fed data gained from the rocks into a complex computer model developed by co-author Ikudo Wada, then of Japan's Tohoku University. She has since joined the University of Minnesota.That combination opened a window on how rising temperatures during subduction drive water out of different parts of the subducted oceanic crust, Walowski said. Water migrates upwards and causes the top of the subducted oceanic crust to melt, producing magma beneath the Cascade volcanoes.The key part of the study, Wallace said, involved hydrogen isotopes. "Most of the hydrogen in water contains a single proton," he said. "But there's also a heavy isotope, deuterium, which has a neutron in addition to the proton. It is important to measure the ratio of the two isotopes. We use this ratio as a thermometer, or probe, to study what's happening deep inside the earth.""Melting of the subducting oceanic crust and the mantle rock above it would not be possible without the addition of water," Walowski said. "Once the melts reach the surface, the water can directly affect the explosiveness of magma. However, evidence for this information is lost to the atmosphere during violent eruptions." | Earthquakes | 2,015 |
May 6, 2015 | https://www.sciencedaily.com/releases/2015/05/150506111200.htm | Earthquakes: Supercycles in subduction zones | On 11 March 2011, a massive release of stress between two overlapping tectonic plates occurred beneath the ocean floor off the coast of Japan, triggering a giant tsunami. The Tohoku quake resulted in the death of more than 15,000 people, the partial or total destruction of nearly 400,000 buildings, and major damage to the Fukushima nuclear power plant. This "superquake" may have been the largest in a series of earthquakes, thus marking the end of what's known as a supercycle: a sequence of several large earthquakes. | A research team at ETH Zurich headed by Taras Gerya, professor of geophysics, and Ylona van Dinther is studying supercycles such as this that occur in subduction zones. Geologists use the term "subduction zone" to refer to the boundary between two tectonic plates along a megathrust fault, where one plate underthrusts the other and moves into the earth's mantle. These zones are found all over the world: off the South American coast, in the US's Pacific Northwest, off Sumatra -- and of course in Japan.However, earthquakes don't occur at just any point along a megathrust fault, but only in the fault's seismogenic zones. Why? In these zones, friction prevents relative movement of the plates over long periods of time. "This causes stresses to build up; an earthquake releases them all of a sudden," explains ETH doctoral student Robert Herrendörfer. After the quake has released these stresses, the continued movement of the plates builds up new stresses, which are then released by new earthquakes -- and an earthquake cycle is born. In a supercycle, the initial quakes rupture only parts of a subduction zone segment, whereas the final "superquake" affects the entire segment.Several different theories have been advanced to explain this "gradual rupture" phenomenon, but they all assume that individual segments along the megathrust fault are governed by different frictional properties. "This heterogeneity results in a kind of 'patchwork rug'," says Herrendörfer. "To begin with, earthquakes rupture individual smaller patches, but later a 'superquake' ruptures several patches all at once."In a new article recently published in To understand this, you first have to picture the physical forces at work in a subduction zone. As one plate dives beneath the other at a particular angle, the plates along the megathrust fault become partially coupled together, so the lower plate pulls the upper one down with it.The ETH researchers ran computer simulations of this process, with the overriding plate represented by a wedge and the lower by a rigid slab. Since the plates are connected to each other only within the seismogenic zone, the wedge is deformed and physical stresses build up. In the adjacent earthquake-free zones, the plates can move relative to each other.These stresses build up most rapidly at the edges of the seismogenic zone. If the stress there becomes greater than the plate's frictional resistance, the wedge decouples from the lower plate and begins to move relative to the subducting plate. As the relative speed increases, the frictional resistance decreases -- allowing the wedge to move even faster. The result is a rapid succession of interactions: an earthquake.The earthquake spreads out, stopping only when it reaches a point where the frictional resistance is once again greater than the stress. That is where the slip event ends and both plates couple together again.As part of his dissertation work, Herrendörfer has investigated how the width of the seismogenic zone affects this process. The models show that at the start of a supercycle, the difference between the stress and the frictional resistance is very large -- and the wider the seismogenic zone, the larger the difference. "This means that the first earthquakes in this area will only partially rupture the seismogenic zone," says Herrendörfer. In narrower zones, it takes just one earthquake to rupture the entire zone. In wider zones that are about 120 km or more across, the stress is released in a series of several quakes and ultimately in a superquake.Empirical data supports this explanation. "To date, supercycles have been observed only in subduction zones with a larger-than-average seismogenic zone about 110 km across," says Herrendörfer.Based on their findings, the ETH researchers have defined further regions in addition to those already known as places that could be affected by supercycles -- namely, the subduction zones off Kamchatka, the Antilles, Alaska and Java.However, Herrendörfer cautions against jumping to conclusions. "Our theoretical models represent nature only to a limited extent, and aren't suitable for predicting earthquakes," he emphasises. "Our efforts were aimed at improving our understanding of the physical processes at work in an earthquake cycle. In future, this knowledge could be used for generating long-term estimates of the risk of earthquakes." The method can also be applied to continental collision zones, such as the Himalayan mountain range, where Nepal was recently struck by a devastating quake.Subduction zones are convergent boundaries of tectonic plates, areas where plates move towards and against each other. These convergent boundaries also include continental collision zones such as the Alps and the Himalayas, where the Indian plate is colliding with the Asian plate. Other plate boundaries are divergent, where the plates are moving away from each other, such as in Iceland. On transform plate boundaries, plates slide past each other horizontally on a vertical fault. Examples include the San Andreas Fault in California and Turkey's North Anatolian Fault. | Earthquakes | 2,015 |
May 4, 2015 | https://www.sciencedaily.com/releases/2015/05/150504120815.htm | Mystery of India’s rapid move toward Eurasia 80 million years ago explained | In the history of continental drift, India has been a mysterious record-holder. | More than 140 million years ago, India was part of an immense supercontinent called Gondwana, which covered much of the Southern Hemisphere. Around 120 million years ago, what is now India broke off and started slowly migrating north, at about 5 centimeters per year. Then, about 80 million years ago, the continent suddenly sped up, racing north at about 15 centimeters per year -- about twice as fast as the fastest modern tectonic drift. The continent collided with Eurasia about 50 million years ago, giving rise to the Himalayas.For years, scientists have struggled to explain how India could have drifted northward so quickly. Now geologists at MIT have offered up an answer: India was pulled northward by the combination of two subduction zones -- regions in the Earth's mantle where the edge of one tectonic plate sinks under another plate. As one plate sinks, it pulls along any connected landmasses. The geologists reasoned that two such sinking plates would provide twice the pulling power, doubling India's drift velocity.The team found relics of what may have been two subduction zones by sampling and dating rocks from the Himalayan region. They then developed a model for a double subduction system, and determined that India's ancient drift velocity could have depended on two factors within the system: the width of the subducting plates, and the distance between them. If the plates are relatively narrow and far apart, they would likely cause India to drift at a faster rate.The group incorporated the measurements they obtained from the Himalayas into their new model, and found that a double subduction system may indeed have driven India to drift at high speed toward Eurasia some 80 million years ago."In earth science, it's hard to be completely sure of anything," says Leigh Royden, a professor of geology and geophysics in MIT's Department of Earth, Atmospheric and Planetary Sciences. "But there are so many pieces of evidence that all fit together here that we're pretty convinced."Royden and colleagues including Oliver Jagoutz, an associate professor of earth, atmospheric, and planetary sciences at MIT, and others at the University of Southern California have published their results this week in the journal Based on the geologic record, India's migration appears to have started about 120 million years ago, when Gondwana began to break apart. India was sent adrift across what was then the Tethys Ocean -- an immense body of water that separated Gondwana from Eurasia. India drifted along at an unremarkable 40 millimeters per year until about 80 million years ago, when it suddenly sped up to 150 millimeters per year. India kept up this velocity for another 30 million years before hitting the brakes -- just when the continent collided with Eurasia."When you look at simulations of Gondwana breaking up, the plates kind of start to move, and then India comes slowly off of Antarctica, and suddenly it just zooms across -- it's very dramatic," Royden says.In 2011, scientists believed they had identified the driving force behind India's fast drift: a plume of magma that welled up from the Earth's mantle. According to their hypothesis, the plume created a volcanic jet of material underneath India, which the subcontinent could effectively "surf" at high speed.However, when others modeled this scenario, they found that any volcanic activity would have lasted, at most, for 5 million years -- not nearly enough time to account for India's 30 million years of high-velocity drift.Instead, Royden and Jagoutz believe that India's fast drift may be explained by the subduction of two plates: the tectonic plate carrying India and a second plate in the middle of the Tethys Ocean.In 2013, the team, along with 30 students, trekked through the Himalayas, where they collected rocks and took paleomagnetic measurements to determine where the rocks originally formed. From the data, the researchers determined that about 80 million years ago, an arc of volcanoes formed near the equator, which was then in the middle of the Tethys Ocean.A volcanic arc is typically a sign of a subduction zone, and the group identified a second volcanic arc south of the first, near where India first began to break away from Gondwana. The data suggested that there may have been two subducting plates: a northern oceanic plate, and a southern tectonic plate that carried India.Back at MIT, Royden and Jagoutz developed a model of double subduction involving a northern and a southern plate. They calculated how the plates would move as each subducted, or sank into the Earth's mantle. As plates sink, they squeeze material out between their edges. The more material that can be squeezed out, the faster a plate can migrate. The team calculated that plates that are relatively narrow and far apart can squeeze more material out, resulting in faster drift."Imagine it's easier to squeeze honey through a wide tube, versus a very narrow tube," Royden says. "It's exactly the same phenomenon."Royden and Jagoutz's measurements from the Himalayas showed that the northern oceanic plate remained extremely wide, spanning nearly one-third of the Earth's circumference. However, the southern plate carrying India underwent a radical change: About 80 million years ago, a collision with Africa cut that plate down to 3,000 kilometers -- right around the time India started to speed up.The team believes the diminished plate allowed more material to escape between the two plates. Based on the dimensions of the plates, the researchers calculated that India would have sped up from 50 to 150 millimeters per year. While others have calculated similar rates for India's drift, this is the first evidence that double subduction acted as the continent's driving force."It's a lucky coincidence of events," says Jagoutz, who sees the results as a starting point for a new set of questions. "There were a lot of changes going on in that time period, including climate, that may be explained by this phenomenon. So we have a few ideas we want to look at in the future." | Earthquakes | 2,015 |
May 1, 2015 | https://www.sciencedaily.com/releases/2015/05/150501221550.htm | Seafloor sensors record possible eruption of underwater volcano | If a volcano erupts at the bottom of the sea, does anybody see it? If that volcano is Axial Seamount, about 300 miles offshore and 1 mile deep, the answer is now: yes. | Thanks to a set of high-tech instruments installed last summer by the University of Washington to bring the deep sea online, what appears to be an eruption of Axial Volcano on April 23 was observed in real time by scientists on shore."It was an astonishing experience to see the changes taking place 300 miles away with no one anywhere nearby, and the data flowed back to land at the speed of light through the fiber-optic cable connected to Pacific City -- and from there, to here on campus by the Internet, in milliseconds," said John Delaney, a UW professor of oceanography who led the installation of the instruments as part of a larger effort sponsored by the National Science Foundation.This custom-built precise pressure sensor detects the seafloor's rise and fall as magma, or molten rock, moves in and out of the underlying magma chamber. Three are installed on the caldera of the underwater volcano.NSF-OOI/UW/CSSFDelaney organized a workshop on campus in mid-April at which marine scientists discussed how this high-tech observatory would support their science. Then, just before midnight on April 23 until about noon the next day, the seismic activity went off the charts.The gradually increasing rumblings of the mountain were documented over recent weeks by William Wilcock, a UW marine geophysicist who studies such systems.During last week's event, the earthquakes increased from hundreds per day to thousands per day, and the center of the volcanic crater dropped by about 6 feet (2 meters) over the course of 12 hours."The only way that could have happened was to have the magma move from beneath the caldera to some other location," Delaney said, "which the earthquakes indicate is right along the edge of the caldera on the east side."The seismic activity was recorded by eight seismometers that measure shaking up to 200 times per second around the caldera and at the base of the 3,000-foot seamount. The height of the caldera was tracked by the bottom pressure tilt instrument, which measures the pressure of the water overhead and then removes the effect of tides and waves to calculate its position.The depth instrument was developed by Bill Chadwick, an oceanographer at Oregon State University and the National Oceanic and Atmospheric Administration who has also been tracking the activity at Axial Volcano and predicted that the volcano would erupt in 2015.The most recent eruptions were in 1998 and 2011.The volcano is located about 300 miles west of Astoria, Oregon, on the Juan de Fuca Ridge, part of the globe-girdling mid-ocean ridge system -- a continuous, 70,000 km (43,500 miles) long submarine volcanic mountain range stretching around the world like the strings on a baseball, and where about 70 percent of the planet's volcanic activity occurs. The highly energetic Axial Seamount, Delaney said, is viewed by many scientists as being representative of the myriad processes operating continuously along the powerful subsea volcanic chain that is present in every ocean."This exciting sequence of events documented by the OOI-Cabled Array at Axial Seamount gives us an entirely new view of how our planet works," said Richard Murray, division director for ocean sciences at the National Science Foundation. "Although the OOI-Cabled Array is not yet fully operational, even with these preliminary observations we can see how the power of innovative instrumentation has the potential to teach us new things about volcanism, earthquakes and other vitally important scientific phenomena."The full set of instruments in the deep-sea observatory is scheduled to come online this year. A first maintenance cruise leaves from the UW in early July, and will let researchers and students further explore the aftermath of the volcanic activity. | Earthquakes | 2,015 |
April 30, 2015 | https://www.sciencedaily.com/releases/2015/04/150430170755.htm | Did dinosaur-killing asteroid trigger largest lava flows on Earth? | The asteroid that slammed into the ocean off Mexico 66 million years ago and killed off the dinosaurs probably rang the Earth like a bell, triggering volcanic eruptions around the globe that may have contributed to the devastation, according to a team of University of California, Berkeley, geophysicists. | Specifically, the researchers argue that the impact likely triggered most of the immense eruptions of lava in India known as the Deccan Traps, explaining the "uncomfortably close" coincidence between the Deccan Traps eruptions and the impact, which has always cast doubt on the theory that the asteroid was the sole cause of the end-Cretaceous mass extinction."If you try to explain why the largest impact we know of in the last billion years happened within 100,000 years of these massive lava flows at Deccan ... the chances of that occurring at random are minuscule," said team leader Mark Richards, UC Berkeley professor of earth and planetary science. "It's not a very credible coincidence."Richards and his colleagues marshal evidence for their theory that the impact reignited the Deccan flood lavas in a paper to be published in While the Deccan lava flows, which started before the impact but erupted for several hundred thousand years after re-ignition, probably spewed immense amounts of carbon dioxide and other noxious, climate-modifying gases into the atmosphere, it's still unclear if this contributed to the demise of most of life on Earth at the end of the Age of Dinosaurs, Richards said."This connection between the impact and the Deccan lava flows is a great story and might even be true, but it doesn't yet take us closer to understanding what actually killed the dinosaurs and the 'forams,'" he said, referring to tiny sea creatures called foraminifera, many of which disappeared from the fossil record virtually overnight at the boundary between the Cretaceous and Tertiary periods, called the KT boundary. The disappearance of the landscape-dominating dinosaurs is widely credited with ushering in the age of mammals, eventually including humans.He stresses that his proposal differs from an earlier hypothesis that the energy of the impact was focused around Earth to a spot directly opposite, or antipodal, to the impact, triggering the eruption of the Deccan Traps. The "antipodal focusing" theory died when the impact crater, called Chicxulub, was found off the Yucatán coast of Mexico, which is about 5,000 kilometers from the antipode of the Deccan traps.Richards proposed in 1989 that plumes of hot rock, called "plume heads," rise through Earth's mantle every 20-30 million years and generate huge lava flows, called flood basalts, like the Deccan Traps. It struck him as more than coincidence that the last four of the six known mass extinctions of life occurred at the same time as one of these massive eruptions."Paul Renne's group at Berkeley showed years ago that the Central Atlantic Magmatic Province is associated with the mass extinction at the Triassic/Jurassic boundary 200 million years ago, and the Siberian Traps are associated with the end Permian extinction 250 million years ago, and now we also know that a big volcanic eruption in China called the Emeishan Traps is associated with the end-Guadalupian extinction 260 million years ago," Richards said. "Then you have the Deccan eruptions -- including the largest mapped lava flows on Earth -- occurring 66 million years ago coincident with the KT mass extinction. So what really happened at the KT boundary?"Richards teamed up with experts in many areas to try to discover faults with his radical idea that the impact triggered the Deccan eruptions, but instead came up with supporting evidence. Renne, a professor in residence in the UC Berkeley Department of Earth and Planetary Science and director of the Berkeley Geochronology Center, re-dated the asteroid impact and mass extinction two years ago and found them essentially simultaneous, but also within approximately 100,000 years of the largest Deccan eruptions, referred to as the Wai subgroup flows, which produced about 70 percent of the lavas that now stretch across the Indian subcontinent from Mumbai to Kolkata.Michael Manga, a professor in the same department, has shown over the past decade that large earthquakes -- equivalent to Japan's 9.0 Tohoku quake in 2011 -- can trigger nearby volcanic eruptions. Richards calculates that the asteroid that created the Chicxulub crater might have generated the equivalent of a magnitude 9 or larger earthquake everywhere on Earth, sufficient to ignite the Deccan flood basalts and perhaps eruptions many places around the globe, including at mid-ocean ridges."It's inconceivable that the impact could have melted a whole lot of rock away from the impact site itself, but if you had a system that already had magma and you gave it a little extra kick, it could produce a big eruption," Manga said.Similarly, Deccan lava from before the impact is chemically different from that after the impact, indicating a faster rise to the surface after the impact, while the pattern of dikes from which the supercharged lava flowed -- "like cracks in a soufflé," Renne said -- are more randomly oriented post-impact."There is a profound break in the style of eruptions and the volume and composition of the eruptions," said Renne. "The whole question is, 'Is that discontinuity synchronous with the impact?'"Richards, Renne and graduate student Courtney Sprain, along with Deccan volcanology experts Steven Self and Loÿc Vanderkluysen, visited India in April 2014 to obtain lava samples for dating, and noticed that there are pronounced weathering surfaces, or terraces, marking the onset of the huge Wai subgroup flows. Geological evidence suggests that these terraces may signal a period of quiescence in Deccan volcanism prior to the Chicxulub impact. Apparently never before noticed, these terraces are part of the western Ghats, a mountain chain named after the Hindu word for steps."This was an existing massive volcanic system that had been there probably several million years, and the impact gave this thing a shake and it mobilized a huge amount of magma over a short amount of time," Richards said. "The beauty of this theory is that it is very testable, because it predicts that you should have the impact and the beginning of the extinction, and within 100,000 years or so you should have these massive eruptions coming out, which is about how long it might take for the magma to reach the surface." | Earthquakes | 2,015 |
April 30, 2015 | https://www.sciencedaily.com/releases/2015/04/150430134933.htm | Rupture along the Himalayan Front | In their article for | A little more than a month on, the area experienced a magnitude 7.8 earthquake, centered in Nepal (25 Apr. 2015).In their study, Morell and colleagues use a series of complementary geomorphic and erosion rate data to define the ramp-flat geometry of the active detachment fault that is likely to host a large earthquake within the hinterland of the northwest Himalaya. Their analysis indicates that this detachment is sufficiently large to host another great earthquake in the western half of the central Himalayan seismic gap.Specifically, their data sets point to a distinctive physiographic transition at the base of the high Himalaya in the state of Uttarakhand, India, characterized by abrupt strike-normal increases in channel steepness and a tenfold increase in erosion rates.When combined with previously published geophysical imaging and seismicity data sets, Morell and colleagues interpret the observed spatial distribution of erosion rates and channel steepness to reflect the landscape response to spatially variable rock uplift due to a structurally coherent ramp-flat system of the Main Himalayan Thrust. They write, "Although it remains unresolved whether the kinematics of the Main Himalayan Thrust ramp involve an emergent fault or duplex, the landscape and erosion rate patterns suggest that the décollement beneath the state of Uttarakhand provides a sufficiently large and coherent fault segment capable of hosting a great earthquake."In conclusion, they note, "While this hypothesis remains speculative, it is supported by independent records of historical seismicity." | Earthquakes | 2,015 |
April 28, 2015 | https://www.sciencedaily.com/releases/2015/04/150428141921.htm | Landslides, mudslides likely to remain a significant threat in Nepal for months | The threat of landslides and mudslides remains high across much of Nepal's high country, and the risk is likely to increase when the monsoon rains arrive this summer, according to a University of Michigan researcher. | U-M geomorphologist Marin Clark and two colleagues have assessed the landslide hazard in Nepal following Saturday's magnitude-7.8 earthquake. They looked for locations where landslides likely occurred during the earthquake, as well as places that are at high risk in the coming weeks and months.The analysis revealed tens of thousands of locations at high risk, Clark said."The majority of them, we expect, have already happened and came down all at once with the shaking on Saturday," she said. "But there will still be slopes that have not yet failed but were weakened. So there will be a continued risk during aftershocks and with the recent rainfall, and again when the monsoon rains arrive this summer."Information from the U-M-led study has been shared with the U.S Geological Survey, NASA, the U.S. Agency for International Development and other responding agencies. It is being used help prioritize both satellite observations and the analysis of data from those satellites, said Clark, an associate professor in the U-M Department of Earth and Environmental Sciences."The satellites looked first at places where lots of people live, including Kathmandu and the foothills areas to the south," Clark said. "Those areas do not look significantly impacted by landsliding, but we're worried about the high country," she said.The region at highest risk for landslides and mudslides is the mountainous area along the Nepal-Tibet border, north of Kathmandu and west of Mount Everest, directly above the fault rupture. The highest-risk zone is at elevations above 8,200 feet in a region that covers 17,550 square miles, which is roughly twice the size of Massachusetts.Cloud cover has blocked observation of much of that region since Saturday's earthquake. But news stories and social media reports of landslides in Nepal's Gorka District and Langtang Valley are consistent with the Clark team's assessment, which showed that those areas are at high risk, she said.Remote villages are scattered throughout the high-risk zone, which also contains the main highway that connects Kathmandu and Tibet. The area is popular with trekkers and mountaineers, as well."Many small Nepalese villages throughout this region have likely been cut off from the rescue operation," Clark said. "This is also high season for trekking and mountaineering, so I expect there are a large number of foreign tourists there, as well."The Clark team's assessment of the landslide risk was based on a computer analysis that looked at earthquake shaking, slope steepness and the strength of various rock types.Their initial analysis was completed Saturday afternoon and was shared with the U.S. Geological Survey and other agencies on Saturday evening. It was revised Sunday morning and distributed through the National Earthquake Hazards Reduction Program on Sunday afternoon.More than 200,000 landslides occurred following a magnitude-7.9 earthquake in a mountainous region of Sichuan Province, China, in 2008, according to Clark. Many of those landslides blocked roads, which slowed response and recovery efforts. The final death toll for that quake was about 70,000.Landsliding is a general term for slowly to very rapidly descending rock and debris. A mudslide or mudflow is a fluid mix of mud and debris that moves down a slope.Landslides in mountainous regions can also block river valleys, creating a significant flooding hazard. Water builds up behind those dam-like structures, creating the potential for catastrophic flooding if the dams are overtopped and then fail."With the satellite images, we'll be looking first at the highest-risk landslide areas that are close to big rivers," Clark said. "Those locations are high priorities."Clark's collaborators on the landslide hazard assessment are U-M's Nathan Niemi and Sean Gallen, a former U-M postdoctoral researcher under Clark who recently accepted a position at ETH Zurich in Switzerland. | Earthquakes | 2,015 |
April 27, 2015 | https://www.sciencedaily.com/releases/2015/04/150427183113.htm | Tidal tugs on 'Teflon' faults drive slow-slipping earthquakes | Unknown to most people, the Pacific Northwest experiences a magnitude-6.6 earthquake about once a year. The reason nobody notices is that the movement happens slowly and deep underground, in a part of the fault whose behavior, known as slow-slip, was only recently discovered. | A University of Washington seismologist who studies slow-slip quakes has looked at how they respond to tidal forces from celestial bodies and used the result to make a first direct calculation of friction deep on the fault. Though these events occur much deeper and on a different type of fault than the recent catastrophe in Nepal, the findings could improve general understanding of when and how faults break.The new study, published online April 27 in "I was able to tease out the effect of friction and found that it is not the friction of normal rocks in the lab -- it's much lower," said author Heidi Houston, a UW professor of Earth and space sciences. "It's closer to Teflon than to sandpaper."The surprising results of the new study could help to better model the physics of these systems, and they could even tell us something about the frictional forces in the locked portion of the fault where hazardous earthquakes occur.The research looked at six recent slow-slip events along the Cascadia subduction zone, one of the best-studied places for these enigmatic slow quakes. The slow quakes are accompanied by tremors, weak seismic vibrations previously thought to be random noise. The tremors begin every 12 to 14 months below Puget Sound, Washington, and then travel north and south at about 5 miles (8 kilometers) per day for several weeks, affecting each section of the fault for about five days.The paper looks at how the gravitational pull of the sun and moon, which slightly deform Earth and oceans, affect forces along, across and inside the fault, and what that means for the slow-slip seismic activity more than 20 miles (35 kilometers) underground.Results show that on the first day of tremors, the tidal forces don't matter much. But starting at about 1 1/2 days -- when Houston thinks minerals that had been deposited from the surrounding fluid and that held the fault together may have broken -- the additional pull of the tides does have an effect."Three days after the slipping has begun, the fault is very sensitive to the tides, and it almost slips only when the tides are encouraging it," Houston said."It implies that something is changing on the fault plane over those five days."By averaging across many sections of the fault, and over all six events, she found that the amount of the tremor increases exponentially with increasing tidal force.Regular fast earthquakes are also very slightly affected by the tides, but they are overwhelmed by other forces and the effect is almost too small to detect.There is no need for worry, Houston says -- even when celestial bodies line up to generate the biggest king tides, the effect would only rarely be enough to actually trigger a slow-slip quake, much less a regular earthquake. But it does tell us something about the physics of a crucial process that can't be easily studied in the lab."We want to understand the physics of what is happening, to understand and predict how the fault is storing stress, and how it's going to relieve stress," Houston said. "Friction is a big piece of that. And it turns out that this part of the fault is much slipperier than previously thought."Slow-slip earthquakes relieve stress right where they slipped, but the movement actually places more stress on neighboring parts of the fault, including the so-called locked zone, where a rupture can cause the most damaging type of earthquakes.In Cascadia's slow-slip events the fault will move about an inch (3 centimeters) over several days, with different parts of the fault moving at different times. When the shallower "locked zone" ruptures, by contrast, a large section of the fault can lurch over 60 feet (18 meters) in minutes. When this occurs, as it does about every 500 years in a magnitude 9 on the Cascadia subduction zone, it generates strong damaging seismic waves in Earth's crust.Still unknown is how slow-slip events are related to the more damaging quakes. A shallower slow-slip event was detected in the weeks before the deadly 2011 Tohoku earthquake and tsunami, on a fault like Cascadia's where an ocean plate plunges below a continental plate."Understanding slow slip and tremor could give us some way to monitor what is happening in the shallower, locked part of the fault," Houston said. "Geophysicists started with the picture of just a flat plane of sandpaper, but that picture is evolving."The study was funded by the National Science Foundation. | Earthquakes | 2,015 |
April 24, 2015 | https://www.sciencedaily.com/releases/2015/04/150424121301.htm | Aid workers should read through archaeologists' notebooks on building houses | Aid workers who provide shelter following natural disasters, such as hurricanes or earthquakes, should consider long-term archaeological information about how locals constructed their homes in the past, and what they do when they repair and rebuild. Archaeologists and international humanitarian organizations are both involved in recovery, with the former doing this for the past, and the latter for the present. So says Alice Samson of the University of Cambridge in the UK, leader of an archaeological overview of building practices used in the Caribbean 1,400 to 450 years ago. It is published in Springer's journal | Research conducted while Samson was at Leiden University in the Netherlands, used data gathered from 150 excavated structures at 16 sites in the Greater Antilles, Turks and Caicos, Virgin Islands and northern Lesser Antilles. They also analyzed sixteenth-century sketches and recollections of house structures by Spanish conquistadors on Haiti and the Dominican Republic, the Bahamas, and Cuba.According to Samson the so-called "Caribbean architectural mode" emerged some 1,400 years ago. Inhabitants reproduced it basic features up until European colonization over 500 years ago. In a region of considerable local settlement diversity, the mode was an effective and preferred way of meeting community and cultural needs, dealing with cyclical environmental hazards, and longer-term historical change.There was no natural one-size-fits-all template. Small, carefully designed and evenly anchored structures with high-pitched roofs and reinforced facades were generally built near the shore, in settlements of varying sizes. An excavation of the settlement of El Cabo in the Dominican Republic, for instance, showed that the houses were circular, between 6.5 and 10 meters in diameter. They consisted of an outer perimeter wall of closely spaced, slender posts, and eight larger, roof-bearing posts, aligned on an inland facing doorway. People worked in sheltered areas outside the fronts of houses, honored their dead and ancestors there, and buried their personal possessions inside when they moved. The structures were well adapted for dealing with the winds, rains and heat of the Caribbean. Wind exposure was reduced by locating houses in an irregular pattern and using windbreaks and partitions. Houses lasted for centuries because they were deliberately rebuilt and repaired.The islands across the Caribbean archipelago share a maritime climate with marked variations in wind and temperature. Humanitarian relief is often needed following dramatic climatic and seismic events such as hurricanes, tropical storms, tsunamis, earthquakes and volcanoes. Samson's study sheds light on how communities who have been struck by such disasters can be better supported by humanitarian efforts to rebuild homes. She believes this can be done by encouraging a dialogue between different disciplines dealing with housing, to develop a greater appreciation of historical shelter processes. Although in many places, especially recently urbanized settings, there is no direct connection between archaeological houses and now, nevertheless everywhere has a prior housing process whether ancient or more recent.Co-author Kate Crawford, an engineer and previously Shelter Field Advisor for CARE International, contends that, "Humanitarian decision-making often happens in a context of scant evidence and overwhelming data. This was one of the challenges faced in the response to the earthquake in Haiti in 2010. What we need is a better appreciation of strategic questions, research on alternatives such as repair, and support for local responses, not just technical solutions.""We hope that this long-term, regional analysis of house features might contribute to greater engagement between archaeologists and those responsible for building, or rebuilding the present," says Samson. | Earthquakes | 2,015 |
April 24, 2015 | https://www.sciencedaily.com/releases/2015/04/150424085009.htm | SWIFTS spectrometer to study Earth's tides and quakes | The world's smallest spectrometer has successfully measured tiny deformations of Earth's crust, of the order of one millimeter, over a length of one thousand kilometers. Researchers at the Institut des Sciences de la Terre (CNRS/Université Joseph Fourier/IRD/IFSTTAR/Université de Savoie) and the Institut de Planétologie et d'Astrophysique de Grenoble (CNRS/Université Joseph Fourier) used the SWIFTS[1] spectrometer to detect these as yet poorly understood movements. Their work[2] is published on 23 April 2015 in the journal | The team carried out the measurements at the Low Noise Underground Laboratory (CNRS/Université de Nice/Université d'Avignon), located in a former nuclear missile firing station on the Plateau d'Albion in southern France. The exceptional conditions at the site allow accurate research to be carried out, shielded from variations in pressure and temperature at a depth of three hundred meters underground. As a result, the system used was able to measure deformation, notably caused by Earth tides, at a resolution of one part per billion. This is equivalent to a variation of one millimeter over a length of one thousand kilometers. As well as measurements related to Earth tides, the instrument also measured a signal from the 2014 Iquique earthquake in Chile.The system works by using white light that travels through a fiber and is reflected by two Fabry-Pérot interferometers[3], each of which is made up of two Bragg gratings. A Bragg grating is a microscopic mirror inside an optical fiber, obtained by treatment with ultraviolet light. This series of reflections only transmits certain wavelengths to SWIFTS. The spectrometer can then determine, in a fraction of a second, the relative position of the two mirrors to the nearest nanometer, thus making it possible to assess the extent of the deformation.The SWIFTS spectrometer was developed by Resolution Spectra System, a start-up company stemming from research carried out at Joseph Fourier University in Grenoble. The extremely small size of the spectrometer, 30 x 1.5 x 1.5 millimeters, means that measurements can be taken using an instrument the size of a matchbox. This could eventually make it possible to set up a network of sensors in areas of tectonic activity or on volcanoes. Studies carried out on the 1999 Izmit earthquake in Turkey and the 2011 Tohoku-Oki earthquake in Japan revealed the existence of slow precursory displacements before the main quake. This is precisely the type of phenomenon that SWIFTS can measure, which could contribute to forecasting the onset of such natural disasters.[1] Stationary-wave integrated Fourier transform spectrometer [2] Carried out as part of labex OSUG@2020 at the Observatoire des Sciences de l'Univers, Grenoble. [3] Interferometry is a measurement method that makes use of interference between several waves. | Earthquakes | 2,015 |
April 24, 2015 | https://www.sciencedaily.com/releases/2015/04/150424105400.htm | Continental U. S: Map shows content and origins of the geologic basement | map showing the many different pieces of Earth's crust that comprise the nation's geologic basement is now available from the U.S. Geological Survey. This is the first map to portray these pieces, from the most ancient to recent, by the events that influenced their composition, starting with their origin. This product provides a picture of the basement for the U.S., including Alaska, that can help scientists produce regional and national mineral resource assessments, starting with the original metal endowments in source rocks. | "Traditionally, scientists have assessed mineral resources using clues at or near the Earth's surface to determine what lies below," said USGS scientist Karen Lund, who led the project. "This map is based on the concept that the age and origins of basement rocks influenced the nature and location of mineral deposits. It offers a framework to examine mineral resources and other geologic aspects of the continent from its building blocks up," said Lund.More than 80 pieces of crust have been added to the nation's basement since the Earth began preserving crust about 3.6 billion years ago. These basement domains had different ages and origins before they became basement rocks, and this map includes these as key factors that determined their compositions and the original metals that may be available for remobilization and concentration into ore deposits. The map further classifies the basement domains according to how and when they became basement, as these events also influence the specific metals and deposit types that might be found in a region.Users can identify domains potentially containing specific metals or deposit types. They can configure the companion database to show the construction of the U.S. through time. The map also provides a template to correlate regional to national fault and earthquake patterns. The map is also available on a separate site, where users can combine data and overlay known mineral sites or other features on the domains.Basement rocks are crystalline rocks lying above the mantle and beneath all other rocks and sediments. They are sometimes exposed at the surface, but often they are buried under miles of rock and sediment and can only be mapped over large areas using remote geophysical surveys. This map was compiled using a variety of methods, including data from national-scale gravity and aeromagnetic surveys.Crustal rocks are modified several times before they become basement, and these transitions alter their composition. Basement rocks are continental crust that has been modified by a wide variety of plate tectonic events involving deformation, metamorphism, deposition, partial melting and magmatism. Ultimately, continental crust forms from pre-existing oceanic crust and overlying sediments that have been thus modified.It is not only the myriad processes that result in varying basement rock content but also the time when these processes occurred during the Earth's history. For example, because the Earth has evolved as a planet during its 4.5 billion year history, early deposit types formed when there was less oxygen in the atmosphere and the thin crust was hotter. The ancient domains are now more stable and less likely to be altered by modern processes that could cause metals to migrate. By contrast, basement rocks that formed out of crust that is less than one billion years old have origins that can be interpreted according to the present-day rates and scales of plate tectonic processes that reflect a more mature planet with a thicker crust.By incorporating ancient to modern processes, this map offers a more complete and consistent portrait of the nation's geologic basement than previous maps and presents a nationwide concept of basement for future broad-scale mineral resource assessments and other geologic studies. | Earthquakes | 2,015 |
April 23, 2015 | https://www.sciencedaily.com/releases/2015/04/150423215641.htm | Fracking? Injecting wastewater? New insight on ground shaking from human-made earthquakes | Significant strides in science have been made to better understand potential ground shaking from induced earthquakes, which are earthquakes triggered by human practices. | Earthquake activity has sharply increased since 2009 in the central and eastern United States. The increase has been linked to industrial operations that dispose of wastewater by injecting it into deep wells.The U. S. Geological Survey (USGS) released a report today that outlines a preliminary set of models to forecast how hazardous ground shaking could be in the areas where sharp increases in seismicity have been recorded. The models ultimately aim to calculate how often earthquakes are expected to occur in the next year and how hard the ground will likely shake as a result. This report looked at the central and eastern United States; future research will incorporate data from the western states as well.This report also identifies issues that must be resolved to develop a final hazard model, which is scheduled for release at the end of the year after the preliminary models are further examined. These preliminary models should be considered experimental in nature and should not be used for decision-making.USGS scientists identified 17 areas within eight states with increased rates of induced seismicity. Since 2000, several of these areas have experienced high levels of seismicity, with substantial increases since 2009 that continue today. This is the first comprehensive assessment of the hazard levels associated with induced earthquakes in these areas. A detailed list of these areas is provided in the accompanying map, including the states of Alabama, Arkansas, Colorado, Kansas, New Mexico, Ohio, Oklahoma, and Texas.Scientists developed the models by analyzing earthquakes in these zones and considering their rates, locations, maximum magnitude, and ground motions."This new report describes for the first time how injection-induced earthquakes can be incorporated into U.S. seismic hazard maps," said Mark Petersen, Chief of the USGS National Seismic Hazard Modeling Project. "These earthquakes are occurring at a higher rate than ever before and pose a much greater risk to people living nearby. The USGS is developing methods that overcome the challenges in assessing seismic hazards in these regions in order to support decisions that help keep communities safe from ground shaking."In 2014, the USGS released updated National Seismic Hazard Maps, which describe hazard levels for natural earthquakes. Those maps are used in building codes, insurance rates, emergency preparedness plans, and other applications. The maps forecast the likelihood of earthquake shaking within a 50-year period, which is the average lifetime of a building. However, these new induced seismicity products display intensity of potential ground shaking from induced earthquakes in a one-year period. This shorter timeframe is appropriate because the induced activity can vary rapidly with time and is subject to commercial and policy decisions that could change at any point.These new methods and products result in part from a workshop hosted by the USGS and the Oklahoma Geological Survey. The workshop, described in the new report, brought together a broad group of experts from government, industry and academic communities to discuss the hazards from induced earthquakes.Wastewater that is salty or polluted by chemicals needs to be disposed of in a manner that prevents contaminating freshwater sources. Large volumes of wastewater can result from a variety of processes, such as a byproduct from energy production. Wastewater injection increases the underground pore pressure, which may lubricate nearby faults thereby making earthquakes more likely to occur. Although the disposal process has the potential to trigger earthquakes, most wastewater disposal wells do not produce felt earthquakes.Many questions have been raised about whether hydraulic fracturing -- commonly referred to as "fracking" -- is responsible for the recent increase of earthquakes. USGS's studies suggest that the actual hydraulic fracturing process is only occasionally the direct cause of felt earthquakes.Real the newly published USGS report, “ | Earthquakes | 2,015 |
April 23, 2015 | https://www.sciencedaily.com/releases/2015/04/150423142717.htm | Scientists see deeper Yellowstone magma | University of Utah seismologists discovered and made images of a reservoir of hot, partly molten rock 12 to 28 miles beneath the Yellowstone supervolcano, and it is 4.4 times larger than the shallower, long-known magma chamber. | The hot rock in the newly discovered, deeper magma reservoir would fill the 1,000-cubic-mile Grand Canyon 11.2 times, while the previously known magma chamber would fill the Grand Canyon 2.5 times, says postdoctoral researcher Jamie Farrell, a co-author of the study published online today in the journal "For the first time, we have imaged the continuous volcanic plumbing system under Yellowstone," says first author Hsin-Hua Huang, also a postdoctoral researcher in geology and geophysics. "That includes the upper crustal magma chamber we have seen previously plus a lower crustal magma reservoir that has never been imaged before and that connects the upper chamber to the Yellowstone hotspot plume below."Contrary to popular perception, the magma chamber and magma reservoir are not full of molten rock. Instead, the rock is hot, mostly solid and spongelike, with pockets of molten rock within it. Huang says the new study indicates the upper magma chamber averages about 9 percent molten rock -- consistent with earlier estimates of 5 percent to 15 percent melt -- and the lower magma reservoir is about 2 percent melt.So there is about one-quarter of a Grand Canyon worth of molten rock within the much larger volumes of either the magma chamber or the magma reservoir, Farrell says.The researchers emphasize that Yellowstone's plumbing system is no larger -- nor closer to erupting -- than before, only that they now have used advanced techniques to make a complete image of the system that carries hot and partly molten rock upward from the top of the Yellowstone hotspot plume -- about 40 miles beneath the surface -- to the magma reservoir and the magma chamber above it."The magma chamber and reservoir are not getting any bigger than they have been, it's just that we can see them better now using new techniques," Farrell says.Study co-author Fan-Chi Lin, an assistant professor of geology and geophysics, says: "It gives us a better understanding the Yellowstone magmatic system. We can now use these new models to better estimate the potential seismic and volcanic hazards."The researchers point out that the previously known upper magma chamber was the immediate source of three cataclysmic eruptions of the Yellowstone caldera 2 million, 1.2 million and 640,000 years ago, and that isn't changed by discovery of the underlying magma reservoir that supplies the magma chamber."The actual hazard is the same, but now we have a much better understanding of the complete crustal magma system," says study co-author Robert B. Smith, a research and emeritus professor of geology and geophysics at the University of Utah.The three supervolcano eruptions at Yellowstone -- on the Wyoming-Idaho-Montana border -- covered much of North America in volcanic ash. A supervolcano eruption today would be cataclysmic, but Smith says the annual chance is 1 in 700,000.Before the new discovery, researchers had envisioned partly molten rock moving upward from the Yellowstone hotspot plume via a series of vertical and horizontal cracks, known as dikes and sills, or as blobs. They still believe such cracks move hot rock from the plume head to the magma reservoir and from there to the shallow magma chamber.The study in Yellowstone is among the world's largest supervolcanoes, with frequent earthquakes and Earth's most vigorous continental geothermal system.The three ancient Yellowstone supervolcano eruptions were only the latest in a series of more than 140 as the North American plate of Earth's crust and upper mantle moved southwest over the Yellowstone hotspot, starting 17 million years ago at the Oregon-Idaho-Nevada border. The hotspot eruptions progressed northeast before reaching Yellowstone 2 million years ago.Here is how the new study depicts the Yellowstone system, from bottom to top:-- Previous research has shown the Yellowstone hotspot plume rises from a depth of at least 440 miles in Earth's mantle. Some researchers suspect it originates 1,800 miles deep at Earth's core. The plume rises from the depths northwest of Yellowstone. The plume conduit is roughly 50 miles wide as it rises through Earth's mantle and then spreads out like a pancake as it hits the uppermost mantle about 40 miles deep. Earlier Utah studies indicated the plume head was 300 miles wide. The new study suggests it may be smaller, but the data aren't good enough to know for sure.-- Hot and partly molten rock rises in dikes from the top of the plume at 40 miles depth up to the bottom of the 11,200-cubic mile magma reservoir, about 28 miles deep. The top of this newly discovered blob-shaped magma reservoir is about 12 miles deep, Huang says. The reservoir measures 30 miles northwest to southeast and 44 miles southwest to northeast. "Having this lower magma body resolved the missing link of how the plume connects to the magma chamber in the upper crust," Lin says.-- The 2,500-cubic mile upper magma chamber sits beneath Yellowstone's 40-by-25-mile caldera, or giant crater. Farrell says it is shaped like a gigantic frying pan about 3 to 9 miles beneath the surface, with a "handle" rising to the northeast. The chamber is about 19 miles from northwest to southeast and 55 miles southwest to northeast. The handle is the shallowest, long part of the chamber that extends 10 miles northeast of the caldera.Scientists once thought the shallow magma chamber was 1,000 cubic miles. But at science meetings and in a published paper this past year, Farrell and Smith showed the chamber was 2.5 times bigger than once thought. That has not changed in the new study.Discovery of the magma reservoir below the magma chamber solves a longstanding mystery: Why Yellowstone's soil and geothermal features emit more carbon dioxide than can be explained by gases from the magma chamber, Huang says. Farrell says a deeper magma reservoir had been hypothesized because of the excess carbon dioxide, which comes from molten and partly molten rock.As with past studies that made images of Yellowstone's volcanic plumbing, the new study used seismic imaging, which is somewhat like a medical CT scan but uses earthquake waves instead of X-rays to distinguish rock of various densities. Quake waves go faster through cold rock, and slower through hot and molten rock.For the new study, Huang developed a technique to combine two kinds of seismic information: Data from local quakes detected in Utah, Idaho, the Teton Range and Yellowstone by the University of Utah Seismograph Stations and data from more distant quakes detected by the National Science Foundation-funded EarthScope array of seismometers, which was used to map the underground structure of the lower 48 states.The Utah seismic network has closely spaced seismometers that are better at making images of the shallower crust beneath Yellowstone, while EarthScope's seismometers are better at making images of deeper structures."It's a technique combining local and distant earthquake data better to look at this lower crustal magma reservoir," Huang says. | Earthquakes | 2,015 |
April 23, 2015 | https://www.sciencedaily.com/releases/2015/04/150423125840.htm | The 2011 Tohoku-Oki earthquake was felt from space | For the first time, a natural source of infrasonic waves of Earth has been measured directly from space--450 kilometers above the planet's surface. The source was the massive 2011 Tohoku-Oki earthquake in Japan, and its signature was detected at this orbital altitude only eight minutes after the arrival of seismic and infrasonic waves, according to Jet Propulsion Laboratory and Caltech's Yu-Ming (Oscar) Yang and colleagues in collaboration with the University of New Brunswick, Canada, who present their research today at the annual meeting of the Seismological Society of America (SSA). | Infrasonic waves, with frequencies much lower than an audible voice, can be generated by sources as diverse as volcanoes, earthquakes, rocket launches and meteor air blasts. Their signature can be detected in the upper atmosphere as either fluctuations in air pressure or as electron density disruptions. In the case of the 2011 quake, the Gravity Recovery and Climate Experiment (GRACE) satellites orbiting Earth captured the significant ionospheric signature produced by the quake's infrasonic wave output. The GRACE measurements were similar to those captured by ground-based GPS and seismic stations, the researchers note.The findings suggest that these satellite-based measurements could be useful in future early warning systems of natural hazards. The analysis also lends more support to the idea that such detection systems could be useful in measuring seismic activity from space, as would be the case for infrasonic detection missions like those envisioned for the atmosphere of Venus. | Earthquakes | 2,015 |
April 22, 2015 | https://www.sciencedaily.com/releases/2015/04/150422175004.htm | More Americans at risk from strong earthquakes, says new report | More than 143 million Americans living in the 48 contiguous states are exposed to potentially damaging ground shaking from earthquakes, with as many as 28 million people likely to experience strong shaking during their lifetime, according to research discussed at the annual meeting of Seismological Society of America. The report puts the average long-term value of building losses from earthquakes at $4.5 billion per year, with roughly 80 percent of losses attributed to California, Oregon and Washington. | "This analysis of data from the new National Seismic Hazard Maps reveals that significantly more Americans are exposed to earthquake shaking, reflecting both the movement of the population to higher risk areas on the west coast and a change in hazard assessments," said co-author Bill Leith, senior science advisor at USGS. By comparison, FEMA estimated in 1994 that 75 million Americans in 39 states were at risk from earthquakes.Kishor Jaiswal, a research contractor with the U.S. Geological Survey (USGS), presented the research conducted with colleagues from USGS, FEMA and California Geological Survey. They analyzed the 2014 National Seismic Hazard Maps and the latest data on infrastructure and population from LandScan, a product of Oak Ridge National Laboratory.The report focuses on the 48 contiguous states, where more than 143 million people are exposed to ground motions from earthquakes, but Leith noted that nearly half the U.S. population, or nearly 150 million Americans, are at risk of shaking from earthquakes when Alaska, Puerto Rico and Hawaii are also considered.In the highest hazard zones, where 28 million Americans will experience strong shaking during their lifetime, key infrastructure could also experience a shaking intensity sufficient to cause moderate to extensive damage. The analysis identified more than 6,000 fire stations, more than 800 hospitals and nearly 20,000 public and private schools that may be exposed to strong ground motion from earthquakes.Using the 2010 Census data and the 2012 replacement cost values for buildings, and using FEMA's Hazus program, researchers calculated systematically the losses that could happen on any given year, ranging from no losses to a very high value of loss. However, the long-term average loss to the buildings in the contiguous U.S. is $4.5 billion per year, with most financial losses occurring in California, Oregon and Washington states."Earthquakes remain an important threat to our economy," said Jaiswal. "While the west coast may carry the larger burden of potential losses and the greatest threat from the strongest shaking, this report shows that the threat from earthquakes is widespread." | Earthquakes | 2,015 |
April 22, 2015 | https://www.sciencedaily.com/releases/2015/04/150422121718.htm | Earthquake potential where there is no earthquake history | It may seem unlikely that a large earthquake would take place hundreds of kilometers away from a tectonic plate boundary, in areas with low levels of strain on the crust from tectonic motion. But major earthquakes such as the Mw 7.9 2008 Chengdu quake in China and New Zealand's 2011 Mw 6.3 quake have shown that large earthquakes do occur and can cause significant infrastructure damage and loss of life. So what should seismologists look for if they want to identify where an earthquake might happen despite the absence of historical seismic activity? | Roger Bilham of the University of Colorado shows that some of these regions had underlying features that could have been used to identify that the region was not as "aseismic" as previously thought. Some of these warning signs include debris deposits from past tsunamis or landslides, ancient mid-continent rifts that mark the scars of earlier tectonic boundaries, or old fault scarps worn down by hundreds or thousands of years of erosion.Earth's populated area where there is no written history makes for an enormous "search grid" for earthquakes. For example, the Caribbean coast of northern Colombia resembles a classic subduction zone with the potential for tsunamigenic M>8 earthquakes at millennial time scales, but the absence of a large earthquake since 1492 is cause for complacency among local populations. These areas are not only restricted to the Americas. Bilham notes that in many parts of Asia, where huge populations now reside and critical facilities exist or are planned, a similar historical silence exists. Parts of the Himalaya and central and western India that have not had any major earthquake in more than 500 years could experience shaking at levels and durations that are unprecedented in their written histories. | Earthquakes | 2,015 |
April 22, 2015 | https://www.sciencedaily.com/releases/2015/04/150422121715.htm | Magma intrusion is likely source of Colombia-Ecuador border quake swarms | The "seismic crisis" around the region of the Chiles and Cerro Negro de Mayasquer volcanoes near the Colombia-Ecuador border is likely caused by intruding magma, according to a report by R. Corredor Torres of the Servicio Geológico Colombiano and colleagues presented at the annual meeting of the Seismological Society of America (SSA). | The intruding magma appears to be interacting with the regional tectonics to spawn micro-earthquakes, which at their peak of activity numbered thousands of micro-earthquakes each day. Most of the earthquakes were less than magnitude 3, although the largest quake to date was magnitude 5.6 that took place in October 2014. When the earthquake swarms began in 2013, the Colombian Servicio Geológico Colombiano and the Ecuadoran Instituto Geofísico of the Escuela Politécnica Nacional collaborated to set up a monitoring system to observe the swarms and judge the risk of volcanic eruption for the surrounding population.The largest perceived threat of eruption came in the fall of 2014, when the activity level was changed from yellow to orange, meaning a probable occurrence of eruption in days to weeks. Due to the occurrence of a magnitude 5.6 earthquake and subsequent aftershocks, some houses in the area were damaged and local residents decided to sleep in tents to feel safe, accepting support from the Colombian Disaster Prevention Office, said Torres.Data collected by the new monitoring stations suggest that most of the earthquakes in the area are of a type of volcano-tectonic quakes, which occur when the movement of magma-and the fluids and gases it releases-creates pressure changes in the rocks above. Based on the seismic activity in the area, the researchers infer that millions of cubic meters of magma have moved into the area deep under the Chile and Cerro Negro volcanoes. However, both volcanoes appear to have been dormant for at least 10,000 years, and the tectonic stress in the region is compressive--both of which may be holding the magma back from erupting to the surface. So far, there have been no signs of ground swelling or outgassing at the surface, and the rate of earthquakes has slowed considerably this year from its peak of 7000 -- 8000 micro-quakes per day in the fall of 2014. | Earthquakes | 2,015 |
April 21, 2015 | https://www.sciencedaily.com/releases/2015/04/150421132039.htm | Likely cause of 2013-14 earthquakes: Combination of gas field fluid injection and removal | A seismology team led by Southern Methodist University (SMU), Dallas, finds that high volumes of wastewater injection combined with saltwater (brine) extraction from natural gas wells is the most likely cause of earthquakes occurring near Azle, Texas, from late 2013 through spring 2014. | In an area where the seismology team identified two intersecting faults, they developed a sophisticated 3D model to assess the changing fluid pressure within a rock formation in the affected area. They used the model to estimate stress changes induced in the area by two wastewater injection wells and the more than 70 production wells that remove both natural gas and significant volumes of salty water known as brine.Conclusions from the modeling study integrate a broad-range of estimates for uncertain subsurface conditions. Ultimately, better information on fluid volumes, flow parameters, and subsurface pressures in the region will provide more accurate estimates of the fluid pressure along this fault."The model shows that a pressure differential develops along one of the faults as a combined result of high fluid injection rates to the west and high water removal rates to the east," said Matthew Hornbach, SMU associate professor of geophysics. "When we ran the model over a 10-year period through a wide range of parameters, it predicted pressure changes significant enough to trigger earthquakes on faults that are already stressed."Model-predicted stress changes on the fault were typically tens to thousands of times larger than stress changes associated with water level fluctuations caused by the recent Texas drought."What we refer to as induced seismicity -- earthquakes caused by something other than strictly natural forces -- is often associated with subsurface pressure changes," said Heather DeShon, SMU associate professor of geophysics. "We can rule out stress changes induced by local water table changes. While some uncertainties remain, it is unlikely that natural increases to tectonic stresses led to these events."DeShon explained that some ancient faults in the region are more susceptible to movement -- "near critically stressed" -- due to their orientation and direction. "In other words, surprisingly small changes in stress can reactivate certain faults in the region and cause earthquakes," DeShon said.The study, "Causal Factors for Seismicity near Azle, Texas," has been published online in the journal The study was produced by a team of scientists from SMU's Department of Earth Sciences in Dedman College of Humanities and Sciences, the U.S. Geological Survey, the University of Texas Institute for Geophysics and the University of Texas Department of Petroleum and Geosystems Engineering. SMU scientists Hornbach and DeShon are the lead authors.SMU seismologists have been studying earthquakes in North Texas since 2008, when the first series of felt tremors hit near DFW International Airport between Oct. 30, 2008, and May 16, 2009. Next came a series of quakes in Cleburne between June 2009 and June 2010, and this third series in the Azle-Reno area northwest of Fort Worth occurred between November 2013 and January 2014. The SMU team also is studying an ongoing series of earthquakes in the Irving-Dallas area that began in April 2014.In both the DFW sequence and the Cleburne sequence, the operation of injection wells used in the disposal of natural gas production fluids was listed as a possible cause of the seismicity. The introduction of fluid pressure modeling of both industry activity and water table fluctuations in the Azle study represents the first of its kind, and has allowed the SMU team to move beyond assessment of possible causes to the most likely cause identified in this report.Prior to the DFW Airport earthquakes in 2008, an earthquake large enough to be felt had not been reported in the North Texas area since 1950. The North Texas earthquakes of the last seven years have all occurred in areas developed for natural gas extraction from a geologic formation known as the Barnett Shale. The Texas Railroad Commission reports that production in the Barnett Shale grew exponentially from 216 million cubic feet a day in 2000, to 4.4 billion cubic feet a day in 2008, to a peak of 5.74 billion cubic feet of gas a day in 2012.While the SMU Azle study adds to the growing body of evidence connecting some injection wells and, to a lesser extent, some oil and gas production to induced earthquakes, SMU's team notes that there are many thousands of injection and/or production wells that are not associated with earthquakes.The area of study addressed in the report is in the Newark East Gas Field (NEGF), north and east of Azle. In this field, hydraulic fracturing is applied to loosen and extract gas trapped in the Barnett Shale, a sedimentary rock formation formed approximately 350 million years ago. The report explains that along with natural gas, production wells in the Azle area of the NEGF can also bring to the surface significant volumes of water from the highly permeable Ellenburger Formation -- both naturally occurring brine as well as fluids that were introduced during the fracking process.Subsurface fluid pressures are known to play a key role in causing seismicity. A primer produced by the U.S. Department of Energy explains the interplay of fluids and faults:"The fluid pressure in the pores and fractures of the rocks is called the 'pore pressure.' The pore pressure acts against the weight of the rock and the forces holding the rock together (stresses due to tectonic forces). If the pore pressures are low (especially compared to the forces holding the rock together), then only the imbalance of natural in situ earth stresses will cause an occasional earthquake. If, however, pore pressures increase, then it would take less of an imbalance of in situ stresses to cause an earthquake, thus accelerating earthquake activity. This type of failure...is called shear failure. Injecting fluids into the subsurface is one way of increasing the pore pressure and causing faults and fractures to "fail" more easily, thus inducing an earthquake. Thus, induced seismicity can be caused by injecting fluid into the subsurface or by extracting fluids at a rate that causes subsidence and/or slippage along planes of weakness in the earth."All seismic waveform data used in the compilation of the report are publically available at the IRIS Data Management Center. Wastewater injection, brine production and surface injection pressure data are publicly available at the Texas Railroad Commission (TRC). Craig Pearson at the TRC, Bob Patterson from the Upper Trinity Groundwater Conservation District; scientists at XTO Energy, ExxonMobil, MorningStar Partners and EnerVest provided valuable discussions and, in some instances, data used in the completion of the report."This report points to the need for even more study in connection with earthquakes in North Texas," said Brian Stump, SMU's Albritton Chair in Earth Sciences. "Industry is an important source for key data, and the scope of the research needed to understand these earthquakes requires government support at multiple levels." | Earthquakes | 2,015 |
April 10, 2015 | https://www.sciencedaily.com/releases/2015/04/150410165310.htm | Researchers test smartphones for earthquake warning | Smartphones and other personal electronic devices could, in regions where they are in widespread use, function as early warning systems for large earthquakes according to newly reported research. This technology could serve regions of the world that cannot afford higher quality, but more expensive, conventional earthquake early warning systems, or could contribute to those systems. | The study, led by scientists at the U.S. Geological Survey and published April 10 in the inaugural volume of the new AAAS journal Using crowd-sourced observations from participating users' smartphones, earthquakes could be detected and analyzed, and customized earthquake warnings could be transmitted back to users. "Crowd-sourced alerting means that the community will benefit by data generated from the community," said Sarah Minson, USGS geophysicist and lead author of the study. Minson was a post-doctoral researcher at Caltech while working on this study.Earthquake early warning systems detect the start of an earthquake and rapidly transmit warnings to people and automated systems before they experience shaking at their location. While much of the world's population is susceptible to damaging earthquakes, EEW systems are currently operating in only a few regions around the globe, including Japan and Mexico. "Most of the world does not receive earthquake warnings mainly due to the cost of building the necessary scientific monitoring networks," said USGS geophysicist and project lead Benjamin Brooks.Researchers tested the feasibility of crowd-sourced EEW with a simulation of a hypothetical magnitude 7 earthquake, and with real data from the 2011 magnitude 9 Tohoku-oki, Japan earthquake. The results show that crowd-sourced EEW could be achieved with only a tiny percentage of people in a given area contributing information from their smartphones. For example, if phones from fewer than 5000 people in a large metropolitan area responded, the earthquake could be detected and analyzed fast enough to issue a warning to areas farther away before the onset of strong shaking. "The speed of an electronic warning travels faster than the earthquake shaking does," explained Craig Glennie, a report author and professor at the University of Houston.The authors found that the sensors in smartphones and similar devices could be used to issue earthquake warnings for earthquakes of approximately magnitude 7 or larger, but not for smaller, yet potentially damaging earthquakes. Comprehensive EEW requires a dense network of scientific instruments. Scientific-grade EEW, such as the U.S. Geological Survey's ShakeAlert system that is currently being implemented on the west coast of the United States, will be able to help minimize the impact of earthquakes over a wide range of magnitudes. However, in many parts of the world where there are insufficient resources to build and maintain scientific networks, but consumer electronics are increasingly common, crowd-sourced EEW has significant potential."The U.S. earthquake early warning system is being built on our high-quality scientific earthquake networks, but crowd-sourced approaches can augment our system and have real potential to make warnings possible in places that don't have high-quality networks," said Douglas Given, USGS coordinator of the ShakeAlert Earthquake Early Warning System. The U.S. Agency for International Development has already agreed to fund a pilot project, in collaboration with the Chilean Centro Sismologico Nacional, to test a pilot hybrid earthquake warning system comprising stand-alone smartphone sensors and scientific-grade sensors along the Chilean coast."The use of mobile phone fleets as a distributed sensor network -- and the statistical insight that many imprecise instruments can contribute to the creation of more precise measurements -- has broad applicability including great potential to benefit communities where there isn't an existing network of scientific instruments," said Bob Iannucci of Carnegie Mellon University, Silicon Valley."Thirty years ago it took months to assemble a crude picture of the deformations from an earthquake. This new technology promises to provide a near-instantaneous picture with much greater resolution," said Thomas Heaton, a coauthor of the study and professor of Engineering Seismology at Caltech."Crowd-sourced data are less precise, but for larger earthquakes that cause large shifts in the ground surface, they contain enough information to detect that an earthquake has occurred, information necessary for early warning," said study co-author Susan Owen of NASA's Jet Propulsion Laboratory, Pasadena, California. | Earthquakes | 2,015 |
April 6, 2015 | https://www.sciencedaily.com/releases/2015/04/150406144554.htm | New research complicates seismic hazard for British Columbia, Alaska region | The Pacific and North America plate boundary off the coast of British Columbia and southeastern Alaska is a complex system of faults capable of producing very large earthquakes. The recent 2012 Mw 7.8 Haida Gwaii and 2013 Mw 7.5 Craig earthquakes released strain built up over years, but did not release strain along the Queen Charlotte Fault, which remains the likely source of a future large earthquake, according to reports published in a special issue of the | "The study of these two quakes revealed rich details about the interaction between the Pacific and North America Plates, advancing our understanding of the seismic hazard for the region," said Thomas James, research scientist at Geological Survey of Canada and one of the guest editors of the special issue, which includes 19 technical articles on both the Haida Gwaii and Craig events.The Haida Gwaii and Craig earthquakes offered new information about the tectonic complexity of the region. Prior to the 2012 earthquake, the Queen Charlotte Fault, a strike-slip fault similar to the San Andreas Fault in California, was the dominating tectonic structure in the area.Nykolaishen et al. used GPS observations of crustal motion to locate the earthquake's rupture offshore to the west of Haida Gwaii, rather than beneath the islands. A close study of the Haida Gwaii mainshock by Kao et al. revealed the Pacific plate slid at a low angle below the North American plate on a previously suspected thrust fault, confirming the presence of subduction activity in the area."This was an event the thrust interface of the plate boundary system, confirming that there is a subduction system in the Haida Gwaii area," said Honn Kao, seismologist with the Geological Survey of Canada, who, along with his colleagues, examined the source parameters--causative faults, rupture processes and depths--of the mainshock and sequence of strong aftershocks."The implication of a confirmed subduction zone is that in addition to the Queen Charlotte Fault, we now have another source which can produce devastating megathrust earthquakes in the area," said Kao.The aftershocks clustered around the periphery of the rupture zone, both on the seaward and landward side of the plate boundary and reflected normal faulting behavior--caused by the bending, extending or stretching of rock-- rather than the thrust faulting of the mainshock."Our observations of normal faulting imply that the mainshock of the Haida Gwaii earthquake dramatically altered the stress field in the rupture zone, especially in a neighboring region," said Kao.The distribution of aftershocks occurred to the north of a previously identified seismic gap where large earthquakes have not occurred in historic times. The gap is located to the south of the where 1949 M8.1 Queen Charlotte earthquake ruptured. Though the Haida Gwaii earthquake may have activated some part of the Queen Charlotte Fault, said Kao, it was limited and did not relieve stress along the seismic gap.The Haida Gwaii rupture shook southeastern Alaska, and the northwest directivity of ground motion may have influenced the timing of the January 2013 Craig earthquake, suggests James et al. in the introduction to the overall special issue.A report by Stephen Holtkamp and Natalia Ruppert at the University of Alaska Fairbanks examines 1785 aftershocks in the Craig earthquake sequence, identifying a mix of faulting behavior that suggests the region is still in a state of transpression--the plates are both sliding past each other and colliding at an angle.The articles in this special issue will appear in print in early May and online in April. The special issue features three main themes. The regional tectonic framework and the nature of the interaction between the Pacific and North America plates at the Queen Charlotte Fault zone are presented in five papers. Three papers focus on the Craig earthquake and examine the main shock, aftershocks and crustal motions. Ten papers discuss the Haida Gwaii event. | Earthquakes | 2,015 |
April 3, 2015 | https://www.sciencedaily.com/releases/2015/04/150403095931.htm | California quake risk: Newly discovered link between Calaveras, Hayward faults means potentially larger earthquakes | University of California, Berkeley seismologists have proven that the Hayward Fault is essentially a branch of the Calaveras Fault that runs east of San Jose, which means that both could rupture together, resulting in a significantly more destructive earthquake than previously thought. | "The maximum earthquake on a fault is proportional to its length, so by having the two directly connected, we can have a rupture propagating across from one to the other, making a larger quake," said lead researcher Estelle Chaussard, a postdoctoral fellow in the Berkeley Seismological Laboratory. "People have been looking for evidence of this for a long time, but only now do we have the data to prove it."The 70-kilometer-long Hayward Fault is already known as one of the most dangerous in the country because it runs through large population areas from its northern limit on San Pablo Bay at Richmond to its southern end south of Fremont.In an update of seismic hazards last month, the U.S. Geological Survey estimated a 14.3 percent likelihood of a magnitude 6.7 or greater earthquake on the Hayward Fault in the next 30 years, and a 7.4 percent chance on the Calaveras Fault.These are based on the assumption that the two faults are independent systems, and that the maximum quake on the Hayward Fault would be between magnitudes 6.9 and 7.0. Given that the Hayward and Calaveras faults are connected, the energy released in a simultaneous rupture could be 2.5 times greater, or a magnitude 7.3 quake."A rupture from Richmond to Gilroy would produce about a 7.3 magnitude quake, but it would be even greater if the rupture extended south to Hollister, where the Calaveras Fault meets the San Andreas Fault," Chaussard said.Chaussard and her colleagues, including Roland Bürgmann, a UC Berkeley professor of earth and planetary science, reported their findings in the journal Chaussard said there has always been ambiguity about whether the two faults are connected. The Hayward Fault ends just short of the Calaveras Fault, which runs about 123 kilometers from north of Danville south to Hollister in the Salinas Valley.The UC Berkeley team used 19 years of satellite data to map ground deformation using interferometric synthetic aperture radar (InSAR) and measure creep along the southern end of the Hayward Fault, and found, surprisingly, that the creep didn't stop south of Fremont, the presumed southern end of the fault, but continued as far as the Calaveras Fault."We found that it continued on another 15 kilometers and that the trace merged with the trace of the Calaveras Fault," she said. In addition, seismic data show that micro-earthquakes on these faults 3-5 kilometers underground also merge. "With this evidence from surface creep and seismicity, we can argue for a direct junction on the surface and at depth for the two faults."Both are strike-slip faults -- the western side moves northward relative to the eastern side. The researchers found that the underground portion of the Hayward Fault meets the Calaveras Fault 10 kilometers farther north than where the creeping surface traces of both faults meet. This geometry implies that the Hayward Fault dips at an angle where it meets the Calaveras Fault.Chaussard said that the many years of InSAR data, in particular from the European Space Agency's ERS and Envisat satellites from 1992 to 2011, were critical to connecting the two faults.Creep, or the surface movement along a fault, is evidenced by offset curbs, streets and home foundations. It is normally determined by measuring points on opposite sides of a fault every few years, but that is hard to do along an entire fault or in difficult terrain. InSAR provides data over large areas even in vegetated terrains and outside of urban areas, and with the repeated measurements over many years InSAR can detect deformation with a precision of 2 millimeters per year."With InSAR, we have access to much larger spatial coverage," said Chaussard, who has been expanding the use of InSAR to measure water resources and now ground deformation that occurs between earthquakes. "Instead of having a few points, we have over 200,000 points in the Bay Area. And we have access to areas we couldn't go to on the ground."She noted that while creep relieves stress on a fault gradually, eventually the surface movement must catch up with the long-term underground fault movement. The Hayward Fault moves at about 10 millimeters per year underground, but it creeps at only 3 to 8 millimeters per year. Earthquakes occur when the surface suddenly catches up with a fault's underground long-term movement."Creep is delaying the accumulation of stress needed to get to an earthquake, but it does not cancel the earthquake," Chaussard said. | Earthquakes | 2,015 |
March 30, 2015 | https://www.sciencedaily.com/releases/2015/03/150330122406.htm | Seabed samples rewrite earthquake history near Istanbul | Located in the Marmara Sea, major earthquakes along the North Anatolian Fault (NAF) system have repeatedly struck what is current-day Istanbul and the surrounding region, but determining the recurrence rate has proven difficult since the faults are offshore. Cores of marine sediment reveal an earthquake history of the Cinarcik Segment, a main branch of NAF, and suggest a seismic gap where the next earthquake is likely to rupture, as detailed in a new study published in the | The area has experienced several large earthquakes (>M6), and the scientific community has debated the exact location of the ruptures along the North Anatolian Fault, which extends nearly 750 miles across Northern Turkey and in the Aegean Sea. Most of the deformation on the fault is localized on the northern branch of the NAF, which crosses the Marmara Sea."The important part of this study is that it assigns past earthquakes to specific segments of the fault," said lead author Laureen Drab, a seismologist at the Ecole Normale Superieure in Paris, France. "Knowing which segment ruptured when has a big impact on the recurrence rate of earthquakes on the main fault segment that affects Istanbul."Drab and her colleagues examined two cores of sediment deposits removed from the seabed to identify and date widespread quake-induced disturbances. Large earthquakes on submarine faults can cause underwater landslides, shaking up sediments that result in rapidly deposited layers, or turbidites, of silt and sand of jumbled grain sizes, minerals and specific geochemical properties. Radiocarbon dating and other tests of two core samples identified the age and timing of deposits.Combining the historical catalogue and the new data from the core samples, Drab reconstructed the timing of earthquakes along NAF's main segment. The turbidites reveal six large earthquake-related events, from 136 to 1896 AD, along the Cinarcik Fault and reassigned the 1766 AD rupture previously thought to have occurred on the Cinarcik Fault to another segment."The combined records show three entire rupture sequences on the NAF, with the current sequence incomplete along the Cinarcik Fault," said Drab. "Based on this new data, we see that there is a seismic gap on the Cinarcik Segment, which, from my point of view, is where the next earthquake is likely to occur." | Earthquakes | 2,015 |
March 24, 2015 | https://www.sciencedaily.com/releases/2015/03/150324101006.htm | Soils retain, contain radioactivity in Fukushima | Radiation suddenly contaminates the land your family has farmed and lived on for generations. Can soil play a role in protecting crops and human health? | Research in Fukushima, Japan may lend an answer. On March 11, 2011, a magnitude 9.0 earthquake and tsunami caused wide spread destruction in Japan. This included the Fukushima Daiichi nuclear power plant. The plant's nuclear meltdown released a large amount of radioactivity into the environment. The Japanese government evacuated over 100,000 people in the 30 km zone around the plant.Lead researcher Atsushi Nakao's study is the first to investigate the soil's physical and chemical properties in rice fields around the Fukushima site. The study, published in the Journal of Environmental Quality, examined factors affecting soil-to-plant transfer of radioactive cesium (radiocesium) in the Fukushima area.Radiocesium dissolves easily in water, allowing it to spread quickly. However, different soils have the ability to retain various toxins and prevent them from spreading or entering the food chain. The authors measured the ability of a large number of soil samples collected from Fukushima to intercept radiocesium. They found success depends on various factors.One key factor is the presence of rough or weathered edges of certain minerals, such as mica, in the soil. These rough edges catch the radiocesium and prevent its movement. This is the frayed edge site (FES) concentration. Nakao explains, however, that "quantification of the FES with a simple experiment has proven difficult." A "surrogate" measurement used by soil scientists is the radiocesium interception potential (RIP). This measurement is time-consuming and requires specialized facilities, preventing its measure at local institutes.Thus, Nakao's study looked for and found that other, more easily measured soil properties to predict the radiocesium interception potential (RIP) of a soil. "These findings may be useful in screening soils that are particularly vulnerable to transferring radiocesium to plants grown in them," Nakao says. "However, the amounts of radiocesium transferred to plants are normally negligible, because most of the radiocesium is strongly fixed on the frayed edge site."For example, soils with rich organic material show low RIP. The high concentration (over 6%) of carbon and the low acidity of the richer soil decrease the RIP. Soils with higher phosphate absorption also show lower RIP.However, soils with high clay or silt content adsorb radiocesium more readily. The higher mica content in these soils means more frayed edges and a higher RIP. Additionally, potassium interferes with the root's ability to absorb radiocesium.Although the study obtained a rough relationship between RIP and soil properties, Nakao observes that additional factors may be involved. He suggests that future research should examine the contribution of mica in the silt or sand fraction to RIP. | Earthquakes | 2,015 |
March 23, 2015 | https://www.sciencedaily.com/releases/2015/03/150323130849.htm | A stiff new layer in Earth's mantle | By crushing minerals between diamonds, a University of Utah study suggests the existence of an unknown layer inside Earth: part of the lower mantle where the rock gets three times stiffer. The discovery may explain a mystery: why slabs of Earth's sinking tectonic plates sometimes stall and thicken 930 miles underground. | The findings -- published today in the journal "The Earth has many layers, like an onion," says Lowell Miyagi, an assistant professor of geology and geophysics at the University of Utah. "Most layers are defined by the minerals that are present. Essentially, we have discovered a new layer in the Earth. This layer isn't defined by the minerals present, but by the strength of these minerals."Earth's main layers are the thin crust 4 to 50 miles deep (thinner under oceans, thicker under continents), a mantle extending 1,800 miles deep and the iron core. But there are subdivisions. The crust and some of the upper mantle form 60- to 90-mile-thick tectonic or lithospheric plates that are like the top side of conveyor belts carrying continents and seafloors.Oceanic plates collide head-on with continental plates offshore from Chile, Peru, Mexico, the Pacific Northwest, Alaska, Kamchatka, Japan and Indonesia. In those places, the leading edge of the oceanic plate bends into a slab that dives or "subducts" under the continent, triggering earthquakes and volcanism as the slabs descend into the mantle, which is like the bottom part of the conveyor belt. The subduction process is slow, with a slab averaging roughly 300 million years to descend, Miyagi estimates.Miyagi and fellow mineral physicist Hauke Marquardt, of Germany's University of Bayreuth, identified the likely presence of a superviscous layer in the lower mantle by squeezing the mineral ferropericlase between gem-quality diamond anvils in presses. They squeezed it to pressures like those in Earth's lower mantle. Bridgmanite and ferropericlase are the dominant minerals in the lower mantle.The researchers found that ferropericlase's strength starts to increase at pressures equivalent to those 410 miles deep -- the upper-lower mantle boundary -- and the strength increases threefold by the time it peaks at pressure equal to a 930-mile depth.And when they simulated how ferropericlase behaves mixed with bridgmanite deep underground in the upper part of the lower mantle, they calculated that the viscosity or stiffness of the mantle rock at a depth of 930 miles is some 300 times greater than at the 410-mile-deep upper-lower mantle boundary."The result was exciting," Miyagi says. "This viscosity increase is likely to cause subducting slabs to get stuck -- at least temporarily -- at about 930 miles underground. In fact, previous seismic images show that many slabs appear to 'pool' around 930 miles, including under Indonesia and South America's Pacific coast. This observation has puzzled seismologists for quite some time, but in the last year, there is new consensus from seismologists that most slabs pool."How stiff or viscous is the viscous layer of the lower mantle? On the pascal-second scale, the viscosity of water is 0.001, peanut butter is 200 and the stiff mantle layer is 1,000 billion billion (or 10 to the 21st power), Miyagi says.For the new study, Miyagi's funding came from the U.S. National Science Foundation and Marquardt's from the German Science Foundation."Plate motions at the surface cause earthquakes and volcanic eruptions," Miyagi says. "The reason plates move on the surface is that slabs are heavy, and they pull the plates along as they subduct into Earth's interior. So anything that affects the way a slab subducts is, up the line, going to affect earthquakes and volcanism."He says the stalling and buckling of sinking slabs at due to a stiff layer in the mantle may explain some deep earthquakes higher up in the mantle; most quakes are much shallower and in the crust. "Anything that would cause resistance to a slab could potentially cause it to buckle or break higher in the slab, causing a deep earthquake."Miyagi says the stiff upper part of the lower mantle also may explain different magmas seen at two different kinds of seafloor volcanoes.Recycled crust and mantle from old slabs eventually emerges as new seafloor during eruptions of volcanic vents along midocean ridges -- the rising end of the conveyor belt. The magma in this new plate material has the chemical signature of more recent, shallower, well-mixed magma that had been subducted and erupted through the conveyor belt several times. But in island volcanoes like Hawaii, created by a deep hotspot of partly molten rock, the magma is older, from deeper sources and less well-mixed.Miyagi says the viscous layer in the lower mantle may be what separates the sources of the two different magmas that supply the two different kinds of volcanoes.Another implication of the stiff layer is that "if you decrease the ability of the rock in the mantle to mix, it's also harder for heat to get out of the Earth, which could mean Earth's interior is hotter than we think," Miyagi says.He says scientists believe the average temperature and pressure 410 miles deep at the upper-lower mantle boundary is 2,800 degrees Fahrenheit and 235,000 times the atmospheric pressure on Earth's surface. He calculates that at the viscous layer's stiffest area, 930 miles deep, the temperature averages 3,900 degrees Fahrenheit and pressure is 640,000 times the air pressure at Earth's surface.Such conditions prevent geophysicists from visiting Earth's mantle, so "we know a lot more about the surface of Mars than we do Earth's interior," Miyagi says. "We can't get down there, so we have to do experiments to see how these minerals behave under a wide range of conditions, and use that to simulate the behavior of the Earth."To do that, "you take two gem quality diamonds and trap a sample between the tips," he says. "The sample is about the diameter of a human hair. Because the diamond tips are so small, you generate very high pressure just by turning the screws on the press by hand with Allen wrenches."Using diamond anvils, the researchers squeezed thousands of crystals of ferropericlase at pressures up to 960,000 atmospheres. They used ferropericlase with 10 percent and 20 percent iron to duplicate the range found in the mantle.To observe and measure the spacing of atoms in ferropericlase crystals as they were squeezed in diamond anvils, the geophysicists bombarded the crystals with X-rays from an accelerator at Lawrence Berkeley National Laboratory in California, revealing the strength of the mineral at various pressures and allowing the simulations showing how the rock becomes 300 times more viscous at the 930-mile depth than at 410 miles.The finding was a surprise because researchers previously believed that viscosity varied only a little bit at temperatures and pressures in the planet's interior.The study's simulations also determined that just below the 930-mile-deep zone of highest viscosity, slabs sink more easily again as the lower mantle becomes less stiff, which happens because atoms can move more easily within ferropericlase crystals.Descending slabs have been seen as deep as the core-mantle boundary 1,800 miles underground. As the bottom of the conveyor-belt-like mantle slowly moves, the slabs mix with the surrounding rock before the mixture erupts anew millions of years later and thousands of miles away at midocean ridges. | Earthquakes | 2,015 |
March 19, 2015 | https://www.sciencedaily.com/releases/2015/03/150319193120.htm | An 'octopus' robot with eight limbs developed to clear rubble in Fukushima, Japan | Researchers in Japan have jointly developed a robot with four arms and four crawlers which can perform multiple tasks simultaneously to help clean up the rubble left after the 2011 quake-tsunami disasters in Minamisoma, Fukushima. | On March 13th, a remote controlled four-armed, four-wheeled crawler robot designed to clear rubble and save lives in areas with complex terrain was unveiled at the Kikuchi plant in Minami-Soma Fukushima, previously a designated no-go zone from the nuclear disaster crisis. The robot is a collaborative effort between Waseda University's Future Robotics Organization and the Kikuchi Corporation.The robot's name, "Octopus," derives from the fact that it has eight limbs and is 1.7 meters in height and weights 70 kilograms. The robot can be equipped with a fiber laser capable of cutting through stone and a grappler capable of dealing with radioactive waste. It is expected to have a wide range of applications including assistance in lifesaving efforts for people trapped in buildings destroyed by earthquakes, tsunamis, and fires and radioactive waste management.Robots of this variety have generally been focused on performing one function at a time on flat terrain. However, the Octopus robot's ability to utilize its four wheels and crawlers to traverse complex terrain and rubble and its ability to utilize all four of its arms simultaneously thanks to its hydraulic capabilities allows it to perform a wide range of tasks such as clearing rubble, fallen trees, and extinguishing fires. When traversing uneven terrain, the robot uses its two rear arms to support its body while climbing with its two front arms and crawlers. Each arm is capable of lifting objects of up to 200 kilograms and all four arms can be used to lift the robot's body from the ground. This type of robot that can utilize four arms simultaneously is very rare. Presently the robot is operated by two people from a remote location but is expected to be operated by one in the future.The Octopus robot was revealed at a conference for the Fukushima Disaster and Medical Welfare Project. Professor Masakatsu Fujie's robot was presented alongside other robots designed to deal with the issues presented by the Fukushima nuclear disaster and assist in reconstruction efforts. | Earthquakes | 2,015 |
March 16, 2015 | https://www.sciencedaily.com/releases/2015/03/150316092959.htm | Finding fault: New information may help understand earthquakes | New modeling and analyses of fault geometry in the Earth's crust by geoscientist Michele Cooke and colleagues at the University of Massachusetts Amherst are advancing knowledge about fault development in regions where one geologic plate slides past or over another, such as along California's San Andreas Fault and the Denali Fault in central Alaska. | Findings may help more accurately predict earthquake hazards and allow scientists to better understand how Earth evolved.Geologists have long been uncertain about the factors that govern how new faults grow, says Cooke, who was recently elected to the board of directors for the Southern California Earthquake Center. This month in an early online issue of the Fault efficiency refers to a dynamic fault system's effectiveness at transforming input energy from the motions of tectonic plates into movement. For example, a straight fault is more efficient at accommodating strain than a curving fault. An important question is how the efficiency of fault bends evolves with increasing deformation of Earth's crust.Master's student Alex Hatem, who did much of the work in these experiments, with Cooke and postdoctoral scholar Elizabeth Madden, report that fault efficiency increases as new faults grow and link, then reaches a steady state. This implies that bends along crustal faults may persist. The straight fault is the most efficient geometry, Cooke points out. "It's interesting that bends increase in efficiency through new fault growth but they never become as efficient as straight faults."Because earthquakes may stop at restraining bends, it further suggests a new understanding: faults segmented by restraining bends may remain in a sort of stasis rather than developing into systems where earthquakes would rupture the entire length of the fault. Here Cooke explains that comparing a straight fault with a fault at a bend, it is more likely that the fault with the bend will have smaller earthquakes that stop at the bend rather than long earthquake ruptures that pass all the way along the fault.Her UMass Amherst lab is one of only a handful worldwide to use a state-of-the-art modeling technique based on kaolin clay rather than sand to understand the behavior of the Earth's crust. Their advanced techniques with the clay include pixel tracking and other quantitative measurements that allow rich details to be obtained from the models and compared with faults around the world.When scaled properly, data from clay experiments conducted over several hours in a table-top device are useful in modeling restraining bend evolution over thousands of years and at the scale of tens of kilometers. Digital image correlation allows Cooke's team to measure the details of deformation throughout the experiments.For this work, they conducted kaolin experiments to model strike-slip rates measured in a restraining bend along a Dead Sea fault in Israel, a fault growth along the Denali Fault in Alaska, and through the San Gorgonio Knot along the San Andreas Fault in southern California."We apply the results to the southern San Andreas Fault where a restraining bend has persisted for 25 million years, but during that time its active fault configuration has changed in ways that resemble what we observed in our experiments," the authors note.They add, "Results of the clay box experiments provide critical insights into the evolution of restraining bends. Because the experiments scale to crustal lengths and strengths, we can extrapolate from the experiments to kilometer-scale systems. The models show progressive deformation by the successive outboard growth of dipping faults in some cases and persistence of vertical fault in others."Understanding the conditions that foster these distinct patterns helps us interpret the geometry and loading of faults within Earth's crust in order to better constrain earthquake behavior.Cooke says, "Using new digital image correlation techniques allows us very detailed measurements of the displacement in the experiments to provide insights we didn't have before. For the fault bends that we tested, the new analysis reveals that efficiency of the faults increases as new faults grow and link and then reaches a steady state. This suggests that restraining bends along crustal faults may persist," Cooke says.Video: | Earthquakes | 2,015 |
March 10, 2015 | https://www.sciencedaily.com/releases/2015/03/150310174541.htm | New long-term earthquake forecast for California | A new California earthquake forecast by the U.S. Geological Survey and partners revises scientific estimates for the chances of having large earthquakes over the next several decades. | The Third Uniform California Earthquake Rupture Forecast, or UCERF3, improves upon previous models by incorporating the latest data on the state's complex system of active geological faults, as well as new methods for translating these data into earthquake likelihoods.The study confirms many previous findings, sheds new light on how the future earthquakes will likely be distributed across the state and estimates how big those earthquakes might be.Compared to the previous assessment issued in 2008, UCERF2, the estimated rate of earthquakes around magnitude 6.7, the size of the destructive 1994 Northridge earthquake, has gone down by about 30 percent. The expected frequency of such events statewide has dropped from an average of one per 4.8 years to about one per 6.3 years.However, in the new study, the estimate for the likelihood that California will experience a magnitude 8 or larger earthquake in the next 30 years has increased from about 4.7% for UCERF2 to about 7.0% for UCERF3."The new likelihoods are due to the inclusion of possible multi-fault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously," said lead author and USGS scientist Ned Field. "This is a significant advancement in terms of representing a broader range of earthquakes throughout California's complex fault system."Two kinds of scientific models are used to inform decisions of how to safeguard against earthquake losses: an Earthquake Rupture Forecast, which indicates where and when the Earth might slip along the state's many faults, and a Ground Motion Prediction model, which estimates the ground shaking given one of the fault ruptures. The UCERF3 model is of the first kind, and is the latest earthquake-rupture forecast for California. It was developed and reviewed by dozens of leading scientific experts from the fields of seismology, geology, geodesy, paleoseismology, earthquake physics and earthquake engineering.The USGS partner organizations that contributed to this product include the Southern California Earthquake Center, the California Geological Survey and the California Earthquake Authority."We are fortunate that seismic activity in California has been relatively low over the past century. But we know that tectonic forces are continually tightening the springs of the San Andreas fault system, making big quakes inevitable," said Tom Jordan, Director of the Southern California Earthquake Center and a co-author of the study. "The UCERF3 model provides our leaders and the public with improved information about what to expect, so that we can better prepare." | Earthquakes | 2,015 |
March 10, 2015 | https://www.sciencedaily.com/releases/2015/03/150310105228.htm | Friction means Antarctic glaciers more sensitive to climate change than we thought | One of the biggest unknowns in understanding the effects of climate change today is the melting rate of glacial ice in Antarctica. Scientists agree rising atmospheric and ocean temperatures could destabilize these ice sheets, but there is uncertainty about how fast they will lose ice. | The West Antarctic Ice Sheet is of particular concern to scientists because it contains enough ice to raise global sea level by up to 16 feet, and its physical configuration makes it susceptible to melting by warm ocean water. Recent studies have suggested that the collapse of certain parts of the ice sheet is inevitable. But will that process take several decades or centuries?Research by Caltech scientists now suggests that estimates of future rates of melt for the West Antarctic Ice Sheet--and, by extension, of future sea-level rise--have been too conservative. In a new study, published online on March 9 in the Unlike other ice sheets that are moored to land above the ocean, most of West Antarctica's ice sheet is grounded on a sloping rock bed that lies below sea level. In the past decade or so, scientists have focused on the coastal part of the ice sheet where the land ice meets the ocean, called the "grounding line," as vital for accurately determining the melting rate of ice in the southern continent."Our results show that the stability of the whole ice sheet and our ability to predict its future melting is extremely sensitive to what happens in a very small region right at the grounding line. It is crucial to accurately represent the physics here in numerical models," says study coauthor Andrew Thompson, an assistant professor of environmental science and engineering at Caltech.Part of the seafloor on which the West Antarctic Ice Sheet rests slopes upward toward the ocean in what scientists call a "reverse slope gradient." The end of the ice sheet also floats on the ocean surface so that ocean currents can deliver warm water to its base and melt the ice from below. Scientists think this "basal melting" could cause the grounding line to retreat inland, where the ice sheet is thicker. Because ice thickness is a key factor in controlling ice discharge near the coast, scientists worry that the retreat of the grounding line could accelerate the rate of interior ice flow into the oceans. Grounding line recession also contributes to the thinning and melting away of the region's ice shelves--thick, floating extensions of the ice sheet that help reduce the flow of ice into the sea.According to Tsai, many earlier models of ice sheet dynamics tried to simplify calculations by assuming that ice loss is controlled solely by viscous stresses, that is, forces that apply to "sticky fluids" such as honey--or in this case, flowing ice. The conventional models thus accounted for the flow of ice around obstacles but ignored friction. "Accounting for frictional stresses at the ice sheet bottom in addition to the viscous stresses changes the physical picture dramatically," Tsai says.In their new study, Tsai's team used computer simulations to show that even though Coulomb friction affects only a relatively small zone on an ice sheet, it can have a big impact on ice stream flow and overall ice sheet stability.In most previous models, the ice sheet sits firmly on the bed and generates a downward stress that helps keep it attached it to the seafloor. Furthermore, the models assumed that this stress remains constant up to the grounding line, where the ice sheet floats, at which point the stress disappears.Tsai and his team argue that their model provides a more realistic representation--in which the stress on the bottom of the ice sheet gradually weakens as one approaches the coasts and grounding line, because the weight of the ice sheet is increasingly counteracted by water pressure at the glacier base. "Because a strong basal shear stress cannot occur in the Coulomb model, it completely changes how the forces balance at the grounding line," Thompson says.Tsai says the idea of investigating the effects of Coulomb friction on ice sheet dynamics came to him after rereading a classic study on the topic by American metallurgist and glaciologist Johannes Weertman from Northwestern University. "I wondered how might the behavior of the ice sheet differ if one factored in this water-pressure effect from the ocean, which Weertman didn't know would be important when he published his paper in 1974," Tsai says.Tsai thought about how this could be achieved and realized the answer might lie in another field in which he is actively involved: earthquake research. "In seismology, Coulomb friction is very important because earthquakes are thought to be the result of the edge of one tectonic plate sliding against the edge of another plate frictionally," Tsai said. "This ice sheet research came about partly because I'm working on both glaciology and earthquakes."If the team's Coulomb model is correct, it could have important implications for predictions of ice loss in Antarctica as a result of climate change. Indeed, for any given increase in temperature, the model predicts a bigger change in the rate of ice loss than is forecasted in previous models. "We predict that the ice sheets are more sensitive to perturbations such as temperature," Tsai says.Hilmar Gudmundsson, a glaciologist with the British Antarctic Survey in Cambridge, UK, called the team's results "highly significant." "Their work gives further weight to the idea that a marine ice sheet, such as the West Antarctic Ice Sheet, is indeed, or at least has the potential to become, unstable," says Gudmundsson, who was not involved in the study.Glaciologist Richard Alley, of Pennsylvania State University, noted that historical studies have shown that ice sheets can remain stable for centuries or millennia and then switch to a different configuration suddenly."If another sudden switch happens in West Antarctica, sea level could rise a lot, so understanding what is going on at the grounding lines is essential," says Alley, who also did not participate in the research."Tsai and coauthors have taken another important step in solving this difficult problem," he says. | Earthquakes | 2,015 |
March 3, 2015 | https://www.sciencedaily.com/releases/2015/03/150303153849.htm | Pre-1950 structures suffered the most damage from August 2014 Napa quake | An analysis of buildings tagged red and yellow by structural engineers after the August 2014 earthquake in Napa links pre-1950 buildings and the underlying sedimentary basin to the greatest shaking damage, according to one of six reports on the Napa quake published in the March/April issue of | "This data should spur people to retrofit older homes," said John Boatwright, a geophysicist with the U.S. Geological Survey (USGS) in Menlo Park and the lead author of a study that analyzed buildings tagged by the City of Napa.The South Napa earthquake was the largest earthquake to strike the greater San Francisco Bay Area since the magnitude 6.9 Loma Prieta earthquake in 1989, damaging residential and commercial buildings from Brown's Valley through historic downtown Napa."The larger faults, like the San Andreas and Hayward faults, get the public's attention, but lesser known faults, like the West Napa fault, can cause extensive damage. Unreinforced brick masonry and stone buildings have been shown to be especially vulnerable to earthquakes," said Erol Kalkan, a research structural engineer at USGS and guest editor of the The South Napa earthquake occurred on the West Napa Fault system, a recognized but poorly studied fault lying between the longer Rodgers Creek and Green Valley faults, and caused strong ground motions, as detailed in the paper by Tom Brocher et al. The mapped surface rupture was unusually large for a moderate quake, extending nearly eight miles from Cuttings Wharf in the south to just west of Alston Park in the north.An extensive sedimentary basin underlies much of Napa Valley, including the City of Napa. The basin, which may be as much as 2 km deep beneath the city, appears to have amplified the ground motion. A close look at the damaged buildings within the city revealed a clear pattern."Usually I look to certain factors that influence ground motion at a specific site -- proximity to the fault rupture, directivity of the rupture process and the geology underneath the site," said Boatwright. "The source distance and the direction of rupture did not strongly condition the shaking damage in Napa."Boatwright et al., analyzed data provided by structural engineers who inspected and tagged damaged buildings after the earthquake. The 165 red tags (prohibited access) and 1,707 yellow tags (restricted access) stretched across the city but were primarily concentrated within the residential section that lies between State Route 29 and the Napa River, including the historic downtown area.When comparing the distribution of red and yellow-tagged buildings to the underlying sedimentary basin, to the pre-1950 development of Napa and to the recent alluvial geology of Napa Valley, the most severe damage correlates to the age of the buildings--pre-1950 construction--and their location within the basin. Less damaged areas to the east and west of central Napa lie outside of the sedimentary basin, and the moderately damaged neighborhoods to the north lie inside the basin but are of more recent construction.Although the city's buildings suffered extensive damage, there were few reports of ground failure, such as liquefaction and landslides. Brocher et al. suggest the timing of the earthquake near the end of the dry season, the three-year long drought and resulting low water table inhibited the liquefaction of the top layers of sandy deposits, sparing the area greater damage. | Earthquakes | 2,015 |
March 3, 2015 | https://www.sciencedaily.com/releases/2015/03/150303123927.htm | A new level of earthquake understanding: Surprise findings from San Andreas Fault rock sample | As everyone who lives in the San Francisco Bay Area knows, the Earth moves under our feet. But what about the stresses that cause earthquakes? How much is known about them? Until now, our understanding of these stresses has been based on macroscopic approximations. Now, the U.S. Department of Energy (DOE)'s Lawrence Berkeley National Laboratory (Berkeley Lab) is reporting the successful study of stress fields along the San Andreas fault at the microscopic scale, the scale at which earthquake-triggering stresses originate. | Working with a powerful microfocused X-ray beam at Berkeley Lab's Advanced Light Source (ALS), a DOE Office of Science User Facility, researchers applied Laue X-ray microdiffraction, a technique commonly used to map stresses in electronic chips and other microscopic materials, to study a rock sample extracted from the San Andreas Fault Observatory at Depth (SAFOD). The results could one day lead to a better understanding of earthquake events."Stresses released during an earthquake are related to the strength of rocks and thus in turn to the rupture mechanism," says Martin Kunz, a beamline scientist with the ALS's Experimental Systems Group. "We found that the distribution of stresses in our sample were very heterogeneous at the micron scale and much higher than what has been reported with macroscopic approximations. This suggests there are different processes at work at the microscopic and macroscopic scales."Kunz is one of the co-authors of a paper describing this research in the journal Most earthquakes occur when stress that builds up in rocks along active faults, such as the San Andreas, is suddenly released, sending out seismic waves that make the ground shake. The pent- up stress results from the friction caused by tectonic forces that push two plates of rock against one another."In an effort to better understand earthquake mechanisms, several deep drilling projects have been undertaken to retrieve material from seismically active zones of major faults such as SAFOD," says co-author Wenk, a geology professor with the University of California (UC) Berkeley's Department of Earth and Planetary Science and the leading scientist of this study. "These drill-core samples can be studied in the laboratory for direct information about physical and chemical processes that occur at depth within a seismically active zone. The data can then be compared with information about seismicity to advance our understanding of the mechanisms of brittle failure in the Earth's crust from microscopic to macroscopic scales."Kunz, Wenk and their colleagues measured remnant or "fossilized" stress fields in fractured quartz crystals from a sample taken out of a borehole in the San Andreas Fault near Parkfield, California at a depth of 2.7 kilometers. The measurements were made using X-ray Laue microdiffraction, a technique that can determine elastic deformation with a high degree of accuracy. Since minerals get deformed by the tectonic forces that act on them during earthquakes, measuring elastic deformation reveals how much stress acted on the minerals during the quake."Laue microdiffraction has been around for quite some time and has been exploited by the materials science community to quantify elastic and plastic deformation in metals and ceramics, but has been so far only scarcely applied to geological samples," says co-author Tamura, a staff scientist with the ALS's Experimental Systems Group who spearheads the Laue diffraction program at the ALS.The measurements were obtained at ALS beamline 12.3.2, a hard (high-energy) X-ray diffraction beamline specialized for Laue X-ray microdiffraction."ALS Beamline 12.3.2 is one of just a few synchrotron-based X-ray beamlines in the world that can be used to measure residual stresses using Laue micro diffraction," Tamura says.In their analysis, the Berkeley researchers found that while some of the areas within individual quartz fragments showed no elastic deformation, others were subjected to stresses in excess of 200 million pascals (about 30,000 psi). This is much higher than the tens of millions of pascals of stress reported in previous indirect strength measurements of SAFOD rocks."Although there are a variety of possible origins of the measured stresses, we think these measured stresses are records of seismic events shocking the rock," says co-author Chen of China's Xi'an Jiantong University. It is the only mechanism consistent with the geological setting and microscopic observations of the rock."The authors believe their Laue X-ray microdiffraction technique has great potential for measuring the magnitude and orientation of residual stresses in rocks, and that with this technique quartz can serve as "paleo-piezometer" for a variety of geological settings and different rock types."Understanding the stress fields under which different types of rock fail will help us better understand what triggers earthquakes," says Kunz. "Our study could mark the beginning of a whole new era of quantifying the forces that shape the Earth." | Earthquakes | 2,015 |
February 19, 2015 | https://www.sciencedaily.com/releases/2015/02/150219133132.htm | Impact of tsunami on the Columbia River | Engineers at Oregon State University have completed one of the most precise evaluations yet done about the impact of a major tsunami event on the Columbia River, what forces are most important in controlling water flow and what areas might be inundated. | They found, in general, that tidal stages are far more important than river flow in determining the impact of a tsunami; that it would have its greatest effect at the highest tides of the year; and that a tsunami would be largely dissipated within about 50 miles of the river's mouth, near Longview, Wash.Any water level increases caused by a tsunami would be so slight as to be almost immeasurable around the Portland metropolitan area or Bonneville Dam, the study showed. But water could rise as much as 13 feet just inside the mouth of the Columbia River, and almost 7 feet within a few miles of Astoria."There have been previous models of Columbia River run-up as a result of a tsunami, but they had less resolution than this work," said David Hill, an associate professor of civil engineering in the OSU College of Engineering. "We carefully considered the complex hydrodynamics, subsidence of grounds that a tsunami might cause, and the impacts during different scenarios."The impact of tsunamis on rivers is difficult to predict, researchers say, because many variables are involved that can either dampen or magnify their effect. Such factors can include the width and shape of river mouths, bays, river flow, tidal effects, and other forces.But the major tsunami in Japan in 2011, which was caused by geologic forces similar to those facing the Pacific Northwest, also included significant inland reach and damage on local rivers. As a result, researchers are paying increased attention to the risks facing residents along such rivers.The OSU research has been published in the Journal of Waterway, Port, Coastal and Ocean Engineering, by Hill and OSU graduate student Kirk Kalmbacher. It's based on a major earthquake on the Cascadia Subduction Zone and a resulting tsunami, with simulations done at different rivers flows; and high, low, flood and ebb tides.Of some interest is that the lowest elevation of a tsunami wave generally occurs at a high tide, but its overall flooding impact is the greatest because the tide levels are already so high. Because of complex hydrodynamic interactions, the study also found that only on a flood tide would water actually wash up and over the southern spit of the Columbia River mouth, with some local flooding based on that.Tides, overall, had much more impact on the reach of a tsunami than did the amount of water flowing in the river."We were a little surprised that the river's water flow didn't really matter that much," Hill said. "The maximum reach of a tsunami on the Columbia will be based on the tidal level at the time, and of course the magnitude of the earthquake causing the event."Based on a maximum 9.0 magnitude earthquake and associated tsunami, at the highest tide of the year, the research concluded:Maps have been developed as a result of this research that make more precise estimates of the areas which might face tsunami-induced flooding. They should aid land owners and land use planners, Hill said, in making improved preparations for an event that researchers now say is inevitable in the region's future. Experts believe this region faces subduction zone earthquakes every 300-600 years, and the last one occurred in January, 1700.There are some noted differences in the projections on these newer maps and older ones, Hill said. | Earthquakes | 2,015 |
February 11, 2015 | https://www.sciencedaily.com/releases/2015/02/150211123741.htm | Buildings with 'rocking' technology would be more earthquake-resilient | Buildings that rock during an earthquake and return to plumb would withstand seismic shaking better than structural designs commonly used in vulnerable zones of California and elsewhere, a Case Western Reserve University researcher has found. | Those buildings would also be more easily and cheaply repaired and put back into use more quickly, said Michael Pollino, an assistant civil engineering professor at Case School of Engineering.Although Pollino didn't invent the technology, he developed a computer model that compares what are called "rocking steel-braced frames" to current earthquake standards used in low- to mid-rise buildings. His findings are featured in the journal "Currently, engineers are designing low-rise structures for an earthquake that has a 10 percent chance of occurring in a 50-year-lifetime," he said. "We accept there will be damage, but no collapse or loss of life."But what about an event that has a 50 percent chance of occurring?" he continued. "You may still have to tear the building down afterward... I think this design should do more to make the building usable and repairable afterward."Pollino is among a growing number of researchers who are finding advantages to the design, which has not yet made it into practice. There are still details to investigate. He and colleagues are discussing forming a technical committee of civil engineers that would advance the technology into practice.Pollino's modeling suggests optimal sizes for two key components of rocking steel-based frames: viscous damping devices, which are akin to shock absorbers, and steel-yielding devices, which have been likened to electrical fuses because they limit the amount of force transferred to the rest of the structure.But unlike fuses that break to prevent an electrical overload, the steel in steel-yielding devices stretches back and forth during a quake, dissipating seismic energy that would otherwise take its toll on the building structure and contents.Buildings are typically constructed to resist the vertical loads of gravity and weight, but earthquakes create horizontal or lateral loads.Current quake designs rely on building deformation and damage to absorb the loads and prevent collapse during quakes. The loads will stretch and deform or push and buckle traditional braces or the heavy joints where beams and columns meet.The rocking frame would provide a better alternative, Pollino said.To enable a three-story building to rock, the columns of the braced frame are not anchored to the building foundation, but tethered to the foundation by dampers and steel-yielding devices. When seismic shaking strikes, the building rocks as the frame lifts off the foundation and tilts. The devices stretch and compress, dissipating seismic energy.A restoring force provided by the building's own self-weight and post-tensioning strands enables the building to return to plumb when the quake has subsided.To understand what's happening inside the building, Pollino simulated and measured the motion passed from the ground to the floors of the building, including deformations and accelerations that tip bookshelves and damage air-conditioning and heating ducts, partition walls and plumbing.That information was added to the requirements for protecting the building frame to calculate the optimal size of the dampers and yielding devices and locations of their connections to the foundation and frame."Others who have looked at rocking steel-braced frames have come to the same conclusion: there are small upfront costs but clear benefits," Pollino said.Pollino is now applying for funding to begin physically testing designs in the university's structures laboratory. His goal is to help develop design standards for engineers building in earthquake zones. | Earthquakes | 2,015 |
February 10, 2015 | https://www.sciencedaily.com/releases/2015/02/150210155930.htm | Earthquake activity linked to injection wells may vary by region | The Williston Basin in north central U.S. produced fewer earthquakes caused by wastewater injection than in Texas, suggesting the link between seismicity and production activities may vary by region, according to a new study published in the journal | Ongoing since 1950s, petroleum and gas production in the Williston Basin, underlying parts of Montana, North Dakota, South Dakota and Saskatchewan in Canada, changed in recent years to include hydraulic fracturing and horizontal drilling. Scientists from University of Texas at Austin took advantage of new monitoring data to explore the connection between seismicity and petroleum production near the Bakken Formation, an area of historically low seismicity, but with a recent history of increased hydraulic fracturing and wastewater disposal."I'm looking at the relationship between seismicity and industry activity across different geographical areas," said Cliff Frohlich, lead-author of the study and associate director of the Institute for Geophysics at University of Texas at Austin, who previously conducted similar studies, including one of the Barnett Shale near Fort Worth, Texas.Frohlich and his colleagues analyzed data recorded by the EarthScope USArray, a National Science Foundation-funded network of temporary broadband seismometers, during September 2008 and May 2011, and from IHS, Inc., a private company that compiles information about petroleum operations from state regulatory sources. The authors identified nine regional earthquakes in the Williston Basin, comprising an area of approximately 100,000 square kilometers. Three of the nine earthquakes occurred near active injection wells, suggesting a connection to the disposal of wastewater.Using a similar method, Frohlich's previous study of the Barnett Shale region in Texas found significantly greater induced seismicity -- 55 earthquakes within a 5,000 square kilometer region near Fort Worth."Why are earthquakes triggered in some areas and not in others? It's an important question for regulators and the scientific community. Some answers are emerging," said Frohlich, who cites differences in geology, orientation of pre-existing faults, local fault strength, injection practices and the timing and duration of oil and gas production as factors that might influence seismicity."Before we implement severe regulations or schemes to manage injection activity in a particular region, we need to do the homework -- survey the relationship between seismicity and injection activity there to determine what's warranted," said Frohlich.The study, "Analysis of Transportable Array (USArray) Data Shows Earthquakes are Scarce Near Injections Wells in the Williston Basin, 2008-2011," will be published online February 11, 2015 and in the March/April print edition of SRL. In addition to Frohlich, co-authors include Jacob I. Walter and Julia F. W. Gate of University of Texas at Austin. The Seismological Society of America publishes the journal SRL. | Earthquakes | 2,015 |
February 9, 2015 | https://www.sciencedaily.com/releases/2015/02/150209113222.htm | Earth's surprise inside: Geologists unlock mysteries of the planet's inner core | Seismic waves are helping scientists to plumb the world's deepest mystery: the planet's inner core. | Thanks to a novel application of earthquake-reading technology, a research team at the University of Illinois and colleagues at Nanjing University in China have found that the Earth's inner core has an inner core of its own, which has surprising properties that could reveal information about our planet.Led by Xiaodong Song, a professor of geology at the U. of I., and visiting postdoctoral researcher Tao Wang, the team published its work in the journal "Even though the inner core is small -- smaller than the moon -- it has some really interesting features," said Song. "It may tell us about how our planet formed, its history, and other dynamic processes of the Earth. It shapes our understanding of what's going on deep inside the Earth."Researchers use seismic waves from earthquakes to scan below the planet's surface, much like doctors use ultrasound to see inside patients. The team used a technology that gathers data not from the initial shock of an earthquake, but from the waves that resonate in the earthquake's aftermath. The earthquake is like a hammer striking a bell; much like a listener hears the clear tone that resonates after the bell strike, seismic sensors collect a coherent signal in the earthquake's coda."It turns out the coherent signal enhanced by the technology is clearer than the ring itself," said Song. "The basic idea of the method has been around for a while, and people have used it for other kinds of studies near the surface. But we are looking all the way through the center of the Earth."Looking through the core revealed a surprise at the center of the planet -- though not of the type envisioned by novelist Jules Verne.The inner core, once thought to be a solid ball of iron, has some complex structural properties. The team found a distinct inner-inner core, about half the diameter of the whole inner core. The iron crystals in the outer layer of the inner core are aligned directionally, north-south. However, in the inner-inner core, the iron crystals point roughly east-west.Not only are the iron crystals in the inner-inner core aligned differently, they behave differently from their counterparts in the outer-inner core. This means that the inner-inner core could be made of a different type of crystal, or a different phase."The fact that we have two regions that are distinctly different may tell us something about how the inner core has been evolving," Song said. "For example, over the history of the Earth, the inner core might have had a very dramatic change in its deformation regime. It might hold the key to how the planet has evolved. We are right in the center -- literally, the center of the Earth." | Earthquakes | 2,015 |
February 5, 2015 | https://www.sciencedaily.com/releases/2015/02/150205101921.htm | Methane seepage from Arctic seabed occurring for millions of years | Natural seepage of methane offshore the Arctic archipelago Svalbard has been occurring periodically for at least 2,7 million years. Major events of methane emissions happened at least twice during this period, according to a new study. | We worry about greenhouse gas methane. It´s lifetime in the atmosphere is much shorter than CO60 percent of the methane in the atmosphere comes from emissions from human activities. But methane is a natural gas, gigatonnes of it trapped under the ocean floor in the Arctic.And it is leaking. And has been leaking for the longer time than the humans have roamed the Earth." Our planet is leaking methane gas all the time. If you go snorkeling in the Caribbean you can see bubbles raising from the ocean floor at 25 meters depth. We studied this type of release, only in a much deeper, colder and darker environment. And found out that it has been going on, periodically, for as far back as 2,7 million years." says Andreia Plaza Faverola the primary author behind a new paper in She is talking about Vestnesa Ridge in Fram Strait, a thousand meters under the surface of the Arctic Ocean, offshore West-Svalbard. Here, enormous, 800 meters high gas flares rise from the seabed. That's the size of the tallest manmade structure in the world -- Burj Khalifa in Dubai."Half of Vestnesa Ridge is showing very active seepage of methane. The other half is not. But there are obvious pockmarks on the inactive half, cavities and dents in the ocean floor, that we recognized as old seepage features. So we were wondering what activates, or deactivates, the seepage in this area.," says Plaza Faverola.She, and a team of marine geophysicists from CAGE, used the P-Cable technology, to figure it out. It is a seismic instrument that is towed behind a research vessel. It recorded the sediments beneath these pockmarks. P-Cable renders images that look like layers of a cake and enables scientists to visualize deep sediments in 3D." We know from other studies in the region that the sediments we are looking at in our seismic data are at least 2.7 million years old. This is the period of increase of glaciations in the Northern Hemisphere, which influenced the sediment. The P-Cable helped us to see features in this sediment, associated with gas release in the past. ""These features can be buried pinnacles, or cavities, that form what we call gas chimneys in the seismic data. Gas chimneys appear like vertical disturbances in the layers of our sedimentary cake. This enabled us to reconstruct the evolution of gas expulsion from this area, for at least 2,7 million years." says Andreia Plaza Faverola.The seismic signal penetrated into 400 to 500 meters of sediment to map this timescale.By using this method, scientists were able to identify two major events of gas emission throughout this time period: One 1,8 million years ago, the other 200,000 years ago.This means that there is something that activated and deactivated the emissions several times. Plaza Faverola´s paper gives a plausible explanation: It is the movement of the tectonic plates that influences the gas release. Vestnesa is not like California though, riddled with earthquakes because of the moving plates. The ridge is on a so-called passive margin. But as it turns out, it doesn´t take a huge tectonic shift to release the methane stored under the ocean floor."Even though Vestnesa Ridge is on a passive margin, it is between two oceanic ridges that are slowly spreading. These spreading ridges resulted in separation of Svalbard from Greenland and opening of the Fram Strait. The spreading influences the passive margin of West-Svalbard, and even small mechanical collapse in the sediment can trigger seepage." says Faverola.Where does the methane come from? The methane is stored as gas hydrates, chunks of frozen gas and water, up to hundreds of meters under the seabed. Vestnesa hosts a large gas hydrate system. There is some concern that global warming of the oceans may melt this icy gas and release it into the atmosphere. That is not very likely in this area, according to Andreia Plaza Faverola." This is a deep water gas hydrate system, which means that it is in permanently cold waters and under a lot of pressure. This pressure keeps the hydrates stable and the whole system is not vulnerable to global temperature changes. But under the stable hydrates there is gas that is not frozen. The amount of this gas may increase if hydrates melt at the base of this stability zone, or if gas from deeper in the sediments arrives into the system. This could increase the pressure in this part of the system, and the free gas may escape the seafloor through chimneys. Hydrates would still remain stable in this scenario[IS8] ."Throughout Earth´s history there have been several short periods of significant increase in temperature. And these periods often coincide with peaks of methane in the atmosphere, as recorded in ice cores. Scientists such as Plaza Faverola are still debating about the cause of this methane release in the past." One hypotheses is that massive gas release from geological sources, such as volcanos or ocean sediments may have influenced global climate. What we know is that there is a lot of methane released at present time from the ocean floor. What we need to find out is if it reaches the atmosphere, or if it ever did."Historical events of methane release, such as the ones in the Vestnesa Ridge, provide crucial information that can be used in future climate modeling. Knowing if these events repeat, and identifying what makes them happen, may help us to better predict the potential influence of methane from the oceans on future climate. | Earthquakes | 2,015 |
February 5, 2015 | https://www.sciencedaily.com/releases/2015/02/150205083030.htm | Randomness of megathrust earthquakes implied by rapid stress recovery after the Japan earthquake | Associate Professor Bogdan Enescu, Faculty of Life and Environmental Sciences, University of Tsukuba, collaborated with colleagues at the Swiss Federal Institute of Technology in Zurich (ETH Zurich), to show that the stress recovery following the 2011 M9.0 Tohoku-oki earthquake has been significantly faster than previously anticipated; specifically, the stress-state at the plate interface returned within just a few years to levels observed before the megathrust event. In addition, since there is no observable spatial difference in the stress state along the megathrust zone, it is difficult to predict the location and extent of future large ruptures. | Constraining the recurrence of megathrust earthquakes is genuinely important for hazard assessment and mitigation. The prevailing approach to model such events worldwide relies on the segmentation of the subduction zone and quasi-periodic recurrence due to constant tectonic loading. The researchers analyzed earthquakes recorded along a 1,000-km-long section of the subducting Pacific Plate beneath Japan since 1998 to map the relative frequency of small to large earthquakes (the so-called "b-value" parameter -- which on average is close to 1.0), in both space and time. Evidence from laboratory experiments, numerical modeling and natural seismicity indicates that the b-value is negatively correlated with the differential stress. The present analysis reveals that the spatial distribution of b-values reflects well the tectonic processes accompanying plate motion. However, there is no evidence of distinct earthquake-generation regions along the megathrust, associated with the so-called "characteristic earthquakes."Nevertheless, the authors show that parts of the plate interface that ruptured during the 2011 Tohoku-oki earthquake were highly stressed in the years leading up to the earthquake, as expressed by mapped, very low regional b-values. Although the stress was largely released during the 2011 rupture, thus leading to an increase in b-values immediately after the megathrust event, the stress levels (i.e., b-values) quickly recovered to pre-megaquake levels within just a few years. This suggests that the megathrust zone is likely ready for large earthquakes any time with a low but on average constant probability.The study concludes that large earthquakes may not have a characteristic location, size or recurrence interval, and might therefore occur more randomly distributed in time. The authors bring also strong evidence that the size distribution of earthquakes is sensitive to stress variations and its careful monitoring can improve the seismic hazard assessment of the megathrust zone. | Earthquakes | 2,015 |
February 2, 2015 | https://www.sciencedaily.com/releases/2015/02/150202114156.htm | To speed up magma, add water | It was a bit like making a CT scan of a patient's head and finding he had very little brain or making a PET scan of a dead fish and seeing hot spots of oxygen consumption. Scientists making seismic images of the mantle beneath a famous geological site saw the least magma where they expected to see the most. | What was going on? Was it an artifact of their technique or the revelation of something real but not yet understood?In the online Feb. 2 issue of Nature, a team of scientists, led by S. Shawn Wei and Douglas Wiens of Washington University in St. Louis, published a three-dimensional seismic image of mantle beneath the Lau Basin in the South Pacific that had an intriguing anomaly.The basin is an ideal location for studying the role of water in volcanic and tectonic processes. Because the basin is widening, it has many spreading centers, or deep cracks through which magma rises to the surface. And because it is shaped like a V, these centers lie at varying distances from the Tonga Trench, where water is copiously injected into Earth's interior.The scientists knew that the chemistry of the magma erupted through the spreading centers varies with their distance from the trench. Those to the north, toward the opening of the V, erupt a drier magma than those to the south, near the point of the V, where the magma has more water and chemical elements associated with water. Because water lowers the melting temperature of rock, spreading centers to the north also produce less magma than those to the south.Before they constructed images from their seismic data, the scientists expected the pattern in the mantle to match that on the surface. In particular they expected to find molten rock pooled to the south, where the water content in the mantle is highest. Instead the seismic images indicated less melt in the south than in the north.They were flummoxed. After considerable debate, they concluded they were seeing something real. Water, they suggest, increases melting but makes the melt less viscous, speeding its transport to the surface, rather like mixing water with honey makes it flow quicker. Because water-laden magma flushes out so quickly, there is less of it in the mantle at any given moment even though more is being produced over time.The finding brings scientists one step closer to understanding how the Earth's water cycle affects nearly everything on the planet -- not just the clouds and rivers above the surface, but also processes that take place in silence and darkness in the plutonic depths.The Lau Basin, which lies between the island archipelagos of Tonga and Fiji, was created by the collision of the giant slab of the Pacific Ocean's floor colliding with the Australian plate. The older, thus colder and denser, Pacific plate nosed under the Australian plate and sank into the depths along the Tonga Trench, also called the Horizon Deep.Water given off by the wet slab lowered the melting point of the rock above, causing magma to erupt and form a volcanic arc, called the Tonga-Kermadec ridge.About 4 to 6 million years ago, the Pacific slab started to pull away from the Australian slab, dragging material from the mantle beneath it. The Australian plate's margin stretched, splitting the ridge, creating the basin and, over time, rifiting the floor of the basin. Magma upwelling through these rifts created new spreading centers."Leave it long enough," said Wei, a McDonnell Scholar and doctoral student in earth and planetary science in Arts & Sciences, "and the basin will probably form a new ocean.""The Tonga subduction zone is famous in seismology," Wiens said, "because two-thirds of the world's deep earthquakes happen there and subduction is faster there than anywhere else in the world. In the northern part of this area the slab is sinking at about 24 centimeters (9 inches) a year, or nearly a meter every four years. That's four times faster than the San Andreas fault is moving."The images of the mantle published in Nature are based on the rumblings of 200 earthquakes picked up by 50 ocean-bottom seismographs deployed in the Lau Basin in 2009 and 2010 and 17 seismographs installed on the islands of Tonga and Fiji.The 2009-2010 campaign, led by Wiens, PhD, professor of earth and planetary sciences in Arts & Sciences, was one the largest long-term deployments of ocean-bottom seismographs ever undertaken.When the scientists looked carefully at the southern part of the images created from their seismograph recordings, they were mystified by what they saw -- or rather didn't see. In their color-coded images, the area along the southern end of the spreading center in the tip of the basin, called the ELSC, should have been deep red.Instead it registered pale yellow or green, as though it held little magma. "It was the reverse of our expectation," Wei said.To make these images, the scientists calculated the departure of the seismic velocities from reference values, Wei explained. "If the velocities are much lower than normal we attribute that to the presence of molten rock," he said. "The lower the velocity, the more magma."Because water lowers melting temperatures, the scientists expected to see the lowest velocities and the most magma to the south, where the water content is highest.Instead the velocities in the north were much lower than those in the south, signaling less magma in the south. This was all the more mystifying because the southern ELSC is intensely volcanic.If the volcanoes were spewing magma, why wasn't it showing up in the seismic images?After considerable debate, the scientists concluded that water from the slab must make melt transport more efficient as well as lowering melting temperatures."Where there's very little water, the system is not moving the magma through very efficiently," Wiens said. "So where we see a lot of magma pooled in the north, that doesn't necessarily mean there's a lot of melting there, just that the movement of the magma is slow.""Conversely, where there's water, the magma is quickly and efficiently transported to the surface. So where we see little magma in the south, there is just as much melting going on, but the magma is moving through so fast that at any one moment we see less of it in the rock," Wiens said."The only way to explain the distribution of magma is to take account of the effect of water on the transport of magma," Wei said. "We think water makes the magma significantly less viscous, and that's why transport is more efficient.""This is the first study to highlight the effect of water on the efficiency of magma extraction and the speed of magma transport," Wiens said.Wei plans to continue following the water. "A recent paper in Nature described evidence that water is carried down to 410 kilometers (250 miles). I'd like to look deeper," he said. | Earthquakes | 2,015 |
January 28, 2015 | https://www.sciencedaily.com/releases/2015/01/150128152246.htm | Ocean waves used to monitor offshore oil and gas fields | A technology developed by Stanford scientists for passively probing the seafloor using weak seismic waves generated by the ocean could revolutionize offshore oil and natural gas extraction by providing real-time monitoring of the subsurface while lessening the impact on marine life. | "We've shown that we can generate images of the subsurface nearly every day instead of taking snapshots just two or three times a year," said Biondo Biondi, professor of geophysics at Stanford's School of Earth Sciences.Currently, many energy companies use a technique called time-lapse reflection seismology to monitor offshore oil and gas deposits to optimize production and look for hazards such as hidden gas pockets. Reflection seismology involves ships towing arrays of "air guns" that explode every 10 to 15 seconds to produce loud sound pulses. The pulses bounce off the seafloor and geological formations beneath, then journey back to the surface, where they are recorded by hydrophones. The data are then deciphered to reveal details about subsurface structures.Each survey can cost tens of millions of dollars, and as a result they are only conducted two to three times a year. Environmental groups and marine biologists have expressed concerns about the use of air guns for contributing to noise pollution in the ocean that can disturb or even injure marine animals, including humpback whales and giant squid.The new technique developed by Biondi and Sjoerd de Ridder, a student of Biondi's who is now a postdoctoral scientist at the University of Edinburgh, is different. It exploits naturally occurring seismic waves generated by Earth's oceans that are several orders of magnitude weaker than those produced by earthquakes.As ocean waves collide with one another, they create pressures on the sea floor, where they generate seismic waves that then propagate in every direction. Scientists have known about this "ambient seismic field" for nearly a century, but it was only recently that they understood ways to harness it."We knew the ambient seismic energy was there, but we didn't know what we could do with it," De Ridder said. "That understanding has only been developed in recent years. Our technique provides the first large-scale application to harness it for oil and gas production."The technique that Biondi and De Ridder developed, called ambient seismic field noise-correlation tomography, or ASNT, uses sensors embedded in the seafloor. The sensors, which are typically installed by robotic submersibles, are connected to one another by cables and arranged into parallel rows that can span several kilometers of the seafloor. Another cable connects the sensor array to a platform in order to collect data in real time.The sensors record ambient seismic waves traveling through Earth's crust. The waves are ubiquitous, continuously generated and traveling in every direction, but using careful signal-processing schemes they developed, Biondi and De Ridder can digitally isolate only those waves that are passing through one sensor and then another one downstream. When this is done repeatedly, and for multiple sensors in the network, what emerges is a "virtual" seismic wave pattern that is remarkably similar to the kind generated by air guns.Because the ASNT technique is entirely passive, meaning it does not require a controlled explosion or a loud air gun blast to create a seismic wave signature, it can be performed for a fraction of the cost of an active-reflection-seismology survey and should be far less disruptive to marine life, the scientists say.Since 2007, Biondi and De Ridder have been testing and refining their technique in a real-world laboratory in Europe. The scientists worked with the energy companies BP and ConocoPhillips to study recordings from existing sensor arrays in the Valhall and Ekofisk oil fields in the North Sea that are capable of recording ambient seismic waves.The proof-of-concept experiment has been successful, and the scientists have demonstrated that they can image the subsurface at Valhall down to a depth of nearly 1,000 feet. "We've now shown that our technique can very reliably and repeatedly retrieve an image of the near-surface," De Ridder said. "Our hope is that they can also reveal changes in the rocks that could signal an impending problem."The Stanford scientists outlined their technique and detailed some of their results from Valhall, as well as from Ekofisk, in a series of technical papers, the latest of which was recently published in the journal | Earthquakes | 2,015 |
January 19, 2015 | https://www.sciencedaily.com/releases/2015/01/150119154507.htm | Geophysicists find the crusty culprits behind sudden tectonic plate movements | Yale-led research may have solved one of the biggest mysteries in geology -- namely, why do tectonic plates beneath the Earth's surface, which normally shift over the course of tens to hundreds of millions of years, sometimes move abruptly? | A new study published Jan. 19 in the journalOf course, in this case "speedy" still means a million years or longer."Our planet is probably most distinctly marked by the fact that it has plate tectonics," said Yale geophysicist David Bercovici, lead author of the research. "Our work here looks at the evolution of plate tectonics. How and why do plates change directions over time?"Traditionally, scientists believed that all tectonic plates are pulled by subducting slabs -- which result from the colder, top boundary layer of the Earth's rocky surface becoming heavy and sinking slowly into the deeper mantle. Yet that process does not account for sudden plate shifts. Such abrupt movement requires that slabs detach from their plates, but doing this quickly is difficult since the slabs should be too cold and stiff to detach.According to the Yale study, there are additional factors at work. Thick crust from continents or oceanic plateaux is swept into the subduction zone, plugging it up and prompting the slab to break off. The detachment process is then accelerated when mineral grains in the necking slab start to shrink, causing the slab to weaken rapidly.The result is tectonic plates that abruptly shift horizontally, or continents suddenly bobbing up."Understanding this helps us understand how the tectonic plates change through the Earth's history," Bercovici said. "It adds to our knowledge of the evolution of our planet, including its climate and biosphere."The study's co-authors are Gerald Schubert of the University of California-Los Angeles and Yanick Ricard of the Université de Lyon in France. | Earthquakes | 2,015 |
January 7, 2015 | https://www.sciencedaily.com/releases/2015/01/150107204453.htm | How the 'beast quake' is helping scientists track real earthquakes | It's not just the football players who have spent a year training. University of Washington seismologists will again be monitoring the ground-shaking cheers of Seahawks fans, this year with a bigger team, better technology and faster response times. | Scientists with the Pacific Northwest Seismic Network will install instruments this Thursday to provide real-time monitoring of the stadium's movement during the 2015 NFL playoffs.This year, the UW researchers have also upped their game. A new QuickShake tool will provide a faster connection between the sensors and the website. This Saturday will be the first test of the software that displays vibrations within three seconds -- five to 10 times faster and more reliably than readings from the same sensors installed last year.The Pacific Northwest Seismic Network monitors earthquake and volcanic activity throughout the region. Network scientists first got interested in football when a seismometer a block away from the stadium showed vibrations during Marshawn Lynch's legendary Jan. 8, 2011, touchdown run. The resulting seismograph became a celebrity in its own right and coined the term "Beast Quake."After a couple of quieter years, the group got permission last year to place two strong-motion earthquake sensors inside the stadium. The project was a huge hit and the group added a third sensor for the 2014 playoff game.A Beast Quake happens when the energetic jumping and stomping of so many fans at once shakes the stadium and reverberates through the surrounding soil. Seahawks fans also generate record-breaking noise, of course, but sound waves don't rock the building. A guaranteed shaking event with significant public interest is a great test case."We're mostly interested in the speed and the reliability of the communications," said John Vidale, a UW professor of Earth & space sciences and director of the seismic network. "It's hard to simulate thousands of people using this tool all at once. When we can get a lot of people looking, we can see problems that we'd encounter during an actual earthquake."For fans at home, the faster data transfer means that TV viewers may get a tipoff to a big play they'll see on the screen after the 10-second broadcast delay. The researchers have dubbed them "Early Football Rowdiness Warnings."The foot-stomping is a real-world test of technology to detect the bigger shaking that originates underground. The seismic group is working with the U.S. Geological Survey to offer early warnings for the Pacific Northwest that could provide tens of seconds to several minutes' notice of an incoming strong shaking. This year some public agencies and large businesses will have a first chance to try out the system that will eventually be available to the public."The Seahawks experiment should provide us and the Internet-connected public with a feel for the minimum time early warning might provide," said Steve Malone, a UW professor emeritus of Earth & space sciences. "In this case it's football fan activity that generates a signal as a warning for what shows up on TV some seconds later. In the future, it might be seconds to minutes of warning after an earthquake starts."This weekend the group will be beefing up its social-media presence to post updates and respond to questions during the game. That also helps get ready for an emergency situation."During the rumblings on Mt. St. Helens a decade ago there was a huge influx of Web visits and phone calls," Malone said. "Now with social media, it's a whole new ballgame. We've got to learn how to deal with that because it's going to snow us over if we're not prepared."The group will have more staff monitoring social media during the game, and more robust websites that they hope won't slow down or crash during heavy traffic.On the scientific side, they hope to explore the different readings between the three sensors placed at different levels. They also hope to explain some mysterious patterns of shaking during commercial breaks, what one researchers hypothesizes may be a "dance quake."Several researchers will be at the UW campus lab Saturday monitoring the sensors. Two group members will be at the stadium providing eyes on the ground to help explain what could be causing any unusual spikes. They will be rooting for a victory for the Seahawks -- and for science."We're developing these new Web tools, and monitoring the game really motivates everyone to get excited," Vidale said, "and we're rooting for a second helping of roars and rumbles against the Packers or Cowboys to perfect the system." | Earthquakes | 2,015 |
January 5, 2015 | https://www.sciencedaily.com/releases/2015/01/150105182448.htm | Fracking in Ohio confirmed as cause of rare earthquake strong enough to be felt | A new study links the March 2014 earthquakes in Poland Township, Ohio to hydraulic fracturing that activated a previously unknown fault. The induced seismic sequence included a rare felt earthquake of magnitude 3.0, according to research published online by the | In March 2014, a series of five recorded earthquakes, ranging from magnitude 2.1 to 3.0, occurred within one kilometer (0.6 miles) of a group of oil and gas wells operated by Hilcorp Energy, which was conducting active hydraulic fracturing operations at the time. Due to the proximity of a magnitude 3.0 event near a well, the Ohio Department of Natural Resources (ODNR) halted operations at the Hilcorp well on March 10, 2014.Hydraulic fracturing, or fracking, is a method for extracting gas and oil from shale rock by injecting a high-pressure water mixture directed at the rock to release the oil and gas trapped inside. The process of fracturing the rocks normally results in micro-earthquakes much smaller than humans can feel.It remains rare for hydraulic fracturing to cause larger earthquakes that are felt by humans. However, due to seismic monitoring advances and the increasing popularity of hydraulic fracturing to recover hydrocarbons, the number of earthquakes -- felt and unfelt -- associated with hydraulic fracturing has increased in the past decade."These earthquakes near Poland Township occurred in the Precambrian basement, a very old layer of rock where there are likely to be many pre-existing faults," said Robert Skoumal who co-authored the study with Michael Brudzinski and Brian Currie at Miami University in Ohio. "This activity did not create a new fault, rather it activated one that we didn't know about prior to the seismic activity."Using a technique called template matching, the researchers sifted through seismic data recorded by the Earthscope Transportable Array, a network of seismic stations, looking for repeating signals similar to the known Poland Township earthquakes, which were treated like seismic "fingerprints." They identified 77 earthquakes with magnitudes from 1.0 and 3.0 that occurred between March 4 and 12 in the Poland Township area. The local community reported feeling only one earthquake, the magnitude 3.0, on March 10.Skoumal and his colleagues compared the identified earthquakes to well stimulation reports, released in August 2014 by the ODNR, and found the earthquakes coincided temporally and spatially with hydraulic fracturing at specific stages of the stimulation. The seismic activity outlined a roughly vertical, east-west oriented fault within one kilometer of the well. Industry activities at other nearby wells produced no seismicity, suggesting to the authors that the fault is limited in extent."Because earthquakes were identified at only the northeastern extent of the operation, it appears that a relatively small portion of the operation is responsible for the events," said Skoumal, who suggests the template matching technique offers a cost-effective and reliable means to monitor seismicity induced by hydraulic fracturing operations."We just don't know where all the faults are located," said Skoumal. "It makes sense to have close cooperation among government, industry and the scientific community as hydraulic fracturing operations expand in areas where there's the potential for unknown pre-existing faults." | Earthquakes | 2,015 |
December 18, 2014 | https://www.sciencedaily.com/releases/2014/12/141218120727.htm | The tsunami-early warning system for the indian ocean: Ten years after | The day after Christmas this year will mark the 10 anniversary of the tsunami disaster in the Indian Ocean. On 26 December 2004, a quarter of a million people lost their lives, five million required immediate aid and 1.8 million citizens were rendered homeless. The natural disaster, which caused extreme devastation over huge areas and the accompanying grief and anxiety, especially in Indonesia, Thailand and Sri Lanka exceeded the imaginable and reached such drastic dimensions, mainly due to the lack of a warning facility and a disaster management plan for the entire Indian Ocean region at this time. | Germany and the international community of states reacted with immediate support. Within the framework of the German Flood Victim Aid the Federal Government commissioned the Helmholtz Association of German Research Centres under the direction of the GFZ German Research Centre for Geosciences with the development of an Early Warning System for the Indian Ocean. From 2005 to 2011, with the large-scale project Under the auspices of the On 12 October 2011 the exercise drill "IOWAVE11" was carried out in the Indian Ocean. With this drill, Indonesia now avails of one of the most modern Tsunami Early Warning Systems. On the basis of data from approx. 300 measuring stations a warning can be issued at a maximum of five minutes after an earthquake. These measuring stations include e.g. seismometers, GPS stations und coastal tide gauges. With the data gained from the sensors and using the most modern evaluation systems such as SeisComP3 which was developed by GFZ scientists for the analyses of earthquake data and a Tsunami simulation system in the Warning Centre it is possible to compile a comprehensive picture of the situation. With the aid of a decision support system respectively classified warnings for the affected coastal areas can then be issued. A total of 70 people are involved the operation of the Warning Centre in Jakarta, with 30 employees working solely in a full shift system. According to information provided by the BMKG a total of 1700 earthquakes with a magnitude of more than M= 5 and 11 quakes with a magnitude of 7 and higher have been evaluated and six Tsunami Warnings have been issued to the public by the Earthquake Monitoring and Tsunami Early Warning Centre since the hand over in March 2011.Schooling, training and disaster precautions (capacity development) for the local community and Town and District councils have received special emphasis. This Capacity Development has been carried out since 2006 in three "typical" regions: Padang (Sumatra), Chilacap (South-Java) and Denpassar (Bali, tourist stronghold). Here particular emphasis was placed on understanding both the warnings issued and the planned evacuation measures.Local disaster management structures are established with local decision-makers and Disaster Risk Reduction Strategies are developed. Specifically, the education of trainers who are, in turn, responsible for the further spreading of the developed concepts plays a significant role.Another key element is the determination of hazard and risk maps as a basis for the local evacuation planning as well as for future town and land-use planning. In Bali communication with the hotel industry was an additional factor.No Early Warning System will ever be able to prevent a strong earthquake and a resulting tsunami and also, in the future, there will be loss of life and material damage. However, through the existence of an Early Warning System and the integration of organizational measures together with comprehensive capacity building the adverse effects of such a natural disaster can certainly be reduced. | Earthquakes | 2,014 |
December 15, 2014 | https://www.sciencedaily.com/releases/2014/12/141215114101.htm | Scientists observe the Earth grow a new layer under an Icelandic volcano | New research into an Icelandic eruption has shed light on how the Earth’s crust forms, according to a paper published today in | When the Bárðarbunga volcano, which is buried beneath Iceland’s Vatnajökull ice cap, reawakened in August 2014, scientists had a rare opportunity to monitor how the magma flowed through cracks in the rock away from the volcano. The molten rock forms vertical sheet-like features known as dykes, which force the surrounding rock apart. Study co-author Professor Andy Hooper from the Centre for Observation and Modelling of Earthquakes, volcanoes and Tectonics (COMET) at the University of Leeds explained: “New crust forms where two tectonic plates are moving away from each other. Mostly this happens beneath the oceans, where it is difficult to observe.“However, in Iceland this happens beneath dry land. The events leading to the eruption in August 2014 are the first time that such a rifting episode has occurred there and been observed with modern tools, like GPS and satellite radar.”Although it has a long history of eruptions, Bárðarbunga has been increasingly restless since 2005. There was a particularly dynamic period in August and September this year, when more than 22,000 earthquakes were recorded in or around the volcano in just four weeks, due to stress being released as magma forced its way through the rock.Using GPS and satellite measurements, the team were able to track the path of the magma for over 45km before it reached a point where it began to erupt, and continues to do so to this day. The rate of dyke propagation was variable and slowed as the magma reached natural barriers, which were overcome by the build-up of pressure, creating a new segment.The dyke grows in segments, breaking through from one to the next by the build up of pressure. This explains how focused upwelling of magma under central volcanoes is effectively redistributed over large distances to create new upper crust at divergent plate boundaries, the authors conclude.As well as the dyke, the team found ‘ice cauldrons’ – shallow depressions in the ice with circular crevasses, where the base of the glacier had been melted by magma. In addition, radar measurements showed that the ice inside Bárðarbunga’s crater had sunk by 16m, as the volcano floor collapsed.COMET PhD student Karsten Spaans from the University of Leeds, a co-author of the study, added: “Using radar measurements from space, we can form an image of caldera movement occurring in one day. Usually we expect to see just noise in the image, but we were amazed to see up to 55cm of subsidence.”Like other liquids, magma flows along the path of least resistance, which explains why the dyke at Bárðarbunga changed direction as it progressed. Magma flow was influenced mostly by the lie of the land to start with, but as it moved away from the steeper slopes, the influence of plate movements became more important.Summarising the findings, Professor Hooper said: “Our observations of this event showed that the magma injected into the crust took an incredibly roundabout path and proceeded in fits and starts.“Initially we were surprised at this complexity, but it turns out we can explain all the twists and turns with a relatively simple model, which considers just the pressure of rock and ice above, and the pull exerted by the plates moving apart.” | Earthquakes | 2,014 |
December 15, 2014 | https://www.sciencedaily.com/releases/2014/12/141215101732.htm | 2011 Japan earthquake: Fault had been relieving stress at accelerating rate for years | Stanford scientists have found evidence that sections of the fault responsible for the 9.0 magnitude Tohoku earthquake that devastated northern Japan in 2011 were relieving seismic stress at a gradually accelerating rate for years before the quake. | This "decoupling" process, in which the edges of two tectonic plates that are frictionally locked together slowly became unstuck, transferred stress to adjacent sections that were still locked. As a result, the quake, which was the most powerful ever recorded to hit Japan, may have occurred earlier than it might have otherwise, said Andreas Mavrommatis, a graduate student in Stanford's School of Earth Sciences.Mavrommatis and his advisor, Paul Segall, a professor of geophysics at Stanford, reached their conclusions after analyzing 15 years' worth of GPS measurements from the Japanese island of Honshu. Their results were published earlier this year in the journal "We looked at northeastern Japan, which has one of the densest and longest running high-precision GPS networks in the world," Mavrommatis said.Segall said, "The measurements indicated the plate boundary was gradually becoming less locked over time. That was surprising."The scientists will present their work, "Decadal-Scale Decoupling of the Japan Trench Prior to the 2011 Tohoku-Oki Earthquake from Geodetic and Repeating-Earthquake Observations," Dec. 17 at the American Geophysical Union's Fall Meeting in San Francisco. The talk will take place at 5 p.m. PT at the Moscone Convention Center in Moscone South, Room 306.The pair's hypothesis is further supported by a recent analysis they conducted of so-called repeating earthquakes offshore of northern Honshu. The small quakes, which were typically magnitude 3 or 4, occurred along the entire length of the fault line, but each one occurred at the same spot every few years. Furthermore, many of them were repeating not at a constant but an accelerating rate, the scientists found. This acceleration would be expected if the fault were becoming less locked over time, Mavrommatis said, because the decoupling process would have relieved pent-up stress along some sections of the fault but increased stress on adjacent sections."According to our model, the decoupling process would have had the effect of adding stress to the section of the fault that nucleated the Tohoku quake," Segall said. "We suspect this could have accelerated the occurrence of the earthquake."The scientists caution that their results cannot be used to predict the occurrence of the next major earthquake in Japan, but it could shed light on the physical processes that operate on faults that generate the world's largest quakes. | Earthquakes | 2,014 |
December 8, 2014 | https://www.sciencedaily.com/releases/2014/12/141208145757.htm | Re-thinking Southern California earthquake scenarios in Coachella Valley, San Andreas Fault | New three-dimensional (3D) numerical modeling that captures far more geometric complexity of an active fault segment in southern California than any other, suggests that the overall earthquake hazard for towns on the west side of the Coachella Valley such as Palm Springs and Palm Desert may be slightly lower than previously believed. | New simulations of deformation on three alternative fault configurations for the Coachella Valley segment of the San Andreas Fault conducted by geoscientists Michele Cooke and Laura Fattaruso of the University of Massachusetts Amherst, with Rebecca Dorsey of the University of Oregon, appear in the December issue of The Coachella Valley segment is the southernmost section of the San Andreas Fault in California. It has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years, the authors point out.The researchers acknowledge that their new modeling offers "a pretty controversial interpretation" of the data. Many geoscientists do not accept a dipping active fault geometry to the San Andreas Fault in the Coachella Valley, they say. Some argue that the data do not confirm the dipping structure. "Our contribution to this debate is that we add an uplift pattern to the data that support a dipping active fault and it rejects the other models," say Cooke and colleagues.Their new model yields an estimated 10 percent increase in shaking overall for the Coachella segment. But for the towns to the west of the fault where most people live, it yields decreased shaking due to the dipping geometry. It yields a doubling of shaking in mostly unpopulated areas east of the fault. "This isn't a direct outcome of our work but an implication," they add.Cooke says, "Others have used a dipping San Andreas in their models but they didn't include the degree of complexity that we did. By including the secondary faults within the Mecca Hills we more accurately capture the uplift pattern of the region."Fattaruso adds, "Others were comparing to different data sets, such as geodesy, and since we were comparing to uplift it is important that we have this complexity." In this case, geodesy is the science of measuring and representing the Earth and its crustal motion, taking into account the competition of geological processes in 3D over time.Most other models of deformation, stress, rupture and ground shaking have assumed that the southern San Andreas Fault is vertical, say Cooke and colleagues. However, seismic, imaging, aerial magnetometric surveys and GPS-based strain observations suggest that the fault dips 60 to 70 degrees toward the northeast, a hypothesis they set out to investigate.Specifically, they explored three alternative geometric models of the fault's Coachella Valley segment with added complexity such as including smaller faults in the nearby Indio and Mecca Hills. "We use localized uplift patterns in the Mecca Hills to assess the most plausible geometry for the San Andreas Fault in the Coachella Valley and better understand the interplay of fault geometry and deformation," they write.Cooke and colleagues say the fault structures in their favored model agree with distributions of local seismicity, and are consistent with geodetic observations of recent strain. "Crustal deformation models that neglect the northeast dip of the San Andreas Fault in the Coachella Valley will not replicate the ground shaking in the region and therefore inaccurately estimate seismic hazard," they note.This work was supported by the National Science Foundation. | Earthquakes | 2,014 |
December 4, 2014 | https://www.sciencedaily.com/releases/2014/12/141204143130.htm | Source of volcanoes may be much closer than thought: Geophysicists challenge traditional theory underlying origin of mid-plate volcanoes | A long-held assumption about the Earth is discussed in today's edition of | The discovery challenges conventional thought that volcanoes are caused when plates that make up the planet's crust shift and release heat.Instead of coming from deep within the interior of the planet, the responsibility is closer to the surface, about 80 kilometers to 200 kilometers deep -- a layer above the Earth's mantle, known as the as the asthenosphere."For nearly 40 years there has been a debate over a theory that volcanic island chains, such as Hawaii, have been formed by the interaction between plates at the surface and plumes of hot material that rise from the core-mantle boundary nearly 1,800 miles below the Earth's surface," King said. "Our paper shows that a hot layer beneath the plates may explain the origin of mid-plate volcanoes without resorting to deep conduits from halfway to the center of the Earth."Traditionally, the asthenosphere has been viewed as a passive structure that separates the moving tectonic plates from the mantle.As tectonic plates move several inches every year, the boundaries between the plates spawn most of the planet's volcanoes and earthquakes."As the Earth cools, the tectonic plates sink and displace warmer material deep within the interior of the Earth," explained King. "This material rises as two broad, passive updrafts that seismologists have long recognized in their imaging of the interior of the Earth."The work of Anderson and King, however, shows that the hot, weak region beneath the plates acts as a lubricating layer, preventing the plates from dragging the material below along with them as they move.The researchers show this lubricating layer is also the hottest part of the mantle, so there is no need for heat to be carried up to explain mid-plate volcanoes."We're taking the position that plate tectonics and mid-plate volcanoes are the natural results of processes in the plates and the layer beneath them," King said. | Earthquakes | 2,014 |
December 1, 2014 | https://www.sciencedaily.com/releases/2014/12/141201161121.htm | Exploring a large, restless volcanic field in Chile | If Brad Singer knew for sure what was happening three miles under an odd-shaped lake in the Andes, he might be less eager to spend a good part of his career investigating a volcanic field that has erupted 36 times during the last 25,000 years. As he leads a large scientific team exploring a region in the Andes called Laguna del Maule, Singer hopes the area remains quiet. | But the primary reason to expend so much effort on this area boils down to one fact: The rate of uplift is among the highest ever observed by satellite measurement for a volcano that is not actively erupting.That uplift is almost definitely due to a large intrusion of magma -- molten rock -- beneath the volcanic complex. For seven years, an area larger than the city of Madison has been rising by 10 inches per year.That rapid rise provides a major scientific opportunity: to explore a mega-volcano before it erupts. That effort, and the hazard posed by the restless magma reservoir beneath Laguna del Maule, are described in a major research article in the December issue of the Geological Society of America's "We've always been looking at these mega-eruptions in the rear-view mirror," says Singer. "We look at the lava, dust and ash, and try to understand what happened before the eruption. Since these huge eruptions are rare, that's usually our only option. But we look at the steady uplift at Laguna del Maule, which has a history of regular eruptions, combined with changes in gravity, electrical conductivity and swarms of earthquakes, and we suspect that conditions necessary to trigger another eruption are gathering force."Laguna del Maule looks nothing like a classic, cone-shaped volcano, since the high-intensity erosion caused by heavy rain and snow has carried most of the evidence to the nearby Pacific Ocean. But the overpowering reason for the absence of "typical volcano cones" is the nature of the molten rock underground. It's called rhyolite, and it's the most explosive type of magma on the planet.The eruption of a rhyolite volcano is too quick and violent to build up a cone. Instead, this viscous, water-rich magma often explodes into vast quantities of ash that can form deposits hundreds of yards deep, followed by a slower flow of glassy magma that can be tens of yards tall and measure more than a mile in length.The next eruption could be in the size range of Mount St. Helens -- or it could be vastly bigger, Singer says. "We know that over the past million years or so, several eruptions at Laguna del Maule or nearby volcanoes have been more than 100 times larger than Mount St. Helens," he says. "Those are rare, but they are possible." Such a mega-eruption could change the weather, disrupt the ecosystem and damage the economy.Trying to anticipate what Laguna del Maule holds in store, Singer is heading a new $3 million, five-year effort sponsored by the National Science Foundation to document its behavior before an eruption. With colleagues from Chile, Argentina, Canada, Singapore, and Cornell and Georgia Tech universities, he is masterminding an effort to build a scientific model of the underground forces that could lead to eruption. "This model should capture how this system has evolved in the crust at all scales, from the microscopic to basinwide, over the last 100,000 years," Singer says. "It's like a movie from the past to the present and into the future."Over the next five years, Singer says he and 30 colleagues will "throw everything, including the kitchen sink, at the problem -- geology, geochemistry, geochronology and geophysics -- to help measure, and then model, what's going on."One key source of information on volcanoes is seismic waves. Ground shaking triggered by the movement of magma can signal an impending eruption. Team member Clifford Thurber, a seismologist and professor of geoscience at UW-Madison, wants to use distant earthquakes to locate the underground magma body.As many as 50 seismometers will eventually be emplaced above and around the magma at Laguna del Maule, in the effort to create a 3-D image of Earth's crust in the area.By tracking multiple earthquakes over several years, Thurber and his colleagues want to pinpoint the size and location of the magma body -- roughly estimated as an oval measuring five kilometers (3.1 miles) by 10 kilometers (6.2 miles).Each seismometer will record the travel time of earthquake waves originating within a few thousand kilometers, Thurber explains. Since soft rock transmits sound less efficiently than hard rock, "we expect that waves that pass through the presumed magma body will be delayed," Thurber says. "It's very simple. It's like a CT scan, except instead of density we are looking at seismic wave velocity."As Singer, who has been visiting Laguna del Maule since 1998, notes, "The rate of uplift -- among the highest ever observed -- has been sustained for seven years, and we have discovered a large, fluid-rich zone in the crust under the lake using electrical resistivity methods. Thus, there are not many possible explanations other than a big, active body of magma at a shallow depth."The expanding body of magma could freeze in place -- or blow its top, he says. "One thing we know for sure is that the surface cannot continue rising indefinitely." | Earthquakes | 2,014 |
November 21, 2014 | https://www.sciencedaily.com/releases/2014/11/141121082911.htm | Erosion may trigger earthquakes | Researchers from laboratories at Géosciences Rennes (CNRS/Université de Rennes 1)*, Géosciences Montpellier (CNRS/Université de Montpellier 2) and Institut de Physique du Globe de Paris (CNRS/IPGP/Université Paris Diderot), in collaboration with a scientist in Taiwan, have shown that surface processes, i.e. erosion and sedimentation, may trigger shallow earthquakes (less than five kilometers deep) and favor the rupture of large deep earthquakes up to the surface. Although plate tectonics was generally thought to be the only persistent mechanism able to influence fault activity, it appears that surface processes also increase stresses on active faults, such as those in Taiwan, one of the world's most seismic regions. | The work is published in Over the last few decades, many studies have focused on the evolution of mountain range landscapes over geological time (1 to 100 million years). The aim is to better understand the dynamics and interactions between erosion, sedimentation and tectonic deformation processes. Recent work has shown that Earth's surface can undergo major changes in just a few days, months or years, for instance during extreme events such as typhoons or high magnitude earthquakes. Such events cause many landslides and an increase in sedimentary transport into rivers, as was the case in 2009 when typhoon Morakot struck Taiwan, leading to abrupt erosion of landscapes. Such rapid changes to the shape of Earth's surface alter the balance of forces at the site of deep active faults.In Taiwan, where erosion and deformation rates are among the highest in the world, the researchers showed that erosion rates of the order of 0.1 to 20 millimeters per year can cause an increase of the order of 0.1 to 10 bar in stresses on faults located nearby. Such forces are probably enough to trigger shallow earthquakes (less than five kilometers deep) or to favor the rupture of deep earthquakes up to the surface, especially if they are amplified by extreme erosion events caused by typhoons and high magnitude earthquakes. The researchers have thus shown that plate tectonics is not the only persistent mechanism able to influence the activity of seismic faults, and that surface processes such as erosion and sedimentation can increase stresses on active faults sufficiently to cause shallow earthquakes.Thanks to an analysis of the relationships between surface processes and active deformation of Earth in near real-time, this study provides new perspectives for understanding the mechanisms that trigger earthquakes.*The Géosciences Rennes laboratory is part of the Observatoire des Sciences de l'Univers de Rennes. | Earthquakes | 2,014 |
November 20, 2014 | https://www.sciencedaily.com/releases/2014/11/141120141752.htm | Geologists discover ancient buried canyon in South Tibet | A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon--thousands of feet deep in places--effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast. | "I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet," says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. "When I first saw the data, I said, 'Wow!' It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion."Geologists like Avouac and his colleagues, who are interested in tectonics--the study of the earth's surface and the way it changes--can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land--helping them to pin down when the land changed and by how much, for example."In tectonics, we are always trying to use rivers to say something about uplift," Avouac says. "In this case, we used a paleocanyon that was carved by a river. It's a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving."The team reports its findings in the current issue of Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac's group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.The researchers' reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river's way, damming it, and the canyon subsequently filled with sediment."This is the time when the Namche Barwa massif started to rise, and the gorge developed," says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.The team's new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth's crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. "But now we have discovered that the river was able to cut into the plateau way before the uplift happened," Avouac says, "and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision." | Earthquakes | 2,014 |
November 19, 2014 | https://www.sciencedaily.com/releases/2014/11/141119125420.htm | Geologists shed light on formation of Alaska Range | Geologists in the College of Arts and Sciences have recently figured out what has caused the Alaska Range to form the way it has and why the range boasts such an enigmatic topographic signature. The narrow mountain range is home to some of the world's most dramatic topography, including 20,320-foot Denali (Mount McKinley), North America's highest mountain. | Professor Paul Fitzgerald and a team of students and fellow scientists have been studying the Alaska Range along the Denali fault. They think they know why the fault is located where it is and what accounts for the alternating asymmetrical, mountain-scale topography along the fault.Their findings were the subject of a recent paper in the journal In 2002, the Denali fault, which cuts across south-central Alaska, was the site of a magnitude-7.9 earthquake and was felt as far away as Texas and Louisiana. It was the largest earthquake of its kind in more than 150 years."Following the earthquake, researchers flocked to the area to examine the effects," says Fitzgerald, who serves as professor of Earth Sciences and an associate dean for the college. "They were fascinated by how the frozen ground behaved; the many landslides [the earthquake] caused; how bridges responded; and how the Trans-Alaska oil pipeline survived, as it was engineered to do so."Geologists were also surprised by how the earthquake began on a previously unknown thrust-fault; then propagated eastward, along the Denali fault, and finally jumped onto another fault, hundreds of kilometers away."From our perspective, the earthquake has motivated analyses of why the highest mountains in the central Alaska Range occur He attributes the Alaska Range's alternating topographic signatures to a myriad of factors: contrasting lithospheric strength between large terranes (i.e., distinctly different rock units); the location of the curved Denali fault; the transfer of strain inland from southern Alaska's active plate margin; and the shape of the controlling former continental margin against weaker suture-zone rocks.It's no secret that Alaska is one of the most geologically active areas on the planet. For instance, scientists know that the North American Plate is currently overriding the Pacific Plate at the latter's southern coast, while the Yakutat microplate is colliding with North America.As a result of plate tectonics, Alaska is an amalgamation of terranes that have collided with the North American craton and have accreted to become part of North America. Cratons are pieces of continents that have been largely stable for hundreds of millions of years.Terranes often originate as volcanic islands (like those of Hawaii) and, after colliding with one another or a continent, are separated by large discrete faults. When terranes collide and accrete, they form a suture, also known as a collision zone, which is made up of weak, crushed rock. During deformation, suture-zone rocks usually deform first, especially if they are adjacent to a strong rock body."Technically, the Denali fault is what we'd call an 'intercontinental right-lateral strike-slip fault system,'" says Fitzgerald, adding that a strike-slip fault occurs when rocks move horizontally past one another, usually on a vertical fault. "This motion includes a component of slip In Alaska, the shape of the accreted terranes generally controls the location of the Denali fault and the mountains that form along it, especially at the bends in the trace of the fault.Fitzgerald: "Mount McKinley and the central Alaska Range lie within the concave curve of the Denali fault. There, higher topography and greater exhumation [uplift of rock] occur south of the Denali fault, exactly where you'd expect a mountain range to form, given the regional tectonics. In the eastern Alaska Range, higher topography and greater exhumation are found north of the fault, on its convex side-not an expected pattern at all and very puzzling."Using mapped surface geology, geophysical data, and thermochronology (i.e., time-temperature history of the rocks), Fitzgerald and colleagues have determined that much of Alaska's uplift and deformation began some 25 million years ago, when the Yakutat microplate first started colliding with North America. The bold, glacier-clad peaks comprising the Alaska Range actually derive from within the aforementioned "weak suture-zone rocks" between the terranes.While mountains are high and give the impression of strength, they are built largely from previously fractured rock units. Rock movement along the Denali fault drives the uplift of the mountains, which form at bends in the fault, where previously fractured suture-zone rocks are pinned against the stronger former North American continental margin."The patterns of deformation help us understand regional tectonics and the formation of the Alaska Range, which is fascinating to geologists and non-geologists alike," says Fitzgerald. "Being able to determine patterns or how to reveal them, while others see chaos, is often the key to finding the answer to complex problems. … To us scientists, the real significance of this work is that it helps us understand the evolution of our planet, how faults and mountain belts form, and why earthquakes happen. It also provides a number of hypotheses about Alaskan tectonics and rock deformation that we can test, using the Alaska Range as our laboratory."In addition to Fitzgerald, the paper was co-authored by Sarah Roeske, a research scientist at the University of California, Davis; Jeff Benowitz, a research scientist at the Geophysical Institute at the University of Alaska Fairbanks; Steven Riccio and Stephanie Perry, graduate students in Earth Sciences at Syracuse; and Phillip Armstrong, professor and chair of geological sciences at California State University, Fullerton. | Earthquakes | 2,014 |
November 18, 2014 | https://www.sciencedaily.com/releases/2014/11/141118141711.htm | High earthquake danger in Tianjin, China | With a population of 11 million and located about 100 km from Beijing (22 million people) and Tangshan (7 million people), Tianjin lies on top of the Tangshan-Hejian-Cixian fault that has been the site of 15 devastating earthquakes in the past 1,000 years. An example of the disastrous events is the 1976 magnitude 7.6 Tangshan Earthquake, which killed a quarter million people. | To assess future seismic hazards along the fault, scientists from the University of California at Los Angeles (UCLA) and the Chinese Earthquake Administration (CEA) have reconstructed, for the first time, a spatial pattern of major earthquakes along the fault. Their reconstruction is based on (1) detailed analysis of the available instrumental records in the past few decades; (2) historical records in the past ~4,000 years; and (3) pre-historical records tracing back nearly 11,000 years.A surprising finding from this work is the existence of a 160-km seismic gap centered at Tianjin, which has not been ruptured by any major earthquake for more than 8,400 years. As the average earthquake cycle is about 8,700 years, the authors suggest that the 160-km Tianjin fault segment, capable of generating a devastating earthquake similar to the 1976 Tangshan earthquake, may be the next to rupture. | Earthquakes | 2,014 |
November 17, 2014 | https://www.sciencedaily.com/releases/2014/11/141117164121.htm | Subtle shifts in the Earth could forecast earthquakes, tsunamis | Earthquakes and tsunamis can be giant disasters no one sees coming, but now an international team of scientists led by a University of South Florida professor has found that subtle shifts in Earth's offshore plates can be a harbinger of the size of the disaster. | In a new paper published in the "Giant earthquakes and tsunamis in the last decade -- Sumatra in 2004 and Japan in 2011 -- are a reminder that our ability to forecast these destructive events is painfully weak," Dixon said.Dixon was involved in the development of high precision GPS for geophysical applications, and has been making GPS measurements in Costa Rica since 1988, in collaboration with scientists at Observatorio Vulcanológico y Sismológico de Costa Rica, the University of California-Santa Cruz, and Georgia Tech. The project is funded by the National Science Foundation.Slow slip events have some similarities to earthquakes (caused by motion on faults) but release their energy slowly, over weeks or months, and cannot be felt or even recorded by conventional seismographs, Dixon said. Their discovery in 2001 by Canadian scientist Herb Dragert at the Pacific Geoscience Center had to await the development of high precision GPS, which is capable of measuring subtle movements of the Earth.The scientists studied the Sept. 5, 2012 earthquake on the Costa Rica subduction plate boundary, as well as motions of the Earth in the previous decade. High precision GPS recorded numerous slow slip events in the decade leading up to the 2012 earthquake. The scientists made their measurements from a peninsula overlying the shallow portion of a megathrust fault in northwest Costa Rica.The 7.6-magnitude quake was one of the strongest earthquakes ever to hit the Central American nation and unleased more than 1,600 aftershocks. Marino Protti, one of the authors of the paper and a resident of Costa Rica, has spent more than two decades warning local populations of the likelihood of a major earthquake in their area and recommending enhanced building codes.A tsunami warning was issued after the quake, but only a small tsunami occurred. The group's finding shed some light on why: slow slip events in the offshore region in the decade leading up to the earthquake may have released much of the stress and strain that would normally occur on the offshore fault.While the group's findings suggest that slow slip events have limited value in knowing exactly when an earthquake and tsunami will strike, they suggest that these events provide critical hazard assessment information by delineating rupture area and the magnitude and tsunami potential of future earthquakes.The scientists recommend monitoring slow slip events in order to provide accurate forecasts of earthquake magnitude and tsunami potential. | Earthquakes | 2,014 |
November 17, 2014 | https://www.sciencedaily.com/releases/2014/11/141117084627.htm | Climate capers of the past 600,000 years | If you want to see into the future, you have to understand the past. An international consortium of researchers under the auspices of the University of Bonn has drilled deposits on the bed of Lake Van (Eastern Turkey) which provide unique insights into the last 600,000 years. The samples reveal that the climate has done its fair share of mischief-making in the past. Furthermore, there have been numerous earthquakes and volcanic eruptions. The results of the drilling project also provide a basis for assessing the risk of how dangerous natural hazards are for today's population. In a special edition of the | null | Earthquakes | 2,014 |
November 13, 2014 | https://www.sciencedaily.com/releases/2014/11/141113195158.htm | Seismic hazard in the Puget Lowland, Washington state, USA | Seismic hazards in the Puget Lowland of northwestern Washington include deep earthquakes associated with the Cascadia subduction zone and shallow earthquakes associated with crustal faults across the region. Research presented in | Paleoseismic investigations on the Darrington-Devils Mountain fault zone by Stephen F. Personius and colleagues, using three-dimensional trenching, document a large-magnitude (M 6.7 to 7.0) earthquake about 2,000 years ago and show evidence of a similar earthquake about 8,000 years ago.An additional surprising result is evidence indicating that the sense of slip on the fault zone during these earthquakes was primarily right-lateral, with a smaller component of north-side-up vertical slip. Holocene north-side-up, right-lateral oblique slip is opposite the south-side-up, left-lateral oblique sense of slip inferred from deformation of Eocene and older rocks along the fault zone. According Personius and colleagues, the cause of this slip reversal is unknown, but may be related to ongoing clockwise rotation of northwestern Washington State into a position more favorable to right-lateral slip in the current stress field. | Earthquakes | 2,014 |
November 7, 2014 | https://www.sciencedaily.com/releases/2014/11/141107091453.htm | Understanding the 1989 Loma Prieta earthquake in an urban context | In an urban environment, the effect of a major earthquake such as the 17 Oct. 1989 Loma Prieta event can be pieced together by the infrastructure damaged or destroyed. | This study by Kevin M. Schmidt and colleagues details the effects of the Loma Prieta earthquake still detectable 25 years on and sheds light on the potential damage to infrastructure from future earthquakes along the San Andreas fault or the neighboring Foothills thrust belt.Despite the absence of primary surface rupture from the 1989 Loma Prieta earthquake, patterns of damage to pavement and utility pipes can be used to assess ground deformation near the southwest margin of the densely populated Santa Clara or "Silicon" Valley, California, USA. Schmidt and colleagues utilized more than 1,400 damage sites as an urban strain gage to determine relationships between ground deformation and previously mapped faults.Post-earthquake surveys of established monuments and the concrete channel lining of Los Gatos Creek reveal belts of deformation consistent with regional geologic structure. The authors conclude that reverse movement largely along preexisting faults, probably enhanced significantly by warping combined with enhanced ground shaking, produced the widespread ground deformation.Such damage, with a preferential NE-SW sense of shortening, occurred in response to the 1906 and 1989 earthquakes and will likely repeat itself in future earthquakes in the region. | Earthquakes | 2,014 |
November 4, 2014 | https://www.sciencedaily.com/releases/2014/11/141104141912.htm | Tectonic plates not rigid, deform horizontally in cooling process | The puzzle pieces of tectonic plates that make up the outer layer of earth are not rigid and don't fit together as nicely as we were taught in high school. | A study published in the journal Using large-scale numerical modeling as well as GPS velocities from the largest GPS data-processing center in the world -- the Nevada Geodetic Laboratory at the University of Nevada, Reno -- Kreemer and Gordon have showed that cooling of the lithosphere, the outermost layer of Earth, makes some sections of the Pacific plate contract horizontally at faster rates than other sections. This causes the plate to deform.Gordon's idea is that the plate cooling, which makes the ocean deeper, also affects horizontal movement and that there is shortening and deformation of the plates due to the cooling. In partnering with Kreemer, the two put their ideas and expertise together to show that the deformation could explain why some parts of the plate tectonic puzzle didn't fall neatly into place in recent plate motion models, which is based on spreading rates along mid-oceanic ridges. Kreemer and Gordon also showed that there is a positive correlation between where the plate is predicted to deform and where intraplate earthquakes occur. Their work was supported by the National Science Foundation.Results of the study suggest that plate-scale horizontal thermal contraction is significant, and that it may be partly released seismically. . The pair of researchers are, as the saying goes, rewriting the textbooks."This is plate tectonics 2.0, it revolutionizes the concepts of plate rigidity," Kreemer, who teaches in the University's College of Science, said. "We have shown that the Pacific plate deforms, that it is pliable. We are refining the plate tectonic theory and have come up with an explanation for mid-plate seismicity."The oceanic plates are shortening due to cooling, which causes relative motion inside the plate, Kreemer said. The oceanic crust of the Pacific plate off shore California is moving 2 mm to the south every year relative to the Pacific/Antarctic plate boundary."It may not sound like much, but it is significant considering that we can measure crustal motion with GPS within a fraction of a millimeter per year," he said. "Unfortunately, all existing GPS stations on Pacific islands are in the old part of the plate that is not expected nor shown to deform. New measurements will be needed within the young parts of the plate to confirm this study's predictions, either on very remote islands or through sensors on the ocean floor."This work is complementary to Kreemer's ongoing effort to quantify the deformation in all of Earth's plate boundary zones with GPS velocities -- data that are for a large part processed in the Nevada Geodetic Laboratory. The main goal of the global modeling is to convert the strain rates to earthquake forecast maps."Because we don't have GPS data in the right places of the Pacific plate, our prediction of how that plate deforms can supplement the strain rates I've estimated in parts of the world where we can quantify them with GPS data," Kreemer said. "Ultimately, we hope to have a good estimate of strain rates everywhere so that the models not only forecast earthquakes for places like Reno and San Francisco, but also for places where you may expect them the least." | Earthquakes | 2,014 |
November 4, 2014 | https://www.sciencedaily.com/releases/2014/11/141104121306.htm | Geologist reveals correlation between earthquakes, landslides in Peru | A geologist in Syracuse University's College of Arts and Sciences has demonstrated that earthquakes--not climate change, as previously thought--affect the rate of landslides in Peru. | The finding is the subject of an article in "Geologic records of landslide activity offer rare glimpses into landscapes evolving under the influence of tectonics and climate," says McPhillips, whose expertise includes geomorphology and tectonics. "Because deposits from individual landslides are unlikely to be preserved, it's difficult to reconstruct landslide activity in the geologic past. Therefore, we've developed a method that measures landslide activity before and after the last glacial-interglacial climate transition in Peru."McPhillips and his team have spent the past several years in the Western Andes Mountains, studying cobbles in the Quebrada Veladera river channel and in an adjacent fill terrace. By measuring the amount of a nuclide known as Beryllium-10 (Be-10) in each area's cobble population, they've been able to calculate erosion rates over tens of thousands of years.The result? The range of Be concentrations in terrace cobbles from a relatively wet period, more than 16,000 years ago, was no different from those found in river channel cobbles from more recent arid periods."This suggests that the amount of erosion from landslides has not changed in response to climatic changes," McPhillips says. "Our integrated millennial-scale record of landslides implies that earthquakes may be the primary landslide trigger."McPhillips says the study is the first to study landslides by measuring individual particles of river sediment, as opposed to amalgamating all the particles and then measuring a single concentration."These concentrations provide a robust record of hill-slope behavior over long timescales," he adds. "Millennial-scale records of landslide activity, especially in settings without preserved landslide deposits, are an important complement to studies documenting modern landslide inventories."Earthquakes are a regular occurrence in Peru, which is located at the nexus of the small Nazca oceanic plate and the larger South American crustal plate. The ongoing subduction, or sliding, of the Nazca Plate under the South American Plate has spawned considerable tectonic activity."Peru is rife with earthquakes, landslides, volcanic eruptions, and tectonic uplift," McPhillips adds. "By studying its past, we may be able to better predict and prepare for future calamities." | Earthquakes | 2,014 |
November 1, 2014 | https://www.sciencedaily.com/releases/2014/11/141101173219.htm | Study of Chile earthquake finds new rock structure that affects earthquake rupture | Researchers from the University of Liverpool have found an unusual mass of rock deep in the active fault line beneath Chile which influenced the rupture size of a massive earthquake that struck the region in 2010. | The geological structure, which was not previously known about, is unusually dense and large for this depth in Earth's crust. The body was revealed using 3-D seismic images of Earth's interior based on the monitoring of vibrations on the Pacific seafloor caused by aftershocks from the magnitude 8.8 Chile earthquake. This imaging works in a similar way to CT scans that are used in hospitals.Analysis of the 2010 earthquake also revealed that this structure played a key role in the movement of the fault, causing the rupture to suddenly slow down.Seismologists think that the block of rock was once part of Earth's mantle and may have formed around 220 million years ago, during the period of time known as the Triassic.Liverpool Seismologist, Stephen Hicks from the School of Environmental Sciences, who led the research, said: "It was previously thought that dense geological bodies in an active fault zone may cause more movement of the fault during an earthquake.""However, our research suggests that these blocks of rock may in fact cause the earthquake rupture to suddenly slow down. But this slowing down can generate stronger shaking at the surface, which is more damaging to human-made structures.""It is now clear that ancient geology plays a big role in the generation of future earthquakes and their subsequent aftershocks."Professor Andreas Rietbrock, head of the Earthquake Seismology and Geodynamics research group added: "This work has clearly shown the potential of 3D 'seismic' images to further our understanding of the earthquake rupture process.We are currently establishing the Liverpool Earth Observatory (LEO), which will allow us together with our international partners, to carry out similar studies in other tectonically active regions such as northern Chile, Indonesia, New Zealand and the northwest coast United States. This work is vital for understanding risk exposure in these countries from both ground shaking and tsunamis."Chile is located on the Pacific Ring of Fire, where the sinking of tectonic plates generates many of the world's largest earthquakes.The 2010 magnitude 8.8 earthquake in Chile is one of the best-recorded earthquakes, giving seismologists the best insight to date into the ruptures of mega-quakes. | Earthquakes | 2,014 |
October 30, 2014 | https://www.sciencedaily.com/releases/2014/10/141030142024.htm | Magma pancakes beneath Indonesia's Lake Toba: Subsurface sources of mega-eruptions | The tremendous amounts of lava that are emitted during super-eruptions accumulate over millions of years prior to the event in the Earth's crust. These reservoirs consist of magma that intrudes into the crust in the form of numerous horizontally oriented sheets resting on top of each other like a pile of pancakes. | A team of geoscientists from Novosibirsk, Paris and Potsdam presents these results in the current issue ofGeoscientists were interested in finding out: How can the gigantic amounts of eruptible material required to form such a super volcano accumulate in the Earth's crust. Was this a singular event thousands of years ago or can it happen again?Researchers from the GFZ German Research Centre for Geosciences successfully installed a seismometer network in the Toba area to investigate these questions and provided the data to all participating scientists via the GEOFON data archive. GFZ scientist, Christoph Sens-Schönfelder, a co-author of the study explains: "With a new seismological method we were able to investigate the internal structure of the magma reservoir beneath the Toba-caldera. We found that the middle crust below the Toba supervolcano is horizontally layered." The answer thus lies in the structure of the magma reservoir. Here, below 7 kilometers the crust consists of many, mostly horizontal, magmatic intrusions still containing molten material.It was already suspected that the large volume of magma ejected during the supervolcanic eruption had slowly accumulated over the last few millions of years in the form of consequently emplaced intrusions. This could now be confirmed with the results of field measurements. The GFZ scientists used a novel seismological method for this purpose. Over a six-month period they recorded the ambient seismic noise, the natural vibrations which usually are regarded as disturbing signals. With a statistical approach they analyzed the data and discovered that the velocity of seismic waves beneath Toba depends on the direction in which the waves shear the Earth's crust. Above 7 kilometers depth the deposits of the last eruption formed a zone of low velocities. Below this depth the seismic anisotropy is caused by horizontally layered intrusions that structure the reservoir like a pile of pancakes. This is reflected in the seismic data.Not only in Indonesia, but also in other parts of the world there are such supervoclcanoes, which erupt only every couple of hundred thousand years but then in gigantic eruptions. Because of their size those volcanoes do not build up mountains but manifest themselves with their huge carter formed during the eruption -- the caldera. Other known supervolcanoes include the area of the Yellow-Stone-Park, volcanoes in the Andes, and the caldera of Lake-Taupo in New Zealand. The present study helps to better understand the processes that lead to such super-eruptions. | Earthquakes | 2,014 |
October 29, 2014 | https://www.sciencedaily.com/releases/2014/10/141029203945.htm | Urban seismic network detects human sounds | When listening to the Earth, what clues can seismic data reveal about the impact of urban life? Although naturally occurring vibrations have proven extremely useful to seismologists, until now the vibrations caused by humans haven't been explored in any real depth. | Scripps Institution of Oceanography researchers Nima Riahi, a postdoctoral fellow, and Peter Gerstoft, a geophysicist, will describe their efforts to tap into an urban seismic network to monitor the traffic of trains, planes, automobiles and other modes of human transport. They will present the work this week at the 168th Meeting of the Acoustical Society of America (ASA), which will be held October 27-31, 2014, at the Indianapolis Marriott Downtown Hotel.Traffic in urban areas generates both acoustic and seismic "noise." While seismic noise typically isn't perceptible by humans, it could prove to be an interesting data source for traffic information systems in the near future."Earlier this year an industrial partner offered us access to a large vibration dataset acquired over the city of Long Beach, Calif., so we seized the opportunity," explained Riahi.This particular dataset consists of a 5,300-geophone network -- deployed as part of a hydrocarbon industry survey -- covering an area of more than 70 km2. Geophone devices are commonly used to record energy waves reflected by the subsurface geology as a way of mapping out geologic structures or track earthquakes."By recording vibrations via geophones spaced roughly every 100 meters (300 feet), we were able to look into activity in Long Beach with a resolution below a typical city block," said Riahi.This begs the question: What urban processes can the space and time structure of vibrational intensity reveal?Much to their surprise, Riahi and Gerstoft discovered that "by using mostly standard signal processing, we can follow a metro schedule, count aircraft and their acceleration on a runway, and even see larger vehicles on a 10-lane highway." More refined techniques and algorithms may well uncover many other types of humanmade signals within the Earth.These findings indicate that urban vibrations can serve as a new data source to observe cities. "Traffic monitoring tasks are an important and obvious application, but other uses may be involved in urban area characterization in which the type and schedule of activities can be visualized, so that it's possible to vibrationally identify industrial, residential or office zones," Riahi added. | Earthquakes | 2,014 |
October 28, 2014 | https://www.sciencedaily.com/releases/2014/10/141028082133.htm | Radiation exposure linked to aggressive thyroid cancers, researchers confirm for the first time | For the first time, researchers have found that exposure to radioactive iodine is associated with more aggressive forms of thyroid cancer, according to a careful study of nearly 12,000 people in Belarus who were exposed when they were children or adolescents to fallout from the 1986 Chernobyl nuclear power plant accident. | Researchers examined thyroid cancers diagnosed up to two decades after the Chernobyl accident and found that higher thyroid radiation doses estimated from measurements taken shortly after the accident were associated with more aggressive tumor features."Our group has previously shown that exposures to radioactive iodine significantly increase the risk of thyroid cancer in a dose-dependent manner. The new study shows that radiation exposures are also associated with distinct clinical features that are more aggressive," said the paper's first author, Lydia Zablotska, MD, PhD, associate professor in the Department of Epidemiology and Biostatistics at UC San Francisco (UCSF). The paper will be published online in the journal Zablotska said the findings have implications for those exposed to radioactive iodine fallout from the 2011 nuclear reactor incidents in Fukushima, Japan, after the reactors were damaged by an earthquake-induced tsunami."Those exposed as children or adolescents to the fallout are at highest risk and should probably be screened for thyroid cancer regularly, because these cancers are aggressive, and they can spread really fast," Zablotska said. "Clinicians should be aware of the aggressiveness of radiation-associated tumors and closely monitor those at high risk."Chernobyl studies led by Zablotska also showed for the first time that exposures to the radioactive iodine after the Chernobyl nuclear plant accident are associated with a whole spectrum of thyroid diseases, from benign to malignant. Benign encapsulated tumors of the thyroid gland are called follicular adenomas, and are treated in the same way as thyroid cancer -- by removing the thyroid gland, then giving patients pills to replace the hormones that are lost. Lifelong hormone supplementation treatment is both costly and complicated for patients.Thyroid cancer is ordinarily rare among children, with less than one new case per million diagnosed each year. Among adults, about 13 new cases will be diagnosed each year for every 100,000 people, according to the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). But in the Belarus cohort, the researchers diagnosed 158 thyroid cancers among 11,664 subjects during three rounds of screening. Those who had received higher radiation doses also were more likely to have solid or diffuse variants of thyroid cancer, as well as to have more aggressive tumor features, such as spread to lymphatic vessels and several simultaneous cancer lesions in the thyroid gland. | Earthquakes | 2,014 |
October 23, 2014 | https://www.sciencedaily.com/releases/2014/10/141023100430.htm | Hippos-Sussita excavation: Silent evidence of the earthquake of 363 CE | Silent evidence of a large earthquake in 363 CE -- the skeleton of a woman with a dove-shaped pendant was discovered under the tiles of a collapsed roof by archeologists from the University of Haifa during this excavation season at Hippos-Sussita. They also found a large muscular marble leg and artillery ammunition from some 2,000 years ago. "The data is finally beginning to form a clear historical-archaeological picture," said Dr. Michael Eisenberg, head of the international excavation team. | The past fifteen excavation seasons at Hippos-Sussita, run by archeologists from the Zinman Institute of Archaeology at the University of Haifa, have not stopped providing a constant flow of fascinating findings. The team digging at the city site -- situated east of the Sea of Galilee in the Sussita National Park, which is under the management of the Israel Nature and Parks Authority -- has grown over the years, with more and more teams and excavators from various countries joining them. This time, the security situation in the south of Israel "sent" them a Canadian team, led by Dr. Stephen Chambers, as reinforcement.The city of Hippos-Sussita, which was founded in the second century BCE, experienced two strong and well-documented earthquakes. The first was in the year 363 CE and it caused heavy damage. The city, did, however, recover. The great earthquake of 749 CE destroyed the city which was subsequently abandoned completely. Evidence of the extensive damage caused by the earthquake of 363 was found in earlier seasons. None, however, was as violent, thrilling and eerie as the evidence discovered this year.To the north of the basilica, the largest building in town that served as the commercial, economic and judicial center of the city, the dig's senior area supervisor Haim Shkolnik and his team unearthed the remains of several skeletons that had been crushed by the weight of the collapsed roof. Among the bones of one of the women lay a gold dove-shaped pendant.This year, evidence was found for the first time that the great earthquake of 363 CE had destroyed the Roman bathhouse, which was uncovered by the team run by Arleta Kowalewska from Poland. Like the basilica, it too was not rebuilt. According to Dr. Eisenberg, the evidence found so far shows that the earthquake was so powerful it completely destroyed the city, which took some twenty years to be rebuilt. Among the wreckage from the bathhouse, an excellent Roman marble sculpture of a muscular right leg of a man leaning against a tree trunk was found. "It is too early to determine who the man depicted in the sculpture was. It could be the sculpture of a god or an athlete; it was more than two meters tall. We hope to find more parts of the sculpture in the coming seasons to shed some light on his identity," said Dr. Eisenberg.Excavations were resumed in the bastion, the main defense post of the Roman period city built on the southern edge of the cliff, where the work focused on the fortified position of a projectile machine that propelled/launched ballista stones. The catapult was some eight meters long according to the size of the chamber. So far the archeologists have found a number of ballista balls that fit the massive catapult, as well as smaller balls that were used on smaller ballista machines. These machines were positioned above the bastion's vaults and were used to launch basalt ballista balls slightly smaller than soccer balls as far as 350 meters.A section of the western part of the city's main colonnaded street, which traversed its entire length of 600 meters from east to west (the decumanus maximus) was excavated this year with the help of a Canadian team, after their planned dig in the south was cancelled. The archeologists uncovered another original piece of the wall that supported the street columns, confirming the theory that it had been a magnificent colonnaded street similar to those of the Roman East cities that were built at the peak of the Pax Romana -- the Roman era of peace during the first few centuries CE.While working on the dig the team also invested a lot of work on the site's conservation. "I am extremely proud that we were able to organize a sizable conservation team this year as well, from our own internal budgets and with the help of the Western Galilee College in Acre. Twenty-two students from the college's Department of Conservation together with five experienced conservators under the direction of Julia Burdajewicz from the Academy of Fine Arts in Warsaw conducted the conservation work. This is one of the major tourist destinations in the northern part of the country, and as such I see this as a national mission, even if the budget comes primarily from our own sources, without government support," concluded Dr. Eisenberg. | Earthquakes | 2,014 |
October 23, 2014 | https://www.sciencedaily.com/releases/2014/10/141023090736.htm | Should the Japanese give nuclear power another chance? | On September 9, 2014, the Japan Times reported an increasing number of suicides coming from the survivors of the March 2011 disaster. In Minami Soma Hospital, which is located 23 km away from the power plant, the number of patients experiencing stress has also increased since the disaster. What's more, many of the survivors are now jobless and therefore facing an uncertain future. | This is not the first time that nuclear power has victimized the Japanese people. In 1945, atomic bombs exploded in Hiroshima and Nagasaki, creating massive fears about nuclear power in the Japanese population. It took 20 years for the public to erase the trauma of these events. It was then -- in the mid 1960s(?) -- that the Fukushima Daiichii Nuclear Power Plant was built.According to Professor Tetsuo Sawada, Assistant Professor in the Laboratory of Nuclear Reactors at Tokyo University, it took a lot of effort to assure people that nuclear power was safe and beneficial. The first step was a legal step: In 1955, the Japanese government passed a law decreeing that nuclear power could only be used for peaceful purposes."But that law was not enough to assure people to accept the establishment of nuclear power," said Prof. Sawada.He explained that the economy plays an important role in public acceptance of nuclear power. Through the establishment of nuclear power plants, more jobs were created, which boosted the economy of the Fukushima region at that time."Before the Fukushima disaster, we could find many pro-nuclear people in the area of nuclear power plants since it gave them money," said Prof. Sawada.Now, more than forty years have passed and the public's former confidence has evolved into feelings of fear about nuclear power and distrust toward the government.According to a study conducted by Noriko Iwai from the Japanese General Social Survey Research Center, the Fukushima nuclear accident has heightened people's perception of disaster risks, fears of nuclear accident, and recognition of pollution, and has changed public opinion on nuclear energy policy."Distance from nuclear plants and the perception of earthquake risk interactively correlate with opinions on nuclear issues: among people whose evaluation of earthquake risk is low, those who live nearer to the plants are more likely to object to the abolishment of nuclear plants," said Iwai.This finding is in line with the perception of Sokyu Genyu, a chief priest in Fukujuji temple, Miharu Town, Fukushima Prefecture. As a member of the Reconstruction Design Council in Response to the Great East Japan Earthquake, he argued that both the Fukushima Daiichi and Daini nuclear power plants should be shut down in response to the objection of 80% of Fukushima residents.However, the Japanese government, local scientists and international authorities have announced that Fukushima is safe. Radiation levels are below 1mSv/y, a number that, according to them, we should not be worried about. But the public do not believe in numbers.But Genyu was not saying that these numbers are scientifically false. Rather, he argues that the problem lies more in the realm of social psychology. Despite the announcement about low-radiation levels, the Japanese people are still afraid of radiation."It is reasonable for local residents in Fukushima to speak out very emotionally. Within three months of the disaster, six people had committed sucide. They were homeless and jobless, " said Genyu.It is heart-breaking to know that victims of the Fukushima Daiichi nuclear accident died not because of radiation, but instead because of depression. Besides the increasing number of suicides, the number ofpatients suffering from cerebrovascular disease (strokes)has also risen. In Minami-Soma Hospital, the population of stroke patients increased by more than 100% after the disaster.Local doctors and scientists are now actively educating students in Fukushima, convincing them that the radiation will not affect their health.Dr. Masaharu Tsubokura, a practicing doctor at Minami-Soma Hospital, has been informing students that Fukushima is safe. But sadly, their responses are mostly negative and full of apathy."I think the Fukushima disaster is not about nuclear radiation but is rather a matter of public trust in the technology ," said Dr. Tsubokura.Dr. Tsubokura has given dosimeters, a device used to measure radiation, to children living in Minami-Soma city. But apparently, this was not enough to eliminate people's fears.In 2012, Professor Ryogo Hayano, a physicist from the University of Tokyo, joined Dr. Tsubokura in Minami-Soma Hospital and invented BABYSCAN technology, a whole-body scanning to measure radiation in small children as well as to allay the fears of Fukushima parents."BABYSCAN is unnecessary but necessary. It is unnecessary because we know that the radiation is low. But it is necessary to assure parents that their children are going to be okay," said Prof. Hayano.After witnessing the fears of the Fukushima people, Prof. Hayano thinks that nuclear power is no longer appropriate for Japan. He believes that the government should shut down nuclear power plants."As a scientist, I know that nuclear power is safe and cheap. But looking at the public's fear in Fukushima, I think it should be phased out," said Prof. Hayano.But, does the government care about the public when it comes to politics?It has only been three years since the disaster and Prime Minister Shinzo Abe has been keen to revive the country's nuclear power plants. The operations of more than 50 nuclear power plants in Japan have been suspended because of the Daiichi power plant meltdown.Last month, Japan's Nuclear Regulation Authority approved the reopening of a power plant in Sendai for 2015. | Earthquakes | 2,014 |
October 21, 2014 | https://www.sciencedaily.com/releases/2014/10/141021115349.htm | Rising above the risk: America's first tsunami refuge | Washington's coast is so close to the seismically active Cascadia Subduction Zone that if a megathrust earthquake were to occur, a tsunami would hit the Washington shoreline in just 25 minutes. | One coastal community is preparing for such a disaster by starting construction on the nation's first tsunami evacuation refuge, large enough to shelter more than 1,000 people who are within 20-minute walking distance.The vertical evacuation-refuge will be the roof of the gym of the new school in Grays Harbor County, Washington. The Ocosta Elementary School and Tsunami Safe Haven will be the first of its kind in the nation and will be the culmination of 18 years of effort, said Tim Walsh, who is a Chief Hazard Geologist at the Department of Natural Resources and has been working on this project since The National Tsunami Hazard Mitigation Program was formed in 1995.Walsh will present the project design for the school and structure, along with the detailed tsunami modeling used to find the best location for the refuge, at the Annual Meeting for the Geological Society of America in Vancouver, Canada, on 21 October.The Cascadia subduction zone is a 700-mile-long (over 1,000 kilometers) fault along the West Coast, where the Juan de Fuca Plate is being forced under the North American Plate. The subduction zone is capable of producing massive earthquakes; scientists have calculated that magnitude-9 earthquakes along this fault line could generate a massive tsunami that would hit the coastlines of British Columbia, Washington, Oregon, and California within 20 to 30 minutes."It used to be thought that Cascadia was not an active fault," said Walsh. Not only has Cascadia been found to be an active fault, it has a 10 percent chance that it will cause an earthquake in the next 50 years, he said."It is more than 10 times more likely than the chance you will be killed in a traffic accident," said Walsh. "But you aren't looking at the statistics of a single person, but an earthquake that would have an effect on thousands of miles of shoreline."The biggest challenge was at the very beginning, trying to come up with a location that could be effective and accessible to people, said Walsh. "It was difficult in the beginning to go to the public meetings in these communities and present the hazards, but have no solution for them," he said.Project Safe Haven brought together structural engineers, oceanographers, geographers, and scientists from many other disciplines to create a safe and accessible refuge.Walsh and his colleagues used a model called GeoClaw to research the risk a tsunami, factoring in and any potential landslides caused by the wave or megaquake. Using this model in the community for Grays Harbor County, the scientists determined the best place for the school, and what how much force the structure would have to withstand to protect refugees.The school will be built on a dune ridge, so the roof of the evacuation shelter will be about 55 feet (almost 17 meters) above sea-level. The structure is designed to withstand earthquakes and the impact of a storm surge, with reinforced concrete cores at each corner of the gym and staircases leading to the room. The school, and refuge, is expected to be finished and operating for the 2015-2016 academic year.Walsh would like to see other scientists and community groups working together to create novel solutions for tsunami risk, he said. Currently the Washington coast has very few tall buildings, and barely any are taller than three stories, leaving thousands of people at risk in the event of a tsunami, he said. | Earthquakes | 2,014 |
October 21, 2014 | https://www.sciencedaily.com/releases/2014/10/141021101608.htm | A global surge of great earthquakes from 2004-2014 and implications for Cascadia | The last ten years have been a remarkable time for great earthquakes. Since December 2004 there have been no less than 18 quakes of Mw8.0 or greater -- a rate of more than twice that seen from 1900 to mid-2004. Hundreds of thousands of lives have been lost and massive damage has resulted from these great earthquakes. But as devastating as such events can be, these recent great quakes have come with a silver lining: They coincide with unprecedented advances in technological and scientific capacity for learning from them. | "We previously had very limited information about how ruptures grow into great earthquakes and interact with regions around them," said seismologist Thorne Lay of the University of California at Santa Cruz. "So we are using the recorded data for these recent events to guide our understanding of future earthquakes. We've gained a new level of appreciation for how one earthquake can influence events in other zones."High on the list of areas ripe for a great quake is Cascadia, the Pacific Northwest, where the risk for great quakes had long been under appreciated. Evidence began surfacing about 20 years ago that there had been a great quake in the region in the year 1700. Since then the view of the great quake risk in Cascadia has shifted dramatically."We don't know many details about what happened in 1700," said Lay. There were no instruments back then to observe and record it. And so the best way to try and understand the danger and what could happen in Cascadia is to study the recent events elsewhere.Over the last decade Lay and his colleagues have been able to gather fine details about these giant earthquakes using data from an expanded global networks of seismometers, GPS stations, tsunami gauges, and new satellite imaging capabilities such as GRACE, InSAR, and LandSAT interferometry. Among the broader conclusions they have come to is that great quakes are very complicated and idiosyncratic. Lay will be presenting some of those idiosyncrasies at the meeting of the Geological Society of America in Vancouver on Oct. 21."What we've seen is that we can have multiple faults activated," said Lay. "We've seen it off Sumatra and off Japan. Once earthquakes get going they can activate faulting in areas that were thought not physically feasible."The great Sumatra-Andaman earthquake of Dec. 26, 2004, for instance, unzipped a 1,300 kilometer long segment of the subduction zone and unleashed one of history's most destructive, deadly tsunamis. Much of the rupture was along a region with very limited plate convergence. In Japan, the Kuril Islands, and the Solomon Islands, great mega-thrust ruptures have ruptured portions of the subduction zones that were thought too warm or weak to experience earthquakes."These earthquakes ruptured right through areas that had been considered to have low risk," said Lay. "We thought that would not happen. But it did, so we have to adjust our understanding."Perhaps the best recent analogy to Cascadia is off the coast of Iquique, Chile, said Lay. There had been a great quake in 1877, and a conspicuous gap in quakes ever since. Like the 1700 Cascadia earthquake, there is little data for the 1877 event, which killed more than 2,500 people. In both subduction zones, the converging plates are thought to be accumulating strain which could be released in a very large and violent rupture. On April 1 of this year, some of that strain was released offshore of Iquique. There was a Mw8.1 rupture in the northern portion of the seismic gap. But it involved slip over less than 20 percent of the region that seismologists believe to have accumulated strain since 1877."We have no idea why only a portion of the 1877 zone ruptured," said Lay. "But clearly, 80 percent of that zone is still unruptured. We don't have a good basis for assessment of how the rest will fail. It's the same for Cascadia. We don't know if it always goes all at once or sometimes in sequences of smaller events, with alternating pattern. It is prudent to prepare for the worst case of failure of the entire region in a single event, but it may not happen that way every time."What is certain is that studying these recent big earthquakes has given geophysicists the best information ever about how they work and point to new ways to begin understanding what could be in Cascadia's future. | Earthquakes | 2,014 |
October 20, 2014 | https://www.sciencedaily.com/releases/2014/10/141020121529.htm | Massive debris pile reveals risk of huge tsunamis in Hawaii | A mass of marine debris discovered in a giant sinkhole in the Hawaiian islands provides evidence that at least one mammoth tsunami, larger than any in Hawaii's recorded history, has struck the islands, and that a similar disaster could happen again, new research finds. Scientists are reporting that a wall of water up to nine meters (30 feet) high surged onto Hawaiian shores about 500 years ago. A 9.0-magnitude earthquake off the coast of the Aleutian Islands triggered the mighty wave, which left behind up to nine shipping containers worth of ocean sediment in a sinkhole on the island of Kauai. | The tsunami was at least three times the size of a 1946 tsunami that was the most destructive in Hawaii's recent history, according to the new study that examined deposits believed to have come from the extreme event and used models to show how it might have occurred. Tsunamis of this magnitude are rare events. An earthquake in the eastern Aleutian Trench big enough to generate a massive tsunami like the one in the study is expected to occur once every thousand years, meaning that there is a 0.1 percent chance of it happening in any given year -- the same probability as the 2011 Tohoku earthquake that struck Japan, according to Gerard Fryer, a geophysicist at the Pacific Tsunami Warning Center in Ewa Beach, Hawaii.Nevertheless, the new research has prompted Honolulu officials to revise their tsunami evacuation maps to account for the possibility of an extreme tsunami hitting the county of nearly 1 million people. The new maps would more than double the area of evacuation in some locations, according to Fryer."You're going to have great earthquakes on planet Earth, and you're going to have great tsunamis," said Rhett Butler, a geophysicist at the University of Hawaii at Manoa and lead author of the new study published online in Hawaiians have told stories about colossal tsunamis hitting the islands for generations, but possible evidence of these massive waves was only first detected in the late 1990s when David Burney, a paleoecologist at the National Tropical Botanical Garden in Kalaheo, was excavating the Makauwahi sinkhole, a collapsed limestone cave on the south shore of Kauai.Two meters (six and a half feet) below the surface he encountered a layer of sediment marked by coral fragments, mollusk shells and coarse beach sand that could only have come from the sea. But the mouth of the sinkhole was separated from the shore by 100 meters (328 feet) of land and seven-meter (23-foot) high walls. Burney speculated that the deposit could have been left by a massive tsunami, but he was unable to verify the claim.The deposits remained a mystery until the Tohoku earthquake hit Japan in 2011. It caused water to surge inland like a rapidly rising tide, reaching heights up to 39 meters (128 feet) above the normal sea level. After that tsunami deluged the island nation, scientists began to question Hawaii's current tsunami evacuation maps. The maps are based largely upon the 1946 tsunami, which followed a magnitude 8.6 earthquake in the Aleutian Islands and caused water to rise only two and a half meters (8 feet) up the side of the Makauwahi sinkhole."[The Japan earthquake] was bigger than almost any seismologist thought possible," said Butler. "Seeing [on live TV] the devastation it caused, I began to wonder, did we get it right in Hawaii? Are our evacuation zones the correct size?"To find out, the study's authors used a wave model to predict how a tsunami would flood the Kauai coastline. They simulated earthquakes with magnitudes between 9.0 and 9.6 originating at different locations along the Aleutian-Alaska subduction zone, a 3,400-kilometer (2,113-mile) long ocean trench stretching along the southern coast of Alaska and the Aleutian Islands where the Pacific tectonic plate is slipping under the North American plate.The researchers found that the unique geometry of the eastern Aleutians would direct the largest post-earthquake tsunami energy directly toward the Hawaiian Islands. Inundation models showed that an earthquake with a magnitude greater than 9.0 in just the right spot could produce water levels on the shore that reached eight to nine meters (26 to 30 feet) high, easily overtopping the Makauwahi sinkhole wall where the ocean deposits were found.The authors used radiocarbon-dated marine deposits from Sedanka Island off the coast of Alaska and along the west coasts of Canada and the United States dating back to the same time period as the Makauwahi deposit to show that all three sediments could have come from the same tsunami and provide some evidence that the event occurred, according to the study."[The authors] stitched together geological evidence, anthropological information as well as geophysical modeling to put together this story that is tantalizing for a geologist but it's frightening for people in Hawaii," said Robert Witter, a geologist at the U.S. Geological Survey in Anchorage, Alaska who was not involved in the study.According to Witter, it is possible that a massive tsunami hit Hawaii hundreds of years ago, based on the deposits found in the Kauai sinkhole, but he said it is difficult to determine if all three locations experienced the same event based on radiocarbon dating alone.Radiocarbon dating only gives scientists a rough estimate of the age of a deposit, he said. All three locations offer evidence of a great tsunami occurring between 350 and 575 years ago, but it is hard to know if it was the same tsunami or ones that occurred hundreds of years apart."An important next thing to do is to look for evidence for tsunamis elsewhere in the Hawaiian island chain," said Witter.Fryer, of the Pacific Tsunami Warning Center, is confident that more evidence of the massive tsunami will be found, confirming that events of this magnitude have rocked the island chain in the not-so-distant past."I've seen the deposit," said Fryer, who was not involved in the study. "I'm absolutely convinced it's a tsunami, and it had to be a monster tsunami."Fryer is so convinced that he has worked with the city and county of Honolulu to update their tsunami evacuation maps to include the possibility of a massive tsunami the size of the one detailed in the new study hitting the islands. The county hopes to have the new maps distributed to residents by the end of the year, he said."We prepared ourselves for the worst tsunami that's likely to happen in one hundred years," Fryer said of the current tsunami evacuation maps based on the 1946 event. "What hit Japan was a thousand-year event … and this scenario [in the eastern Aleutians] is a thousand year event." | Earthquakes | 2,014 |
October 20, 2014 | https://www.sciencedaily.com/releases/2014/10/141020104928.htm | Earthquakes in the ocean: Towards a better understanding of their precursors | Published on 14 September in Nature Geoscience, the study conducted by researchers from several institutes, including IFREMER (French Research Institute for Exploitation of the Sea), CNRS and IFSTTAR, offers the first theoretical model that, based on fluid-related processes, explains the seismic precursors of an underwater earthquake. Using quantitative measurements, this innovative model established a link between observed precursors and the mainshock of an earthquake. The results open a promising avenue of research for guiding future investigations on detecting earthquakes before they strike. | The data used to construct the model presented in the article were collected from subsea observatories* deployed in the North-East Pacific fracture zones.The researchers showed that the properties of the fluids that circulate in submarine fault zones change over time, during what is called the “seismic cycle”. This term describes the cycle during which strain accumulates along a fault until it exceeds the frictional forces that prevent the fault from slipping. An earthquake results at the moment of rupture, due to the sudden release of built-up strain. A new cycle begins with strain accumulating and continues until the next rupture occurs along the fault...Due to their proximity to mid-ocean ridges, the fluids that circulate in the faults undergo tremendous pressure and extremely high temperatures. These fluids can reach the supercritical state. The physical properties of supercritical fluids (density, viscosity, diffusivity) are intermediate to those of liquids and gases.The compressibility of supercritical fluid varies greatly with pressure, and, according to the study’s analysis, this change in compressibility may trigger an earthquake, occurring after a short period of foreshocks.Seismic precursors are the early warning signs before an earthquake strikes. Many different types of earthquake precursors have been studied by the scientific community: ground movements, seismic signals, fluid or gas emissions, electrical signals, thermal signals, animal behaviour, etc.For an event as large as an earthquake, which releases a considerable amount of energy, there must be a preparatory phase. This problem in predicting earthquakes does not lie in the absence of precursors (hindsight observations are numerous), but in the capacity to detect these forerunners before the mainshock.The results of the model can help guide future research in the detection of seismic precursors with, ultimately, potential applications for earthquake prediction. Supercritical fluids require very specific conditions; they are also encountered on land in hydrothermal and volcanic areas, such as Iceland.Under the effect of tectonic forces, two antagonistic effects are usually in play near transform faults. First, increasing shear stress tends to break rocks and weaken resistance in the transform fault. Second, decreasing pressure of the fluid contained in the fault results in an increase in the volume of the pore space between rock beds. This effect acts as a stabilising suction cup, counterbalancing the ‘weakening’ in the rock bed and delaying the triggering of an earthquake.The efficiency of this counterbalancing mechanism depends on fluid compressibility. It is highest in the presence of fluids in the liquid state, whose low compressibility causes a dramatic decrease in fluid pressure in response to small increases in volume. Conversely, for gas-type fluids, which are highly compressible, the suction cup effect is nearly inexistent.When a change in the ‘liquid-gas’ state of the fluid occurs during a fault slip, the counterbalancing mechanism fails, allowing a major shock to be triggered. This transition occurs over several days and has numerous signs, including many small foreshocks.*Subsea observatories are comparable to a laboratory on the seafloor. Equipped with a series of instruments, they record many types of data that can be used to study the geophysical events that occur in the ocean. | Earthquakes | 2,014 |
October 14, 2014 | https://www.sciencedaily.com/releases/2014/10/141014211753.htm | Hydraulic fracturing linked to earthquakes in Ohio | Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study published in the journal | Nearly 400 small earthquakes occurred between Oct. 1 and Dec. 13, 2013, including 10 "positive" magnitude earthquake, none of which were reported felt by the public. The 10 positive magnitude earthquakes, which ranged from magnitude 1.7 to 2.2, occurred between Oct. 2 and 19, coinciding with hydraulic fracturing operations at nearby wells.This series of earthquakes is the first known instance of seismicity in the area.Hydraulic fracturing, or fracking, is a method for extracting gas and oil from shale rock by injecting a high-pressure water mixture directed at the rock to release the gas inside. The process of hydraulic fracturing involves injecting water, sand and chemicals into the rock under high pressure to create cracks. The process of cracking rocks results in micro-earthquakes. Hydraulic fracturing usually creates only small earthquakes, ones that have magnitude in the range of negative 3 (−3) to negative 1 (-1)."Hydraulic fracturing has the potential to trigger earthquakes, and in this case, small ones that could not be felt, however the earthquakes were three orders of magnitude larger than normally expected," said Paul Friberg, a seismologist with Instrumental Software Technologies, Inc. (ISTI) and a co-author of the study.The earthquakes revealed an east-west trending fault that lies in the basement formation at approximately two miles deep and directly below the three horizontal gas wells. The EarthScope Transportable Array Network Facility identified the first earthquakes on Oct. 2, 2013, locating them south of Clendening Lake near the town of Uhrichsville, Ohio. A subsequent analysis identified 190 earthquakes during a 39-hour period on Oct. 1 and 2, just hours after hydraulic fracturing began on one of the wells.The micro-seismicity varied, corresponding with the fracturing activity at the wells. The timing of the earthquakes, along with their tight linear clustering and similar waveform signals, suggest a unique source for the cause of the earthquakes -- the hydraulic fracturing operation. The fracturing likely triggered slip on a pre-existing fault, though one that is located below the formation expected to confine the fracturing, according to the authors."As hydraulic fracturing operations explore new regions, more seismic monitoring will be needed since many faults remain unmapped." Friberg co-authored the paper with Ilya Dricker, also with ISTI, and Glenda Besana-Ostman originally with Ohio Department of Natural Resources, and now with the Bureau of Reclamation at the U.S. Department of Interior. | Earthquakes | 2,014 |
October 14, 2014 | https://www.sciencedaily.com/releases/2014/10/141014211711.htm | Many older adults still homebound after 2011 Great East Japan Earthquake | A new study, published online in the journal | A team of researchers led by Naoki Kondo of the University of Tokyo's School of Public Health studied data from the city of Rikuzentakata, an area that was seriously damaged by the disaster. Of its total population of 23,302 before the events of 2011, 1,773 people died or are still missing. Of 7,730 houses, 3,368 (43.6%) were affected with 3,159 "completely destroyed." Much of the population had been concentrated in flat coastal areas, and since the community infrastructure was totally shattered, many people who lost their houses insisted on moving to areas in the mountains.This study used home-visit interviews with 2,327 adults over 65 years old (1027 men; 1300 women), and was carried out between August 2012 and October 2013. Interviewers gathered information of current morbidity, socio-economic status, health behaviour (diet, smoking, and alcohol intake), frequency of going out, and social support. 19.6% of men and 23.2% of women were shown to be homebound, defined as only leaving the house every 4 or more days. Of those older adults who were classified as homebound, around 40% also had no contact with neighbours.Information was also obtained on the locations of grocery stores, convenience stores, and shopping centres from the online community directory database in August 2012. Information on shopper bus stops and hawker sites was provided by a disaster support team, and the team also collated road network data. This geographical analysis indicated that distances to retail stores was associated with the risk of people being homebound.Lead author Naoki Kondo says: "This study has important implications for public health, especially in the setting of post-disaster community reconstruction. First, community diagnoses in a post-disaster setting should cover the built environment, including access to shopping facilities. Second, to prevent older victims of a disaster such as the Great East Japan Earthquake being homebound, it is clearly essential to provide access to the facilities that fulfil their daily needs."Given the findings of this study, such access could be increased by the private sector, suggesting the importance of public-private partnerships for post-disaster reconstruction."Key messages: | Earthquakes | 2,014 |
October 13, 2014 | https://www.sciencedaily.com/releases/2014/10/141013190612.htm | Some sections of the San Andreas Fault system in San Francisco Bay Area are locked, overdue | Four urban sections of the San Andreas Fault system in Northern California have stored enough energy to produce major earthquakes, according to a new study that measures fault creep. Three fault sections -- Hayward, Rodgers Creek and Green Valley -- are nearing or past their average recurrence interval, according to the study published in the | The earthquake cycle reflects the accumulation of strain on a fault, its release as slip, and its re-accumulation and re-release. Fault creep is the slip and slow release of strain in the uppermost part of the Earth's crust that occurs on some faults between large earthquakes, when much greater stress is released in only seconds. Where no fault creep occurs, a fault is considered locked and stress will build until it is released by an earthquake.This study estimates how much creep occurs on each section of the San Andreas Fault system in Northern California. Enough creep on a fault can diminish the potential size of its next earthquake rupture."The extent of fault creep, and therefore locking, controls the size and timing of large earthquakes on the Northern San Andreas Fault system," said James Lienkaemper, a co-author of the study and research geophysicist at U.S. Geological Survey (USGS). "The extent of creep on some fault sections is not yet well determined, making our first priority to study the urban sections of the San Andreas, which is directly beneath millions of Bay Area residents."Understanding the amount and extent of fault creep directly impacts seismic hazard assessments for the region. The San Andreas Fault system in Northern California consists of five major branches that combine for a total length of approximately 1250 miles. Sixty percent of the fault system releases energy through fault creep, ranging from 0.1 to 25.1 mm (.004 to 1 inch) per year, and about 28 percent remains locked at depth, according to the authors.Monitoring of creep on Bay Area faults has expanded in recent years. The alignment array measurements made by the San Francisco State University Creep Project and recently expanded GPS station networks provide the primary data on surface creep, which the authors used to estimate the average depth of creep for each fault segment. Where available, details of past ruptures of individual faults, unearthed in previous paleoseismic studies, allowed the authors to calculate recurrence rates and the probable timing and size of future earthquakes.According to the study, four faults have accumulated sufficient strain to produce a major earthquake. Three creeping faults have large locked areas (less than 1 mm or .04 inches of creep per year) that have not ruptured in a major earthquake of at least magnitude 6.7 since the reporting of earthquakes by local inhabitants: Rodgers Creek, northern Calaveras and southern Green Valley. The southern Hayward fault, which produced a magnitude 6.8 earthquake in 1868, is now approaching its mean recurrence time based on paleoseismic studies.The authors also estimate three faults appear to be nearing or have exceeded their mean recurrence time and have accumulated sufficient strain to produce large earthquakes: the Hayward (M 6.8), Rodgers Creek (M 7.1) and Green Valley (M 7.1)."The San Andreas Fault and its two other large branches, the Hayward and Northern Calaveras, have been quiet for decades. This study offers a good reminder to prepare today for the next major earthquake," said Lienkaemper. | Earthquakes | 2,014 |
October 2, 2014 | https://www.sciencedaily.com/releases/2014/10/141002123239.htm | Underwater landslide doubled size of 2011 Japanese tsunami | An ocean engineer at the University of Rhode Island has found that a massive underwater landslide, combined with the 9.0 earthquake, was responsible for triggering the deadly tsunami that struck Japan in March 2011. | Professor Stephan Grilli, an international leader in the study of tsunamis, said the generally accepted explanation for the cause of the tsunami had been the earthquake, the fifth largest ever measured, which created a significant uplift and subsidence of the seafloor. While that adequately explains the 10-meter surge that affected much of the impacted area, Grilli said it cannot account for the 40-meter waves that struck a 100-kilometer area of Japan's mountainous Sanriku Coast."Computer models have not been able to explain the large inundation and run-up on the Sanriku Coast using the earthquake alone," Grilli said. "Our model could only get inundation up to 16 or 18 meters, not 40. So we knew there must be another cause."His findings were published this week in the journal In a series of models, Grilli and his former doctoral student Jeff Harris worked backwards in time to recreate the movement of the seafloor from the earthquake and concluded that an additional movement underwater about 100 kilometers north of the earthquake's epicenter must have occurred to propagate the large waves that struck Sanriku. So the URI engineers and colleagues at the British Geological Survey and the University of Tokyo went looking for evidence that something else happened there.Reviewing surveys of the seafloor conducted by Japanese scientists before and after the earthquake, the scientists found signs of a large slump on the seafloor -- a rotational landslide 40 kilometers by 20 kilometers in extent and 2 kilometers thick that traveled down the slope of the Japan Trench, leaving a horizontal footprint the size of Paris that could only have been created by a 100-meter uplift in the seafloor. The earthquake only raised the seafloor 10 meters."Underwater landslides tend to create shorter period tsunami waves, and they tend to concentrate their energy in a small stretch of coastline," said Grilli. "The train of waves from the landslide, combined with the earthquake generated waves, together created the 40 meter inundation along the Sanriku Coast."Grilli said it has been difficult to convince his Japanese colleagues of his research group's results. Most assumed that the massive size of the earthquake was enough to create the waves that were observed."It raises questions about how we've been doing tsunami predictions in the past," he said. "We generally have just considered the largest possible earthquake, but we seldom consider underwater landslides as an additional source," even though large tsunamis in 1998 in Papua New Guinea and in 1946 in the Aleutian Islands were found to be generated by a combination of earthquakes and underwater landslides.Grilli also said that his analysis is under considerable scrutiny because it brings into question whether Japan had adequately prepared for natural disasters prior to the 2011 event."There is a lot at stake in Japan," he said. "Tsunami scientists working for government agencies use tsunami return periods that are much too low in their calculations, leading them to underestimate the tsunami risk. All of the safety procedures they have in place, including at nuclear power plants, are still based on underestimating the maximum earthquake likely to strike Japan, and they underestimate the maximum tsunami, too. Japan is working toward revising their approach to tsunami hazard assessment, but this will take time." | Earthquakes | 2,014 |
September 23, 2014 | https://www.sciencedaily.com/releases/2014/09/140923101432.htm | Drilling Into an Active Earthquake Fault in New Zealand | Three University of Michigan geologists are participating in an international effort to drill nearly a mile beneath the surface of New Zealand this fall to bring back rock samples from an active fault known to generate major earthquakes. | The goal of the Deep Fault Drilling Project is to better understand earthquake processes by sampling the Alpine Fault, which is expected to trigger a large event in the coming decades."We're trying to understand why some faults are more earthquake-prone than others, and that requires fundamental knowledge about the processes at work," said Ben van der Pluijm, the Bruce R. Clark Collegiate Professor of Geology in the U-M Department of Earth and Environmental Sciences.Van der Pluijm and two of his EES colleagues -- doctoral student Austin Boles and research scientist Anja Schleicher -- are part of the team scheduled to start the two-month drilling project early next month. Schleicher will spend October at the site, and Boles will be there for about six weeks starting in early November.It will be only the second science project to drill deep into an active earthquake fault and return samples. Several years ago, scientists drilled a nearly 2-mile-deep hole into California's San Andreas Fault. Van der Pluijm was a member of that team, as well."I hope we find something different this time, a different rock signature that contrasts with what we saw at the San Andreas," he said.The goal is to drill 0.8 miles (1.3 kilometers) into the 530-mile-long Alpine Fault, which marks the boundary between the Australian and Pacific tectonic plates, on New Zealand's South Island. Though most of the movement along the fault is lateral rather than vertical, the fault is responsible for lifting the Southern Alps, the rugged mountain range featured in the "Lord of the Rings" movies.Earthquakes occur on the Alpine Fault every 200 to 400 years at magnitudes of 7.5 to 8.0, with an average time between successive large earthquakes of about 330 years. Though earthquakes of that size that originate at shallow depths are capable of tremendous damage, the region is sparsely populated.The last Alpine Fault quake occurred in 1717, and the probability of another big one occurring there in the next 50 years has been calculated at about 28 percent. So the $2.5 million Deep Fault Drilling Project presents a rare opportunity to collect and analyze samples from a major fault before it breaks.The task for van der Pluijm and his colleagues is to analyze the possible role of clay minerals and friction melting in the fault zone. Radiometric dating, X-ray studies and isotopic-analysis techniques will be used to determine how much clay is in the rock samples and when those clays formed, as well as the likely source of the water that helped produce them."The information we can extract from these clays is remarkably rich," said Boles, who will use data from the New Zealand study in his doctoral dissertation. "These clay minerals are a key tool that we can use to better understand the physical and chemical processes happening in an active fault."Clay minerals can help reduce friction and heat generation along a fault, lubricating it so that pressure is released through steady, relatively small and nondestructive "creeping" motions rather than the periodic violent jolts known as earthquakes.Creeping motions were observed along the portion of the San Andreas Fault drilled by scientists several years ago. Temperatures in that fault were relatively low, and clay-rich rocks from the active zone were returned to the surface."We think that clays are a significant player in making faults less earthquake-prone," van der Pluijm said. "We know that the section of the Alpine Fault we'll be drilling has a history of producing large earthquakes. So finding little clay and, instead, evidence for frictional melting in the rock would better fit the large-earthquake scenario. That would be a fantastic breakthrough."In addition to sampling the fault during the two-month drilling program, researchers will install permanent pressure, temperature and seismic-monitoring sensors in the borehole.The U-M researchers are hoping to obtain a rock sample about the volume of a baseball from deep within the Alpine Fault. That would be plenty to complete their various studies, which are funded by the National Science Foundation and the International Continental Scientific Drilling Program."Getting the right samples is more important than the amount," van der Pluijm said. "Returning samples to the surface from depth is always a challenge, but I'm confident that it will work." | Earthquakes | 2,014 |
September 17, 2014 | https://www.sciencedaily.com/releases/2014/09/140917131814.htm | New explanation for origin of plate tectonics: What set Earth's plates in motion? | The mystery of what kick-started the motion of our earth's massive tectonic plates across its surface has been explained by researchers at the University of Sydney. | "Earth is the only planet in our solar system where the process of plate tectonics occurs," said Professor Patrice Rey, from the University of Sydney's School of Geosciences."The geological record suggests that until three billion years ago the Earth's crust was immobile so what sparked this unique phenomenon has fascinated geoscientists for decades. We suggest it was triggered by the spreading of early continents then eventually became a self-sustaining process."Professor Rey is lead author of an article on the findings published in The other authors on the paper are Nicolas Flament, also from the School of Geosciences and Nicolas Coltice, from the University of Lyon.There are eight major tectonic plates that move above Earth's mantle at rates up to 150 millimetres every year.In simple terms the process involves plates being dragged into the mantle at certain points and moving away from each other at others, in what has been dubbed 'the conveyor belt'.Plate tectonics depends on the inverse relationship between density of rocks and temperature.At mid-oceanic ridges, rocks are hot and their density is low, making them buoyant or more able to float. As they move away from those ridges they cool down and their density increases until, where they become denser than the underlying hot mantle, they sink and are 'dragged' under.But three to four billion years ago, Earth's interior was hotter, volcanic activity was more prominent and tectonic plates did not become cold and dense enough to spontaneously sank."So the driving engine for plate tectonics didn't exist," said Professor Rey said."Instead, thick and buoyant early continents erupted in the middle of immobile plates. Our modelling shows that these early continents could have placed major stress on the surrounding plates. Because they were buoyant they spread horizontally, forcing adjacent plates to be pushed under at their edges.""This spreading of the early continents could have produced intermittent episodes of plate tectonics until, as the Earth's interior cooled and its crust and plate mantle became heavier, plate tectonics became a self-sustaining process which has never ceased and has shaped the face of our modern planet."The new model also makes a number of predictions explaining features that have long puzzled the geoscience community. | Earthquakes | 2,014 |
September 15, 2014 | https://www.sciencedaily.com/releases/2014/09/140915202949.htm | Wastewater injection is culprit for most earthquakes in southern Colorado and northern New Mexico, study finds | The deep injection of wastewater underground is responsible for the dramatic rise in the number of earthquakes in Colorado and New Mexico since 2001, according to a study to be published in the | The Raton Basin, which stretches from southern Colorado into northern New Mexico, was seismically quiet until shortly after major fluid injection began in 1999. Since 2001, there have been 16 magnitude > 3.8 earthquakes (including M 5.0 and 5.3), compared to only one (M 4.0) the previous 30 years. The increase in earthquakes is limited to the area of industrial activity and within 5 kilometers (3.1 miles) of wastewater injection wells.In 1994, energy companies began producing coal-bed methane in Colorado and expanded production to New Mexico in 1999. Along with the production of methane, there is the production of wastewater, which is injected underground in disposal wells and can raise the pore pressure in the surrounding area, inducing earthquakes. Several lines of evidence suggest the earthquakes in the area are directly related to the disposal of wastewater, a by-product of extracting methane, and not to hydraulic fracturing occurring in the area.Beginning in 2001, the production of methane expanded, with the number of high-volume wastewater disposal wells increasing (21 presently in Colorado and 7 in New Mexico) along with the injection rate. Since mid-2000, the total injection rate across the basin has ranged from 1.5 to 3.6 million barrels per month.The authors, all scientists with the U.S. Geological Survey, detail several lines of evidence directly linking the injection wells to the seismicity. The timing and location of seismicity correspond to the documented pattern of injected wastewater. Detailed investigations of two seismic sequences (2001 and 2011) places them in proximity to high-volume, high-injection-rate wells, and both sequences occurred after a nearby increase in the rate of injection. A comparison between seismicity and wastewater injection in Colorado and New Mexico reveals similar patterns, suggesting seismicity is initiated shortly after an increase in injection rates. | Earthquakes | 2,014 |
September 15, 2014 | https://www.sciencedaily.com/releases/2014/09/140915202947.htm | Mega-quake possible for subduction zones along 'Ring of Fire,' new study suggests | The magnitude of the 2011 Tohoku quake (M 9.0) caught many seismologists by surprise, prompting some to revisit the question of calculating the maximum magnitude earthquake possible for a particular fault. New research offers an alternate view that uses the concept of probable maximum magnitude events over a given period, providing the magnitude and the recurrence rate of extreme events in subduction zones for that period. Most circum-Pacific subduction zones can produce earthquakes of magnitude greater than 9.0, suggests the study. | The idea of identifying the maximum magnitude for a fault isn't new, and its definition varies based on context. This study, published online by the Bulletin of the Seismological Society of America (BSSA), calculates the "probable maximum earthquake magnitude within a time period of interest," estimating the probable magnitude of subduction zone earthquakes for various time periods, including 250, 500 and 10,000 years."Various professionals use the same terminology -- maximum magnitude -- to mean different things. The most interesting question for us was what was going to be the biggest magnitude earthquake over a given period of time?" said co-author Yufang Rong, a seismologist at the Center for Property Risk Solutions of FM Global, a commercial and industrial property insurer. "Can we know the exact, absolute maximum magnitude? The answer is no, however, we developed a simple methodology to estimate the probable largest magnitude within a specific time frame."The study's results indicated most of the subduction zones can generate M 8.5 or greater over a 250-return period; M 8.8 or greater over 500 years; and M 9.0 or greater over 10,000 years."Just because a subduction zone hasn't produced a magnitude 8.8 in 499 years, that doesn't mean one will happen next year," said Rong. "We are talking about probabilities."The instrumental and historical earthquake record is brief, complicating any attempt to confirm recurrence rates and estimate with confidence the maximum magnitude of an earthquake in a given period. The authors validated their methodology by comparing their findings to the seismic history of the Cascadia subduction zone, revealed through deposits of marine sediment along the Pacific Northwest coast. While some subduction zones have experienced large events during recent history, the Cascadia subduction zone has remained quiet. Turbidite and onshore paleoseismic studies have documented a rich seismic history, identifying 40 large events over the past 10,000 years."Magnitude limits of subduction zone earthquakes" is co-authored by Rong, David Jackson of UCLA, Harold Magistrale of FM Global, and Chris Goldfinger of Oregon State University. The paper will be published online Sept. 16 by BSSA as well as in its October print edition. | Earthquakes | 2,014 |
September 10, 2014 | https://www.sciencedaily.com/releases/2014/09/140910152518.htm | Major earthquake may occur off coast of Istanbul, seismic shifts suggest | When a segment of a major fault line goes quiet, it can mean one of two things: The "seismic gap" may simply be inactive -- the result of two tectonic plates placidly gliding past each other -- or the segment may be a source of potential earthquakes, quietly building tension over decades until an inevitable seismic release. | Researchers from MIT and Turkey have found evidence for both types of behavior on different segments of the North Anatolian Fault -- one of the most energetic earthquake zones in the world. The fault, similar in scale to California's San Andreas Fault, stretches for about 745 miles across northern Turkey and into the Aegean Sea.The researchers analyzed 20 years of GPS data along the fault, and determined that the next large earthquake to strike the region will likely occur along a seismic gap beneath the Sea of Marmara, some five miles west of Istanbul. In contrast, the western segment of the seismic gap appears to be moving without producing large earthquakes."Istanbul is a large city, and many of the buildings are very old and not built to the highest modern standards compared to, say, southern California," says Michael Floyd, a research scientist in MIT's Department of Earth, Atmospheric and Planetary Sciences. "From an earthquake scientist's perspective, this is a hotspot for potential seismic hazards."Although it's impossible to pinpoint when such a quake might occur, Floyd says this one could be powerful -- on the order of a magnitude 7 temblor, or stronger."When people talk about when the next quake will be, what they're really asking is, 'When will it be, to within a few hours, so that I can evacuate?' But earthquakes can't be predicted that way," Floyd says. "Ultimately, for people's safety, we encourage them to be prepared. To be prepared, they need to know what to prepare for -- that's where our work can contribute"Floyd and his colleagues, including Semih Ergintav of the Kandilli Observatory and Earthquake Research Institute in Istanbul and MIT research scientist Robert Reilinger, have published their seismic analysis in the journal In recent decades, major earthquakes have occurred along the North Anatolian Fault in a roughly domino-like fashion, breaking sequentially from east to west. The most recent quake occurred in 1999 in the city of Izmit, just east of Istanbul. The initial shock, which lasted less than a minute, killed thousands. As Istanbul sits at the fault's western end, many scientists have thought the city will be near the epicenter of the next major quake.To get an idea of exactly where the fault may fracture next, the MIT and Turkish researchers used GPS data to measure the region's ground movement over the last 20 years. The group took data along the fault from about 100 GPS locations, including stations where data are collected continuously and sites where instruments are episodically set up over small markers on the ground, the positions of which can be recorded over time as the Earth slowly shifts."By continuously tracking, we can tell which parts of the Earth's crust are moving relative to other parts, and we can see that this fault has relative motion across it at about the rate at which your fingernail grows," Floyd says.From their ground data, the researchers estimate that, for the most part, the North Anatolian Fault must move at about 25 millimeters -- or one inch -- per year, sliding quietly or slipping in a series of earthquakes.As there's currently no way to track the Earth's movement offshore, the group also used fault models to estimate the motion off the Turkish coast. The team identified a segment of the fault under the Sea of Marmara, west of Istanbul, that is essentially stuck, with the "missing" slip accumulating at 10 to 15 millimeters per year. This section -- called the Princes' Island segment, for a nearby tourist destination -- last experienced an earthquake 250 years ago.Floyd and colleagues calculate that the Princes' Island segment should have slipped about 8 to 11 feet -- but it hasn't. Instead, strain has likely been building along the segment for the last 250 years. If this tension were to break the fault in one cataclysmic earthquake, the Earth could shift by as much as 11 feet within seconds.Although such accumulated strain may be released in a series of smaller, less hazardous rumbles, Floyd says that given the historical pattern of major quakes along the North Anatolian Fault, it would be reasonable to expect a large earthquake off the coast of Istanbul within the next few decades."Earthquakes are not regular or predictable," Floyd says. "They're far more random over the long run, and you can go many lifetimes without experiencing one. But it only takes one to affect many lives. In a location like Istanbul that is known to be subject to large earthquakes, it comes back to the message: Always be prepared." | Earthquakes | 2,014 |
September 10, 2014 | https://www.sciencedaily.com/releases/2014/09/140910093227.htm | New study reconstructs mega-earthquakes timeline in Indian Ocean | A new study on the frequency of past giant earthquakes in the Indian Ocean region shows that Sri Lanka, and much of the Indian Ocean, is affected by large tsunamis at highly variable intervals, from a few hundred to more than one thousand years. The findings suggest that the accumulation of stress in the region could generate as large, or even larger tsunamis than the one that resulted from the 2004 magnitude-9.2 Sumatra earthquake. | Researchers from the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and the University of Peradeniya in Sri Lanka collected and analyzed 22 sediment cores from Karagan Lagoon, Hambantota in southeastern Sri Lanka, to expand the historical record of giant earthquakes along the Sumatra-Andaman subduction zone, where the Indo-Australian plate and Eurasian plate meet. Using sand deposited in the lagoon during the 2004 Indian Ocean tsunami and seven older paleo-tsunami deposits as proxies for large earthquakes in the region, the scientists reconstructed the timeline for mega-earthquakes along the Indian Ocean's plate boundary from Myanmar to Indonesia, assuming that the tsunamis were all generated by large earthquakes."In Sri Lanka, coastal lagoons were inundated by this tsunami and others that occurred over thousands of years," said Gregor Eberli, professor of Marine Geosciences and director of UM's CSL -- Center for Carbonate Research. "These lagoons are ideal repositories for tsunami sand layers because after deposition, the tsunami sands were sealed with mud."The Dec. 26, 2004 M-9.2 Sumatra earthquake resulted in a trans-oceanic tsunami, with wave heights up to 100 feet (30 meters) in some places, which impacted much of the Indian Ocean region causing widespread damage in southeastern Sri Lanka.During the a 7,000-year record of Indian Ocean tsunamis preserved in the sediment, the research team found evidence that estimated the time period between consecutive tsunamis from 181 (up to 517) years and 1045 (± 334) years. The longest period was nearly twice the time period prior to the 2004 earthquake."These results are very important to better understand the tsunami hazard in Sri Lanka," said Kelly Jackson, UM Rosenstiel School Ph.D. candidate and lead author of the study."A scary result is a 1000-year time period without a tsunami, which is nearly twice as long as the lull period prior to the 2004 earthquake," said Falk Amelung, professor of geophysics within the department of Marine Geosciences at the UM Rosenstiel School. "This means that the subduction zone is capable of generating earthquakes almost twice as big as in 2004, although we don't have any evidence yet that this actually happened.""The 2004 tsunami caught us completely by surprise, although we should have known better because there is a Sri Lankan legend in which the sea came ashore in 200 B.C.," says Chandra Jayasena, a geologist at the University of Peradeniya. "We now need to study other lagoons to further expand the historical record of large tsunami-generating earthquakes in the region and get a better understanding of the earthquake frequency in this highly populated region."The region's subduction zone exhibits great variability in rupture modes, putting it on the list with the Cascadia Subduction Zone, which stretches from Vancouver Island to northern California and Chile, according to the authors. | Earthquakes | 2,014 |
September 8, 2014 | https://www.sciencedaily.com/releases/2014/09/140908152924.htm | Textbook theory behind volcanoes may be wrong | In the typical textbook picture, volcanoes, such as those that are forming the Hawaiian islands, erupt when magma gushes out as narrow jets from deep inside Earth. But that picture is wrong, according to a new study from researchers at Caltech and the University of Miami in Florida. | New seismology data are now confirming that such narrow jets don't actually exist, says Don Anderson, the Eleanor and John R. McMillian Professor of Geophysics, Emeritus, at Caltech. In fact, he adds, basic physics doesn't support the presence of these jets, called mantle plumes, and the new results corroborate those fundamental ideas."Mantle plumes have never had a sound physical or logical basis," Anderson says. "They are akin to Rudyard Kipling's 'Just So Stories' about how giraffes got their long necks."Anderson and James Natland, a professor emeritus of marine geology and geophysics at the University of Miami, describe their analysis online in the September 8 issue of the According to current mantle-plume theory, Anderson explains, heat from Earth's core somehow generates narrow jets of hot magma that gush through the mantle and to the surface. The jets act as pipes that transfer heat from the core, and how exactly they're created isn't clear, he says. But they have been assumed to exist, originating near where Earth's core meets the mantle, almost 3,000 kilometers underground -- nearly halfway to the planet's center. The jets are theorized to be no more than about 300 kilometers wide, and when they reach the surface, they produce hot spots.While the top of the mantle is a sort of fluid sludge, the uppermost layer is rigid rock, broken up into plates that float on the magma-bearing layers. Magma from the mantle beneath the plates bursts through the plate to create volcanoes. As the plates drift across the hot spots, a chain of volcanoes forms -- such as the island chains of Hawaii and Samoa."Much of solid-Earth science for the past 20 years -- and large amounts of money -- have been spent looking for elusive narrow mantle plumes that wind their way upward through the mantle," Anderson says.To look for the hypothetical plumes, researchers analyze global seismic activity. Everything from big quakes to tiny tremors sends seismic waves echoing through Earth's interior. The type of material that the waves pass through influences the properties of those waves, such as their speeds. By measuring those waves using hundreds of seismic stations installed on the surface, near places such as Hawaii, Iceland, and Yellowstone National Park, researchers can deduce whether there are narrow mantle plumes or whether volcanoes are simply created from magma that's absorbed in the sponge-like shallower mantle.No one has been able to detect the predicted narrow plumes, although the evidence has not been conclusive. The jets could have simply been too thin to be seen, Anderson says. Very broad features beneath the surface have been interpreted as plumes or super-plumes, but, still, they're far too wide to be considered narrow jets.But now, thanks in part to more seismic stations spaced closer together and improved theory, analysis of the planet's seismology is good enough to confirm that there are no narrow mantle plumes, Anderson and Natland say. Instead, data reveal that there are large, slow, upward-moving chunks of mantle a thousand kilometers wide.In the mantle-plume theory, Anderson explains, the heat that is transferred upward via jets is balanced by the slower downward motion of cooled, broad, uniform chunks of mantle. The behavior is similar to that of a lava lamp, in which blobs of wax are heated from below and then rise before cooling and falling. But a fundamental problem with this picture is that lava lamps require electricity, he says, and that is an outside energy source that an isolated planet like Earth does not have.The new measurements suggest that what is really happening is just the opposite: Instead of narrow jets, there are broad upwellings, which are balanced by narrow channels of sinking material called slabs. What is driving this motion is not heat from the core, but cooling at Earth's surface. In fact, Anderson says, the behavior is the regular mantle convection first proposed more than a century ago by Lord Kelvin. When material in the planet's crust cools, it sinks, displacing material deeper in the mantle and forcing it upward."What's new is incredibly simple: upwellings in the mantle are thousands of kilometers across," Anderson says. The formation of volcanoes then follows from plate tectonics -- the theory of how Earth's plates move and behave. Magma, which is less dense than the surrounding mantle, rises until it reaches the bottom of the plates or fissures that run through them. Stresses in the plates, cracks, and other tectonic forces can squeeze the magma out, like how water is squeezed out of a sponge. That magma then erupts out of the surface as volcanoes. The magma comes from within the upper 200 kilometers of the mantle and not thousands of kilometers deep, as the mantle-plume theory suggests."This is a simple demonstration that volcanoes are the result of normal broad-scale convection and plate tectonics," Anderson says. He calls this theory "top-down tectonics," based on Kelvin's initial principles of mantle convection. In this picture, the engine behind Earth's interior processes is not heat from the core but cooling at the planet's surface. This cooling and plate tectonics drives mantle convection, the cooling of the core, and Earth's magnetic field. Volcanoes and cracks in the plate are simply side effects.The results also have an important consequence for rock compositions -- notably the ratios of certain isotopes, Natland says. According to the mantle-plume idea, the measured compositions derive from the mixing of material from reservoirs separated by thousands of kilometers in the upper and lower mantle. But if there are no mantle plumes, then all of that mixing must have happened within the upwellings and nearby mantle in Earth's top 1,000 kilometers.The paper is titled "Mantle updrafts and mechanisms of oceanic volcanism." | Earthquakes | 2,014 |
September 3, 2014 | https://www.sciencedaily.com/releases/2014/09/140903162437.htm | New, inexpensive method for understanding earthquake topography | Using high-resolution topography models not available in the past, geologists can greatly enrich their research. However, current methods of acquisition are costly and require trained personnel with high-tech, cumbersome equipment. In light of this, Kendra Johnson and colleagues have developed a new system that takes advantage of affordable, user-friendly equipment and software to produce topography data over small, sparsely vegetated sites at comparable (or better) resolution and accuracy to standard methods. | Their workflow is based on structure from motion (SfM), which uses overlapping photographs of a scene to produce a 3-D model that represents the shape and scale of the terrain. To acquire the photos, Johnson and colleagues attached a camera programmed to take time-lapse photos to a helium balloon or small, remote-controlled glider. They augmented the aerial data by recording a few GPS points of ground features that would be easily recognized in the photographs.Using a software program called Agisoft Photoscan, they combined the photographs and GPS data to produce a robust topographic model.Johnson and colleagues note that this SfM workflow can be used for many geologic applications. In this study for They targeted two sites in southern California, each of which has existing topography data collected using well-established, laser-scanning methods.The first site covers a short segment of the southern San Andreas fault that historically has not had a large earthquake; however, the ground surface reveals evidence of prehistoric ruptures that help estimate the size and frequency of earthquakes on this part of the fault. The team notes that this evidence is more easily quantified using high-resolution topography data than by geologists working in the field.The second site covers part of the surface rupture formed during the 1992 Landers earthquake (near Palm Springs, California, USA). Johnson and colleagues chose this site to test the capability of their workflow as part of the scientific response that immediately follows an earthquake.At each site, they compared their SfM data to the existing laser scanner data and found that the values closely matched. Johnson and colleagues conclude that their new SfM workflow produces topography data at sufficient quality for use in earthquake research. | Earthquakes | 2,014 |
September 2, 2014 | https://www.sciencedaily.com/releases/2014/09/140902114224.htm | Can a stack of computer servers survive an earthquake? | How do you prevent an earthquake from destroying expensive computer systems? | That's the question earthquake engineer Claudia Marin-Artieda, PhD, associate professor of civil engineering at Howard University, aims to answer through a series of experiments conducted at the University at Buffalo."The loss of functionality of essential equipment and components can have a disastrous impact. We can limit these sorts of equipment losses by improving their seismic performance," Marin-Artieda said.In buildings such as data centers, power plants and hospitals, it could be catastrophic to have highly-sensitive equipment swinging, rocking, falling and generally bashing into things.In high-seismic regions, new facilities often are engineered with passive protective systems that provide overall seismic protection. But often, existing facilities are conventional fixed-base buildings in which seismic demands on sensitive equipment located within are significantly amplified. In such buildings, sensitive equipment needs to be secured from these damaging earthquake effects, Marin-Artieda said.The stiffer the building, the greater the magnification of seismic effects, she added."It is like when you are riding a rollercoaster," she said. "If your body is relaxed, you don't feel strong inertial effects. But if you hold your body rigid, you'll feel the inertial effects much more, and you'll get knocked about in the car."The experiments were conducted this month at the University at Buffalo's Network for Earthquake Engineering Simulation (NEES), a shared network of laboratories based at Purdue University.Marin-Artieda and her team used different devices for supporting 40 computer servers donated by Yahoo Labs. The researchers attached the servers to a frame in multiple configurations on seismically isolated platforms. They then subjected the frame to a variety of three-directional ground motions with the servers in partial operation to monitor how they react to an earthquake simulation.Preliminary work confirmed, among other things, that globally and locally installed seismic isolation and damping systems can significantly reduce damage to computer systems and other electronic equipment.Base isolation is a technique that sets objects atop an energy-absorbing base; damping employs energy-absorbing devices within the object to be protected from an earthquake's damaging effects.Marin-Artieda plans to expand the research by developing a framework for analysis, design and implementation of the protective measures.The research is funded by the National Science Foundation. In addition to Yahoo Labs, industry partners include Seismic Foundation Control Inc., The VMC Group, Minus K Technology Inc., Base Isolation of Alaska, and Roush Industries Inc. All provided in-kind materials for the experiments.Video showing one of the tests, which mimics 80 percent of the force of 1994's Northridge earthquake: | Earthquakes | 2,014 |
September 2, 2014 | https://www.sciencedaily.com/releases/2014/09/140902092959.htm | Seismic hazards reassessed in the Andes | Although being able to predict the date on which the next big earthquake will occur is still some way off becoming a reality, it is now possible to identify the areas where they will occur. IRD researchers and their French, Ecuadorian and Peruvian partners have just measured the current deformation in the northern part of the Andes for the first time using GPS, where the tectonics of the Pacific and South American plates govern the high seismic activity in the region. The scientists then identified the areas where the fault, located at the interface of these two plates, is capable of generating large earthquakes or not. | This work, which was published in The Andes have had three of the largest earthquakes ever recorded: on the border between Colombia and Ecuador in 1906, as well as in Chile, in 1960 and again in 2010. When will one of these major earthquakes happen there again? It is impossible to say... But scientists can now identify the areas where it will occur. Researchers from the Géoazur, ISTerre and ISTEP laboratories and their partners from geophysical and geographical institutes in Ecuador and Peru, have just measured the deformation in the northern Andes caused by the subduction of the Pacific Oceanic plate under the South American continental plate. Using a vast GPS network which has been deployed since 2008 and observational data collected since the 1990s, they have quantified the movements of 100 measurement points from central Peru to southern Colombia, with an accuracy of about one millimetre per year.The researchers were able to locate the areas at risk. Only two fault segments can produce mega-earthquakes (greater than 8.5 on the Richter scale), potentially accompanied by tsunamis: the first is located in central Peru and the second is further north, extending from northern Ecuador to southern Colombia. In between these two active segments, the research team identified a third subduction segment. Somewhat surprisingly, this is characterised by sliding that is mainly "aseismic." So in this area spanning more than 1,000 km from the north of Peru to the south of Ecuador, or 20% of the length of the Andean subduction, the accumulated energy seems insufficient to produce a mega-earthquake. Across the region, earthquakes remain more superficial and more modest in magnitude, as shown in recent history.These studies have also enabled the researchers to discover a large continental block, wedged between the Pacific and South American plates. This piece of continent was called the "sliver Inca" by the authors of the study and is more than 1,500 km long and 300 to 400 km wide. It is separated from the continental plate and moves 5 to 6 mm per year towards the south-east in relation to it. This finding suggests that the current deformation of the Andes from Venezuela to southern Chile, and the seismic activity in the region are dominated by the movements of several microplates of that type.The discovery of the "sliver Inca" also explains the location of major tectonic structures. For example, the Bolivian highlands, the second highest plateau in the world, was created by the "sliver Inca" and the central Andes microplate coming together. In contrast, the opening of the Gulf of Guayaquil in Ecuador is a result of the divergence of the Inca block and the northern Andes microplate.These studies allow a better understanding of recent developments in the Andes and their continental margins. They therefore make better estimates of seismic hazards in the region possible. | Earthquakes | 2,014 |
September 1, 2014 | https://www.sciencedaily.com/releases/2014/09/140901211409.htm | Likely near-simultaneous earthquakes complicate seismic hazard planning for Italy | Before the shaking from one earthquake ends, shaking from another might begin, amplifying the effect of ground motion. Such sequences of closely timed, nearly overlapping, consecutive earthquakes account for devastating seismic events in Italy's history and should be taken into account when building new structures, according to research published in the September issue of the journal | "It's very important to consider this scenario of earthquakes, occurring possibly seconds apart, one immediately after another," said co-author Anna Tramelli, a seismologist with the Istituto Nazionale di Geofisica e Vulcanologia in Naples, Italy. "Two consecutive mainshocks of magnitude 5.8 could have the effect of a magnitude 6 earthquake in terms of energy release. But the effect on a structure could be even larger than what's anticipated from a magnitude 6 earthquake due to the longer duration of shaking that would negatively impact the resilience of a structure."Historically, multiple triggered mainshocks, with time delays of seconds to days, have caused deadly earthquakes along the Italian Apennine belt, a series of central mountain ranges extending the length of Italy. The 1997-98 Umbria-March seismic sequence numbered six mainshocks of moderate magnitude, ranging M 5.2 -- 6.0. The 1980 Irpinia earthquakes included a sequence of three events, occurring at intervals within 20 seconds of each other. The 2012 Emilia sequence started with an M 5.9 event, with the second largest mainshock (M 5.8) occurring nine days later, and included more than 2000 aftershocks.In this study, Tramelli and her colleagues used the recorded waveforms from the 2012 Emilia seismic sequence to simulate a seismic sequence that triggered end-to-end earthquakes along adjacent fault patches, observing the affect of continuous ruptures on the resulting ground motion and, consequently, its impact on critical structures, such as dams, power plants, hospitals and bridges."We demonstrated that consecutively triggered earthquakes can enhance the amount of energy produced by the ruptures, exceeding the design specifications expected for buildings in moderate seismic hazard zones," said Tramelli, whose analysis suggests that the shaking from multiple magnitude 5.0 earthquakes would be significantly greater than from an individual magnitude 5.0 event.And back-to-back earthquakes are more than theoretical, say the authors, who note that this worst-case scenario has happened at least once in Italy's recent history. Previous studies identified three sub-events at intervals of 20 seconds in the seismic signals recorded during the 1980 Irpinia earthquake sequence, whose shared ground motion caused more than 3000 deaths and significant damage to structures.A "broader and modern approach" to seismic risk mitigation in Italy, suggest the authors, would incorporate the scenario of multiple triggered quakes, along with the present understanding of active fault locations, mechanisms and interaction. | Earthquakes | 2,014 |
August 27, 2014 | https://www.sciencedaily.com/releases/2014/08/140827111948.htm | Pacific plate shrinking as it cools | The tectonic plate that dominates the Pacific "Ring of Fire" is not as rigid as many scientists assume, according to researchers at Rice University and the University of Nevada. | Rice geophysicist Richard Gordon and his colleague, Corné Kreemer, an associate professor at the University of Nevada, Reno, have determined that cooling of the lithosphere -- the outermost layer of Earth -- makes some sections of the Pacific plate contract horizontally at faster rates than others and cause the plate to deform.Gordon said the effect detailed this month in Geology is most pronounced in the youngest parts of the lithosphere -- about 2 million years old or less -- that make up some the Pacific Ocean's floor. They predict the rate of contraction to be 10 times faster than older parts of the plate that were created about 20 million years ago and 80 times faster than very old parts of the plate that were created about 160 million years ago.The tectonic plates that cover Earth's surface, including both land and seafloor, are in constant motion; they imperceptibly surf the viscous mantle below. Over time, the plates scrape against and collide into each other, forming mountains, trenches and other geological features.On the local scale, these movements cover only inches per year and are hard to see. The same goes for deformations of the type described in the new paper, but when summed over an area the size of the Pacific plate, they become statistically significant, Gordon said.The new calculations showed the Pacific plate is pulling away from the North American plate a little more -- approximately 2 millimeters a year -- than the rigid-plate theory would account for, he said. Overall, the plate is moving northwest about 50 millimeters a year."The central assumption in plate tectonics is that the plates are rigid, but the studies that my colleagues and I have been doing for the past few decades show that this central assumption is merely an approximation -- that is, the plates are not rigid," Gordon said. "Our latest contribution is to specify or predict the nature and rate of deformation over the entire Pacific plate."The researchers already suspected cooling had a role from their observation that the 25 large and small plates that make up Earth's shell do not fit together as well as the "rigid model" assumption would have it. They also knew that lithosphere as young as 2 million years was more malleable than hardened lithosphere as old as 170 million years."We first showed five years ago that the rate of horizontal contraction is inversely proportional to the age of the seafloor," he said. "So it's in the youngest lithosphere (toward the east side of the Pacific plate) where you get the biggest effects."The researchers saw hints of deformation in a metric called plate circuit closure, which describes the relative motions where at least three plates meet. If the plates were rigid, their angular velocities at the triple junction would have a sum of zero. But where the Pacific, Nazca and Cocos plates meet west of the Galápagos Islands, the nonclosure velocity is 14 millimeters a year, enough to suggest that all three plates are deforming."When we did our first global model in 1990, we said to ourselves that maybe when we get new data, this issue will go away," Gordon said. "But when we updated our model a few years ago, all the places that didn't have plate circuit closure 20 years ago still didn't have it."There had to be a reason, and it began to become clear when Gordon and his colleagues looked beneath the seafloor. "It's long been understood that the ocean floor increases in depth with age due to cooling and thermal contraction. But if something cools, it doesn't just cool in one direction. It's going to be at least approximately isotropic. It should shrink the same in all directions, not just vertically," he said.A previous study by Gordon and former Rice graduate student Ravi Kumar calculated the effect of thermal contraction on vertical columns of oceanic lithosphere and determined its impact on the horizontal plane, but viewing the plate as a whole demanded a different approach. "We thought about the vertically integrated properties of the lithosphere, but once we did that, we realized Earth's surface is still a two-dimensional problem," he said.For the new study, Gordon and Kreemer started by determining how much the contractions would, on average, strain the horizontal surface. They divided the Pacific plate into a grid and calculated the strain on each of the nearly 198,000 squares based on their age, as determined by the seafloor age model published by the National Geophysical Data Center."That we could calculate on a laptop," Gordon said. "If we tried to do it in three dimensions, it would take a high-powered computer cluster."The surface calculations were enough to show likely strain fields across the Pacific plate that, when summed, accounted for the deformation. As further proof, the distribution of recent earthquakes in the Pacific plate, which also relieve the strain, showed a greater number occurring in the plate's younger lithosphere. "In the Earth, those strains are either accommodated by elastic deformation or by little earthquakes that adjust it," he said."The central assumption of plate tectonics assumes the plates are rigid, and this is what we make predictions from," said Gordon, who was recently honored by the American Geophysical Union for writing two papers about plate movements that are among the top 40 papers ever to appear in one of the organization's top journals. "Up until now, it's worked really well.""The big picture is that we now have, subject to experimental and observational tests, the first realistic, quantitative estimate of how the biggest oceanic plate departs from that rigid-plate assumption."The National Science Foundation supported the research. Gordon is the Keck Professor of Geophysics and chairman of the Earth Science Department at Rice. | Earthquakes | 2,014 |
August 21, 2014 | https://www.sciencedaily.com/releases/2014/08/140821141542.htm | Severe drought is causing the western US to rise like a spring uncoiling | The severe drought gripping the western United States in recent years is changing the landscape well beyond localized effects of water restrictions and browning lawns. Scientists at Scripps Institution of Oceanography at UC San Diego have now discovered that the growing, broad-scale loss of water is causing the entire western U.S. to rise up like an uncoiled spring. | Investigating ground positioning data from GPS stations throughout the west, Scripps researchers Adrian Borsa, Duncan Agnew, and Dan Cayan found that the water shortage is causing an "uplift" effect up to 15 millimeters (more than half an inch) in California's mountains and on average four millimeters (0.15 of an inch) across the west. From the GPS data, they estimate the water deficit at nearly 240 gigatons (62 trillion gallons of water), equivalent to a six-inch layer of water spread out over the entire western U.S.Adrian Borsa, an assistant research geophysicist at Scripps Institution of Oceanography, UC San Diego.Results of the study, which was supported by the U.S. Geological Survey (USGS), appear in the August 21 online edition of the journal While poring through various sets of data of ground positions from highly precise GPS stations within the National Science Foundation's Plate Boundary Observatory and other networks, Borsa, a Scripps assistant research geophysicist, kept noticing the same pattern over the 2003-2014 period: All of the stations moved upwards in the most recent years, coinciding with the timing of the current drought.Agnew, a Scripps Oceanography geophysics professor who specializes in studying earthquakes and their impact on shaping Earth's crust, says the GPS data can only be explained by rapid uplift of the tectonic plate upon which the western U.S. rests (Agnew cautions that the uplift has virtually no effect on the San Andreas fault and therefore does not increase the risk of earthquakes).For Cayan, a research meteorologist with Scripps and USGS, the results paint a new picture of the dire hydrological state of the west."These results quantify the amount of water mass lost in the past few years," said Cayan. "It also represents a powerful new way to track water resources over a very large landscape. We can home in on the Sierra Nevada mountains and critical California snowpack. These results demonstrate that this technique can be used to study changes in fresh water stocks in other regions around the world, if they have a network of GPS sensors." | Earthquakes | 2,014 |
August 17, 2014 | https://www.sciencedaily.com/releases/2014/08/140817215912.htm | New mechanism of erosion: Gorges are eradicated by downstream sweep erosion | Local surface uplift can block rivers, particularly in mountainous regions. The impounded water, however, always finds its way downstream, often cutting a narrow gorge into the rocks. Subsequent erosion of the rocks can lead to a complete eradication of this initial incision, until not a trace is left of the original breakthrough. In extreme cases the whole gorge disappears, leaving behind a broad valley with a flat floodplain. Previously, the assumption was that this transition from a narrow gorge to a wide valley was driven by gorge widening and the erosion of the walls of the gorges. | A team of scientists from the GFZ German Research Centre for Geosciences in Potsdam has now revealed a new mechanism that drives this process of fluvial erosion (Nature Geoscience, 17.08.2014). The geoscientists analyzed the development of a gorge on the Da'an Chi river in Taiwan over a period of almost ten years. There, uplift that was caused by the Jiji earthquake of 1999 (magnitude 7.6), and that runs transverse to the river, had formed a blockage. Earthquakes of that size occur there every 300 to 500 years. "Before the quake there was no sign of a gorge at all in this riverbed, which is one and a half kilometers wide," explains Kristen Cook of the GFZ. "We have here the world's first real-time observation of the evolution of gorge width by fluvial erosion over the course of several years." Currently the gorge is roughly a kilometer long, 25 meters wide and up to 17 meters deep. Initially, the gorge walls were eroded at a rate of five meters per year, and today are still retreating one and a half meters per year.The scientists identified a hitherto unknown mechanism by which the gorge is destroyed. "Downstream sweep erosion" they termed this process. "A wide braided channel upstream of the gorge is necessary," explains co-author Jens Turowski (GFZ). "The course of this channel changes regularly and it has to flow in sharp bends to run into the gorge. In these bends, the bed-load material that is transported by the river hits the upper edge of the gorge causing rapid erosion." This mechanism gradually washes away all of the bedrock surrounding the gorge and, therefore, is the cause for the planation of the riverbed over the complete width of the valley. Assuming the current erosion rate of 17 meters per year, it will take here at the Da'an Chi River only 50 to 100 years until again a flat beveled channel again fills the valley. In contrast, lateral erosion in the gorge would be too slow to eradicate the gorge in the time of one earthquake cycle. The newly discovered downstream sweep erosion is far more effective. | Earthquakes | 2,014 |
August 14, 2014 | https://www.sciencedaily.com/releases/2014/08/140814124648.htm | Study of Chilean quake shows potential for future earthquake | Near real-time analysis of the April 1 earthquake in Iquique, Chile, showed that the 8.2 event occurred in a gap on the fault unruptured since 1877 and that the April event was not what the scientists had expected, according to an international team of geologists. | "We assumed that the area of the 1877 earthquake would eventually rupture, but all indications are that this 8.2 event was not the 8.8 event we were looking for," said Kevin P. Furlong, professor of geophysics, Penn State. "We looked at it to see if this was the big one."But according to the researchers, it was not. Seismologists expect that areas of faults will react the same way over and over. However, the April earthquake was about nine times less energetic than the one in 1877 and was incapable of releasing all the stress on the fault, leaving open the possibility of another earthquake.The Iquique earthquake took place on the northern portion of the subduction zone formed when the Nazca tectonic plate slides under the South American plate. This is one of the longest uninterrupted plate boundaries on the planet and the site of many earthquakes and volcanos. The 8.2 earthquake was foreshadowed by a systematic sequence of foreshocks recorded at 6.0, 6.5, 6.7 and 6.2 with each foreshock triggering the next until the main earthquake occurred.These earthquakes relieved the stresses on some parts of the fault. Then the 8.2 earthquake relieved more stress, followed by a series of aftershocks in the range of 7.7. While the aftershocks did fill in some of the gaps left by the 8.2 earthquake, the large earthquake and aftershocks could not fill in the entire gap where the fault had not ruptured in a very long time. That area is unruptured and still under stress.The foreshocks eased some of the built up stress on 60 to 100 miles of fault, and the main shock released stress on about 155 miles, but about 155 miles of fault remain unchanged, the researchers report today (Aug. 13) in "There can still be a big earthquake there," said Furlong. "It didn't release the total hazard, but it told us something about this large earthquake area. That an 8.8 rupture doesn't always happen."The researchers were able to do this analysis in near real time because of the availability of large computing power and previously laid groundwork.The computing power allowed researchers to model the fault more accurately. In the past, subduction zones were modeled as if they were on a plane, but the plate that is subducting curves underneath the other plate creating a 3-dimensional fault line. The researchers used a model that accounted for this curving and so more accurately recreated the stresses on the real geology at the fault."One of the things the U.S. Geological Survey and we have been doing is characterizing the major tectonic settings," said Furlong. "So when an earthquake is imminent, we don't need a lot of time for the background."In essence, they are creating a library of information about earthquake faults and have completed the first level, a general set of information on areas such as Japan, South America and the Caribbean. Now they are creating the levels of north and south Japan or Chile, Peru and Ecuador.Knowing where the old earthquake occurred, how large it was and how long ago it happened, the researchers could look at the foreshocks, see how much stress they relieved and anticipate, at least in a small way, what would happen."This is what we need to do in the future in near real time for decision makers," said Furlong. | Earthquakes | 2,014 |
August 13, 2014 | https://www.sciencedaily.com/releases/2014/08/140813132120.htm | Foreshock series controls earthquake rupture | A long lasting foreshock series controlled the rupture process of this year's great earthquake near Iquique in northern Chile. The earthquake was heralded by a three quarter year long foreshock series of ever increasing magnitudes culminating in a Mw 6.7 event two weeks before the mainshock. The mainshock (magnitude 8.1) finally broke on April 1st a central piece out of the most important seismic gap along the South American subduction zone. An international research team under leadership of the GFZ German Research Centre for Geosciences now revealed that the Iquique earthquake occurred in a region where the two colliding tectonic plates where only partly locked. | The Pacific Nazca plate and the South American plate are colliding along South America's western coast. While the Pacific sea floor submerges in an oceanic trench under the South American coast the plates get stressed until occasionally relieved by earthquakes. In about 150 years time the entire plate margin from Patagonia in the south to Panama in the north breaks once completely through in great earthquakes. This cycle is almost complete with the exception of a last segment -- the seismic gap near Iquique in northern Chile. The last great earthquake in this gap occurred back in 1877. On initiative of the GFZ this gap was monitored in an international cooperation (GFZ, Institut de Physique du Globe Paris, Centro Sismologico National -- Universidad de Chile, Universidad de Catolica del Norte, Antofagasta, Chile) by the Integrated Plate Boundary Observatory Chile (IPOC), with among other instruments seismographs and cont. GPS. This long and continuous monitoring effort makes the Iquique earthquake the best recorded subduction megathrust earthquake globally. The fact that data of IPOC is distributed to the scientific community in near real time, allowed this timely analysis.The mainshock of magnitude 8.1 broke the 150 km long central piece of the seismic gap, leaving, however, two large segments north and south intact. GFZ scientist Bernd Schurr headed the newly published study that appeared in the lastest issue of Despite the fact that the IPOC instruments delivered continuous data before, during and after the earthquake, the GFZ HART (Hazard And Risk Team) group went into the field to meet with international colleagues to conduct additional investigations. More than a dozen researchers continue to measure on site deformation and record aftershocks in the aftermath of this great rupture. Because the seismic gap is still not closed, IPOC gets further developed. So far 20 multi-parameter stations have been deployed. These consist of seismic broadband and strong-motion sensors, continuous GPS receivers, magneto-telluric and climate sensors, as well as creepmeters, which transmit data in near real-time to Potsdam. The European Southern astronomical Observatory has also been integrated into the observation network. | Earthquakes | 2,014 |
August 10, 2014 | https://www.sciencedaily.com/releases/2014/08/140810214206.htm | 2010 Chilean earthquake causes icequakes in Antarctica | Seismic events aren't rare occurrences on Antarctica, where sections of the frozen desert can experience hundreds of micro-earthquakes an hour due to ice deformation. Some scientists call them icequakes. But in March of 2010, the ice sheets in Antarctica vibrated a bit more than usual because of something more than 3,000 miles away: the 8.8-magnitude Chilean earthquake. A new Georgia Institute of Technology study published in | To study the quake's impact on Antarctica, the Georgia Tech team looked at seismic data from 42 stations in the six hours before and after the 3:34 a.m. event. The researchers used the same technology that allowed them to "hear" the seismic response at large distances for the devastating 2011 magnitude 9 Japan earthquake as it rumbled through Earth. In other words, they simply removed the longer-period signals as the seismic waves spread from the distant epicenter to identify high-frequency signals from nearby sources. Nearly 30 percent (12 of the 42 stations) showed clear evidence of high-frequency seismic signals as the surface-wave arrived on Antarctica."We interpret these events as small icequakes, most of which were triggered during or immediately after the passing of long-period Rayleigh waves generated from the Chilean mainshock," said Zhigang Peng, an associate professor in the School of Earth and Atmospheric Sciences who led the study. "This is somewhat different from the micro-earthquakes and tremor caused by both Love and Rayleigh-type surface waves that traditionally occur in other tectonically active regions thousands of miles from large earthquakes.Peng says the subtle difference is that micro-earthquakes respond to both shearing and volumetric deformation from distant events. The newly found icequakes respond only to volumetric deformation."Such differences may be subtle, but they tell us that the mechanism of these triggered icequakes and small earthquakes are different," Peng added. "One is more like cracking, while the other is like a shear slip event. It's similar to two hands passing each other."Some of the icequakes were quick bursts and over in less than one second. Others were long duration, tremor-like signals up to 10 seconds. They occurred in various parts of the continent, including seismic stations along the coast and near the South Pole.The researchers found the clearest indication of induced high-frequency signals at station HOWD near the northwest corner of the Ellsworth Mountains. Short bursts occurred when the P wave hit the station, then continued again when the Rayleigh wave arrived. The triggered icequakes had very similar high waveform patterns, which indicates repeated failure at a single location, possibly by the opening of cracks.Peng says the source locations of the icequakes are difficult to determine because there isn't an extensive seismic network coverage in Antarctica."But at least some of the icequakes themselves create surface waves, so they are probably formed very close to the ice surface," he added. "While we cannot be certain, we suspect they simply reflect fracturing of ice in the near surface due to alternating volumetric compressions and expansions as the Rayleigh waves passed through Antarctica's frozen ice."Antarctica was originally not on the research team's target list. While examining seismic stations in the Southern Hemisphere, Peng "accidently" found the triggered icequakes at a few openly available stations. He and former Georgia Tech postdoctoral student Jake Walter (now a research scientist at the Institute for Geophysics at UT Austin) then reached out to other seismologists (the paper's four co-authors) who were in charge of deploying more broadband seismometers in Antarctica. | Earthquakes | 2,014 |
July 24, 2014 | https://www.sciencedaily.com/releases/2014/07/140724124445.htm | Fukushima accident underscores need for U.S. to seek out new information about nuclear plant hazards | A new congressionally mandated report from the National Academy of Sciences concludes that the overarching lesson learned from the 2011 Fukushima Daiichi nuclear accident is that nuclear plant licensees and their regulators must actively seek out and act on new information about hazards with the potential to affect the safety of nuclear plants. The committee that wrote the report examined the causes of the Japan accident and identified findings and recommendations for improving nuclear plant safety and offsite emergency responses to nuclear plant accidents in the U.S. | The accident at the Fukushima Daiichi plant was initiated by the Great East Japan Earthquake and tsunami on March 11, 2011. The earthquake knocked out offsite AC power to the plant, and the tsunami inundated portions of the plant site. Flooding of critical equipment resulted in the extended loss of onsite power with the consequent loss of reactor monitoring, control, and cooling functions in multiple units. Three reactors -- Units 1, 2, and 3 -- sustained severe core damage, and three reactor buildings -- Units 1, 3, and 4 -- were damaged by hydrogen explosions. Offsite releases of radioactive materials contaminated land in Fukushima and several neighboring prefectures, prompting widespread evacuations, distress among the population, large economic losses, and the eventual shutdown of all nuclear power plants in Japan.Personnel at the Fukushima Daiichi plant responded to the accident with courage and resilience, and their actions likely reduced its severity and the magnitude of offsite radioactive material releases, the committee said. However, several factors relating to the management, design, and operation of the plant prevented plant personnel from achieving greater success and contributed to the overall severity of the accident.Nuclear plant operators and regulators in the U.S. and other countries are taking useful actions to upgrade nuclear plant systems, operating procedures, and operator training in response to the Fukushima Daiichi accident. As the U.S. nuclear industry and its regulator, the U.S. Nuclear Regulatory Commission (USNRC), implement these actions, the report recommends particular attention to improving the availability, reliability, redundancy, and diversity of specific nuclear plant systems:·DC power for instrumentation and safety system control·tools for estimating real-time plant status during loss of power·reactor heat removal, reactor depressurization, and containment venting systems and protocols·instrumentation for monitoring critical thermodynamic parameters -- for example temperature and pressure -- in reactors, containments, and spent-fuel pools·hydrogen monitoring, including monitoring in reactor buildings, and mitigation·instrumentation for both onsite and offsite radiation and security monitoring·communications and real-time information systemsTo further improve the resilience of U.S. nuclear plants, the report also recommends:·The U.S. nuclear industry and the USNRC should give specific attention to improving resource availability and operator training, including training for developing and implementing ad hoc responses to deal with unanticipated complexities.·The U.S. nuclear industry and USNRC should strengthen their capabilities for assessing risks from events that could challenge the design of nuclear plant structures and components and lead to a loss of critical safety functions. Part of this effort should focus on events that have the potential to affect large geographic regions and multiple nuclear plants, including earthquakes, tsunamis and other geographically extensive floods, and geomagnetic disturbances. USNRC should support these efforts by providing guidance on approaches and overseeing rigorous peer review.·USNRC should further incorporate modern risk concepts into its nuclear safety regulations using these strengthened capabilities.·USNRC and the U.S. nuclear industry must continuously monitor and maintain a strong safety culture and should examine opportunities to increase the transparency of and communication about their efforts to assess and improve nuclear safety.Until now, U.S. safety regulations have been based on ensuring plants are designed to withstand certain specified failures or abnormal events, or "design-basis-events"-- such as equipment failures, loss of power, and inability to cool the reactor core -- that could impair critical safety functions. However, four decades of analysis and experience have demonstrated that reactor core-damage risks are dominated by "beyond-design-basis events," the report says. The Fukushima Daiichi, Three Mile Island, and Chernobyl accidents were all initiated by beyond-design-basis events. The committee found that current approaches for regulating nuclear plant safety, which have been based traditionally on deterministic concepts such as the design-basis accident, are clearly inadequate for preventing core-melt accidents and mitigating their consequences. A more complete application of modern risk-assessment principles in licensing and regulation could help address this inadequacy and enhance the overall safety of all nuclear plants, present and future.The Fukushima Daiichi accident raised the question of whether offsite emergency preparedness in the U.S. would be challenged if a similar-scale event -- involving several concurrent disasters -- occurred here. The committee lacked time and resources to perform an in-depth examination of U.S. preparedness for severe nuclear accidents. The report recommends that the nuclear industry and organizations with emergency management responsibilities assess their preparedness for severe nuclear accidents associated with offsite regional-scale disasters.Emergency response plans, including plans for communicating with affected populations, should be revised or supplemented to ensure that there are scalable and effective strategies, well-trained personnel, and adequate resources for responding to long-duration accident/disaster scenarios. In addition, industry and emergency management organizations should assess the balance of protective actions -- such as evacuation, sheltering-in-place, and potassium iodide distribution -- for affected offsite populations and revise the guidelines as appropriate. Particular attention should be given to protective actions for children, those who are ill, and the elderly and their caregivers; long-term social, psychological, and economic impacts of sheltering-in-place, evacuation, and/or relocation; and decision making for resettlement of evacuated populations in areas that were contaminated by radioactive material.Report: | Earthquakes | 2,014 |
July 22, 2014 | https://www.sciencedaily.com/releases/2014/07/140722152430.htm | Oso disaster had its roots in earlier landslides | The disastrous March 22 landslide that killed 43 people in the rural Washington state community of Oso involved the "remobilization" of a 2006 landslide on the same hillside, a new federally sponsored geological study concludes. | The research indicates the landslide, the deadliest in U.S. history, happened in two major stages. The first stage remobilized the 2006 slide, including part of an adjacent forested slope from an ancient slide, and was made up largely or entirely of deposits from previous landslides. The first stage ultimately moved more than six-tenths of a mile across the north fork of the Stillaguamish River and caused nearly all the destruction in the Steelhead Haven neighborhood.The second stage started several minutes later and consisted of ancient landslide and glacial deposits. That material moved into the space vacated by the first stage and moved rapidly until it reached the trailing edge of the first stage, the study found.The report, released Tuesday on the four-month anniversary of the slide, details an investigation by a team from the Geotechnical Extreme Events Reconnaissance Association, or GEER. The scientists and engineers determined that intense rainfall in the three weeks before the slide likely was a major issue, but factors such as altered groundwater migration, weakened soil consistency because of previous landslides and changes in hillside stresses played key roles.The extreme events group is funded by the National Science Foundation, and its goal is to collect perishable data immediately in the wake of extreme events such as earthquakes, hurricanes, tsunamis, landslides or floods. Recent events for which reports have been filed include earthquakes in New Zealand and Haiti, the 2011 earthquake and tsunami in Japan, and Hurricane Sandy on the U.S. Eastern Seaboard in 2012."Perhaps the most striking finding is that, while the Oso landslide was a rare geologic occurrence, it was not extraordinary," said Joseph Wartman, a University of Washington associate professor of civil and environmental engineering and a team leader for the study."We observed several other older but very similar long-runout landslides in the surrounding Stillaguamish River Valley. This tells us these may be prevalent in this setting over long time frames. Even the apparent trigger of the event -- several weeks of intense rainfall -- was not truly exceptional for the region," Wartman said.Team co-leader Jeffrey Keaton, a principal engineering geologist with AMEC Americas, an engineering consultant and project management company, said another important finding is that spring of 2014 was not a big time for landslides in Northwest Washington."The Oso landslide was the only major one that occurred in Snohomish County or the Seattle area this spring," Keaton said.Other team members are Scott Anderson of the Federal Highway Administration, Jean Benoit of the University of New Hampshire, John deLaChapelle of Golder Associates Inc., Robert Gilbert of the University of Texas and David Montgomery of the University of Washington.The team was formed and approved within days of the landslide, but it began work at the site about eight weeks later, after search and recovery activities were largely completed. The researchers documented conditions and collected data that could be lost over time. Their report is based largely on data collected during a four-day study of the entire landslide area in late May. It focuses on data and observations directly from the site, but also considers information such as local geologic and climate conditions and eyewitness accounts.The researchers reviewed evidence for a number of large landslides in the Stillaguamish Valley around Oso during the previous 6,000 years, many of them strongly resembling the site of the 2014 slide. There is solid evidence, for example, of a slide just west of this year's slide that also ran out across the valley. In addition, they reviewed published maps showing the entire valley bottom in the Oso area is made up of old landslide deposits or areas where such deposits have been reworked by the river and left on the flood plain.The team estimated that large landslides such as the March event have happened in the same area as often as every 400 years (based on 15 mapped large landslides) to every 1,500 years (based on carbon dating of what appears to be the oldest of four generations of large slides) during the last six millennia.The researchers found that the size of the landslide area grew slowly starting in the 1930s until 2006, when it increased dramatically. That was followed by this year's catastrophically larger slide.Studies in previous decades indicated a high landslide risk for the Oso area, the researchers found, but they noted that it does not appear there was any publicly communicated understanding that debris from a landslide could run as far across the valley as it did in March. In addition to the fatalities, that event seriously injured at least 10 people and caused damage estimated at more than $50 million."For me, the most important finding is that we must think about landslides in the context of 'risk' rather than 'hazard,'" Wartman said. "While these terms are often used interchangeably, there is a subtle but important difference. Landslide hazard, which was well known in the region, tells us the likelihood that a landslide will occur, whereas landslide risk tells us something far more important -- the likelihood that human losses will occur as a result of a landslide."From a policy perspective, I think it is very important that we begin to assess and clearly communicate the risks from landslides," he said.Other study conclusions include:• That past landslides and associated debris deposited by water should be carefully investigated when mapping areas for zoning purposes.• That the influence of precipitation on destabilizing a slope should consider both cumulative amounts and short-duration intensities in assessing the likelihood of initial or renewed slope movement.• That methods to identify and delineate potential landslide runout zones need to be revisited and re-evaluated.The report is available at | Earthquakes | 2,014 |
July 17, 2014 | https://www.sciencedaily.com/releases/2014/07/140717094607.htm | New view of Mount Rainier's volcanic plumbing: Electrical images show upward flow of fluids to magma chamber | By measuring how fast Earth conducts electricity and seismic waves, a University of Utah researcher and colleagues made a detailed picture of Mount Rainier's deep volcanic plumbing and partly molten rock that will erupt again someday. | "This is the most direct image yet capturing the melting process that feeds magma into a crustal reservoir that eventually is tapped for eruptions," says geophysicist Phil Wannamaker, of the university's Energy & Geoscience Institute and Department of Civil and Environmental Engineering. "But it does not provide any information on the timing of future eruptions from Mount Rainier or other Cascade Range volcanoes."The study was published today in the journal In an odd twist, the image appears to show that at least part of Mount Rainier's partly molten magma reservoir is located about 6 to 10 miles northwest of the 14,410-foot volcano, which is 30 to 45 miles southeast of the Seattle-Tacoma area.But that could be because the 80 electrical sensors used for the experiment were placed in a 190-mile-long, west-to-east line about 12 miles north of Rainier. So the main part of the magma chamber could be directly under the peak, but with a lobe extending northwest under the line of detectors, Wannamaker says.The top of the magma reservoir in the image is 5 miles underground and "appears to be 5 to 10 miles thick, and 5 to 10 miles wide in east-west extent," he says. "We can't really describe the north-south extent because it's a slice view."Wannamaker estimates the reservoir is roughly 30 percent molten. Magma chambers are like a sponge of hot, soft rock containing pockets of molten rock.The new image doesn't reveal the plumbing tying Mount Rainier to the magma chamber 5 miles below it. Instead, it shows water and partly molten and molten rock are generated 50 miles underground where one of Earth's seafloor crustal plates or slabs is "subducting" or diving eastward and downward beneath the North America plate, and how and where those melts rise to Rainier's magma chamber.The study was funded largely by the National Science Foundation's Earthscope program, which also has made underground images of the United States using seismic or sound-wave tomography, much like CT scans show the body's interior using X-rays.The new study used both seismic imaging and magnetotelluric measurements, which make images by showing how electrical and magnetic fields in the ground vary due to differences in how much underground rock and fluids conduct or resist electricity.Wannamaker says it is the most detailed cross-section view yet under a Cascades volcanic system using electrical and seismic imaging. Earlier seismic images indicated water and partly molten rock atop the diving slab. The new image shows melting "from the surface of the slab to the upper crust, where partly molten magma accumulates before erupting," he adds.Wannamaker and Rob L. Evans, of the Woods Hole Oceanographic Institution, conceived the study. First author R Shane McGary -- then at Woods Hole and now at the College of New Jersey -- did the data analysis. Other co-authors were Jimmy Elsenbeck of Woods Hole and Stéphane Rondenay of the University of Bergen.Mount Rainier, the tallest peak in the Cascades, "is an active volcano that will erupt again," says the U.S. Geological Survey. Rainier sits atop volcanic flows up to 36 million years old. An ancestral Rainier existed 2 million to 1 million years ago. Frequent eruptions built the mountain's modern edifice during the past 500,000 years. During the past 11,000 years, Rainier erupted explosively dozens of times, spewing ash and pumice.Rainier once was taller until it collapsed during an eruption 5,600 years ago to form a large crater open to the northeast, much like the crater formed by Mount St. Helens' 1980 eruption. The 5,600-year-old eruption sent a huge mudflow west to Puget Sound, covering parts or all of the present sites of the Port of Tacoma, Seattle suburbs Kent and Auburn, and the towns Puyallup, Orting, Buckley, Sumner and Enumclaw.Rainier's last lava flows were 2,200 years ago, the last flows of hot rock and ash were 1,100 years ago and the last big mudflow 500 years ago. There are disputed reports of steam eruptions in the 1800s.The "ring of fire" is a zone of active volcanoes and frequent earthquake activity surrounding the Pacific Ocean. It exists where Earth's tectonic plates collide -- specifically, plates that make up the seafloor converge with plates that carry continents.From Cape Mendocino in northern California and north past Oregon, Washington state and into British Columbia, an oceanic plate is being pushed eastward and downward -- a process called subduction -- beneath the North American plate. This relatively small Juan de Fuca plate is located between the huge Pacific plate and the Pacific Northwest.New seafloor rock -- rich with water in cracks and minerals -- emerges from an undersea volcanic ridge some 250 miles off the coast, from northern California into British Columbia. That seafloor adds to the western edge of the Juan de Fuca plate and pushes it east-northeast under the Pacific Northwest, as far as Idaho.The part of the plate diving eastward and downward is called the slab, which ranges from 30 to 60 miles thick as it is jammed under the North American plate. The part of the North American plate above the diving slab is shaped like a wedge.When the leading, eastern edge of the diving slab descends deep enough, where pressures and temperatures are high, water-bearing minerals such as chlorite and amphibole release water from the slab, and the slab and surrounding mantle rock begin to melt. That is why the Cascade Range of active volcanoes extends north-to-south -- above the slab and parallel but about 120 miles inland from the coast -- from British Columbia south to Mount Shasta and Lassen Peak in northern California.In the new image, yellow-orange-red areas correspond to higher electrical conductivity (or lower resistivity) in places where fluids and melts are located.The underground image produced by the new study shows where water and molten rock accumulate atop the descending slab, and the route they take to the magma chamber that feeds eruptions of Mount Rainier:-- The rock begins to melt atop the slab about 50 miles beneath Mount Rainier. Wannamaker says it is best described as partly molten rock that contains about 2 percent water and "is a mush of crystals within an interlacing a network of molten rock."-- Some water and partly molten rock actually gets dragged downward atop the descending slab, to depths of 70 miles or more.-- Other partly molten rock rises up through the upper mantle wedge, crosses into the crust at a depth of about 25 miles, and then rises into Rainier's magma chamber -- or at least the lobe of the chamber that crosses under the line of sensors used in the study. Evidence suggests the magma moves upward at least 0.4 inches per year.-- The new magnetotelluric image also shows a shallower zone of fluid perhaps 60 miles west of Rainier and 25 miles deep at the crust-mantle boundary. Wannamaker says it is largely water released from minerals as the slab is squeezed and heated as it dives.The seismic data were collected during 2008-2009 for other studies. The magnetotelluric data were gathered during 2009-2010 by authors of the new study.Wannamaker and colleagues placed an east-west line of magnetotelluric sensors: 60 that made one-day measurements and looked as deep as 30 miles into the Earth, and 20 that made measurements for a month and looked at even greater depths. | Earthquakes | 2,014 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.