Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
July 16, 2014 | https://www.sciencedaily.com/releases/2014/07/140716141331.htm | 70-foot-long, 52-ton concrete bridge survives series of simulated earthquakes | A 70-foot-long, 52-ton concrete bridge survived a series of earthquakes in the first multiple-shake-table experiment in the University of Nevada, Reno's new Earthquake Engineering Lab. | "It was a complete success. The bridge withstood the design standard very well and today went over and above 2.2 times the design standard," John Stanton, civil and environmental engineering professor and researcher from the University of Washington, said. Stanton collaborated with Foundation Professor David Sanders of the University of Nevada, Reno in the novel experiment."The bridge performed very well," Sanders said. "There was a lot of movement, about 12 percent deflection -- which is tremendous -- and it's still standing. You could hear the rebar inside the columns shearing, like a zipper opening. Just as it would be expected to do."The set of three columns swayed precariously, the bridge deck twisted and the sound filled the cavernous laboratory as the three 14- by 14-foot, 50-ton-capacity hydraulically driven shake tables moved the massive structure."Sure we broke it, but we exposed it to extreme, off-the-scale conditions," Stanton said. "The important thing is it's still standing, with the columns coming to rest right where they started, meaning it could save lives and property. I'm quite happy."The bridge was designed and the components were pre-cast at the University of Washington in Seattle, and then built atop three 14- by 14-foot, 50-ton-capacity hydraulically driven shake tables in the 24,500 square-foot lab. It was shaken in a series of simulated earthquakes, culminating in the large ground motions similar to those recorded in the deadly and damaging 1995 magnitude 6.9 earthquake in Kobe, Japan.The rocking, pre-tensioned concrete bridge support system is a new bridge engineering design the team has developed with the aim of saving lives, reducing on-site construction time and minimizing earthquake damage."By building the components off-site we can save time with construction on-site, minimizing interruptions in traffic and lowering construction costs," Sanders said. "In this case, the concrete columns and beams were pre-cast and tensioned at the University of Washington. Other components were built here at the University of Nevada, Reno. It took us only a month to build the bridge, in what would otherwise be a lengthy process.""This can't be done anywhere else in the nation, and perhaps the world," Ian Buckle, director of the lab and professor of civil engineering, said of the test. "Of course we've been doing these types of large-scale structures experiments for years, but it's exciting to have this first test using multiple tables in this building complete. It's good to see the equipment up and running successfully.When combined with the University's Large-Scale Structures Laboratory, just steps away from the new lab, the facility comprises the biggest, most versatile large-scale structures, earthquake/seismic engineering facility in the United States, according to National Institute of Standards and Technology, and possibly the largest University-based facility of its kind in the world.A grand opening was held recently for the $19 million lab expansion project, funded with $12.2 million by the U.S. Department of Commerce's National Institute of Standards and Technology, funds from the Department of Energy, as well as University and donor funds. The expansion allows a broader range of experiments and there is additional space to add a fifth large shake table."Our facility is unique worldwide and, combined with the excellence of our faculty and students, will allow us to make even greater contributions to the seismic safety of our state, the nation and the world," Manos Maragakis, dean of the College of Engineering, said. "We will test new designs and materials that will improve our homes, hospitals, offices and highway systems. Remarkable research is carried on here. Getting to this point has taken a lot of hard work. It's both a culmination and a beginning, ushering in a new era." | Earthquakes | 2,014 |
July 15, 2014 | https://www.sciencedaily.com/releases/2014/07/140715214213.htm | Rainwater discovered at new depths, with high pressure and temperatures over 300 degrees Celsius | University of Southampton researchers have found that rainwater can penetrate below the Earth's fractured upper crust, which could have major implications for our understanding of earthquakes and the generation of valuable mineral deposits. | It had been thought that surface water could not penetrate the ductile crust -- where temperatures of more than 300°C and high pressures cause rocks to flex and flow rather than fracture -- but researchers, led by Southampton's Dr Catriona Menzies, have now found fluids derived from rainwater at these levels.Fluids in the Earth's crust can weaken rocks and may help to initiate earthquakes along locked fault lines. They also concentrate valuable metals such as gold. The new findings suggest that rainwater may be responsible for controlling these important processes, even deep in the Earth.Researchers from the University of Southampton, GNS Science (New Zealand), the University of Otago, and the Scottish Universities Environmental Research Centre studied geothermal fluids and mineral veins from the Southern Alps of New Zealand, where the collision of two tectonic plates forces deeper layers of the Earth closer to the surface.The team looked into the origin of the fluids, how hot they were and to what extent they had reacted with rocks deep within the mountain belt."When fluids flow through the crust they leave behind deposits of minerals that contain a small amount of water trapped within them," says Postdoctoral Researcher Catriona, who is based at the National Oceanography Centre. "We have analysed these waters and minerals to identify where the fluids deep in the crust came from."Fluids may come from a variety of sources in the crust. In the Southern Alps fluids may flow upwards from deep in the crust, where they are released from hot rocks by metamorphic reactions, or rainwater may flow down from the surface, forced by the high mountains above. We wanted to test the limits of where rainwater may flow in the crust. Although it has been suggested before, our data shows for the first time that rainwater does penetrate into rocks that are too deep and hot to fracture."Surface-derived waters reaching such depths are heated to over 400°C and significantly react with crustal rocks. However, through testing the researchers were able to establish the water's meteoric origin.Funding for this research, which has been published in Earth and Planetary Science Letters, was provided by the Natural Environmental Research Council (NERC). Catriona and her team are now looking further at the implications of their findings in relation to earthquake cycles as part of the international Deep Fault Drilling Project, which aims to drill a hole through the Alpine Fault at a depth of about 1km later this year. | Earthquakes | 2,014 |
July 10, 2014 | https://www.sciencedaily.com/releases/2014/07/140710141626.htm | Evidence of super-fast deep earthquake: Rare high-speed rupture off Russia and similar phenomena on shallow fault zones | As scientists learn more about earthquakes that rupture at fault zones near the planet's surface -- and the mechanisms that trigger them -- an even more intriguing earthquake mystery lies deeper in the planet. | Scientists at Scripps Institution of Oceanography at UC San Diego have discovered the first evidence that deep earthquakes, those breaking at more than 400 kilometers (250 miles) below Earth's surface, can rupture much faster than ordinary earthquakes. The finding gives seismologists new clues about the forces behind deep earthquakes as well as fast-breaking earthquakes that strike near the surface.Seismologists have documented a handful of these events, in which an earthquake's rupture travels faster than the shear waves of seismic energy that it radiates. These "supershear" earthquakes have rupture speeds of four kilometers per second (an astonishing 9,000 miles per hour) or more.In a National Science Foundation-funded study reported in the June 11, 2014, issue of the journal Details of a magnitude 6.7 aftershock of the event captured Zhan's attention. Analyzing data from the IRIS (Incorporated Research Institutions for Seismology) consortium, which coordinates a global network of seismological instruments, Zhan noted that most seismometers around the world yielded similar records, all suggesting an anomalously short duration for a magnitude 6.7 earthquake.Data from one seismometer, however, stationed closest to the event in Russia's Kamchatka Peninsula, told a different story with intriguing details.After closely analyzing the data, Zhan not only found that the aftershock ruptured extremely deeply at 640 kilometers (400 miles) below Earth's surface, but its rupture velocity was extraordinary -- about eight kilometers per second (five miles per second), nearly 50 percent faster than the shear wave velocity at that depth."For a 6.7 earthquake you would expect a duration of seven to eight seconds, but this one lasted just two seconds," said Shearer, a geophysics professor in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics (IGPP) at Scripps. "This is the first definitive example of supershear rupture for a deep earthquake since previously supershear ruptures have been documented only for shallow earthquakes.""This finding will help us understand why deep earthquakes happen," said Zhan. "One quarter of earthquakes occur at large depths, and some of these can be pretty big, but we still don't understand why they happen. So this earthquake provides a new observation for deep earthquakes and high-rupture speeds."Zhan also believes the new information will be useful in examining ultra-fast earthquakes and their potential for impacting fault zones near Earth's surface. Although not of supershear caliber, California's destructive 1994 Northridge earthquake had a comparable size and geometry to that of the 6.7 Sea of Okhotsk aftershock."If a shallow earthquake such as Northridge goes supershear, it could cause even more shaking and possibly more damage," said Zhan. | Earthquakes | 2,014 |
July 8, 2014 | https://www.sciencedaily.com/releases/2014/07/140708111137.htm | Was da Vinci wrong? New research shows friction and fracture are interrelated, with implications for earthquakes | Overturning conventional wisdom stretching all the way to Leonardo da Vinci, new Hebrew University of Jerusalem research shows that how things break (fracture) and how things slide (friction) are closely interrelated. The breakthrough study marks an important advance in understanding friction and fracture, with implications for describing the mechanics that drive earthquakes. | Over 500 years ago, da Vinci described how rough blocks slide over one another, providing the basis for our understanding of friction to this day. The phenomenon of fracture was always considered to be something totally different.But new research by Prof. Jay Fineberg and his graduate student Ilya Svetlizky, at the Hebrew University's Racah Institute of Physics, has demonstrated that these two seemingly disparate processes of fracture and friction are actually intimately intertwined.Appearing in the journal Fineberg and Svetlizky produced "laboratory earthquakes" showing that the friction caused by the sliding of two contacting blocks can only occur when the connections between the surfaces are first ruptured (that is, fractured or broken) in an orderly, "organized" process that takes place at nearly the speed of sound.How does this happen? Before any motion can occur, the blocks are connected by interlocking rough contacts that define their interface. In order for motion to occur, these connections have to be broken. This physical process of breaking is called a fracture process. This process is described by the theory of crack propagation, say the researchers, meaning that the stresses (or forces) that exist at the front edge of a crack become highly magnified, even if the overall forces being applied are initially quite small."The insights gained from our study provide a new paradigm for understanding friction and give us a new, fundamental description of the mechanics and behavior that drive earthquakes, the sliding of two tectonic blocks within natural faults," says Fineberg. "In this way, we can now understand important processes that are generally hidden kilometers beneath the Earth's surface."The research was supported by the James S. McDonnell Fund, the European Research Council (grant no. 267256) and the Israel Science Foundation (grant 76/11). | Earthquakes | 2,014 |
July 8, 2014 | https://www.sciencedaily.com/releases/2014/07/140708092129.htm | Giant earthquakes help predict volcanic eruptions | Researchers at the Institut des Sciences de la Terre (CNRS/Université Joseph Fourier/Université de Savoie/IRD/IFSTTAR) and the Institut de Physique du Globe de Paris (CNRS/Université Paris Diderot/IPGP), working in collaboration with Japanese researchers, have for the first time observed the response of Japanese volcanoes to seismic waves produced by the giant Tohoku-oki earthquake of 2011. Their conclusions, published in | Until the early 2000s, seismic noise* was systematically removed from seismological analyses. This background noise is in fact associated with seismic waves caused by ocean swell. These waves, which can be compared to permanent, continuous microseisms, can be used by seismologists instead of earthquakes (which are highly localized over a limited time period) to image Earth's interior and its evolution over time, rather like an ultrasound scan on a global scale.Now, seismic noise has been used for the continuous measurement of perturbations of the mechanical properties of Earth's crust. Researchers at the Institut des Sciences de la Terre (CNRS/Université Joseph Fourier/Université de Savoie/IRD/IFSTTAR) and the Institut de Physique du Globe de Paris (CNRS/Université Paris Diderot/IPGP) have applied this novel method while working in collaboration with Japanese colleagues using the Hi-net network, which is the world's densest seismic network (comprising more than 800 seismic detectors throughout Japan).After the giant Tohoku-oki earthquake of 2011, the researchers analyzed over 70 terabytes of seismic data from the network. For the first time, they showed that the regions where the perturbations of Earth's crust were the greatest were not those where the shocks were the strongest. They were in fact localized under volcanic regions, especially under Mount Fuji. The new method thus enabled the scientists to observe the anomalies caused by the perturbations from the earthquake in volcanic regions under pressure. Mount Fuji, which exhibits the greatest anomaly, is probably under great pressure, although no eruption has yet followed the Tohoku-oki earthquake. The 6.4-magnitude seism that occurred four days after the 2011 quake confirms the critical state of the volcano in terms of pressure. These findings lend support to theories that the last eruption of Mount Fuji in 1707 was probably triggered by the giant 8.7-magnitude Hoei earthquake, which took place 49 days before the eruption.More generally, the results show how regions affected by high-pressure volcanic fluids can be characterized using seismic data from dense seismic detector networks. This should help to anticipate the risk of major volcanic eruptions worldwide.*Seismic noise includes all the unwanted components affecting an analysis, such as the noise produced by the measuring device itself or external perturbations inadvertently picked up by the measuring devices. | Earthquakes | 2,014 |
July 3, 2014 | https://www.sciencedaily.com/releases/2014/07/140703142340.htm | Oklahoma earthquakes induced by wastewater injection by disposal wells, study finds | The dramatic increase in earthquakes in central Oklahoma since 2009 is likely attributable to subsurface wastewater injection at just a handful of disposal wells, finds a new study to be published in the journal | The research team was led by Katie Keranen, professor of geophysics at Cornell University, who says Oklahoma earthquakes constitute nearly half of all central and eastern U.S. seismicity from 2008 to 2013, many occurring in areas of high-rate water disposal."Induced seismicity is one of the primary challenges for expanded shale gas and unconventional hydrocarbon development. Our results provide insight into the process by which the earthquakes are induced and suggest that adherence to standard best practices may substantially reduce the risk of inducing seismicity," said Keranen. "The best practices include avoiding wastewater disposal near major faults and the use of appropriate monitoring and mitigation strategies."The study also concluded:"Earthquake and subsurface pressure monitoring should be routinely conducted in regions of wastewater disposal and all data from those should be publicly accessible. This should also include detailed monitoring and reporting of pumping volumes and pressures," said Keranen. 'In many states the data are more difficult to obtain than for Oklahoma; databases should be standardized nationally. Independent quality assurance checks would increase confidence. " | Earthquakes | 2,014 |
July 1, 2014 | https://www.sciencedaily.com/releases/2014/07/140701142937.htm | New bridge design improves earthquake resistance, reduces damage and speeds construction | Researchers have developed a new design for the framework of columns and beams that support bridges, called "bents," to improve performance for better resistance to earthquakes, less damage and faster on-site construction. | The faster construction is achieved by pre-fabricating the columns and beams off-site and shipping them to the site, where they are erected and connected quickly."The design of reinforced concrete bridges in seismic regions has changed little since the mid-1970s," said John Stanton, a professor in the Department of Civil and Environmental Engineering at the University of Washington, Seattle, who developed the concept underlying the new design. The team members include professor Marc Eberhard and graduate research assistants Travis Thonstad and Olafur Haraldsson from the University of Washington; and professor David Sanders and graduate research assistant Islam Mantawy from the University of Nevada, Reno.Research findings are included in a paper being presented during Quake Summit 2014, the annual meeting for the National Science Foundation's George E. Brown, Jr. Network for Earthquake Engineering Simulation, a shared network of laboratories based at Purdue University. This year's summit is part of the 10th U.S. National Conference on Earthquake Engineering on July 21-25 in Anchorage, Alaska.Until now the majority of bridge bents have been made using concrete that is cast in place, but that approach means time is needed for the concrete to gain strength before the next piece can be added. Pre-fabricating the pieces ahead of time eliminates this requirement, speeding on-site construction and reducing traffic delays."However, pre-fabricating means the pieces need to be connected on-site, and therein lies a major difficulty," Stanton said. "It is hard enough to design connections that can survive earthquake shaking, or to design them so that they can be easily assembled, but to do both at once is a real challenge."Moreover, the researchers have achieved this goal using only common construction materials, which should smooth the way for owners and contractors to accept the new approach, he said.An important feature of the new system is that the columns are pre-tensioned."A good analogy is to think of a series of a child's wooden building blocks, each with a hole through it," Stanton said. "Stack them on top of one another, put a rubber band through the central hole, stretch it tight and anchor it at each end. The rubber band keeps the blocks squeezed together. Now stand the assembly of blocks up on its end and you have a pre-tensioned column. If the bottom of the column is attached to a foundation block, you can push the top sideways, as would an earthquake, but the rubber band just snaps the column back upright when you let go."This "re-centering" action is important because it ensures that, directly after an earthquake, the bridge columns are vertical and not leaning over at an angle. This means that the bridge can be used by emergency vehicles in the critical moments immediately following the earthquake."Of course, the real bridge columns do not contain rubber bands, but very high-strength steel cables are used to achieve the same behavior," Stanton said.To keep the site operations as simple as possible, those cables are stressed and embedded in the concrete at the plant where the columns are fabricated. The columns also contain some conventional rebar, which is also installed in the fabrication plant.The technology was pioneered in the building industry in the 1990s but is now being adapted for use with bridges.When the columns rock during an earthquake, they experience high local stresses at the points of contact, and without special measures the concrete there would crush. To counteract this possibility, the researchers protected the ends of the columns with short steel tubes, or "jackets," that confine the concrete, not unlike the hoops of a barrel, or the steel cap that ranchers use to protect the top of a fence-post while driving it into the ground."Cyclic tests of the critical connections have demonstrated that the system can deform during strong earthquakes and then bounce back to vertical with minimal damage," Stanton said.Those tests were conducted on individual connections by graduate assistants Olafur Haraldsson, Jeffrey Schaefer and Bryan Kennedy. In July, the team will test a complete bridge built with the system. The test will be conducted at 25 percent of full-scale on the earthquake-shaking tables at a facility at the University of Nevada, Reno. The facility is part of NEES.Travis Thonstad led the design and built the components for that test. The column and cap beam components were then shipped to the University of Nevada, Reno, where Islam Mantawy is leading the construction of the bridge. The team from Washington and Nevada will be processing the data from this project, and it will be archived and made available to the public through NEES.The Quake Summit paper was authored jointly by the team. The research was supported by the NSF, the Pacific Earthquake Engineering Research (PEER) Center and the Valle Foundation of the University of Washington. | Earthquakes | 2,014 |
June 30, 2014 | https://www.sciencedaily.com/releases/2014/06/140630193409.htm | Gas-charged fluids creating seismicity associated with a Louisiana sinkhole | Natural earthquakes and nuclear explosions produce seismic waves that register on seismic monitoring networks around the globe, allowing the scientific community to pinpoint the location of the events. In order to distinguish seismic waves produced by a variety of activities -- from traffic to mining to explosions -- scientists study the seismic waves generated by as many types of events as possible. | In August 2012, the emergence of a very large sinkhole at the Napoleonville Salt Dome in Louisiana offered University of California, Berkeley scientists the opportunity to detect, locate and analyze a rich sequence of 62 seismic events that occurred one day prior to its discovery.In June 2012, residents of Bayou Corne reported frequent tremors and unusual gas bubbling in local surface water. The U.S. Geological Survey installed a temporary network of seismic stations, and on August 3, a large sinkhole was discovered close to the western edge of the salt dome.In this study published by the The point-source equivalent force system describing the motions at the seismic source (called moment tensor) showed similarities to seismic events produced by explosions and active geothermal and volcanic environments. But at the sinkhole, an influx of natural gas rather than hot magma may be responsible for elevating the pore pressure enough to destabilize pre-existing zones of weakness, such as fractures or faults at the edge of the salt dome. | Earthquakes | 2,014 |
June 27, 2014 | https://www.sciencedaily.com/releases/2014/06/140627094541.htm | Extinct undersea volcanoes squashed under Earth's crust cause tsunami earthquakes | New research has revealed the causes and warning signs of rare tsunami earthquakes, which may lead to improved detection measures. | Tsunami earthquakes happen at relatively shallow depths in the ocean and are small in terms of their magnitude. However, they create very large tsunamis, with some earthquakes that only measure 5.6 on the Richter scale generating waves that reach up to ten metres when they hit the shore.A global network of seismometers enables researchers to detect even the smallest earthquakes. However, the challenge has been to determine which small magnitude events are likely to cause large tsunamis.In 1992, a magnitude 7.2 tsunami earthquake occurred off the coast of Nicaragua in Central America causing the deaths of 170 people. Six hundred and thirty seven people died and 164 people were reported missing following a tsunami earthquake off the coast of Java, Indonesia, in 2006, which measured 7.2 on the Richter scale.The new study, published in the journal The researchers from Imperial College London and GNS Science in New Zealand used geophysical data collected for oil and gas exploration and historical accounts from eye witnesses relating to two tsunami earthquakes, which happened off the coast of New Zealand's north island in 1947. Tsunami earthquakes were only identified by geologists around 35 years ago, so detailed studies of these events are rare.The team located two extinct volcanoes off the coast of Poverty Bay and Tolaga Bay that have been squashed and sunk beneath the crust off the coast of New Zealand, in a process called subduction.The researchers suggest that the volcanoes provided a "sticking point" between a part of Earth's crust called the Pacific plate, which was trying to slide underneath the New Zealand plate. This caused a build-up of energy, which was released in 1947, causing the plates to "unstick" and the Pacific plate to move and the volcanoes to become subsumed under New Zealand. This release of the energy from both plates was unusually slow and close to the seabed, causing large movements of the sea floor, which led to the formation of very large tsunami waves.All these factors combined, say the researchers, are factors that contribute to tsunami earthquakes. The researchers say that the 1947 New Zealand tsunami earthquakes provide valuable insights into what geological factors cause these events. They believe the information they've gathered on these events could be used to locate similar zones around the world that could be at risk from tsunami earthquakes. Eyewitnesses from these tsunami earthquakes also describe the type of ground movement that occurred and this provides valuable clues about possible early warning signals for communities.Dr Rebecca Bell, from the Department of Earth Science and Engineering at Imperial College London, says: "Tsunami earthquakes don't create massive tremors like more conventional earthquakes such as the one that hit Japan in 2011, so residents and authorities in the past haven't had the same warning signals to evacuate. These types of earthquakes were only identified a few decades ago, so little information has been collected on them. Thanks to oil exploration data and eyewitness accounts from two tsunami earthquakes that happened in New Zealand more than 70 years ago, we are beginning to understand for first time the factors that cause these events. This could ultimately save lives."By studying the data and reports, the researchers have built up a picture of what happened in New Zealand in 1947 when the tsunami earthquakes hit. In the March earthquake, eyewitnesses around Poverty Bay on the east coast of the country, close to the town of Gisborne, said that they didn't feel violent tremors, which are characteristic of typical earthquakes. Instead, they felt the ground rolling, which lasted for minutes, and brought on a sense of sea sickness. Approximately 30 minutes later the bay was inundated by a ten metre high tsunami that was generated by a 5.9 magnitude offshore earthquake. In May, an earthquake measuring 5.6 on the Richter scale happened off the coast of Tolaga Bay, causing an approximate six metre high tsunami to hit the coast. No lives were lost in the New Zealand earthquakes as the areas were sparsely populated in 1947. However, more recent tsunami earthquakes elsewhere have devastated coastal communities.The researchers are already working with colleagues in New Zealand to develop a better warning system for residents. In particular, new signage is being installed along coastal regions to alert people to the early warning signs that indicate a possible tsunami earthquake. In the future, the team hope to conduct new cutting-edge geophysical surveys over the sites of other sinking volcanoes to better understand their characteristics and the role they play in generating this unusual type of earthquake. | Earthquakes | 2,014 |
June 18, 2014 | https://www.sciencedaily.com/releases/2014/06/140618163918.htm | Studying magma formation beneath Mount St. Helens | University and government scientists are embarking on a collaborative research expedition to improve volcanic eruption forecasting by learning more about how a deep-underground feeder system creates and supplies magma to Mount St. Helens. | They hope the research will produce science that will lead to better understanding of eruptions, which in turn could lead to greater public safety.The Imaging Magma Under St. Helens project involves three distinct components: active-source seismic monitoring, passive-source seismic monitoring and magnetotelluric monitoring, using fluctuations in Earth's electromagnetic field to produce images of structures beneath the surface.Researchers are beginning passive-source and magnetotelluric monitoring, while active-source monitoring -- measuring seismic waves generated by underground detonations -- will be conducted later.Passive-source monitoring involves burying seismometers at 70 different sites throughout a 60-by-60-mile area centered on Mount St. Helens in southwestern Washington. The seismometers will record data from a variety of seismic events."We will record local earthquakes, as well as distant earthquakes. Patterns in the earthquake signatures will reveal in greater detail the geological structures beneath St. Helens," said John Vidale, director of the University of Washington-based Pacific Northwest Seismic Network.Magnetotelluric monitoring will be done at 150 sites spread over an area running 125 miles north to south and 110 miles east to west, which includes both Mount Rainier and Mount Adams. Most of the sites will only be used for a day, with instruments recording electric and magnetic field signals that will produce images of subsurface structures.Besides the UW, collaborating institutions are Oregon State University, Lamont-Doherty Earth Observatory at Columbia University, Rice University, Columbia University, the U.S. Geological Survey and ETH-Zurich in Switzerland. The work is being funded by the National Science Foundation.Mount St. Helens has been the most active volcano in the Cascade Range during the last 2,000 years and has erupted twice in the last 35 years. It also is more accessible than most volcanoes for people and equipment, making it a prime target for scientists trying to better understand how volcanoes get their supply of magma.The magma that eventually comes to the surface probably originates 60 to 70 miles deep beneath St. Helens, at the interface between the Juan de Fuca and North American tectonic plates. The plates first come into contact off the Pacific Northwest coast, where the Juan de Fuca plate subducts beneath the North American plate and reaches great depth under the Cascades. As the magma works its way upward, it likely accumulates as a mass several miles beneath the surface.As the molten rock works its way toward the surface, it is possible that it gathers in a large chamber a few miles beneath the surface. The path from great depth to this chamber is almost completely unknown and is a main subject of the study. The project is expected to conclude in the summer of 2016. | Earthquakes | 2,014 |
June 12, 2014 | https://www.sciencedaily.com/releases/2014/06/140612142309.htm | New evidence for 'oceans' of water deep in Earth: Water bound in mantle rock alters view of Earth's composition | Researchers from Northwestern University and the University of New Mexico report evidence for potentially oceans worth of water deep beneath the United States. Though not in the familiar liquid form -- the ingredients for water are bound up in rock deep in the Earth's mantle -- the discovery may represent the planet's largest water reservoir. | The presence of liquid water on the surface is what makes our "blue planet" habitable, and scientists have long been trying to figure out just how much water may be cycling between Earth's surface and interior reservoirs through plate tectonics.Northwestern geophysicist Steve Jacobsen and University of New Mexico seismologist Brandon Schmandt have found deep pockets of magma located about 400 miles beneath North America, a likely signature of the presence of water at these depths. The discovery suggests water from the Earth's surface can be driven to such great depths by plate tectonics, eventually causing partial melting of the rocks found deep in the mantle.The findings, to be published June 13 in the journal "Geological processes on the Earth's surface, such as earthquakes or erupting volcanoes, are an expression of what is going on inside the Earth, out of our sight," said Jacobsen, a co-author of the paper. "I think we are finally seeing evidence for a whole-Earth water cycle, which may help explain the vast amount of liquid water on the surface of our habitable planet. Scientists have been looking for this missing deep water for decades."Scientists have long speculated that water is trapped in a rocky layer of the Earth's mantle located between the lower mantle and upper mantle, at depths between 250 miles and 410 miles. Jacobsen and Schmandt are the first to provide direct evidence that there may be water in this area of the mantle, known as the "transition zone," on a regional scale. The region extends across most of the interior of the United States.Schmandt, an assistant professor of geophysics at the University of New Mexico, uses seismic waves from earthquakes to investigate the structure of the deep crust and mantle. Jacobsen, an associate professor of Earth and planetary sciences at Northwestern's Weinberg College of Arts and Sciences, uses observations in the laboratory to make predictions about geophysical processes occurring far beyond our direct observation.The study combined Jacobsen's lab experiments in which he studies mantle rock under the simulated high pressures of 400 miles below the Earth's surface with Schmandt's observations using vast amounts of seismic data from the USArray, a dense network of more than 2,000 seismometers across the United States.Jacobsen's and Schmandt's findings converged to produce evidence that melting may occur about 400 miles deep in the Earth. H"Melting of rock at this depth is remarkable because most melting in the mantle occurs much shallower, in the upper 50 miles," said Schmandt, a co-author of the paper. "If there is a substantial amount of HIf just one percent of the weight of mantle rock located in the transition zone is HThis water is not in a form familiar to us -- it is not liquid, ice or vapor. This fourth form is water trapped inside the molecular structure of the minerals in the mantle rock. The weight of 250 miles of solid rock creates such high pressure, along with temperatures above 2,000 degrees Fahrenheit, that a water molecule splits to form a hydroxyl radical (OH), which can be bound into a mineral's crystal structure.Schmandt and Jacobsen's findings build on a discovery reported in March in the journal Nature in which scientists discovered a piece of the mineral ringwoodite inside a diamond brought up from a depth of 400 miles by a volcano in Brazil. That tiny piece of ringwoodite -- the only sample in existence from within the Earth -- contained a surprising amount of water bound in solid form in the mineral."Whether or not this unique sample is representative of the Earth's interior composition is not known, however," Jacobsen said. "Now we have found evidence for extensive melting beneath North America at the same depths corresponding to the dehydration of ringwoodite, which is exactly what has been happening in my experiments."For years, Jacobsen has been synthesizing ringwoodite, colored sapphire-like blue, in his Northwestern lab by reacting the green mineral olivine with water at high-pressure conditions. (The Earth's upper mantle is rich in olivine.) He found that more than one percent of the weight of the ringwoodite's crystal structure can consist of water -- roughly the same amount of water as was found in the sample reported in the Nature paper."The ringwoodite is like a sponge, soaking up water," Jacobsen said. "There is something very special about the crystal structure of ringwoodite that allows it to attract hydrogen and trap water. This mineral can contain a lot of water under conditions of the deep mantle."For the study reported in Jacobsen uses small gem diamonds as hard anvils to compress minerals to deep-Earth conditions. "Because the diamond windows are transparent, we can look into the high-pressure device and watch reactions occurring at conditions of the deep mantle," he said. "We used intense beams of X-rays, electrons and infrared light to study the chemical reactions taking place in the diamond cell."Jacobsen's findings produced the same evidence of partial melt, or magma, that Schmandt detected beneath North America using seismic waves. Because the deep mantle is beyond the direct observation of scientists, they use seismic waves -- sound waves at different speeds -- to image the interior of the Earth."Seismic data from the USArray are giving us a clearer picture than ever before of the Earth's internal structure beneath North America," Schmandt said. "The melting we see appears to be driven by subduction -- the downwelling of mantle material from the surface."The melting the researchers have detected is called dehydration melting. Rocks in the transition zone can hold a lot of H"When a rock with a lot of H"Once the water is released, much of it may become trapped there in the transition zone," Jacobsen added.Just a little bit of melt, about one percent, is detectible with the new array of seismometers aimed at this region of the mantle because the melt slows the speed of seismic waves, Schmandt said.The USArray is part of EarthScope, a program of the National Science Foundation that deploys thousands of seismic, GPS and other geophysical instruments to study the structure and evolution of the North American continent and the processes the cause earthquakes and volcanic eruptions.The National Science Foundation (grants EAR-0748797 and EAR-1215720) and the David and Lucile Packard Foundation supported the research. | Earthquakes | 2,014 |
June 12, 2014 | https://www.sciencedaily.com/releases/2014/06/140612132351.htm | Personal resiliency paramount for future disasters | A Mayo Clinic researcher says individuals need to build disaster readiness and resiliency in order to better recover from the effects of earthquakes, tsunamis, hurricanes, tornadoes, wildfires and other natural disasters. Those who prepare well for disasters are more likely to have a sense of spiritual and emotional well-being and be satisfied with their life. Those findings appear in the journal | Health scientist and geologist Monica Gowan, Ph.D., says how well people are prepared for adversity through the presence of meaning and purpose in their lives can play a positive role in how well they manage the uncertainties of disaster risk and recover from devastating experiences to regain health and quality of life."Even prior to the 2010-2014 New Zealand earthquakes and 2011 Japan tsunami, we recognized a need to explore how well-being was related to evacuation preparedness for future earthquake and tsunami disasters, Our findings are now relevant to any disaster experience, whether it's an earthquake or tsunami in the Pacific or a tornado in the American Midwest," says Dr. Gowan. She says if someone consciously cares about his or her well-being and that of others, and is aware and engaged enough to act on that basis, they have a stronger chance of being better off. She and her colleagues say this is the first scientific study of personal resilience and evacuation readiness prior to large disasters.* * ** * *"Along with the robust survey findings we obtained from our random sample of 695 adults, many people in the study shared anecdotes about why they were preparing for disaster," says Dr. Gowan. "Profoundly personal reasons were a common theme, whether due to their own vulnerabilities and desire to survive, to concern for a loved one, being part of a community, or wanting to serve some other greater good or higher purpose. A number had survived a prior disaster, with experiences ranging from the Holocaust to 9/11, and nearly every type of natural disaster. They all seemed to have found meaningful ways to transcend their unthinkable experiences."The researchers say the study has far-reaching implications. Growing populations and global travel make everyone vulnerable to disaster, and so the need for resilience is universal. | Earthquakes | 2,014 |
June 3, 2014 | https://www.sciencedaily.com/releases/2014/06/140603103432.htm | Three years since Japan's disaster: Communities remain scattered and suffering | While western eyes are focused on the ongoing problems of the Fukushima Daiichi nuclear reactor site, thousands of people are still evacuated from their homes in north-eastern Japan following the earthquake, tsunami and nuclear emergency. Many are in temporary accommodation and frustrated by a lack of central government foresight and responsiveness to their concerns. | With the exception of the ongoing problems at the Fukushima Daiichi nuclear reactor, outside of the Tohoku region of Japan, the after effects of the Great East Japan Earthquake of 2011, and the subsequent tsunami and nuclear disaster, are no longer front page news. The hard work of recovery is the everyday reality in the region, and for planning schools and consultants across the country the rebuilding of Tohoku dominates practice and study.But while physical reconstruction takes place, progress is not smooth. Many victims of the disasters and members of the wider public feel that the government is more interested in feeding the construction industry than addressing the complex challenges of rebuilding sustainable communities. This is a region that was already suffering from the challenges of an aging population, the exodus of young people to Tokyo and the decline of traditional fisheries-based industries. In the worst cases people are facing the invidious choice of returning to areas that are still saturated with radioactive fallout or never going home.The frustration is reflected in four short pieces in Planning Theory and Practice's Interface Section from architecture, design and planning practitioners working with communities in four different parts of Tohoku.Christian Dimmer, Assistant Professor at Tokyo University and founder of TPF2 -- Tohoku Planning Forum which links innovative redevelopment schemes in the region says:"The current Japanese government's obsession with big construction projects, like mega-seawalls that have already been shown to be not likely to be effective, is leading to really innovative community solutions being marginalized, the voices of communities being ignored, and sustainability cast aside."According to community planner and academic, Kayo Murakami -- who edits this Interface section: "The troubles of the Tohoku reconstruction are not just a concern for Japan. They highlight some of the fundamental challenges for disaster recovery and building sustainable communities, in which people are really involved, all over the world." | Earthquakes | 2,014 |
May 23, 2014 | https://www.sciencedaily.com/releases/2014/05/140523094255.htm | Disaster planning: Risk assessment vital to development of mitigation plans | Wildfires and flooding affect many more people in the USA than earthquakes and landslide and yet the dread, the perceived risk, of the latter two is much greater than for those hazards that are more frequent and cause greater loss of life. Research published in the | Maura Knutson (nee Hurley) and Ross Corotis of the University of Colorado, Boulder, explain that earlier efforts for incorporating a sociological perspective and human risk perception into hazard-mitigation plans, commonly used equivalent dollar losses from natural hazard events as the statistic by which to make decisions. Unfortunately, this fails to take into consideration how people view natural hazards, the team reports. Moreover, this can lead to a lack of public support and compliance with emergency plans when disaster strikes and lead to worse outcomes in all senses.The researchers have therefore developed a framework that combines the usual factors for risk assessment, injuries, deaths and economic and collateral loss with the human perception of the risks associated with natural disasters. The framework includes risk perception by graphing natural hazards against "dread" and "familiarity." These two variables are well known to social psychologists as explaining the greatest variability in an individual's perception of risk, whether considering earthquakes, landslides, wildfires, storms, tornadoes, hurricanes, flooding, avalanche, even volcanic activity. "Understanding how the public perceives the risk for various natural hazards can assist decision makers in developing and communicating policy decisions," the team says.The higher the perceived risk of a natural disaster, the more people want to see that risk reduced and that means seeing their tax dollars spent on mitigation and preparation. For example, far more money is spent on reducing earthquake risk than on reducing the risk from wildfires, perhaps because the perceived risk is much greater, even though both will cause significant losses of life and property. The team's new framework for risk assessment will act as an aid in decision making for these types of situations as well as perhaps even offering a way to give members of the public a clearer understanding of actual risk rather than perceived risk. | Earthquakes | 2,014 |
May 22, 2014 | https://www.sciencedaily.com/releases/2014/05/140522141451.htm | Deep Earth recycling of the oceanic floor: New insight into the temperature of deep Earth | Scientists from the Magma and Volcanoes Laboratory (CNRS/IRD/Université Blaise Pascal) and the European Synchrotron, the ESRF, have recreated the extreme conditions 600 to 2900 km below Earth's surface to investigate the melting of basalt in the oceanic tectonic plates. They exposed microscopic pieces of rock to these extreme pressures and temperatures while simultaneously studying their structure with the ESRF's extremely powerful X-ray beam. The results show that basalt produced on the ocean floor has a melting temperature lower than the peridotite which forms Earth's mantle. Near the core-mantle boundary, where the temperature rises rapidly, the melting basalt produces liquids rich in silica (SiO | Earth is an active planet. The heat it contains is capable of inducing the mantle convection responsible for plate tectonics. This energy comes from the heat accumulated during the formation of our planet, the latent heat of crystallization of the inner core, and radioactive decay. The temperatures inside Earth, however, are not well known.Convection causes hot material to rise to the surface of Earth and cold material to sink towards the core. Thus, when the ascending mantle begins to melt at the base of the oceanic ridges, the basalt flows along the surface to form what we call the oceanic crust. "Over the course of millennia the crust will then undergo subduction, its greater density causing it to sink into the mantle. This is why Earth's continents are known to be several billion years old, while the oldest oceanic crust only dates back 165 million years" said Mohamed Mezouar, scientist at the ESRF.The temperature at the core-mantle boundary (also known as the D" region) is thought to increase by more than 1000 degrees over a few hundred kilometers, which is significant compared to the temperature gradient across the rest of the mantle. Previous authors have suggested that this temperature rise could cause the partial melting of the mantle, but this hypothesis leaves a number of geophysical observations unexplained. Firstly, the anomalies in the propagation speed of seismic waves do not match those expected for a partial melting of the mantle, and secondly, the melting mantle should lead to the production of liquid pockets in the lowermost mantle, a phenomenon which has never been observed.The team led by Professor Denis Andrault from the Université Blaise Pascal decided instead to study the melting point of basalt at high depths, and found that it was significantly lower than that of the mantle. The melting of sub-oceanic basalt piles could therefore be responsible for the previously unexplained seismic anomalies. The researchers also showed that the melting basalt generates a liquid rich in SiO2. As the mantle itself contains large quantities of MgO, the interaction of these liquids with the mantle is expected to produce a rapid reaction leading to the formation of the solid MgSiOIf it is indeed the basalt and not the mantle whose melting in the D"-region is responsible for the observed seismic anomalies, then the temperature at the core-mantle boundary must be between 3800 and 4150 Kelvin, between the melting points of basalt and Earth's mantle. If this hypothesis is correct, this would be the most accurate determination of the temperature at the core-mantle boundary available today."It could solve a long time controversy about the peculiar role of the core-mantle boundary in the dynamical properties of the Earth mantle, said Professor Denis Andrault. ''We know now that the cycle of crust formation at the mid-ocean ridges and crust dissolution in the lowermost mantle may have occured since plate tectonics were active on our planet'', he added. | Earthquakes | 2,014 |
May 19, 2014 | https://www.sciencedaily.com/releases/2014/05/140519184532.htm | Earthquakes: The next 'Big One' for the San Francisco Bay Area may be a cluster of major quakes | A cluster of closely timed earthquakes over 100 years in the 17th and 18th centuries released as much accumulated stress on San Francisco Bay Area's major faults as the Great 1906 San Francisco earthquake, suggesting two possible scenarios for the next "Big One" for the region, according to new research published by the | "The plates are moving," said David Schwartz, a geologist with the U.S. Geological Survey and co-author of the study. "The stress is re-accumulating, and all of these faults have to catch up. How are they going to catch up?"The San Francisco Bay Region (SFBR) is considered within the boundary between the Pacific and North American plates. Energy released during its earthquake cycle occurs along the region's principal faults: the San Andreas, San Gregorio, Calaveras, Hayward-Rodgers Creek, Greenville, and Concord-Green Valley faults."The 1906 quake happened when there were fewer people, and the area was much less developed," said Schwartz. "The earthquake had the beneficial effect of releasing the plate boundary stress and relaxing the crust, ushering in a period of low level earthquake activity."The earthquake cycle reflects the accumulation of stress, its release as slip on a fault or a set of faults, and its re-accumulation and re-release. The San Francisco Bay Area has not experienced a full earthquake cycle since its been occupied by people who have reported earthquake activity, either through written records or instrumentation. Founded in 1776, the Mission Dolores and the Presidio in San Francisco kept records of felt earthquakes and earthquake damage, marking the starting point for the historic earthquake record for the region."We are looking back at the past to get a more reasonable view of what's going to happen decades down the road," said Schwartz. "The only way to get a long history is to do these paleoseismic studies, which can help construct the rupture histories of the faults and the region. We are trying to see what went on and understand the uncertainties for the Bay Area."Schwartz and colleagues excavated trenches across faults, observing past surface ruptures from the most recent earthquakes on the major faults in the area. Radiocarbon dating of detrital charcoal and the presence of non-native pollen established the dates of paleoearthquakes, expanding the span of information of large events back to 1600.The trenching studies suggest that between 1690 and the founding of the Mission Dolores and Presidio in 1776, a cluster of earthquakes ranging from magnitude 6.6 to 7.8 occurred on the Hayward fault (north and south segments), San Andreas fault (North Coast and San Juan Bautista segments), northern Calaveras fault, Rodgers Creek fault, and San Gregorio fault. There are no paleoearthquake data for the Greenville fault or northern extension of the Concord-Green Valley fault during this time interval."What the cluster of earthquakes did in our calculations was to release an amount of energy somewhat comparable to the amount released in the crust by the 1906 quake," said Schwartz.As stress on the region accumulates, the authors see at least two modes of energy release -- one is a great earthquake and other is a cluster of large earthquakes. The probability for how the system will rupture is spread out over all faults in the region, making a cluster of large earthquakes more likely than a single great earthquake."Everybody is still thinking about a repeat of the 1906 quake," said Schwartz. "It's one thing to have a 1906-like earthquake where seismic activity is shut off, and we slide through the next 110 years in relative quiet. But what happens if every five years we get a magnitude 6.8 or 7.2? That's not outside the realm of possibility." | Earthquakes | 2,014 |
May 14, 2014 | https://www.sciencedaily.com/releases/2014/05/140514133440.htm | California mountains rise as groundwater depleted in state's Central Valley: May trigger small earthquakes | Winter rains and summer groundwater pumping in California's Central Valley make the Sierra Nevada and Coast Ranges sink and rise by a few millimeters each year, creating stress on the state's earthquake faults that could increase the risk of a quake. | Gradual depletion of the Central Valley aquifer because of groundwater pumping also raises these mountain ranges by a similar amount each year -- about the thickness of a dime -- with a cumulative rise over the past 150 years of up to 15 centimeters (6 inches), according to calculations by a team of geophysicists.While the seasonal changes in the Central Valley aquifer have not yet been firmly associated with any earthquakes, studies have shown that similar levels of periodic stress, such as that caused by the motions of the moon and sun, increase the number of microquakes on the San Andreas Fault, which runs parallel to the mountain ranges. If these subtle seasonal load changes are capable of influencing the occurrence of microquakes, it is possible that they can sometimes also trigger a larger event, said Roland Bürgmann, UC Berkeley professor of earth and planetary science at UC Berkeley."The stress is very small, much less than you need to build up stress on a fault toward an earthquake, but in some circumstances such small stress changes can be the straw that broke the camel's back; it could just give that extra push to get a fault to fail," Bürgmann said.Bürgmann is a coauthor of a report published online this week by the journal Water has been pumped from California's Central Valley for more than 150 years, reducing what used to be a marsh and extensive lake, Tulare Lake, into fertile agricultural fields that feed the world. In that time, approximately 160 cubic kilometers (40 cubic miles) of water was removed -- the capacity of Lake Tahoe -- dropping the water table in some areas more than 120 meters (400 feet) and the ground surface 5 meters (16 feet) or more.The weight of water removed allowed the underlying crust or lithosphere to rise by so-called isostatic rebound, which has raised the Sierra probably as much as half a foot since about 1860, Bürgmann said.The same rebound happens as a result of the state's seasonal rains. Torrential winter storms drop water and snow across the state, which eventually flow into Central Valley streams, reservoirs and underground aquifer, pushing down the crust and lowering the Sierra 1-3 millimeters. In the summer, water flow through the delta into the Pacific Ocean, evaporation and ground water pumping for irrigation, which has accelerated in the past few years because of a drought, allows the crust and surrounding mountains to rise again.Bürgmann said that the flexing of Earth's crust downward in winter would clamp the San Andreas Fault tighter, lowering the risk of quakes, while in summer the upward flexure would relieve this clamping and perhaps increase the risk."The hazard is ever so slightly higher in the summer than in the wintertime," he said. "This suggests that climate and tectonics interact; that water changes ultimately affect the deeper Earth too."Millimeter-precision measurements of elevation have been possible only in the last few years, with improved continuous GPS networks -- part of the National Science Foundation-funded Plate Boundary Observatory, which operates 1,100 stations around the western U.S. -- and satellite-based interferometric synthetic aperture radar (InSAR). Synthetic aperture radar is a form of radar in which phase information is used to map elevation.These measurements revealed a steady yearly rise of the Sierra of 1-2 millimeters per year, which was initially ascribed to tectonic activity deep underground, even though the rate was unusually high, Bürgmann said. The new study provides an alternative and more reasonable explanation for the rise of the Sierra in historic times."The Coast Range is doing the same thing as the Sierra Nevada, which is part of the evidence that this can't be explained by tectonics," he said. "Both ranges have uplifted over the last few years and they both exhibit the same seasonal up and down movement in phase. This tells us that something has to be driving the system at a seasonal and long-term sense, and that has to be groundwater recharging and depletion."In response to the current drought, about 30 cubic kilometers (7.5 cubic miles) of water were removed from Central Valley aquifers between 2003 and 2010, causing a rise of about 10 millimeters (2/5 inch) in the Sierra over that time.After the new results were shared with colleagues, Bürgmann said, some geologists suggested that the state could get a better or at least comparable inventory of available water each year by using GPS to measure ground deformation instead of measuring snowpack and reservoir levels.Other coauthors are Colin B. Amos of Western Washington University in Bellingham, Ingrid A. Johanson of UC Berkeley. Funding for the research came from NSF EarthScope and UC Berkeley's Miller Institute. | Earthquakes | 2,014 |
May 7, 2014 | https://www.sciencedaily.com/releases/2014/05/140507114805.htm | Yellowstone geyser eruptions influenced more by internal processes | The intervals between geyser eruptions depend on a delicate balance of underground factors, such as heat and water supply, and interactions with surrounding geysers. Some geysers are highly predictable, with intervals between eruptions (IBEs) varying only slightly. | The predictability of these geysers offer earth scientists a unique opportunity to investigate what may influence their eruptive activity, and to apply that information to rare and unpredictable types of eruptions, such as those from volcanoes.Dr. Shaul Hurwitz took advantage of a decade of eruption -- spanning from 2001 to 2011 -- for two of Yellowstone's most predictable geysers, the cone geyser Old Faithful and the pool geyser, Daisy.Dr. Hurwitz's team focused their statistical analysis on possible correlations between the geysers' IBEs and external forces such as weather, earth tides and earthquakes. The authors found no link between weather and Old Faithful's IBEs, but they did find that Daisy's IBEs correlated with cold temperatures and high winds. In addition, Daisy's IBEs were significantly shortened following the 7.9 magnitude earthquake that hit Alaska in 2002.The authors note that atmospheric processes exert a relatively small but statistically significant influence on pool geysers; IBEs by modulating heat transfer rates from the pool to the atmosphere. Overall, internal processes and interactions with surrounding geysers dominate IBEs; variability, especially in cone geysers. | Earthquakes | 2,014 |
May 4, 2014 | https://www.sciencedaily.com/releases/2014/05/140504133203.htm | New insight may help predict volcanic eruption behavior | A new discovery in the study of how lava dome volcanoes erupt may help in the development of methods to predict how a volcanic eruption will behave, say scientists at the University of Liverpool. | Volcanologists at the University have discovered that a process called frictional melting plays a role in determining how a volcano will erupt, by dictating how fast magma can ascend to the surface, and how much resistance it faces en-route.The process occurs in lava dome volcanoes when magma and rocks melt as they rub against each other due to intense heat. This creates a stop start movement in the magma as it makes its way towards Earth's surface. The magma sticks to the rock and stops moving until enough pressure builds up, prompting it to shift forward again (a process called stick-slip).Volcanologist, Dr Jackie Kendrick, who lead the research said: "Seismologists have long known that frictional melting takes place when large tectonic earthquakes occur. It is also thought that the stick-slip process that frictional melting generates is concurrent to 'seismic drumbeats' which are the regular, rhythmic small earthquakes which have been recently found to accompany large volcanic eruptions."Using friction experiments we have shown that the extent of frictional melting depends on the composition of the rock and magma, which determines how fast or slow the magma travels to the surface during the eruption."Analysis of lava collected from Mount St. Helens, USA and the Soufrière Hills volcano in Montserrat by volcanology researchers from the University's School of Environmental Sciences revealed remnants of pseudotachylyte, a cooled frictional melt. Evidence showed that the process took place in the conduit, the channel which lava passes through on its way to erupt.Dr Kendrick, from the University's School of Environmental Sciences, added: "The closer we get to understanding the way magma behaves, the closer we will get to the ultimate goal: predicting volcanic activity when unrest begins. Whilst we can reasonably predict when a volcanic eruption is about to happen, this new knowledge will help us to predict how the eruption will behave."With a rapidly growing population inhabiting the flanks of active volcanoes, understanding the behaviour of lava domes becomes an increasing challenge for volcanologists." | Earthquakes | 2,014 |
May 1, 2014 | https://www.sciencedaily.com/releases/2014/05/140501132628.htm | Wastewater disposal may trigger quakes at greater distance than previously thought | Oil and gas development activities, including underground disposal of wastewater and hydraulic fracturing, may induce earthquakes by changing the state of stress on existing faults to the point of failure. Earthquakes from wastewater disposal may be triggered at tens of kilometers from the wellbore, which is a greater range than previously thought, according to research to be presented today at the annual meeting of the Seismological Society of America (SSA). As an indication of the growing significance of man-made earthquakes on seismic hazard, SSA annual meeting will feature a special session to discuss new research findings and approaches to incorporating induced seismicity into seismic hazard assessments and maps. | The number of earthquakes within central and eastern United States has increased dramatically over the past few years, coinciding with increased hydraulic fracturing of horizontally drilled wells, and the injection of wastewater in deep disposal wells in many locations, including Colorado, Oklahoma, Texas, Arkansas and Ohio. According to the U.S. Geological Survey (USGS), an average rate of 100 earthquakes per year above a magnitude 3.0 occurred in the three years from 2010-2012, compared with an average rate of 21 events per year observed from 1967-2000."Induced seismicity complicates the seismic hazard equation," said Gail Atkinson, professor of earth sciences at Western University in Ontario Canada, whose research details how a new source of seismicity, such as an injection disposal well, can fundamentally alter the potential seismic hazard in an area.For critical structures, such as dams, nuclear power plants and other major facilities, Atkinson suggests that the hazard from induced seismicity can overwhelm the hazard from pre-existing natural seismicity, increasing the risk to structures that were originally designed for regions of low to moderate seismic activity.A new study of the Jones earthquake swarm, occurring near Oklahoma City since 2008, demonstrates that a small cluster of high-volume injection wells triggered earthquakes tens of kilometers away. Both increasing pore pressure and the number of earthquakes were observed migrating away from the injection wells."The existing criteria for an induced earthquake do not allow earthquakes associated with the well activity to occur this far away from the wellbore," said Katie Keranen, assistant professor of geophysics at Cornell University, who led the study of the Jones earthquake swarm. "Our results, using seismology and hydrogeology, show a strong link between a small number of wells and earthquakes migrating up to 50 kilometers away" said Keranen. The study's result will be presented by co-author Geoff Abers, senior research scientist at Lamont-Doherty Earth Observatory.While there are relatively few wells linked to increased seismicity, seismologists seek to anticipate when activity might trigger earthquakes and at what magnitude."It is important to avoid inducing earthquakes large enough to be felt, that is, earthquakes with magnitudes of about 2.5, or greater, because these are the ones that are of concern to the public," said Art McGarr, a geophysicist with USGS.McGarr's research investigates the factors that enhance the likelihood of earthquakes induced by fluid injection that are large enough to be felt, or, on rare occasions, capable of causing damage. The injection activities considered in McGarr's study include underground disposal of wastewater, development of enhanced geothermal systems and hydraulic fracturing. Of the three activities, wastewater disposal predominates both in terms of volumes of injected liquid and earthquake size, with magnitudes for a few of the earthquakes exceeding 5."From the results of this study, the total volume of injected fluid seems to be the factor that limits the magnitude, whereas the injection rate controls the frequency of earthquake occurrence," said McGarr.Despite the increasing seismicity in the central and eastern US, induced earthquakes are presently excluded from USGS estimates of earthquake hazard. Justin Rubinstein, geophysicist with USGS, will present an approach to account for the increased seismicity without first having to determine the source (induced or natural) of the earthquakes.The USGS is trying to "stay agnostic as to whether the earthquakes are induced or natural," says Rubinstein. "In some sense, from a hazard perspective, it doesn't matter whether the earthquakes are natural or induced. An increase in earthquake rate implies that the probability of a larger earthquake has also risen," said Rubinstein, whose method seeks to balance all of the possible ways the hazard might change given the changing earthquake rate.But what's the likelihood of induced seismicity from any specific well?"We can't answer the question at this time," said Atkinson, who said the complex problem of assigning seismic hazard to the effects of induced seismicity is just beginning to be addressed."There is a real dearth of regulations," said Atkinson. "We need a clear understanding of the likely induced seismicity in response to new activity. And who is the onus on to identify the likely seismic hazard?" | Earthquakes | 2,014 |
May 1, 2014 | https://www.sciencedaily.com/releases/2014/05/140501101127.htm | Australian tsunami database reveals threat to continent | Australia's coastline has been struck by up to 145 possible tsunamis since prehistoric times, causing deaths previously unreported in the scientific literature, a UNSW study has revealed. | The largest recorded inundation event in Australia was caused by an earthquake off Java in Indonesia on 17 July 2006, which led to a tsunami that reached up to 7.9 metres above sea level on land at Steep Point in Western Australia.The continent was also the site of the oldest known tsunami in the world -- an asteroid impact that occurred 3.47 billion years ago in what is now the Pilbara district of Western Australia.Details of the 145 modern day and prehistoric events are outlined in a revised Australian tsunami database, which has been extensively updated by UNSW researchers, Professor James Goff and Dr Catherine Chagué-Goff."Our research has led to an almost three-fold increase in the number of events identified -- up from 55 in 2007. NSW has the highest number of tsunamis in the database, with 57, followed by Tasmania with 40, Queensland with 26 and Western Australia with 23," says Professor Goff, of the UNSW School of Biological, Earth and Environmental Sciences."Historical documents indicate that up to 11 possible tsunami-related deaths have occurred in Australia since 1883. This is remarkable, because our tsunami-prone neighbour, New Zealand, has only one recorded death."Professor Goff and Dr Chagué-Goff, who also works at the Australian Nuclear Science and Technology Organisation, scoured scientific papers, newspaper reports, historical records and other tsunami databases to update the 2007 Australian database."And it is still incomplete. Much more work needs to be done, especially to identify prehistoric events and those on the east coast. Our goal is to better understand the tsunami hazard to Australia and the region. The geographical spread of events and deaths suggests the east coast faces the most significant risk," says Professor Goff.The results are published in the journal The country's largest tsunami had been listed in 2007 as one that hit Western Australia following an earthquake off Sumba Island in Indonesia on 19 August 1977, but this rating was based on wrong information about its wave height.Giant wave heights of about 13 metres -- bigger than those of the current record-holding event in 2006 -- have also been attributed to a possible tsunami on 8 April 1911 in Warrnambool in Victoria, but no hard evidence is available as yet to back this up.The study identified three prehistoric events that had an impact across the whole of the South West Pacific Ocean: an asteroid impact 2.5 million years ago and large earthquakes about 2900 years ago and in the mid-15th Century. | Earthquakes | 2,014 |
May 1, 2014 | https://www.sciencedaily.com/releases/2014/05/140501075937.htm | Network for tracking earthquakes exposes glacier activity: Accidental find offers big potential for research on Alaska's glaciers | Alaska's seismic network records thousands of quakes produced by glaciers, capturing valuable data that scientists could use to better understand their behavior, but instead their seismic signals are set aside as oddities. The current earthquake monitoring system could be "tweaked" to target the dynamic movement of the state's glaciers, suggests State Seismologist Michael West, who will present his research today at the annual meeting of the Seismological Society of America (SSA). | "In Alaska, these glacial events have been largely treated as a curiosity, a by-product of earthquake monitoring," said West, director of the Alaska Earthquake Center, which is responsible for detecting and reporting seismic activity across Alaska.The Alaska seismic network was upgraded in 2007-08, improving its ability to record and track glacial events. "As we look across Alaska's glacial landscape and comb through the seismic record, there are thousands of these glacial events. We see patterns in the recorded data that raise some interesting questions about the glaciers," said West.As a glacier loses large pieces of ice on its leading edge, a process called calving, the Alaska Earthquake Center's monitoring system automatically records the event as an earthquake. Analysts filter out these signals in order to have a clear record of earthquake activity for the region. In the discarded data, West sees opportunity."We have amassed a large record of glacial events by accident," said West. "The seismic network can act as an objective tool for monitoring glaciers, operating 24/7 and creating a data flow that can alert us to dynamic changes in the glaciers as they are happening." It's when a glacier is perturbed or changing in some way, says West, that the scientific community can learn the most.Since 2007, the Alaska Earthquake Center has recorded more than 2800 glacial events along 600 km of Alaska's coastal mountains. The equivalent earthquake sizes for these events range from about 1 to 3 on the local magnitude scale. While calving accounts for a significant number of the recorded quakes, each glacier's terminus -- the end of any glacier where the ice meets the ocean -- behaves differently. Seasonal variations in weather cause glaciers to move faster or slower, creating an expected seasonal cycle in seismic activity. But West and his colleagues have found surprises, too.In mid-August 2010, the Columbia Glacier's seismic activity changed radically from being relatively quiet to noisy, producing some 400 quakes to date. These types of signals from the Columbia Glacier have been documented every single month since August 2010, about the time when the Columbia terminus became grounded on sill, stalling its multi-year retreat.That experience highlighted for West the value of the accidental data trove collected by the Alaska Earthquake Center. "The seismic network is blind to the cause of the seismic events, cataloguing observations that can then be validated," said West, who suggests the data may add value to ongoing field studies in Alaska.Many studies of Alaska's glaciers have focused on single glacier analyses with dedicated field campaigns over short periods of time and have not tracked the entire glacier complex over the course of years. West suggests leveraging the data stream may help the scientific community observe the entire glacier complex in action or highlight in real time where scientists could look to catch changes in a glacier."This is low-hanging fruit," said West of the scientific advances waiting to be gleaned from the data. | Earthquakes | 2,014 |
April 30, 2014 | https://www.sciencedaily.com/releases/2014/04/140430083139.htm | Magnitude of maximum earthquake scales with maturity of fault | The oldest sections of transform faults, such as the North Anatolian Fault Zone (NAFZ) and the San Andreas Fault, produce the largest earthquakes, putting important limits on the potential seismic hazard for less mature parts of fault zones, according to a new study to be presented today at the Seismological Society of America (SSA) 2014 Annual Meeting in Anchorage, Alaska. The finding suggests that maximum earthquake magnitude scales with the maturity of the fault. | Identifying the likely maximum magnitude for the NAFZ is critical for seismic hazard assessments, particularly given its proximity to Istanbul."It has been argued for decades that fault systems evolving over geological time may unify smaller fault segments, forming mature rupture zones with a potential for larger earthquake," said Marco Bohnhoff, professor of geophysics at the German Research Center for Geosciences in Potsdam, Germany, who sought to clarify the seismic hazard potential from the NAFZ. "With the outcome of this study it would in principal be possible to improve the seismic hazard estimates for any transform fault near a population center, once its maturity can be quantified," said Bohnhoff.Bohnhoff and colleagues investigated the maximum magnitude of historic earthquakes along the NAFZ, which poses significant seismic hazard to northwest Turkey and, specifically, Istanbul.Relying on the region's extensive literary sources that date back more than 2000 years, Bohnhoff and colleagues used catalogues of historical earthquakes in the region, analyzing the earthquake magnitude in relation to the fault-zone age and cumulative offset across the fault, including recent findings on fault-zone segmentation along the NAFZ."What we know of the fault zone is that it originated approximately 12 million years ago in the east and migrated to the west," said Bohnhoff. "In the eastern portion of the fault zone, individual fault segments are longer and the offsets are larger."The largest earthquakes of approximately M 8.0 are exclusively observed along the older eastern section of the fault zone, says Bohnhoff. The younger western sections, in contrast, have historically produced earthquakes of magnitude no larger than 7.4."While a 7.4 earthquake is significant, this study puts a limit on the current seismic hazard to northwest Turkey and its largest regional population and economical center Istanbul," said Bohnhoff.Bohnhoff compared the study of the NAFZ to the San Andreas and the Dead Sea Transform Fault systems. While the earlier is well studied instrumentally with few historic records, the latter has an extensive record of historical earthquakes but few available modern fault-zone investigations. Both of these major transform fault systems support the findings for the NAFZ that were derived based on a unique combination of long historical earthquake records and in-depth fault-zone studies.Bohnhoff will present his study, "Fault-Zone Maturity Defines Maximum Earthquake Magnitude," today at the SSA Annual Meeting. SSA is an international scientific society devoted to the advancement of seismology and the understanding of earthquakes for the benefit of society. Its 2014 Annual Meeting will be held Anchorage, Alaska on April 30 -- May 2, 2014. | Earthquakes | 2,014 |
April 28, 2014 | https://www.sciencedaily.com/releases/2014/04/140428155820.htm | The thin-crusted U.S. Sierra Nevada Mountains: Where did the Earth go? | In an addition to Geosphere's ongoing themed issue series, "Geodynamics and Consequences of Lithospheric Removal in the Sierra Nevada, California," Craig H. Jones of the University of Colorado Boulder and colleagues examine the seismological study of the entire extent of the U.S. Sierra Nevada range using seismograms collected in the Sierra Nevada EarthScope field experiment from 2005 to 2007. | The southern Sierra Nevada is known to have unusually thin crust for mountains with such high elevations (peaks higher than 4 km/14,000 ft, and average elevations near 3 km/10,000 ft). Jones and his team use measurements of the arrival times of seismic waves (called P-waves) from earthquakes around the globe to image Earth under the Sierra Nevada and neighboring locations.Their results reveal that the entire eastern Sierra overlies low-velocity upper mantle and lacks the dense, quartz-poor lower crust that they say must have existed 80 million years ago when the granites of the range were created.Jones and colleagues write that this missing dense material probably was removed within the past 10 million years. "Previous workers," they note, "have suggested it might be within a high-velocity mantle anomaly under the southeastern San Joaquin Valley," which is "the right size to be the old, dense rock previously under the eastern Sierra."They argue, however, that the geometry and extent of earth within the anomaly does not appear to be consistent with it being a piece of old subducted ocean floor. This would mean that a long strip of dense rock under the Sierra somehow deformed into a steeply plunging ellipsoid at the southwestern edge of the range. This conclusion suggests that the range rose within the past 10 million years as this dense material fell away to the west and south. Finally, Jones and colleagues note that something similar might be underway at the northern edge of the range. | Earthquakes | 2,014 |
April 16, 2014 | https://www.sciencedaily.com/releases/2014/04/140416112956.htm | Floating nuclear plants could ride out tsunamis: New design for enhanced safety, easier siting and centralized construction | When an earthquake and tsunami struck the Fukushima Daiichi nuclear plant complex in 2011, neither the quake nor the inundation caused the ensuing contamination. Rather, it was the aftereffects -- specifically, the lack of cooling for the reactor cores, due to a shutdown of all power at the station -- that caused most of the harm. | A new design for nuclear plants built on floating platforms, modeled after those used for offshore oil drilling, could help avoid such consequences in the future. Such floating plants would be designed to be automatically cooled by the surrounding seawater in a worst-case scenario, which would indefinitely prevent any melting of fuel rods, or escape of radioactive material.The concept is being presented this week at the Small Modular Reactors Symposium, hosted by the American Society of Mechanical Engineers, by MIT professors Jacopo Buongiorno, Michael Golay, and Neil Todreas, along with others from MIT, the University of Wisconsin, and Chicago Bridge and Iron, a major nuclear plant and offshore platform construction company.Such plants, Buongiorno explains, could be built in a shipyard, then towed to their destinations five to seven miles offshore, where they would be moored to the seafloor and connected to land by an underwater electric transmission line. The concept takes advantage of two mature technologies: light-water nuclear reactors and offshore oil and gas drilling platforms. Using established designs minimizes technological risks, says Buongiorno, an associate professor of nuclear science and engineering (NSE) at MIT.Although the concept of a floating nuclear plant is not unique -- Russia is in the process of building one now, on a barge moored at the shore -- none have been located far enough offshore to be able to ride out a tsunami, Buongiorno says. For this new design, he says, "the biggest selling point is the enhanced safety."A floating platform several miles offshore, moored in about 100 meters of water, would be unaffected by the motions of a tsunami; earthquakes would have no direct effect at all. Meanwhile, the biggest issue that faces most nuclear plants under emergency conditions -- overheating and potential meltdown, as happened at Fukushima, Chernobyl, and Three Mile Island -- would be virtually impossible at sea, Buongiorno says: "It's very close to the ocean, which is essentially an infinite heat sink, so it's possible to do cooling passively, with no intervention. The reactor containment itself is essentially underwater."Buongiorno lists several other advantages. For one thing, it is increasingly difficult and expensive to find suitable sites for new nuclear plants: They usually need to be next to an ocean, lake, or river to provide cooling water, but shorefront properties are highly desirable. By contrast, sites offshore, but out of sight of land, could be located adjacent to the population centers they would serve. "The ocean is inexpensive real estate," Buongiorno says.In addition, at the end of a plant's lifetime, "decommissioning" could be accomplished by simply towing it away to a central facility, as is done now for the Navy's carrier and submarine reactors. That would rapidly restore the site to pristine conditions.This design could also help to address practical construction issues that have tended to make new nuclear plants uneconomical: Shipyard construction allows for better standardization, and the all-steel design eliminates the use of concrete, which Buongiorno says is often responsible for construction delays and cost overruns.There are no particular limits to the size of such plants, he says: They could be anywhere from small, 50-megawatt plants to 1,000-megawatt plants matching today's largest facilities. "It's a flexible concept," Buongiorno says.Most operations would be similar to those of onshore plants, and the plant would be designed to meet all regulatory security requirements for terrestrial plants. "Project work has confirmed the feasibility of achieving this goal, including satisfaction of the extra concern of protection against underwater attack," says Todreas, the KEPCO Professor of Nuclear Science and Engineering and Mechanical Engineering.Buongiorno sees a market for such plants in Asia, which has a combination of high tsunami risks and a rapidly growing need for new power sources. "It would make a lot of sense for Japan," he says, as well as places such as Indonesia, Chile, and Africa. | Earthquakes | 2,014 |
April 16, 2014 | https://www.sciencedaily.com/releases/2014/04/140416101625.htm | Ant colonies help evacuees in disaster zones | An escape route mapping system based on the behavior of ant colonies could give evacuees a better chance of reaching safe harbor after a natural disaster or terrorist attack by building a map showing the shortest routes to shelters and providing regular updates of current situations such as fires, blocked roads or other damage via the smart phones of emergency workers and those caught up in the disaster. | Koichi Asakura of Daido University in Nagoya and Toyohide Watanabe of the Nagoya Industrial Science Research Institute in Japan have carried out successful simulations of the construction of navigational maps using this approach and report details in the The team's new system has two key features: First it utilizes the smart phones that are now ubiquitous across cities as networked, mobile sensors that can feed information back to emergency centers. The second feature exploits our understanding of the behavior of an ant colony. This provides a way to determine whether or not particular problems are recent or not, just as individual ants use pheromone trails, and the concentration changes in those pheromones to assess how recently a colony member left a particular signal and so find the optimal routes to and from the nest via food supplies. By using this approach to analyze the data from myriad smart phones as evacuees head for shelter, it is possible to build an active navigational map using the phones' GPS and other tools.The system circumvents the problem that would be almost inevitable during a disaster that closed circuit television (CCTV) cameras would be unreliable whereas sufficient numbers of wireless communication devices might remain active for sufficient time given a large enough number of service providers and communication towers spread widely across the disaster area. The next step will be to develop an ad hoc mobile networking system so that evacuees can themselves access these active maps rather than the present system that provides advice to emergency services for guiding evacuees. Such a network might also circumvent the problem of service provider outages by allowing individual smart phones to create a local network. | Earthquakes | 2,014 |
April 15, 2014 | https://www.sciencedaily.com/releases/2014/04/140415133920.htm | Earthquake simulation tops one petaflop mark | A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have -- with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) -- optimized the SeisSol earthquake simulation software on the SuperMUC high performance computer at the LRZ to push its performance beyond the "magical" one petaflop/s mark -- one quadrillion floating point operations per second. | Geophysicists use the SeisSol earthquake simulation software to investigate rupture processes and seismic waves beneath Earth's surface. Their goal is to simulate earthquakes as accurately as possible to be better prepared for future events and to better understand the fundamental underlying mechanisms. However, the calculations involved in this kind of simulation are so complex that they push even super computers to their limits.In a collaborative effort, the workgroups led by Dr. Christian Pelties at the Department of Geo and Environmental Sciences at LMU and Professor Michael Bader at the Department of Informatics at TUM have optimized the SeisSol program for the parallel architecture of the Garching supercomputer "SuperMUC," thereby speeding up calculations by a factor of five.Using a virtual experiment they achieved a new record on the SuperMUC: To simulate the vibrations inside the geometrically complex Merapi volcano on the island of Java, the supercomputer executed 1.09 quadrillion floating point operations per second. SeisSol maintained this unusually high performance level throughout the entire three hour simulation run using all of SuperMUC's 147,456 processor cores.This was possible only following the extensive optimization and the complete parallelization of the 70,000 lines of SeisSol code, allowing a peak performance of up to 1.42 petaflops. This corresponds to 44.5 percent of Super MUC's theoretically available capacity, making SeisSol one of the most efficient simulation programs of its kind worldwide."Thanks to the extreme performance now achievable, we can run five times as many models or models that are five times as large to achieve significantly more accurate results. Our simulations are thus inching ever closer to reality," says the geophysicist Dr. Christian Pelties. "This will allow us to better understand many fundamental mechanisms of earthquakes and hopefully be better prepared for future events."The next steps are earthquake simulations that include rupture processes on the meter scale as well as the resultant destructive seismic waves that propagate across hundreds of kilometers. The results will improve the understanding of earthquakes and allow a better assessment of potential future events."Speeding up the simulation software by a factor of five is not only an important step for geophysical research," says Professor Michael Bader of the Department of Informatics at TUM. "We are, at the same time, preparing the applied methodologies and software packages for the next generation of supercomputers that will routinely host the respective simulations for diverse geoscience applications."Besides Michael Bader and Christian Pelties also Alexander Breuer, Dr. Alexander Heinecke and Sebastian Rettenberger (TUM) as well as Dr. Alice Agnes Gabriel and Stefan Wenk (LMU) worked on the project. In June the results will be presented at the International Supercomputing Conference in Leipzig (ISC'14, Leipzig, 22-June 26, 2014; title: Sustained Petascale Performance of Seismic Simulation with SeisSol on SuperMUC) | Earthquakes | 2,014 |
April 11, 2014 | https://www.sciencedaily.com/releases/2014/04/140411091949.htm | Tibetan Plateau was larger than previously thought, geologists say | Earth scientists in Syracuse University's College of Arts and Sciences have determined that the Tibetan Plateau -- the world's largest, highest, and flattest plateau -- had a larger initial extent than previously documented. | Their discovery is the subject of an article in the journal Gregory Hoke, assistant professor of Earth sciences, and Gregory Wissink, a Ph.D. student in his lab, have co-authored the article with Jing Liu-Zeng, director of the Division of Neotectonics and Geomorphology at the Institute for Geology, part of the China Earthquake Administration; Michael Hren, assistant professor of chemistry at the University of Connecticut; and Carmala Garzione, professor and chair of Earth and environmental sciences at the University of Rochester."We've determined the elevation history of the southeast margin of the Tibetan Plateau," says Hoke, who specializes in the interplay between Earth's tectonic and surface processes. "By the Eocene epoch (approximately 40 million years ago), the southern part of the plateau extended some 600 miles more to the east than previously documented. This discovery upends a popular model for plateau formation."Known as the "Roof of the World," the Tibetan Plateau covers more than 970,000 square miles in Asia and India and reaches heights of over 15,000 feet. The plateau also contains a host of natural resources, including large mineral deposits and tens of thousands of glaciers, and is the headwaters of many major drainage basins.Hoke says he was attracted to the topography of the plateau's southeast margin because it presented an opportunity to use information from minerals formed at Earth's surface to infer what happened below them in the crust."The tectonic and topographic evolution of the southeast margin has been the subject of considerable controversy," he says. "Our study provides the first quantitative estimate of the past elevation of the eastern portions of the plateau."Historically, geologists have thought that lower crustal flow -- a process by which hot, ductile rock material flows from high- to low-pressure zones -- helped elevate parts of the plateau about 20 million years ago. (This uplift model has also been used to explain watershed reorganization among some of the world's largest rivers, including the Yangtze in China.)But years of studying rock and water samples from the plateau have led Hoke to rethink the area's history. For starters, his data indicates that the plateau has been at or near its present elevation since the Eocene epoch. Moreover, surface uplift in the southernmost part of the plateau -- in and around southern China and northern Vietnam -- has been historically small."Surface uplift, caused by lower crustal flow, doesn't explain the evolution of regional river networks," says Hoke, referring to the process by which a river drainage system is diverted, or captured, from its own bed into that of a neighboring bed. "Our study suggests that river capture and drainage reorganization must have been the result of a slip on the major faults bounding the southeast plateau margin."Hoke's discovery not only makes the plateau larger than previously thought, but also suggests that some of the topography is millions of years younger."Our data provides the first direct documentation of the magnitude and geographic extent of elevation change on the southeast margin of the Tibetan Plateau, tens of millions years ago," Hoke adds. "Constraining the age, spatial extent, and magnitude of ancient topography has a profound effect on how we understand the construction of mountain ranges and high plateaus, such as those in Tibet and the Altiplano region in Bolivia." | Earthquakes | 2,014 |
April 9, 2014 | https://www.sciencedaily.com/releases/2014/04/140409125851.htm | Scientists reconstruct ancient impact that dwarfs dinosaur-extinction blast | Picture this: A massive asteroid almost as wide as Rhode Island and about three to five times larger than the rock thought to have wiped out the dinosaurs slams into Earth. The collision punches a crater into the planet's crust that's nearly 500 kilometers (about 300 miles) across: greater than the distance from Washington, D.C. to New York City, and up to two and a half times larger in diameter than the hole formed by the dinosaur-killing asteroid. Seismic waves bigger than any recorded earthquakes shake the planet for about half an hour at any one location -- about six times longer than the huge earthquake that struck Japan three years ago. The impact also sets off tsunamis many times deeper than the one that followed the Japanese quake. | Although scientists had previously hypothesized enormous ancient impacts, much greater than the one that may have eliminated the dinosaurs 65 million years ago, now a new study reveals the power and scale of a cataclysmic event some 3.26 billion years ago which is thought to have created geological features found in a South African region known as the Barberton greenstone belt. The research has been accepted for publication in Geochemistry, Geophysics, Geosystems, a journal of the American Geophysical Union.The huge impactor -- between 37 and 58 kilometers (23 to 36 miles) wide -- collided with the planet at 20 kilometers per second (12 miles per second). The jolt, bigger than a 10.8 magnitude earthquake, propelled seismic waves hundreds of kilometers through Earth, breaking rocks and setting off other large earthquakes. Tsunamis thousands of meters deep -- far bigger than recent tsunamis generated by earthquakes -- swept across the oceans that covered most of Earth at that time."We knew it was big, but we didn't know how big," Donald Lowe, a geologist at Stanford University and a co-author of the study, said of the asteroid.Lowe, who discovered telltale rock formations in the Barberton greenstone a decade ago, thought their structure smacked of an asteroid impact. The new research models for the first time how big the asteroid was and the effect it had on the planet, including the possible initiation of a more modern plate tectonic system that is seen in the region, according to Lowe.The study marks the first time scientists have mapped in this way an impact that occurred more than 3 billion years ago, Lowe added, and is likely one of the first times anyone has modeled any impact that occurred during this period of Earth's evolution.The impact would have been catastrophic to the surface environment. The smaller, dino-killing asteroid crash is estimated to have released more than a billion times more energy than the bombs that destroyed Hiroshima and Nagasaki. The more ancient hit now coming to light would have released much more energy, experts said.The sky would have become red hot, the atmosphere would have been filled with dust and the tops of oceans would have boiled, the researchers said. The impact sent vaporized rock into the atmosphere, which encircled the globe and condensed into liquid droplets before solidifying and falling to the surface, according to the researchers.The impact may have been one of dozens of huge asteroids that scientists think hit Earth during the tail end of the Late Heavy Bombardment period, a major period of impacts that occurred early in Earth's history -- around 3 billion to 4 billion years ago.Many of the sites where these asteroids landed were destroyed by erosion, movement Earth's crust and other forces as Earth evolved, but geologists have found a handful of areas in South Africa, and Western Australia that still harbor evidence of these impacts that occurred between 3.23 billion and 3.47 billion years ago. The study's co-authors think the asteroid hit Earth thousands of kilometers away from the Barberton Greenstone Belt, although they can't pinpoint the exact location."We can't go to the impact sites. In order to better understand how big it was and its effect we need studies like this," said Lowe. Scientists must use the geological evidence of these impacts to piece together what happened to the Earth during this time, he said.The study's findings have important implications for understanding the early Earth and how the planet formed. The impact may have disrupted Earth's crust and the tectonic regime that characterized the early planet, leading to the start of a more modern plate tectonic system, according to the paper's co-authors.The pummeling the planet endured was "much larger than any ordinary earthquake," said Norman Sleep, a physicist at Stanford University and co-author of the study. He used physics, models, and knowledge about the formations in the Barberton greenstone belt, other earthquakes and other asteroid impact sites on Earth and the moon to calculate the strength and duration of the shaking that the asteroid produced. Using this information, Sleep recreated how waves traveled from the impact site to the Barberton greenstone belt and caused the geological formations.The geological evidence found in the Barberton that the paper investigates indicates that the asteroid was "far larger than anything in the last billion years," said Jay Melosh, a professor at Purdue University in West Lafayette, Indiana, who was not involved in the research.The Barberton greenstone belt is an area 100 kilometers (62 miles) long and 60 kilometers (37 miles) wide that sits east of Johannesburg near the border with Swaziland. It contains some of the oldest rocks on the planet.The model provides evidence for the rock formations and crustal fractures that scientists have discovered in the Barberton greenstone belt, said Frank Kyte, a geologist at UCLA who was not involved in the study."This is providing significant support for the idea that the impact may have been responsible for this major shift in tectonics," he said.Reconstructing the asteroid's impact could also help scientists better understand the conditions under which early life on the planet evolved, the paper's authors said. Along with altering Earth itself, the environmental changes triggered by the impact may have wiped out many microscopic organisms living on the developing planet, allowing other organisms to evolve, they said."We are trying to understand the forces that shaped our planet early in its evolution and the environments in which life evolved," Lowe said. | Earthquakes | 2,014 |
April 4, 2014 | https://www.sciencedaily.com/releases/2014/04/140404140035.htm | More Earthquakes for Chile? Seismic gap has not been closed | After the strong earthquake that struck Chile on April 2 (CEST), numerous aftershocks, some of them of a considerable magnitude, have struck the region around Iquique. Seismologists from the GFZ German Research Centre for Geosciences doubt that the strong earthquake closed the local seismic gap and decreased the risk of a large earthquake. On the contrary, initial studies of the rupture process and the aftershocks show that only about a third of the vulnerable zone broke. | This vulnerable area is referred to as the seismic gap of Iquique and a strong earthquake is expected to strike here. The Pacific Nazca plate meets the South American plate at South America's west coast. "In a subsea trench along the coast, the Pacific Ocean floor submerges beneath the continent building up tension that is released in earthquakes," explains Professor Onno Oncken of the GFZ. "In the course of about 150 years the entire plate boundary from Patagonia in the South to Panama in the North breaks completely with a segmented series of strong earthquakes." This cycle has been completed except for a last segment west of Iquique in northern Chile. As expected, the strong earthquake of April 2 took place exactly at this seismic gap.Initial analyses conducted by GFZ seismologists have shown that there is no sign that tension in the earth' crust has significantly decreased: "So far tension has been released only in the central section of this vulnerable zone," Oncken further explains. The series of earthquakes began on March 16 with a 6.7-magnitude earthquake. Although the main earthquake with a magnitude of 8.1 broke the central section of the seismic gap of a length of some 100 kilometres, two large segments further north and south remain intact, and these segments are able to cause strong earthquakes with a high risk of ground shaking and tsunamis.Oncken: "This means that the risk of one or even several earthquakes with a magnitude clearly above 8 still exists." Furthermore, the location and magnitude of the aftershocks suggest such a scenario.Since the main quake struck, hundreds of aftershocks have been registered, the strongest that of April 2 (CEST) of a magnitude of 7.6. This earthquake struck about 100 kilometres south of the main earthquake's epicentre. Together with the its associated aftershocks, it forms a second rupture zone.For such extreme events, the GFZ has a task force called HART (Hazard and Risk Team) that will travel to the area affected to conduct further studies. The assignment aims at gaining a better and more detailed understanding of the rupture process based on the aftershocks, and defining the rupture surface more precisely based on the distribution of the aftershocks. Currently 25 seismometers are being prepared for air transport. Early next week a team of eight GFZ scientists will fly to Chile. The 25 portable seismometers will be used to expand the existing observatory network IPOC (Integrated Plate Boundary Observatory Chile) in order to be able to determine the earthquake epicentres more precisely. In addition highly precise surface displacements will be measured at 50 GPS measuring points. Two new additional continuous GPS stations will be installed to determine how the earthquake has deformed the earth' crust.The Helmholtz Centre for Ocean Research Geomar in Kiel intends to support the measuring campaign. Ocean floor seismometers will supplement land-based seismic data by providing measurements of the aftershocks on the seafloor.The GFZ initiated the setup of an observatory directly within the seismic gap in northern Chile in order to be able to precisely measure and capture tectonic processes before, during and after the expected strong earthquake. The observatory called Integrated Plate Boundary Observatory Chile (IPOC) is a European-American network of institutions and scientists. Together with several Chilean and German universities, German, French, Chilean and American non-university research institutions operate a decentralized instrumentation system located at Chile's convergent plate boundary to gather data on earthquakes, deformations, magmatism, and surface processes.The mission succeeded in the case of the April 2 earthquake: "All our instruments survived the quake and aftershocks unscathed. We now have a set of data that is unique in the world," says GFZ seismologist Günter Asch with a smile, who was responsible for checking the instruments on site right after the earthquake and who is once again on his way to the region. "We believe that these data will help us understand the entire earthquake process -- from the phase that tension builds up to the actual rupture, and also during the post-seismic phase." This understanding will provide insights into earthquake risks in this part of the world as well as elsewhere.The IPOC will further expand. To this day more than 20 multi-parameter stations have been set up. They comprise broadband seismographs, accelerometers, continuous GPS receivers, magneto-telluric probes, expansion measuring devices and climate sensors. Their data is transferred to Potsdam in real time. The European Southern Observatory on Cerro Paranal is now also part of the observatory network. | Earthquakes | 2,014 |
April 3, 2014 | https://www.sciencedaily.com/releases/2014/04/140403142025.htm | Hot mantle drives elevation, volcanism along mid-ocean ridges | Scientists have shown that temperature differences deep within Earth's mantle control the elevation and volcanic activity along mid-ocean ridges, the colossal mountain ranges that line the ocean floor. The findings, published April 4 in the journal | Mid-ocean ridges form at the boundaries between tectonic plates, circling the globe like seams on a baseball. As the plates move apart, magma from deep within Earth rises up to fill the void, creating fresh crust as it cools. The crust formed at these seams is thicker in some places than others, resulting in ridges with widely varying elevations. In some places, the peaks are submerged miles below the ocean surface. In other places -- Iceland, for example -- the ridge tops are exposed above the water's surface."These variations in ridge depth require an explanation," said Colleen Dalton, assistant professor of geological sciences at Brown and lead author of the new research. "Something is keeping them either sitting high or sitting low."That something, the study found, is the temperature of rocks deep below Earth's surface.By analyzing the speeds of seismic waves generated by earthquakes, the researchers show that mantle temperature along the ridges at depths extending below 400 kilometers varies by as much as 250 degrees Celsius. High points on the ridges tend to be associated with higher mantle temperatures, while low points are associated with a cooler mantle. The study also showed that volcanic hot spots along the ridge -- volcanoes near Iceland as well as the islands of Ascension, Tristan da Cunha, and elsewhere -- all sit above warm spots in Earth's mantle."It is clear from our results that what's being erupted at the ridges is controlled by temperature deep in the mantle," Dalton said. "It resolves a long-standing controversy and has not been shown definitively before."The mid-ocean ridges provide geologists with a window to the interior of Earth. The ridges form when mantle material melts, rises into the cracks between tectonic plates, and solidifies again. The characteristics of the ridges provide clues about the properties of the mantle below.For example, a higher ridge elevation suggests a thicker crust, which in turn suggests that a larger volume of magma was erupted at the surface. This excess molten rock can be caused by very hot temperatures in the mantle. The problem is that hot mantle is not the only way to produce excess magma. The chemical composition of the rocks in Earth's mantle also controls how much melt is produced. For certain rock compositions, it is possible to generate large volumes of molten rock under cooler conditions. For many decades it has not been clear whether mid-ocean ridge elevations are caused by variations in the temperature of the mantle or variations in the rock composition of the mantle.To distinguish between these two possibilities, Dalton and her colleagues introduced two additional data sets. One was the chemistry of basalts, the rock that forms from solidification of magma at the mid-ocean ridge. The chemical composition of basalts differs depending upon the temperature and composition of the mantle material from which they're derived. The authors analyzed the chemistry of nearly 17,000 basalts formed along mid-ocean ridges around the globe.The other data set was seismic wave tomography. During earthquakes, seismic waves are sent pulsing through the rocks in the crust and mantle. By measuring the velocity of those waves, scientists can gather data about the characteristics of the rocks through which they traveled. "It's like performing a CAT scan of the inside of the Earth," Dalton said.Seismic wave speeds are especially sensitive to the temperature of rocks. In general, waves propagate more quickly in cooler rocks and more slowly in hotter rocks.Dalton and her colleagues combined the seismic data from hundreds of earthquakes with data on elevation and rock chemistry from the ridges. Correlations among the three data sets revealed that temperature deep in the mantle varied between around 1,300 and 1,550 degrees Celsius underneath about 61,000 kilometers of ridge terrain. "It turned out," said Dalton, "that seismic tomography was the smoking gun. The only plausible explanation for the seismic wave speeds is a very large temperature range."The study showed that as ridge elevation falls, so does mantle temperature. The coolest point beneath the ridges was found near the lowest point, an area of very deep and rugged seafloor known as the Australian-Antarctic discordance in the Indian Ocean. The hottest spot was near Iceland, which is also the ridges' highest elevation point.Iceland is also where scientists have long debated whether a mantle plume -- a vertical jet of hot rock originating from deep inside Earth -- intersects the mid-ocean ridge. This study provides strong support for a mantle plume located beneath Iceland. In fact, this study showed that all regions with above-average temperature are located near volcanic hot spots, which points to mantle plumes as the culprit for the excess volume of magma in these areas.Despite being made of solid rock, Earth's mantle doesn't sit still. It undergoes convection, a slow churning of material from the depths of the Earth toward the surface and back again."Convection is why we have plate tectonics and earthquakes," Dalton said. "It's also responsible for almost all volcanism at the surface. So understanding mantle convection is crucial to understanding many fundamental questions about the Earth."Two factors influence how that convection works: variations in the composition of the mantle and variations in its temperature. This work, says Dalton, points to temperature as a primary factor in how convection is expressed on the surface."We get consistent and coherent temperature measurements from the mantle from three independent datasets," Dalton said. "All of them suggest that what we see at the surface is due to temperature, and that composition is only a secondary factor. What is surprising is that the data require the temperature variations to exist not only near the surface but also many hundreds of kilometers deep inside the Earth."The findings from this study will also be useful in future research using seismic waves, Dalton says. Because the temperature readings as indicated by seismology were backed up by the other datasets, they can be used to calibrate seismic readings for places where geochemical samples aren't available. This makes it possible to estimate temperature deep in Earth's mantle all over the globe.That will help geologists gain a new insights into how processes deep within Earth mold the ground beneath our feet. | Earthquakes | 2,014 |
April 3, 2014 | https://www.sciencedaily.com/releases/2014/04/140403134457.htm | NASA model provides a 3-D look at L.A.-area quake | On March 28, residents of Greater Los Angeles experienced the largest earthquake to strike the region since 2008. The magnitude 5.1 quake was centered near La Habra in northwestern Orange County about 21 miles (33 kilometers) east-southeast of Los Angeles, and was widely felt throughout Southern California. There have been hundreds of aftershocks, including one of magnitude 4.1. | Scientists at NASA's Jet Propulsion Laboratory, Pasadena, Calif., have developed a model of the earthquake, based on the distribution of aftershocks and other seismic information from the U.S. Geological Survey.A new image based on the model shows what the earthquake may look like through the eyes of an interferometric synthetic aperture radar, such as NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). JPL scientists plan to acquire UAVSAR data from the region of the March 28 quake, possibly as soon as this week, and process the data to validate and improve the results of their model.The model image is online at: The earthquake is believed to be associated with the Puente Hills Thrust fault, a blind thrust fault (meaning it does not break the earth surface) that zigzags from Orange County northwest through downtown Los Angeles. The same fault was responsible for the magnitude 5.9 Whittier Narrows earthquake on Oct. 1, 1987, which caused eight fatalities, injured several hundred and left about $360 million in property damage.The NASA model is based on a fault estimated to be 5.6 miles (9 kilometers) long, 3.1 miles (5 kilometers) deep and 1.9 miles (3 kilometers) wide. The modeled fault segment dips upward through the ground at a 60-degree angle. The model estimated that in this earthquake, one side of the fault moved at a slanted angle horizontally and vertically 3.9 inches (10 centimeters) relative to the other side. The model also estimated the maximum displacement of Earth's surface from the quake at approximately 0.4 inch (1 centimeter), which is at the threshold of what is detectable with UAVSAR. The region of greatest ground displacement is indicated by the darker blue area located in the right center of the image.In Nov. 2008, NASA JPL scientists began conducting a series of UAVSAR flights over regions of Northern and Southern California that are actively deforming and are marked by frequent earthquakes. About every six months, the scientists precisely repeat the same flight paths to produce images of ground deformation called interferograms. From these data, 3-D maps are being created for regions of interest, including the San Andreas and other California faults, extending from the Gulf of California in Mexico to Santa Rosa in the northern San Francisco Bay.UAVSAR, which flies on a NASA C-20A aircraft from NASA's Armstrong Flight Research Center in California, measures ground deformation over large areas to a precision of 0.04 to 0.2 inches (0.1 to 0.5 centimeters).By comparing the repeat-pass radar observations, scientists hope to measure any crustal deformations that may occur between observations, allowing them to "see" the amount of strain building up on fault lines, and giving them a clearer picture of which faults are active and at what rates they're moving, both before earthquakes and after them. The UAVSAR fault mapping project is designed to substantially improve knowledge of regional earthquake hazards in California. The 3-D UAVSAR data will allow scientists to bring entire faults into focus, allowing them to understand faults not just at their surfaces, but also at depth. When integrated into computer models, the data should give scientists a much clearer picture of California's complex fault systems.The scientists are estimating the total displacement occurring in each region. As additional observations are collected, they expect to be able to determine how strain is partitioned between individual faults.The UAVSAR flights serve as a baseline for pre-earthquake activity. As earthquakes occur during the course of this project, the team is measuring the deformation at the time of the earthquakes to determine the distribution of slip on the faults, and then monitoring longer-term motions after the earthquakes to learn more about fault zone properties.Airborne UAVSAR mapping can allow a rapid response after an earthquake to determine what fault was the source and which parts of the fault slipped during the earthquake. Information about the earthquake source can be used to estimate what areas were most affected by an earthquake's shaking to guide rescue efforts and damage assessment.The model was developed as part of NASA's QuakeSim project. The JPL-developed QuakeSim is a comprehensive, state-of-the-art software tool for simulating and understanding earthquake fault processes and improving earthquake forecasting. Initiated in 2002, QuakeSim uses NASA remote sensing and other earthquake-related data to simulate and model the behavior of faults in 3-D both individually and as part of complex, interacting systems. This provides long-term histories of fault behavior that can be used for statistical evaluation. QuakeSim also is used to identify regions of increased earthquake probabilities, called hotspots.NASA's Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) project, which provides tools for earthquake disaster management and response using remote sensing data and NASA earthquake modeling software, published the model results, along with automatically generated deformation models and aftershock forecasts on a La Habra earthquake event page: For more on QuakeSim, visit: For more information about UAVSAR, visit: | Earthquakes | 2,014 |
April 2, 2014 | https://www.sciencedaily.com/releases/2014/04/140402162458.htm | Magnetic anomaly deep within Earth's crust reveals Africa in North America | The repeated cycles of plate tectonics that have led to collision and assembly of large supercontinents and their breakup and formation of new ocean basins have produced continents that are collages of bits and pieces of other continents. Figuring out the origin and make-up of continental crust formed and modified by these tectonic events is a vital to understanding Earth's geology and is important for many applied fields, such as oil, gas, and gold exploration. | In many cases, the rocks involved in these collision and pull-apart episodes are still buried deep beneath Earth's surface, so geologists must use geophysical measurements to study these features.This new study by Elias Parker Jr. of the University of Georgia examines a prominent swath of lower-than-normal magnetism -- known as the Brunswick Magnetic Anomaly -- that stretches from Alabama through Georgia and off shore to the North Carolina coast.The cause of this magnetic anomaly has been under some debate. Many geologists attribute the Brunswick Magnetic Anomaly to a belt of 200 million year old volcanic rocks that intruded around the time the Atlantic Ocean. In this case, the location of this magnetic anomaly would then mark the initial location where North America split from the rest of Pangea as that ancient supercontinent broke apart. Parker proposes a different source for this anomalous magnetic zone.Drawing upon other studies that have demonstrated deeply buried metamorphic rocks can also have a coherent magnetic signal, Parker has analyzed the detailed characteristics of the magnetic anomalies from data collected across zones in Georgia and concludes that the Brunswick Magnetic Anomaly has a similar, deeply buried source. The anomalous magnetic signal is consistent with an older tectonic event -- the Alleghanian orogeny that formed the Alleghany-Appalachian Mountains when the supercontinent of Pangea was assembled.Parker's main conclusion is that the rocks responsible for the Brunswick Magnetic Anomaly mark a major fault-zone that formed as portions of Africa and North America were sheared together roughly 300 million years ago -- and that more extensive evidence for this collision are preserved along this zone. One interesting implication is that perhaps a larger portion of what is now Africa was left behind in the American southeast when Pangea later broke up. | Earthquakes | 2,014 |
April 2, 2014 | https://www.sciencedaily.com/releases/2014/04/140402145641.htm | Magnitude 8.2 earthquake off Chile: Thrust faulting at shallow depths near the Chilean coast | A magnitude 8.2 earthquake struck off Chile on April 1, 2014 at 23:46:46 UTC, according to the U.S. Geological Survey. | The following is information from the USGS event page on this earthquake.The April 1, 2014 M8.2 earthquake in northern Chile occurred as the result of thrust faulting at shallow depths near the Chilean coast. The location and mechanism of the earthquake are consistent with slip on the primary plate boundary interface, or megathrust, between the Nazca and South America plates. At the latitude of the earthquake, the Nazca plate subducts eastward beneath the South America plate at a rate of 65 mm/yr. Subduction along the Peru-Chile Trench to the west of Chile has led to uplift of the Andes mountain range and has produced some of the largest earthquakes in the world, including the 2010 M 8.8 Maule earthquake in central Chile, and the largest earthquake on record, the 1960 M 9.5 earthquake in southern Chile.The April 1 earthquake occurred in a region of historic seismic quiescence -- termed the northern Chile or Iquique seismic gap. Historical records indicate a M 8.8 earthquake occurred within the Iquique gap in 1877, which was preceded immediately to the north by an M 8.8 earthquake in 1868.A recent increase in seismicity rates has occurred in the vicinity of the April 1 earthquake. An M6.7 earthquake with similar faulting mechanism occurred on March 16, 2014 and was followed by 60+ earthquake of M4+, and 26 earthquakes of M5+. The March 16 earthquake was also followed by three M6.2 events on March 17, March 22, and March 23. The spatial distribution of seismicity following the March 16 event migrated spatially to the north through time, starting near 20The South American arc extends over 7,000 km, from the Chilean margin triple junction offshore of southern Chile to its intersection with the Panama fracture zone, offshore of the southern coast of Panama in Central America. It marks the plate boundary between the subducting Nazca plate and the South America plate, where the oceanic crust and lithosphere of the Nazca plate begin their descent into the mantle beneath South America. The convergence associated with this subduction process is responsible for the uplift of the Andes Mountains, and for the active volcanic chain present along much of this deformation front. Relative to a fixed South America plate, the Nazca plate moves slightly north of eastwards at a rate varying from approximately 80 mm/yr in the south to approximately 65 mm/yr in the north. Although the rate of subduction varies little along the entire arc, there are complex changes in the geologic processes along the subduction zone that dramatically influence volcanic activity, crustal deformation, earthquake generation and occurrence all along the western edge of South America.Most of the large earthquakes in South America are constrained to shallow depths of 0 to 70 km resulting from both crustal and interplate deformation. Crustal earthquakes result from deformation and mountain building in the overriding South America plate and generate earthquakes as deep as approximately 50 km. Interplate earthquakes occur due to slip along the dipping interface between the Nazca and the South American plates. Interplate earthquakes in this region are frequent and often large, and occur between the depths of approximately 10 and 60 km. Since 1900, numerous magnitude 8 or larger earthquakes have occurred on this subduction zone interface that were followed by devastating tsunamis, including the 1960 M9.5 earthquake in southern Chile, the largest instrumentally recorded earthquake in the world. Other notable shallow tsunami-generating earthquakes include the 1906 M8.5 earthquake near Esmeraldas, Ecuador, the 1922 M8.5 earthquake near Coquimbo, Chile, the 2001 M8.4 Arequipa, Peru earthquake, the 2007 M8.0 earthquake near Pisco, Peru, and the 2010 M8.8 Maule, Chile earthquake located just north of the 1960 event.Large intermediate-depth earthquakes (those occurring between depths of approximately 70 and 300 km) are relatively limited in size and spatial extent in South America, and occur within the Nazca plate as a result of internal deformation within the subducting plate. These earthquakes generally cluster beneath northern Chile and southwestern Bolivia, and to a lesser extent beneath northern Peru and southern Ecuador, with depths between 110 and 130 km. Most of these earthquakes occur adjacent to the bend in the coastline between Peru and Chile. The most recent large intermediate-depth earthquake in this region was the 2005 M7.8 Tarapaca, Chile earthquake.Earthquakes can also be generated to depths greater than 600 km as a result of continued internal deformation of the subducting Nazca plate. Deep-focus earthquakes in South America are not observed from a depth range of approximately 300 to 500 km. Instead, deep earthquakes in this region occur at depths of 500 to 650 km and are concentrated into two zones: one that runs beneath the Peru-Brazil border and another that extends from central Bolivia to central Argentina. These earthquakes generally do not exhibit large magnitudes. An exception to this was the 1994 Bolivian earthquake in northwestern Bolivia. This M8.2 earthquake occurred at a depth of 631 km, which was until recently the largest deep-focus earthquake instrumentally recorded (superseded in May 2013 by a M8.3 earthquake 610 km beneath the Sea of Okhotsk, Russia), and was felt widely throughout South and North America.Subduction of the Nazca plate is geometrically complex and impacts the geology and seismicity of the western edge of South America. The intermediate-depth regions of the subducting Nazca plate can be segmented into five sections based on their angle of subduction beneath the South America plate. Three segments are characterized by steeply dipping subduction; the other two by near-horizontal subduction. The Nazca plate beneath northern Ecuador, southern Peru to northern Chile, and southern Chile descend into the mantle at angles of 25° to 30°. In contrast, the slab beneath southern Ecuador to central Peru, and under central Chile, is subducting at a shallow angle of approximately 10° or less. In these regions of "flat-slab" subduction, the Nazca plate moves horizontally for several hundred kilometers before continuing its descent into the mantle, and is shadowed by an extended zone of crustal seismicity in the overlying South America plate. Although the South America plate exhibits a chain of active volcanism resulting from the subduction and partial melting of the Nazca oceanic lithosphere along most of the arc, these regions of inferred shallow subduction correlate with an absence of volcanic activity. | Earthquakes | 2,014 |
March 28, 2014 | https://www.sciencedaily.com/releases/2014/03/140328075811.htm | Great earthquakes, water under pressure, high risk: Research reveals interactions between plate tectonics, fluids and quakes | The largest earthquakes occur where oceanic plates move beneath continents. Obviously, water trapped in the boundary between both plates has a dominant influence on the earthquake rupture process. Analyzing the great Chile earthquake of February, 27th, 2010, a group of scientists from the GFZ German Research Centre for Geosciences and from Liverpool University found that the water pressure in the pores of the rocks making up the plate boundary zone takes the key role. | The stress build-up before an earthquake and the magnitude of subsequent seismic energy release are substantially controlled by the mechanical coupling between both plates. Studies of recent great earthquakes have revealed that the lateral extent of the rupture and magnitude of these events are fundamentally controlled by the stress build-up along the subduction plate interface. Stress build-up and its lateral distribution in turn are dependent on the distribution and pressure of fluids along the plate interface."We combined observations of several geoscience disciplines -- geodesy, seismology, petrology. In addition, we have a unique opportunity in Chile that our natural observatory there provides us with long time series of data," says Onno Oncken, director of the GFZ-Department "Geodynamics and Geomaterials." Earth observation (Geodesy) using GPS technology and radar interferometry today allows a detailed mapping of mechanical coupling at the plate boundary from the Earth's surface. A complementary image of the rock properties at depth is provided by seismology. Earthquake data yield a high resolution three-dimensional image of seismic wave speeds and their variations in the plate interface region. Data on fluid pressure and rock properties, on the other hand, are available from laboratory measurements. All these data had been acquired shortly before the great Chile earthquake of February 2010 struck with a magnitude of 8.8."For the first time, our results allow us to map the spatial distribution of the fluid pressure with unprecedented resolution showing how they control mechanical locking and subsequent seismic energy release," explains Professor Oncken. "Zones of changed seismic wave speeds reflect zones of reduced mechanical coupling between plates." This state supports creep along the plate interface. In turn, high mechanical locking is promoted in lower pore fluid pressure domains. It is these locked domains that subsequently ruptured during the Chile earthquake releasing most seismic energy causing destruction at Earth's surface and tsunami waves. The authors suggest the spatial pore fluid pressure variations to be related to oceanic water accumulated in an altered oceanic fracture zone within the Pacific oceanic plate. Upon subduction of the latter beneath South America the fluid volumes are released and trapped along the overlying plate interface, leading to increasing pore fluid pressures. This study provides a powerful tool to monitor the physical state of a plate interface and to forecast its seismic potential. | Earthquakes | 2,014 |
March 27, 2014 | https://www.sciencedaily.com/releases/2014/03/140327100614.htm | Data mining disaster: Computer technology can mine data from social media during disasters | Computer technology that can mine data from social media during times of natural or other disaster could provide invaluable insights for rescue workers and decision makers, according to an international team writing in the | Adam Zagorecki of the Centre for Simulation and Analytics, at Cranfield University, Shrivenham, UK and David Johnson of Missouri State University, Springfield, USA and Jozef Ristvej of the University of Zilina, Zilina, Slovakia, explain that when disaster strikes the situation can change rapidly. Whether that is during flooding, landslide, earthquake or terrorist attack, understanding the complexities of the situation can mean the difference between saving human and animal lives, reducing environmental impact and preventing major economic loss.The team points out that advances in information technology have had a profound impact on disaster management. First, these advances make unprecedented volumes of data available to decision makers. This, however, brings with it the problem of managing and using that data. The team has surveyed the state of the art in data mining and machine learning in this context. They have found that whereas earlier applications were focused on specific problems, such as modeling the dispersion by wind of plumes -- whether from a chemical plant leak, fire or nuclear incident -- and monitoring rescue robots, there are much broader applications, such as constructing situational awareness and real-time threat assessment.Data mining during a disaster can pull in information from unstructured data from news reports, incident activity reports, and announcements, as well as structured textual data from emergency services, situational reports and damage assessment forms. In addition, it can utilize remote sensing data, as well as more frequently now, mobile phone images and video, and satellite and aerial images.Critically, the team also reveals that the advent of social media is playing an important role in generating a real-time data stream that grows quickly whenever disaster strikes and those involved have access to wireless communications and or internet connectivity. In particular, data mining of social media can assist with the response phase of disaster management. This information can quickly provide data points for models that are not available in conventional simulations."Disasters often undergo rapid substantial evolution; therefore, disaster management is a non-uniform process characterized by phases, although these phases are not distinct in nature," the team reports, they have now highlighted the challenges and hinted at future trends that might improve disaster response through the use of modern data mining technology. | Earthquakes | 2,014 |
March 21, 2014 | https://www.sciencedaily.com/releases/2014/03/140321101738.htm | Ground-improvement methods might protect against earthquakes | Researchers from the University of Texas at Austin's Cockrell School of Engineering are developing ground-improvement methods to help increase the resilience of homes and low-rise structures built on top of soils prone to liquefaction during strong earthquakes. | Findings will help improve the safety of structures in Christchurch and the Canterbury region in New Zealand, which were devastated in 2010 and 2011 by a series of powerful earthquakes. Parts of Christchurch were severely affected by liquefaction, in which water-saturated soil temporarily becomes liquid-like and often flows to the surface creating sand boils."The 2010-2011 Canterbury earthquakes in New Zealand have caused significant damage to many residential houses due to varying degrees of soil liquefaction over a wide extent of urban areas unseen in past destructive earthquakes," said Kenneth Stokoe, a professor in the Department of Civil, Architectural and Environmental Engineering. "One critical problem facing the rebuilding effort is that the land remains at risk of liquefaction in future earthquakes. Therefore, effective engineering solutions must be developed to increase the resilience of homes and low-rise structures."Researchers have conducted a series of field trials to test shallow-ground-improvement methods."The purpose of the field trials was to determine if and which improvement methods achieve the objectives of inhibiting liquefaction triggering in the improved ground and are cost-effective measures," said Stokoe, working with Brady Cox, an assistant professor of civil engineering. "This knowledge is needed to develop foundation design solutions."Findings were detailed in a research paper presented in December at the New Zealand -- Japan Workshop on Soil Liquefaction during Recent large-Scale Earthquakes. The paper was authored by Stokoe, graduate students Julia Roberts and Sungmoon Hwang; Cox and operations manager Farn-Yuh Menq from the University of Texas at Austin; and Sjoerd Van Ballegooy from Tonkin & Taylor Ltd, an international environmental and engineering consulting firm in Auckland, New Zealand.The researchers collected data from test sections of improved and unimproved soils that were subjected to earthquake stresses using a large mobile shaker, called T-Rex, and with explosive charges planted underground. The test sections were equipped with sensors to monitor key factors including ground motion and water pressure generated in soil pores during the induced shaking, providing preliminary data to determine the most effective ground-improvement method.Four ground-improvement methods were initially selected for the testing: rapid impact compaction (RIC); rammed aggregate piers (RAP), which consist of gravel columns; low-mobility grouting (LMG); and construction of a single row of horizontal beams (SRB) or a double row of horizontal beams (DRB) beneath existing residential structures via soil-cement mixing."The results are being analyzed, but good and poor performance can already be differentiated," Stokoe said. "The ground-improvement methods that inhibited liquefaction triggering the most were RIC, RAP, and DRB. However, additional analyses are still underway."The test site is located along the Avon River in the Christchurch suburb of Bexley. The work is part of a larger testing program that began in early 2013 with a preliminary evaluation by Brady Cox of seven potential test sites along the Avon River in the Christchurch area.Funding for the research has been provided, in part, by the National Science Foundation and is affiliated with the NSF's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES). The remainder of the funding has been provided by the Earthquake Commission of the New Zealand Government.The 64,000-pound T-Rex, operated by NEES@UTexas at UT Austin, is used to simulate a wide range of earthquake shaking levels.NEES is a shared network of 14 experimental facilities, collaborative tools, centralized data repository and earthquake simulation software, all linked by high-speed Internet connections. | Earthquakes | 2,014 |
March 18, 2014 | https://www.sciencedaily.com/releases/2014/03/140318154940.htm | Fierce 2012 magnetic storm just missed us: Earth dodged huge magnetic bullet from the sun | Earth dodged a huge magnetic bullet from the sun on July 23, 2012. | According to University of California, Berkeley, and Chinese researchers, a rapid succession of coronal mass ejections -- the most intense eruptions on the sun -- sent a pulse of magnetized plasma barreling into space and through Earth's orbit. Had the eruption come nine days earlier, it would have hit Earth, potentially wreaking havoc with the electrical grid, disabling satellites and GPS, and disrupting our increasingly electronic lives.The solar bursts would have enveloped Earth in magnetic fireworks matching the largest magnetic storm ever reported on Earth, the so-called Carrington event of 1859. The dominant mode of communication at that time, the telegraph system, was knocked out across the United States, literally shocking telegraph operators. Meanwhile, the Northern Lights lit up the night sky as far south as Hawaii.In a paper appearing today (Tuesday, March 18) in the journal "Had it hit Earth, it probably would have been like the big one in 1859, but the effect today, with our modern technologies, would have been tremendous," said Luhmann, who is part of the STEREO (Solar Terrestrial Observatory) team and based at UC Berkeley's Space Sciences Laboratory.A study last year estimated that the cost of a solar storm like the Carrington Event could reach $2.6 trillion worldwide. A considerably smaller event on March 13, 1989, led to the collapse of Canada's Hydro-Quebec power grid and a resulting loss of electricity to six million people for up to nine hours."An extreme space weather storm -- a solar superstorm -- is a low-probability, high-consequence event that poses severe threats to critical infrastructures of the modern society," warned Liu, who is with the National Space Science Center of the Chinese Academy of Sciences in Beijing. "The cost of an extreme space weather event, if it hits Earth, could reach trillions of dollars with a potential recovery time of 4-10 years. Therefore, it is paramount to the security and economic interest of the modern society to understand solar superstorms."Based on their analysis of the 2012 event, Liu, Luhmann and their STEREO colleagues concluded that a huge outburst on the sun on July 22 propelled a magnetic cloud through the solar wind at a peak speed of more than 2,000 kilometers per second -- four times the typical speed of a magnetic storm. It tore through Earth's orbit but, luckily, Earth and the other planets were on the other side of the sun at the time. Any planets in the line of sight would have suffered severe magnetic storms as the magnetic field of the outburst tangled with the planets' own magnetic fields.The researchers determined that the huge outburst resulted from at least two nearly simultaneous coronal mass ejections (CMEs), which typically release energies equivalent to that of about a billion hydrogen bombs. The speed with which the magnetic cloud plowed through the solar wind was so high, they concluded, because another mass ejection four days earlier had cleared the path of material that would have slowed it down."The authors believe this extreme event was due to the interaction of two CMEs separated by only 10 to 15 minutes," said Joe Gurman, the project scientist for STEREO at NASA's Goddard Space Flight Center in Greenbelt, Md.One reason the event was potentially so dangerous, aside from its high speed, is that it produced a very long-duration, southward-oriented magnetic field, Luhmann said. This orientation drives the largest magnetic storms when they hit Earth because the southward field merges violently with Earth's northward field in a process called reconnection. Storms that normally might dump their energy only at the poles instead dump it into the radiation belts, ionosphere and upper atmosphere and create auroras down to the tropics."These gnarly, twisty ropes of magnetic field from coronal mass ejections come blasting from the sun through the ambient solar system, piling up material in front of them, and when this double whammy hits Earth, it skews the Earth's magnetic field to odd directions, dumping energy all around the planet," she said. "Some of us wish Earth had been in the way; what an experiment that would have been.""People keep saying that these are rare natural hazards, but they are happening in the solar system even though we don't always see them," she added. "It's like with earthquakes -- it is hard to impress upon people the importance of preparing unless you suffer a magnitude 9 earthquake."All this activity would have been missed if STEREO A -- the STEREO spacecraft ahead of us in Earth's orbit and the twin to STEREO B, which trails in our orbit -- had not been there to record the blast.The goal of STEREO and other satellites probing the magnetic fields of the sun and Earth is to understand how and why the sun sends out these large solar storms and to be able to predict them during the sun's 11-year solar cycle. This event was particularly unusual because it happened during a very calm solar period."Observations of solar superstorms have been extremely lacking and limited, and our current understanding of solar superstorms is very poor," Liu said. "Questions fundamental to solar physics and space weather, such as how extreme events form and evolve and how severe it can be at the Earth, are not addressed because of the extreme lack of observations." | Earthquakes | 2,014 |
March 17, 2014 | https://www.sciencedaily.com/releases/2014/03/140317084740.htm | Earthquakes caused by clogged magma a warning sign of volcanic eruption | New research in | 36 hours before the first magmatic explosions, a swarm of 54 earthquakes was detected across the 13-station seismic network on Augustine Island. By analyzing the resulting seismic waves, the authors found that the earthquakes were being triggered from sources within the volcano's magma conduit."Our article talks about a special type of volcanic earthquake that we think is caused by lava breaking, something that usually can't happen because lava is supposed to flow more like a liquid, rather than crack like a piece of rock," said Dr. Helena Buurman from the University of Alaska Fairbanks. "Much like breaking a piece of chewing gum by stretching it really fast, lab tests show that hot lava can break when stretched quickly enough under certain pressures like those that you might find in the conduit of a volcano. "The authors found that over the course of the two hour swarm, the earthquakes' focus moved 35 meters deeper down into the magma conduit, an indication that the conduit was becoming clogged. The resulting buildup of pressure may have contributed to the explosive eruption the next day."We think that these earthquakes happened within the lava that was just beginning to erupt at the top of Augustine. The earthquakes show that the lava flow was grinding to a halt and plugging up the system. This caused pressure to build up from below, and resulted in a series of large explosions 36 hours later," concluded Dr. Buurman. "We believe that these types of earthquakes can be used to signal that a volcano is becoming pressurized and getting ready to explode, giving scientists time to alert the public of an imminent eruption. " | Earthquakes | 2,014 |
March 11, 2014 | https://www.sciencedaily.com/releases/2014/03/140311124319.htm | After major earthquake, silence: Dynamic stressing of a global system of faults results in rare seismic silence | In the global aftershock zone that followed the major April 2012 Indian Ocean earthquake, seismologists noticed an unusual pattern -- a dynamic "stress shadow," or period of seismic silence when some faults near failure were temporarily rendered incapable of a large rupture. | The magnitude (M) 8.6 earthquake, a strike-slip event at intraoceanic tectonic plates, caused global seismic rates of M≥4.5 to spike for several days, even at distances tens of thousands of kilometers from the mainshock site. But beginning two weeks after the mainshock, the rate of M≥6.5 seismic activity subsequently dropped to zero for the next 95 days.Why did this rare period of quiet occur?In a paper published today in the Using a statistical model of global seismicity, Pollitz and his colleagues show that a transient seismic perturbation of the size of the April 2012 global aftershock would inhibit rupture in 88 percent of their possible M≥6.5 earthquake fault sources over the next 95 days, regardless of how close they were to failure beforehand.This surprising finding, say the authors, challenges the previously held notion that dynamic stresses can only increase earthquake rates rather than inhibit them. But there are still mysteries about this process; for example, the global rate of M≥4.5 and M≥5.5 shocks did not decrease along with the larger shocks. | Earthquakes | 2,014 |
March 7, 2014 | https://www.sciencedaily.com/releases/2014/03/140307084046.htm | Urbanization exposes French cities to greater seismic risk | French researchers have looked into data mining to develop a method for extracting information on the vulnerability of cities in regions of moderate risk, creating a proxy for assessing the probable resilience of buildings and infrastructure despite incomplete seismic inventories of buildings. The research exposes significant vulnerability in regions that have experienced an "explosion of urbanization." | "Considering that the seismic hazard is stable in time, we observe that the seismic risk comes from the rapid development of urbanization, which places at the same site goods and people exposed to hazard" said Philippe Gueguen, co-author and senior researcher at Université Joseph Fourier in Grenoble, France. The paper appears today in the journal Local authorities rely on seismic vulnerability assessments to estimate the probable damage on an overall scale (such as a country, region or town) and identify the most vulnerable building categories that need reinforcement. These assessments are costly and require detailed understanding of how buildings will respond to ground motion.Old structures, designed before current seismic building codes, abound in France, and there is insufficient information about how they will respond during an earthquake, say authors. The last major earthquake in France, which is considered to have moderate seismic hazard, was the 1909 magnitude 6 Lambesc earthquake, which killed 42 people and caused millions of euros of losses in the southeastern region.The authors relied on the French national census for basic descriptions of buildings in Grenoble, a city of moderate seismic hazard, to create a vulnerability proxy, which they validated in Nice and later tested for the historic Lambesc earthquake.The research exposed the effects of the urbanization and urban concentrations in areas prone to seismic hazard."In seismicity regions similar to France, seismic events are rare and are of low probability. With urbanization, the consequences of characteristic events, such as Lambesc, can be significant in terms of structural damage and fatalities," said Gueguen. "These consequences are all the more significant because of the moderate seismicity that reduces the perception of risk by local authorities."If the 1909 Lambesc earthquake were to happen now, write the authors, the region would suffer serious consequences, including damage to more than 15,000 buildings. They equate the likely devastation to that observed after recent earthquakes of similar sizes in L'Aquila, Italy and Christchurch, New Zealand. | Earthquakes | 2,014 |
March 7, 2014 | https://www.sciencedaily.com/releases/2014/03/140307084044.htm | Activity more than location affects perception of earthquakes | Scientists rely on the public's reporting of ground shaking to characterize the intensity of ground motion produced by an earthquake. How accurate and reliable are those perceptions? | A new study by Italian researchers suggests that a person's activity at the time of the quake influences their perception of shaking more than their location. Whether a person is at rest or walking plays a greater role in their perception of ground motion than whether they were asleep on the first or sixth floor of a building. People in motion had the worst perception."People are like instruments, more or less sensitive," said Paola Sbarra, co-author and researcher at the Istituto Nazionale di Geofisica e Vulcanologia in Rome, Italy. "A great amount of data and proper statistical analysis allowed us to make a fine-tuning of different conditions for a better interpretation of earthquake effects," said Sbarra.The paper, co-authored by colleagues Patrizia Tosi and Valerio de Rubeis, is published today in the March issue of the Sbarra and colleagues sought to analyze two variables -- how an observer's "situation" and "location" influenced their perception in order to improve the characterization of low macroseismic intensities felt near small earthquakes or far from larger ones. Contrary to their findings, the current European macroseismic scale, which is the basis for evaluating how strongly an earthquake is felt, considers location the stronger indicator for defining intensity.The authors analyzed data submitted to "Hai-sentito-il-terremoto?," which is similar to the U.S. Geological Survey's "Did You Feel It?" website that analyzes information about earthquakes from people who have felt them. After an earthquake, individuals answer questions about what they felt during the quake, along with other questions regarding their location and activity.Intensity measures the strength of shaking produced by the earthquake at a certain location. Intensity is determined from effects on people, human structures, and the natural environment.The number of people who feel an earthquake is critical to determining intensity levels, and low intensity earthquakes generate fewer reports, making objective evaluation of shaking difficult. | Earthquakes | 2,014 |
March 6, 2014 | https://www.sciencedaily.com/releases/2014/03/140306095526.htm | Japanese Town: Half the survivors of mega-earthquake, tsunami, have PTSD symptoms | Though just two of Hirono's 5,418 residents lost their lives in Japan's mega-earthquake and tsunami, a new study shows that the survivors are struggling to keep their sanity. | One year after the quake, Brigham Young University professor Niwako Yamawaki and scholars from Saga University evaluated the mental health of 241 Hirono citizens. More than half of the people evaluated experienced "clinically concerning" symptoms of post-traumatic stress disorder. Two-thirds of the sample reported symptoms of depression.Those rates exceed levels seen in the aftermath of other natural disasters, but what happened in Japan wasn't just a natural disaster. Leaked radiation from nuclear power plants forced residents of Hirono to relocate to temporary housing far from home."This was the world's fourth-biggest recorded earthquake, and also the tsunami and nuclear plant and losing their homes -- boom boom boom boom within such a short time," said Yamawaki, a psychology professor at BYU. "The prevalence one year after is still much higher than other studies of disasters that we found even though some time had passed."Yamawaki got the idea for this study while shoveling mud from a damaged Japanese home one month after the tsunami flooded coastal towns. She had just arrived for a previously scheduled fellowship at Saga University. During her off-time, she traveled to the affected area and volunteered in the clean-up effort. One seemingly stoic homeowner broke down in tears when Yamawaki and her husband thanked her for the chance to help."She said 'This is the first time I have cried since the disaster happened,'" Yamawaki said. "She just said 'Thank you. Thank you for letting me cry.'"Back at Saga University, Yamawaki collaborated with Hiroko Kukihara to conduct a study on the mental health and resilience of survivors. Their report appears in the journal Participants in the study lived in temporary housing provided by the Japanese government when Hirono was evacuated. With an average age of 58, the people are noticeably older than the populations of normal Japanese towns. Yamawaki suspects that young people were more likely to permanently relocate elsewhere in Japan following the disaster.The researchers didn't just measure the rates of mental illness; they also performed a statistical analysis to learn what fostered resilience among the survivors. Eating right, exercising regularly and going to work all promoted resilience and served as a buffer against mental illness."Having something to do after a disaster really gives a sense of normalcy, even volunteer work," Yamawaki said.As the researchers got to know survivors, they heard from so many that they missed seeing their former neighbors. The mass relocation outside the radiation zone broke up many neighborhood ties."Japanese are very collectivistic people and their identity is so intertwined with neighbors," Yamawaki said. "Breaking up the community has so much impact on them."While it's hard to fathom the scope of the devastation in the coastal region of Fukushima, most survivors believe something like this will happen again. If so, this new study provides a blueprint for how to help them put their lives back together again. | Earthquakes | 2,014 |
March 5, 2014 | https://www.sciencedaily.com/releases/2014/03/140305125142.htm | When disaster strikes: Safeguarding networks | Disasters both natural and human-caused can damage or destroy data and communications networks. Several presentations at the 2014 OFC Conference and Exposition, being held March 9-13 in San Francisco, Calif., USA will present new information on strategies that can mitigate the impacts of these disasters. | Much of our computing these days, from browsing websites and watching online videos to checking email and following social networks, relies on the cloud. The cloud lives in data centers -- massive warehouses filled with thousands of servers that run constantly. Disasters such as earthquakes, tornadoes, or even terrorist attacks, can damage the data centers and the communication links between them, causing massive losses in data and costly disruptions.To mitigate such potential damage, researchers from the University of California, Davis (UC Davis), Sakarya University in Turkey, and Politecnico de Milano in Italy, first analyzed the risk that a disaster may pose to a communications network, based on the possible damage of a data center or the links that connect them to users. Then, they created an algorithm that keeps data safe by moving or copying the data from data centers in peril to more secure locations away from the disaster. The algorithm assesses the risks for damage and users' demands on the network to determine, in real-time, which locations would provide the safest refuge from a disaster."Our content placement solution can be implemented with some modifications on any existing settings of data centers and it is adaptable to different dynamic disaster scenarios," said researcher Sifat Ferdousi of UC Davis. "This can highly benefit the network providers in designing disaster-resilient cloud networks."Earthquakes, tsunamis, and other natural disasters can sever the optical fibers that carry data across long distances, leaving telecommunications networks useless. If fiber-optic cables are down, wireless communication can fill the void and be part of a temporary, emergency network. But for such a system to work, wireless technology would have to be integrated with the fiber-optic network that transports data around the world.Such an integrated wireless optical system would combine the speed and bandwidth of fiber optics with the mobility and range of a wireless network. This system could also be applied in home networks, in which data is sent via optical cables to the home then broadcasted wirelessly.One big challenge of an integrated system, however, is to develop the wireless links that can handle the speed and capacity of optical cables. Researchers from Fudan University in Shanghai and ZTE (TX), Inc. in Morristown, N.J., USA have now developed a new antenna architecture that allows for a simple and high-speed integrated wireless optical system. The design relies on two pairs of antennas, explains Jianjun Yu of ZTE. Because each pair is polarized differently and isolated, there's no interference between the two pairs, allowing for a simpler structure and a larger transmission capacity. The new system achieves a data-transmission rate of 146 gigabits per second (Gb/s), which is the highest bit-rate-per-channel in a wireless signal shown so far, Yu says. | Earthquakes | 2,014 |
March 4, 2014 | https://www.sciencedaily.com/releases/2014/03/140304113542.htm | Insights into plate tectonics, the forces behind earthquakes, volcanoes | The Earth's outer layer is made up of a series of moving, interacting plates whose motion at the surface generates earthquakes, creates volcanoes and builds mountains. Geoscientists have long sought to understand the plates' fundamental properties and the mechanisms that cause them to move and drift, and the questions have become the subjects of lively debate. | A study published online Feb. 27 by the journal Science is a significant step toward answering those questions.Researchers led by Caroline Beghein, assistant professor of earth, planetary and space sciences in UCLA's College of Letters and Science, used a technique called seismic tomography to study the structure of the Pacific Plate -- one of eight to 12 major plates at the surface of the Earth. The technique enabled them to determine the plate's thickness, and to image the interior of the plate and the underlying mantle (the layer between the Earth's crust and outer core), which they were able to relate to the direction of flow of rocks in the mantle."Rocks deform and flow slowly inside the Earth's mantle, which makes the plates move at the surface," said Beghein, the paper's lead author. "Our research enables us to image the interior of the plate and helps us figure out how it formed and evolved." The findings might apply to other oceanic plates as well.Even with the new findings, Beghein said, the fundamental properties of plates "are still somewhat enigmatic."Seismic tomography is similar to commonly used medical imaging techniques like computed tomography, or CT, scans. But instead of using X-rays, seismic tomography employs recordings of the seismic waves generated by earthquakes, allowing scientists to detect variations in the speed of seismic waves inside the Earth. Those variations can reveal different layers within the mantle, and can help scientists determine the temperature and chemistry of the mantle rocks by comparing observed variations in wave speed with predictions from other types of geophysical data.Seismologists often use other types of seismic data to identify this layering: They detect seismic waves that bounce off the interface that separates two layers. In their study, Beghein and co-authors compared the layering they observed using seismic tomography with the layers revealed by these other types of data. Comparing results from the different methods is a continuing challenge for geoscientists, but it is an important part of helping them understand the Earth's structure."We overcame this challenge by trying to push the observational science to the highest resolutions, allowing us to more readily compare observations across datasets," said Nicholas Schmerr, the study's co-author and an assistant research scientist in geology at the University of Maryland.The researchers were the first to discover that the Pacific Plate is formed by a combination of mechanisms: The plate thickens as the rocks of the mantle cool, the chemical makeup of the rocks that form the plate changes with depth, and the mechanical behavior of the rocks change with depth and their proximity to where the plate is being formed at the mid-ocean ridge."By modeling the behavior of seismic waves in Earth's mantle, we discovered a transition inside the plate from the top, where the rocks didn't deform or flow very much, to the bottom of the plate, where they are more strongly deformed by tectonic forces," Beghein said. "This transition corresponds to a boundary between the layers that we can image with seismology and that we attribute to changes in rock composition."Oceanic plates form at ocean ridges and disappear into the Earth's mantle, a process known as subduction. Among geoscientists, there is still considerable debate about what drives this evolution. Beghein and her research team advanced our understanding of how oceanic plates form and evolve as they age by using and comparing two sets of seismic data; the study revealed the presence of a compositional boundary inside the plate that appears to be linked to the formation of the plate itself.Other co-authors of the research are Kaiqing Yuan and Zheng Xing, graduate students in UCLA's Department of Earth, Planetary and Space Sciences. | Earthquakes | 2,014 |
February 24, 2014 | https://www.sciencedaily.com/releases/2014/02/140224110136.htm | Gauging what it takes to heal a disaster-ravaged forest: Case study in China | Recovering from natural disasters usually means rebuilding infrastructure and reassembling human lives. Yet ecologically sensitive areas need to heal, too, and scientists are pioneering new methods to assess nature's recovery and guide human intervention. | The epicenter of China's devastating Wenchuan earthquake in 2008 was in the Wolong Nature Reserve, a globally important valuable biodiversity hotspot and home to the beloved and endangered giant pandas. Not only did the quake devastate villages and roads, but the earth split open and swallowed sections of the forests and bamboo groves that shelter and feed pandas and other endangered wildlife. Persistent landslides and erosion exacerbated the devastation.Typically such natural damage is assessed with remote sensing, which can be limited in fine details. Scientists at Michigan State University (MSU) and in China embarked on a dangerous boots-on-the-ground effort to understand how well the trees, bamboo and critical ground cover were recovering. Their work, which is relevant to disaster areas worldwide, is reported in this week's "Across the world, people are investing billions of dollars to protect valuable natural areas, as well as making enormous investments in restoring such areas after natural disasters," said Jianguo "Jack" Liu, director of MSU's Center for Systems Integration and Sustainability, and a co-author. "It's important we develop ways to understand the fine points of how well recovery efforts are working, so we can direct resources in the right places effectively."Jindong Zhang, a post-doctoral research associate in CSIS, spent several months over a period of four years in Wolong dodging landslides, mudslides and rubble strewn roads to survey forest recovery at a finer scale than can be observed from satellites and getting a better handle on the nuances of tree species, height and soil conditions. The data was then combined with that from satellite imagery.What was found was that much of the natural areas were on the road to recovery, and that China's $17 million effort at replanting native trees and bamboo were helping in areas handicapped by poor soil and growing conditions."Our evaluation of the Wolong restoration project will have a guiding role in the restoration scheme areas across the entire area affected by the earthquake, Zhang said. "Our study indicated that forest restoration after natural disasters should not only consider the forest itself, but also take into account the animals inhabiting the ecosystem and human livelihoods."They also noted that such efforts could benefit from more targeting of areas most favored by pandas. The replanting efforts were done by local residents."We witnessed pretty intense periods when it seemed like everyone in the target areas were out planting," said co-author Vanessa Hull, a CSIS doctoral candidate who studies panda habitat in Wolong. "My field assistants also joined in on the village-wide efforts. It was pretty neat to see."But a potential downside to such efforts was that most of the available labor was near villages, and pandas shy from human contact. That meant that some of the best assisted-forest recovery was in areas not favored by pandas. Hull noted, however, that there could be an upside to that. Healthier forests could mean local residents have less need to venture into more far-flung panda-friendly forests."We wanted to know if the benefit of this effort was matching up to the investment -- which was significant," Hull said. "It's an important question, and the world needs good ways to evaluate it as natural disasters are growing in frequency and intensity." | Earthquakes | 2,014 |
February 12, 2014 | https://www.sciencedaily.com/releases/2014/02/140212132938.htm | San Francisco's big 1906 earthquake was third of a series on San Andreas Fault | Research led by a University of Oregon doctoral student in California's Santa Cruz Mountains has uncovered geologic evidence that supports historical narratives for two earthquakes in the 68 years prior to San Francisco's devastating 1906 disaster. | The evidence places the two earthquakes, in 1838 and 1890, on the San Andreas Fault, as theorized by many researchers based on written accounts about damage to Spanish-built missions in the Monterey and San Francisco bay areas. These two quakes, as in 1906, were surface-rupturing events, the researchers concluded.Continuing work, says San Francisco Bay-area native Ashley R. Streig, will dig deeper into the region's geological record -- layers of sediment along the fault -- to determine if the ensuing seismically quiet years make up a normal pattern -- or not -- of quake frequency along the fault.Streig is lead author of the study, published in this month's issue of the Bulletin of the Seismological Society of America. She collaborated on the project with her doctoral adviser Ray Weldon, professor of the UO's Department of Geological Sciences, and Timothy E. Dawson of the Menlo Park office of the California Geological Survey.The study was the first to fully map the active fault trace in the Santa Cruz Mountains using a combination of on-the-ground observations and airborne Light Detection and Ranging (LiDAR), a remote sensing technology. The Santa Cruz Mountains run for about 39 miles from south of San Francisco to near San Juan Batista. Hazel Dell is east of Santa Cruz and north of Watsonville."We found the first geologic evidence of surface rupture by what looks like the 1838 and 1890 earthquakes, as well as 1906," said Streig, whose introduction to major earthquakes came at age 11 during the 1989 Loma Prieta Earthquake on a deep sub-fault of the San Andreas Fault zone. That quake, which disrupted baseball's World Series, forced her family to camp outside their home.Unlike the 1906 quake that ruptured 470 kilometers (296 miles) of the fault, the 1838 and 1890 quakes ruptured shorter portions of the fault, possibly limited to the Santa Cruz Mountains. "This is the first time we have had good, clear geologic evidence of these historic 19th century earthquakes," she said. "It's important because it tells us that we had three surface ruptures, really closely spaced in time that all had fairly large displacements of at least half a meter and probably larger."The team identified ax-cut wood chips, tree stumps and charcoal fragments from early logging efforts in unexpectedly deep layers of sediment, 1.5 meters (five feet) below the ground, and document evidence of three earthquakes since logging occurred at the site. The logging story emerged from 16 trenches dug in 2008, 2010 and 2011 along the fault at the Hazel Dell site in the mountain range.High-resolution radiocarbon dating of tree-rings from the wood chips and charcoal confirm these are post European deposits, and the geologic earthquake evidence coincides with written accounts describing local earthquake damage, including damage to Spanish missions in 1838, and in a USGS publication of earthquakes in 1890 catalogued by an astronomer from Lick Observatory.Additionally, in 1906 individuals living near the Hazel Dell site reported to geologists that cracks from the 1906 earthquake had occurred just where they had 16 years earlier, in 1890, which, Streig and colleagues say, was probably centered in the Hazel Dell region. Another displacement of sediment at the Hazel Dell site matched the timeline of the 1906 quake.The project also allowed the team to conclude that another historically reported quake, in 1865, was not surface rupturing, but it was probably deep and, like the 1989 event, occurred on a sub zone of the San Andreas Fault. Conventional thinking, Streig said, has suggested that the San Andreas Fault always ruptures in a long-reaching fashion similar to the 1906 earthquake. This study, however, points to more regionally confined ruptures as well."This all tells us that there are more frequent surface-rupturing earthquakes on this section of the fault than have been previously identified, certainly in the historic period," Streig said. "This becomes important to earthquake models because it is saying something about the connectivity of all these fault sections -- and how they might link up."The frequency of the quakes in the Santa Cruz Mountains, she added, must have been a terrifying experience for settlers during the 68-year period."This study is the first to show three historic ruptures on the San Andreas Fault outside the special case of Parkfield," Weldon said, referring to a region in mountains to the south of the Santa Cruz range where six magnitude 6-plus earthquakes occurred between 1857 and 1966. "The earthquakes of 1838 and 1890 were known to be somewhere nearby from shaking, but now we know the San Andreas Fault ruptured three times on the same piece of the fault in less than 100 years."More broadly, Weldon said, having multiple paleoseismic sites close together on a major fault, geologists now realize that interpretations gleaned from single-site evidence probably aren't reliable. "We need to spend more time reproducing or confirming results rather than rushing to the next fault if we are going to get it right," he said. "Ashley's combination of historical research, C-14 dating, tree rings, pollen and stratigraphic correlation between sites has allowed us to credibly argue for precision that allows identification of the 1838 and 1890 earthquakes.""Researchers at the University of Oregon are using tools and technologies to further our understanding of the dynamic forces that continue to shape our planet and impact its people," said Kimberly Andrews Espy, vice president for research and innovation and dean of the UO Graduate School. "This research furthers our understanding of the connectivity of the various sections of California's San Andreas Fault and has the potential to save lives by leading to more accurate earthquake modeling."The U.S. Geological Survey funded the research through grants 08-HQ-GR-0071, 08-HQ-GR-0072, G10AP00064, G10AP0065 and G11AP20123. A Geological Society of America Student Research Grant to Streig funded the age-dating of the team's evidence at the Lawrence Livermore National Laboratory's Center for Accelerator Mass Spectrometry. | Earthquakes | 2,014 |
January 29, 2014 | https://www.sciencedaily.com/releases/2014/01/140129114925.htm | Large, deep magma chamber discovered below Kilauea volcano: Largely unknown internal plumbing of volcanoes | A new study led by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science uncovered a previously unknown magma chamber deep below the most active volcano in the world -- Kilauea. This is the first geophysical observation that large magma chambers exist in the deeper parts of the volcano system. | Scientists analyzed the seismic waves that travel through the volcano to understand the internal structure of the volcanic system. Using the seismic data, the researchers developed a three-dimensional velocity model of a magma anomaly to determine the size, depth and composition of the lava chamber, which is several kilometers in diameter and located at a depth of 8-11 km (5 -- 6.8 miles)."It was known before that Kilauea had small, shallow magma chambers," said Guoqing Lin, UM Rosenstiel School assistant professor of geology and geophysics and lead author of the study. "This study is the first geophysical observation that large magma chambers exist in the deep oceanic crust below."The study also showed that the deep chamber is composed of "magma mush," a mixture of 10-percent magma and 90-percent rock. The crustal magma reservoir below Kilauea is similar to those widely observed beneath volcanoes located at mid-ocean ridges."Understanding these magma bodies are a high priority because of the hazard posed by the volcano," said Falk Amelung, co-author and professor of geology and geophysics at the UM Rosenstiel School. "Kilauea volcano produces many small earthquakes and paying particular attention to new seismic activity near this body will help us to better understand where future lava eruptions will come from."Scientists are still unraveling the mysteries of the deep internal network of magma chambers and lava tubes of Kilauea, which has been in continuous eruption for more than 30 years and is currently the most active volcano in the world. | Earthquakes | 2,014 |
January 27, 2014 | https://www.sciencedaily.com/releases/2014/01/140127093207.htm | Is there an ocean beneath our feet? Ocean water may reach upper mantle through deep sea faults | Scientists at the University of Liverpool have shown that deep sea fault zones could transport much larger amounts of water from Earth's oceans to the upper mantle than previously thought. | Water is carried mantle by deep sea fault zones which penetrate the oceanic plate as it bends into the subduction zone. Subduction, where an oceanic tectonic plate is forced beneath another plate, causes large earthquakes such as the recent Tohoku earthquake, as well as many earthquakes that occur hundreds of kilometers below Earth's surface.Seismologists at Liverpool have estimated that over the age of Earth, the Japan subduction zone alone could transport the equivalent of up to three and a half times the water of all Earth's oceans to its mantle.Using seismic modelling techniques the researchers analysed earthquakes which occurred more than 100 km below Earth's surface in the Wadati-Benioff zone, a plane of Earthquakes that occur in the oceanic plate as it sinks deep into the mantle.Analysis of the seismic waves from these earthquakes shows that they occurred on 1 -- 2 km wide fault zones with low seismic velocities. Seismic waves travel slower in these fault zones than in the rest of the subducting plate because the sea water that percolated through the faults reacted with the oceanic rocks to form serpentinite -- a mineral that contains water.Some of the water carried to the mantle by these hydrated fault zones is released as the tectonic plate heats up. This water causes the mantle material to melt, causing volcanoes above the subduction zone such as those that form the Pacific 'ring of fire'. Some water is transported deeper into the mantle, and is stored in the deep Earth."It has been known for a long time that subducting plates carry oceanic water to the mantle," said Tom Garth, a PhD student in the Earthquake Seismology research group led by Professor Andreas Rietbrock."This water causes melting in the mantle, which leads to arc releasing some of the water back into the atmosphere. Part of the subducted water however is carried deeper into the mantle and may be stored there."We found that fault zones that form in the deep oceanic trench offshore Northern Japan persist to depths of up to 150 km. These hydrated fault zones can carry large amounts of water, suggesting that subduction zones carry much more water from the ocean down to the mantle than has previously been suggested."This supports the theory that there are large amounts of water stored deep in the Earth."Understanding how much water is delivered to the mantle contributes to knowledge of how the mantle convects, and how it melts, which helps to understand how plate tectonics began, and how the continental crust was formed. | Earthquakes | 2,014 |
January 23, 2014 | https://www.sciencedaily.com/releases/2014/01/140123141949.htm | Los Angeles would experience stronger-than-expected ground motion in major earthquake, virtual earthquake generator shows | Stanford scientists are using weak vibrations generated by Earth's oceans to produce "virtual earthquakes" that can be used to predict the ground movement and shaking hazard to buildings from real quakes. | The new technique, detailed in the Jan. 24 issue of the journal "We used our virtual earthquake approach to reconstruct large earthquakes on the southern San Andreas Fault and studied the responses of the urban environment of Los Angeles to such earthquakes," said lead author Marine Denolle, who recently received her PhD in geophysics from Stanford and is now at the Scripps Institution of Oceanography in San Diego.The new technique capitalizes on the fact that earthquakes aren't the only sources of seismic waves. "If you put a seismometer in the ground and there's no earthquake, what do you record? It turns out that you record something," said study leader Greg Beroza, a geophysics professor at Stanford.What the instruments will pick up is a weak, continuous signal known as the ambient seismic field. This omnipresent field is generated by ocean waves interacting with the solid Earth. When the waves collide with each other, they generate a pressure pulse that travels through the ocean to the sea floor and into Earth's crust. "These waves are billions of times weaker than the seismic waves generated by earthquakes," Beroza said.Scientists have known about the ambient seismic field for about 100 years, but it was largely considered a nuisance because it interferes with their ability to study earthquakes. The tenuous seismic waves that make up this field propagate every which way through the crust. But in the past decade, seismologists developed signal-processing techniques that allow them to isolate certain waves; in particular, those traveling through one seismometer and then another one downstream.Denolle built upon these techniques and devised a way to make these ambient seismic waves function as proxies for seismic waves generated by real earthquakes. By studying how the ambient waves moved underground, the researchers were able to predict the actions of much stronger waves from powerful earthquakes.She began by installing several seismometers along the San Andreas Fault to specifically measure ambient seismic waves.Employing data from the seismometers, the group then used mathematical techniques they developed to make the waves appear as if they originated deep within Earth. This was done to correct for the fact that the seismometers Denolle installed were located at Earth's surface, whereas real earthquakes occur at depth.In the study, the team used their virtual earthquake approach to confirm the accuracy of a prediction, made in 2006 by supercomputer simulations, that if the southern San Andreas Fault section of California were to rupture and spawn an earthquake, some of the seismic waves traveling northward would be funneled toward Los Angeles along a 60-mile-long (100-kilometer-long) natural conduit that connects the city with the San Bernardino Valley. This passageway is composed mostly of sediments, and acts to amplify and direct waves toward the Los Angeles region.Until now, there was no way to test whether this funneling action, known as the waveguide-to-basin effect, actually takes place because a major quake has not occurred along that particular section of the San Andreas Fault in more than 150 years.The virtual earthquake approach also predicts that seismic waves will become further amplified when they reach Los Angeles because the city sits atop a large sedimentary basin. To understand why this occurs, study coauthor Eric Dunham, an assistant professor of geophysics at Stanford, said to imagine taking a block of plastic foam, cutting out a bowl-shaped hole in the middle, and filling the cavity with gelatin. In this analogy, the plastic foam is a stand-in for rocks, while the gelatin is like sediments, or dirt. "The gelatin is floppier and a lot more compliant. If you shake the whole thing, you're going to get some motion in the Styrofoam, but most of what you're going to see is the basin oscillating," Dunham said.As a result, the scientists say, Los Angeles could be at risk for stronger, and more variable, ground motion if a large earthquake -- magnitude 7.0 or greater -- were to occur along the southern San Andreas Fault, near the Salton Sea."The seismic waves are essentially guided into the sedimentary basin that underlies Los Angeles," Beroza said. "Once there, the waves reverberate and are amplified, causing stronger shaking than would otherwise occur."Beroza's group is planning to test the virtual earthquake approach in other cities around the world that are built atop sedimentary basins, such as Tokyo, Mexico City, Seattle and parts of the San Francisco Bay area. "All of these cities are earthquake threatened, and all of them have an extra threat because of the basin amplification effect," Beroza said.Because the technique is relatively inexpensive, it could also be useful for forecasting ground motion in developing countries. "You don't need large supercomputers to run the simulations," Denolle said.In addition to studying earthquakes that have yet to occur, the technique could also be used as a kind of "seismological time machine" to recreate the seismic signatures of temblors that shook Earth long ago, according to Beroza."For an earthquake that occurred 200 years ago, if you know where the fault was, you could deploy instruments, go through this procedure, and generate seismograms for earthquakes that occurred before seismographs were invented," he said.German Prieto, an assistant professor of geophysics at the Massachusetts Institute of Technology and a Stanford alumnus, also contributed to the research.Video: | Earthquakes | 2,014 |
January 22, 2014 | https://www.sciencedaily.com/releases/2014/01/140122091734.htm | 40 percent of minors in Lorca suffer post-traumatic stress a year after earthquake | Spanish researchers have analysed the impact of the Lorca catastrophe by the percentage of minors suffering post-traumatic stress. Results reveal that 55% of young people displayed this disorder a month on from the earthquake and 40% were still suffering a year later. | On 11 May 2011, Lorca suffered an earthquake measuring 5.1, preceded by another of 4.5, which killed nine people and caused significant material damage.Two experts from the University of Murcia compared the prevalence of post-traumatic stress disorder (PTSD) on the population of minors in the region in its acute phase (one month after the quake) and as a chronic condition (one year later)."The analysis indicates that 55% of the minors suffered from post-traumatic stress one month after the earthquake, while after one year this had decreased to 40%," as Concepción López Soler, researcher from the University of Murcia and co-author of the study with Juan José López García, explained.The results, published in the journal Gaceta Sanitaria, reveal that 75% of the minors presented re-experiencing symptoms (recurrent thoughts, nightmares and physiological manifestations) after one month and 60% after one year.In addition, a month later, 42% suffered from avoidance of anything related to the tragedy (memory disturbances, emotional blockage) and 24% after one year. 51% also displayed hyperarousal (sleeping difficulties, irritability and concentration problems) after a month and 38% a year later.The authors argue that post-traumatic reactions generally tend to disappear over time. "After the earthquake new mental health resources were implemented to assist people with severe post-traumatic stress," stresses López Soler.Primary school pupils in their 3rd and 6th year in educational centres from the municipal area were asked for their voluntary participation in this study. One month after the earthquake the level of PTSD was assessed in 495 minors and in 374 after one year."It is important to highlight that the younger age groups and girls are more sensitive to developing these symptoms, which coincides with the results of other studies," the Murcian researcher points out. "Young girls in particular are a special risk group."Among the younger students, 54% of girls showed symptoms of post-traumatic stress compared to 39% of boys.The evaluation was carried out using the Diagnostic and Statistical Manual of Mental Disorders criteria (DSM-IV-TR), using the Child PTSD Symptom Scale (CPSS) questionnaire which was developed to evaluate the post-traumatic stress in minors after an earthquake in Northridge (Los Angeles, USA) in 1994."Natural disasters create a sense of loss of personal safety and endangered survival among the population," adds López Soler. Earthquakes are one of the disasters which cause the greatest psychological disturbances in the population and PTSD is the reaction most associated with adverse conditions."With previous earthquakes, the affected population has been quite variable," she states. Three years after the Turkey earthquake in 1999, the prevalence of PTSD was 59%; 18 months after the earthquake in Kashmir (between India and Pakistan), it was 64.8% and ten months after the disaster in L'Aquila (Italy) in 2009, it was greater than 60%.In comparison with other studies, the prevalence of PTSD is somewhat lower, which according to the authors can be understood to be due both to the lesser intensity of the earthquake and its consequences, and swift normalisation of the environment. | Earthquakes | 2,014 |
January 20, 2014 | https://www.sciencedaily.com/releases/2014/01/140120173444.htm | Vancouver: Nearby Georgia basin may amplify ground shaking from next quake | Tall buildings, bridges and other long-period structures in Greater Vancouver may experience greater shaking from large (M 6.8 +) earthquakes than previously thought due to the amplification of surface waves passing through the Georgia basin, according to two studies published by the | "For very stiff soils, current building codes don't include amplification of ground motion," said lead author Sheri Molnar, a researcher at the University of British Columbia. "While the building codes say there should not be any increase or decrease in ground motion, our results show that there could be an average amplification of up to a factor of three or four in Greater Vancouver."The research provides the first detailed studies of 3D earthquake ground motion for a sedimentary basin in Canada. Since no large crustal earthquakes have occurred in the area since the installation of a local seismic network, these studies offer refined predictions of ground motion from large crustal earthquakes likely to occur.Southwestern British Columbia is situated above the seismically active Cascadia subduction zone. A complex tectonic region, earthquakes occur in three zones: the thrust fault interface between the Juan de Fuca plate, which is sliding beneath the North America plate; within the over-riding North America plate; and within the subducting Juan de Fuca plate.Molnar and her colleagues investigate the effect the three dimensional (3D) deep basin beneath Greater Vancouver has on the earthquake-generated waves that pass through it. The Georgia basin is one in a series of basins spanning form California to southern Alaska along the Pacific margin of the North America and is relatively wide and shallow. The basin is filled with sedimentary layers of silts, sands and glacial deposits.While previous research suggested how approximately 100 meters of material near the surface would affect ground shaking, no studies had looked at the effect of the 3D basin structure on long period seismic waves.To fill in that gap in knowledge, Molnar and colleagues performed numerical modeling of wave propagation, using various scenarios for both shallow quakes (5 km in depth) within the North America plate and deep quakes (40 -- 55 km in depth) within the Juan de Fuca subducting plate, the latter being the most common type of earthquake. The authors did not focus on earthquakes generated by a megathrust rupture of the Cascadia subduction zone, a scenario studied previously by co-author Kim Olsen of San Diego State University.For these two studies, the authors modeled 10 scenario earthquakes for the subducting plate and 8 shallow crustal earthquakes within the North America plate, assuming rupture sites based on known seismicity. The computational analyses suggest the basin distorts the seismic radiation pattern -- how the energy moves through the basin -- and produces a larger area of higher ground motions. Steep basin edges excite the seismic waves, amplifying the ground motion.The largest surface waves generated across Greater Vancouver are associated with earthquakes located approximately 80 km or more, south-southwest of the city, suggest the authors."The results were an eye opener," said Molnar. "Because of the 3D basin structure, there's greater hazard since it will amplify ground shaking. Now we have a grasp of how much the basin increases ground shaking for the most likely future large earthquakes."In Greater Vancouver, there are more than 700 12-story and taller commercial and residential buildings, and large structures -- high-rise buildings, bridges and pipelines -- that are more affected by long period seismic waves, or long wavelength shaking. "That's where these results have impact," said Molnar. | Earthquakes | 2,014 |
January 13, 2014 | https://www.sciencedaily.com/releases/2014/01/140113104835.htm | Building 'belt' offers cheap, quick repair of earthquake damage | Four years after the January 2010 earthquake, 145,000 people still remain homeless in Haiti. A cheap and simple technology to repair earthquake damaged buildings -- developed at the University of Sheffield -- could help to reduce these delays by quickly making buildings safe and habitable. | Recent tests showed that a damaged building repaired using the technique could withstand a major earthquake -- similar in scale and proximity to the buildings that collapsed during the Haiti earthquake.The technology involves wrapping metal straps around each floor of the building, which are then tensioned either by hand or using compressed air tools. It is designed for use on reinforced concrete frame buildings -- a common construction technique around the world, including countries like Haiti. Unlike other repair methods, it does not require expensive materials or a high level of technical knowledge, making it ideal for use in the developing world.Lead researcher, Professor Kypros Pilakoutas, explains: "The strapping works very much like a weight-lifter's belt, by keeping everything tightly compressed to reduce tension on the concrete columns of the structure.Concrete works well under compression, but not when pulled under tension and this is why it has to be reinforced for use in construction. When the reinforcement is faulty or damaged, it can be very expensive to repair."Our method not only makes the building stable again very quickly, but it increases the building's ability to deform without breaking, making it more able to withstand further earthquake movement."The team tested the technique on a full scale, two-storey building, built according to an old European standard which has inadequate reinforcing to withstand earthquakes. This construction is typical of many buildings in the developing world, as well as many Mediterranean buildings built before the 1980s.The building was constructed on a specially designed 'shaking table' which can simulate ground movement caused by earthquakes. During the first test, the building was very near collapse following a small earthquake similar in scale to a magnitude 4 on the Richter scale having about 10000 times less energy than the Haiti earthquake.The building was then repaired using the post-tensioned metal straps and retested. The researchers were unable to make the building fail during a major earthquake similar in scale to the magnitude 7 Haiti earthquake at the epicentre and stopped the test at that point.Professor Pilakoutas hopes the new technology will not only speed up the response to major earthquakes, but could also prevent the damage happening in the first place. The cost of the materials for a typical small building column is about £20 and it would take a crew of two people around 2 hours to complete the strengthening. For a typical small dwelling having 6 columns, the seismic rehabilitation would cost around £200 and could be completed in a few days, rather than cost several thousand pounds and take months with other traditional rehabilitation techniques such as jacketing with steel plates or concrete."Ideally, governments shouldn't wait until a disaster happens, but should be identifying buildings at risk and taking steps to make them strong enough to withstand any future earthquakes," he says. "Because this method causes minimal disruption and is cheap to apply, it's ideal for bringing existing buildings up to standard -- both in the developing world and in earthquake risk areas in Europe as well."Video: | Earthquakes | 2,014 |
January 6, 2014 | https://www.sciencedaily.com/releases/2014/01/140106094428.htm | Supervolcano triggers recreated in X-ray laboratory | Scientists have reproduced the conditions inside the magma chamber of a supervolcano to understand what it takes to trigger its explosion. These rare events represent the biggest natural catastrophes on Earth except for the impact of giant meteorites. Using synchrotron X-rays, the scientists established that supervolcano eruptions may occur spontaneously, driven only by magma pressure without the need for an external trigger. The results are published in | The team was led by Wim Malfait and Carmen Sanchez-Valle of ETH Zurich (Switzerland) and comprised scientists from the Paul Scherrer Institute in Villigen (Switzerland), Okayama University (Japan), the Laboratory of Geology of CNRS, Université Lyon 1 and ENS Lyon (France) and the European Synchrotron (ESRF) in Grenoble (France).A well-known supervolcano eruption occurred 600,000 years ago in Wyoming in the United States, creating a huge crater called a caldera, in the centre of what today is Yellowstone National Park. When the volcano exploded, it ejected more than 1000 kmAccording to a 2005 report by the Geological Society of London, "Even science fiction cannot produce a credible mechanism for averting a super-eruption. We can, however, work to better understand the mechanisms involved in super-eruptions, with the goal of being able to predict them ahead of time and provide a warning for society. Preparedness is the key to mitigation of the disastrous effects of a super-eruption."The mechanisms that trigger supervolcano eruptions have remained elusive to date. The main reason is that the processes inside a supervolcano are different from those in conventional volcanoes like Mt. Pinatubo which are better understood. A supervolcano possesses a much larger magma chamber and it is always located in an area where the heat flow from the interior of Earth to the surface is very high. As a consequence, the magma chamber is very large and hot but also plastic: its shape changes as a function of the pressure when it gradually fills with hot magma. This plasticity allows the pressure to dissipate more efficiently than in a normal volcano whose magma chamber is more rigid. Supervolcanoes therefore do not erupt very often.So what changes in the lead up to an eruption? Wim Malfait explains: "The driving force is an additional pressure which is caused by the different densities of solid rock and liquid magma. It is comparable to a football filled with air under water, which is forced upwards by the denser water around it." Whether this additional pressure alone could eventually become sufficiently high to crack Earth's crust, leading to a violent eruption, or whether an external energy source like an Earthquake is required has only now been answered.Whilst it is virtually impossible to drill a hole into the magma chamber of a supervolcano given the depth at which these chambers are buried, one can simulate these extreme conditions in the laboratory. "The synchrotron X-rays at the ESRF can then be used to probe the state -- liquid or solid -- and the change in density when magma crystallises into rock" says Mohamed Mezouar, scientist at the ESRF and member of the team. Jean-Philippe Perrillat from the Laboratory of Geology of CNRS, Université Lyon 1 and ENS Lyon adds: "Temperatures of up to 1700 degrees and pressures of up to 36,000 atmospheres can be reached inside the so-called Paris-Edinburgh press, where speck-sized rock samples are placed between the tips of two tungsten carbide anvils and then heated with a resistive furnace. This special set-up was used to accurately determine the density of the liquid magma over a wide range of pressures and temperatures."Magma often includes water, which as vapour adds additional pressure. The scientists also determined magma densities as a function of water content.The results of their measurements showed that the pressure resulting from the differences in density between solid and liquid magma rock is sufficient in itself to crack more than ten kilometres of Earth's crust above the magma chamber. Carmen Sanchez-Valle concludes: "Our research has shown that the pressure is actually large enough for Earth's crust to break. The magma penetrating into the cracks will eventually reach Earth's surface, even in the absence of water or carbon dioxide bubbles in the magma. As it rises to the surface, the magma will expand violently, which is the well known origin of a volcanic explosion." | Earthquakes | 2,014 |
January 6, 2014 | https://www.sciencedaily.com/releases/2014/01/140106094206.htm | Mine landslide triggered earthquakes: Record-breaking slide would bury Central Park 66 feet deep | Last year's gigantic landslide at a Utah copper mine probably was the biggest nonvolcanic slide in North America's modern history, and included two rock avalanches that happened 90 minutes apart and surprisingly triggered 16 small earthquakes, University of Utah scientists discovered. | The landslide -- which moved at an average of almost 70 mph and reached estimated speeds of at least 100 mph -- left a deposit so large it "would cover New York's Central Park with about 20 meters (66 feet) of debris," the researchers report in the January 2014 cover study in the Geological Society of America magazine While earthquakes regularly trigger landslides, the gigantic landslide the night of April 10, 2013, is the first known to have triggered quakes. The slide occurred in the form of two huge rock avalanches at 9:30 p.m. and 11:05 p.m. MDT at Rio Tinto-Kennecott Utah Copper's open-pit Bingham Canyon Mine, 20 miles southwest of downtown Salt Lake City. Each rock avalanche lasted about 90 seconds.While the slides were not quakes, they were measured by seismic scales as having magnitudes up to 5.1 and 4.9, respectively. The subsequent real quakes were smaller.Kennecott officials closely monitor movements in the 107-year-old mine -- which produces 25 percent of the copper used in the United States -- and they recognized signs of increasing instability in the months before the slide, closing and removing a visitor center on the south edge of the 2.8-mile-wide, 3,182-foot-deep open pit, which the company claims is the world's largest humanmade excavation.Landslides -- including those at open-pit mines but excluding quake-triggered slides -- killed more than 32,000 people during 2004-2011, the researchers say. But no one was hurt or died in the Bingham Canyon slide. The slide damaged or destroyed 14 haul trucks and three shovels and closed the mine's main access ramp until November."This is really a geotechnical monitoring success story," says the new study's first author, Kris Pankow, associate director of the University of Utah Seismograph Stations and a research associate professor of geology and geophysics. "No one was killed, and yet now we have this rich dataset to learn more about landslides."There have been much bigger human-caused landslides on other continents, and much bigger prehistoric slides in North America, including one about five times larger than Bingham Canyon some 8,000 years ago at the mouth of Utah's Zion Canyon.But the Bingham Canyon Mine slide "is probably the largest nonvolcanic landslide in modern North American history," said study co-author Jeff Moore, an assistant professor of geology and geophysics at the University of Utah.There have been numerous larger, mostly prehistoric slides -- some hundreds of times larger. Even the landslide portion of the 1980 Mount St. Helens eruption was 57 times larger than the Bingham Canyon slide.News reports initially put the landslide cost at close to $1 billion, but that may end up lower because Kennecott has gotten the mine back in operation faster than expected.Until now, the most expensive U.S. landslide was the 1983 Thistle slide in Utah, which cost an estimated $460 million to $940 million because the town of Thistle was abandoned, train tracks and highways were relocated and a drainage tunnel built.Pankow and Moore conducted the study with several colleagues from the university's College of Mines and Earth Sciences: J. Mark Hale, an information specialist at the Seismograph Stations; Keith Koper, director of the Seismograph Stations; Tex Kubacki, a graduate student in mining engineering; Katherine Whidden, a research seismologist; and Michael K. McCarter, professor of mining engineering.The study was funded by state of Utah support of the University of Utah Seismograph Stations and by the U.S. Geological Survey.The University of Utah researchers say the Bingham Canyon slide was among the best-recorded in history, making it a treasure trove of data for studying slides.Kennecott has estimated the landslide weighed 165 million tons. The new study estimated the slide came from a volume of rock roughly 55 million cubic meters (1.9 billion cubic feet). Rock in a landslide breaks up and expands, so Moore estimated the landslide deposit had a volume of 65 million cubic meters (2.3 billion cubic feet).Moore calculated that not only would bury Central Park 66 feet deep, but also is equivalent to the amount of material in 21 of Egypt's great pyramids of Giza.The landslide's two rock avalanches were not earthquakes but, like mine collapses and nuclear explosions, they were recorded on seismographs and had magnitudes that were calculated on three different scales:Pankow says the larger magnitudes more accurately reflect the energy released by the rock avalanches, but the smaller Richter magnitudes better reflect what people felt -- or didn't feel, since the Seismograph Stations didn't receive any such reports. That's because the larger surface-wave magnitudes record low-frequency energy, while Richter and coda magnitudes are based on high-frequency seismic waves that people usually feel during real quakes.So in terms of ground movements people might feel, the rock avalanches "felt like 2.5," Pankow says. "If this was a normal tectonic earthquake of magnitude 5, all three magnitude scales would give us similar answers."The slides were detected throughout the Utah seismic network, including its most distant station some 250 miles south on the Utah-Arizona border, Pankow says.The second rock avalanche was followed immediately by a real earthquake measuring 2.5 in Richter magnitude and 3.0 in coda magnitude, then three smaller quakes -- all less than one-half mile below the bottom of the mine pit.The Utah researchers sped up recorded seismic data by 30 times to create an audio file in which the second part of the slide is heard as a deep rumbling, followed by sharp gunshot-like bangs from three of the subsequent quakes.Later analysis revealed another 12 tiny quakes -- measuring from 0.5 to minus 0.8 Richter magnitude. (A minus 1 magnitude has one-tenth the power of a hand grenade.) Six of these tiny tremors occurred between the two parts of the landslide, five happened during the two days after the slide, and one was detected 10 days later, on April 20. No quakes were detected during the 10 days before the double landslide."We don't know of any case until now where landslides have been shown to trigger earthquakes," Moore says. "It's quite commonly the reverse."The landslide, from top to bottom, fell 2,790 vertical feet, but its runout -- the distance the slide traveled -- was almost 10,072 feet, or just less than two miles."It was a bedrock landslide that had a characteristically fast and long runout -- much longer than we would see for smaller rockfalls and rockslides," Moore says.While no one was present to measure the speed, rock avalanches typically move about 70 mph to 110 mph, while the fastest moved a quickly as 220 mph.So at Bingham Canyon, "we can safely say the material was probably traveling at least 100 mph as it fell down the steepest part of the slope," Moore says.The researchers don't know why the slide happened as two rock avalanches instead of one, but Moore says, "A huge volume like this can fail in one episode or in 10 episodes over hours."The Seismograph Stations also recorded infrasound waves from the landslide, which Pankow says are "sound waves traveling through the atmosphere that we don't hear" because their frequencies are so low.Both seismic and infrasound recordings detected differences between the landslide's two rock avalanches. For example, the first avalanche had stronger peak energy at the end that was lacking in the second slide, Pankow says."We'd like to be able to use data like this to understand the physics of these large landslides," Moore says.The seismic and infrasound recordings suggest the two rock avalanches were similar in volume, but photos indicate the first slide contained more bedrock, while the second slide contained a higher proportion of mined waste rock -- although both avalanches were predominantly bedrock. | Earthquakes | 2,014 |
January 2, 2014 | https://www.sciencedaily.com/releases/2014/01/140102133219.htm | Longmanshen fault zone still hazardous, suggest new reports | The 60-kilometer segment of the fault northeast of the 2013 Lushan rupture is the place in the region to watch for the next major earthquake, according to research published in | Guest edited by Huajian Yao, professor of geophysics at the University of Science and Technology of China, the special section includes eight articles that present current data, description and preliminary analysis of the Lushan event and discuss the potential of future earthquakes in the region.More than 87,000 people were killed or went missing as a result of the 2008 magnitude 7.9 Wenchuan earthquake in China's Sichuan province, the largest quake to hit China since 1950. In 2013, the Lushan quake occurred ~90 km to the south and caused 203 deaths, injured 11,492 and affected more than 1.5 million people."After the 2008 magnitude 7.9 Wenchuan earthquake along the Longmenshan fault zone in western Sichuan of China, researchers in China and elsewhere have paid particular attention to this region, seeking to understand how the seismic hazard potential changed in the southern segment of the fault and nearby faults," said Yao. "Yet the occurrence of this magnitude 6.6 Lushan event surprised many. The challenge of understanding where and when the next big quake will occur after a devastating seismic event continues after this Lushan event, although we now have gained much more information about this area."The southern part of the Longmenshan fault zone is complex and still only moderately understood. Similar to the central segment where the 2008 Wenchuan event occurred, the southern segment, which generated the Lushan rupture, includes the Wenchuan-Maoxian fault, Beichuan-Yingxiu fault, the Pengxian-Guanxian fault and Dayi faults, a series of sub-parallel secondary faults.Although the Lushan earthquake's mainshock did not break to the surface, the strong shaking still caused significant damage and casualties in the epicentral region. Three papers detail the rupture process of the Lushan quake. Libo Han from the China Earthquake Administration and colleagues provide a preliminary analysis of the Lushan mainshock and two large aftershocks, which appear to have occurred in the upper crust and terminated at a depth of approximately 8 km. While the Lushan earthquake cannot be associated with any identified surface faults, Han and colleagues suggest the quake may have occurred on a blind thrust fault subparallel to the Dayi fault, which lies at and partly defines the edge of the Chengdu basin. Based on observations from extensive trenching and mapping of fault activity after both the Wenchuan and Lushan earthquakes, Chen Lichun and colleagues from the China Earthquake Administration suggest the Lushan quake spread in a "piggyback fashion" toward the Sichuan basin, but with weaker activity and lower seismogenic potential than the Wenchuan quake. And Junju Xie, from the China Earthquake Administration and Beijing University of Technology, and colleagues examined the vertical and horizontal near-source strong motion from the Mw 6.8 Lushan earthquake. The vertical ground motion is relatively weak for this event, likely due to the fact that seismic energy dissipated at the depth of 12-25 km and the rupture did not break through the ground surface.Were the Lushan and Wenchuan earthquakes related? And if so, what is the relationship? Some researchers consider the Lushan quake to be a strong aftershock of the Wenchuan quake, while others see them as independent events. In this special section, researchers tackled the question from various perspectives.To discover whether the Lushan earthquake was truly independent from the Wenchuan quake, researchers need to have an accurate picture of where the Lushan quake originated. Yong Zhang from the GFZ German Research Centre for Geosciences and the China Earthquake Administration and colleagues begin this process by confirming a new hypocenter for Lushan. To find this place where the fault first began to rupture, the researchers analyze near-fault strong-motion data (movements that took place at a distance of up to a few tens of kilometers away from the fault) as well as long distance (thousands of kilometers ) teleseismic data.Using their newly calculated location for the hypocenter, Zhang and colleagues now agree with earlier studies that suggest the initial Lushan rupture was a circular rupture event with no predominant direction. But they note that their calculations place the major slip area in the Lushan quake about 40 to 50 kilometers apart from the southwest end of the Wenchuan quake fault. This "gap" between the two faults may hold increased seismic hazards, caution Zhang and colleagues.Ke Jia of Beijing University and colleagues explore the relationship of the two quakes with a statistical analysis of aftershocks in the region as well as the evolution of shear stress in the lower crust and upper mantle in the broader quake region. Their analyses suggest that the Wenchuan quake did affect the Lushan quake in an immediate sense by changing the overall background seismicity in the region. If these changes in background seismicity are taken into account, the researchers calculate a 62 percent probability that Lushan is a strong aftershock of Wenchuan.Similarly, Yanzhao Wang from the China Earthquake Administration and colleagues quantified the stress loading of area faults due to the Wenchuan quake and suggest the change in stress may have caused the Lushan quake to rupture approximately 28.4 to 59.3 years earlier than expected. They conclude that the Lushan earthquake is at least 85 percent of a delayed aftershock of the Wenchuan earthquake, rather than due solely to long-term tectonic loading.After the Wenchuan quake, researchers immediately began calculating stress changes on the major faults surrounding the rupture zone, in part to identify where dangerous aftershocks might occur and to test how well these stress change calculations might work to predict new earthquakes. As part of these analyses, Tom Parsons of the U.S. Geological Survey and Margarita Segou of GeoAzur compared data collected from the Wenchuan and Lushan quakes with data on aftershocks and stress change in four other major earthquakes, including the M 7.4 Landers and Izmit quakes in California and Turkey, respectively, and the M 7.9 Denali quake in Alaska and the M 7.1 Canterbury quake in New Zealand.Their comparisons reveal that strong aftershocks similar to Lushan are likely to occur where there is highest overall aftershock activity, where stress change is the greatest and on well-developed fault zones. But they also note that by these criteria, the Lushan quake would only have been predicted by stress changes, and not the clustering of aftershocks following the 2008 Wenchuan event.After Wenchuan and Lushan, where should seismologists and other look for the next big quake in the region? After the 2008 Wenchuan quake, seismologists were primed with data to help predict where and when the next rupture might be in the region. The data suggested that the Wenchuan event would increase seismic stress in the southern Longmenshan fault that was the site of the 2013 Lushan quake. But that information alone could not predict that the southern Longmenshan fault would be the next to rupture after Wenchuan, say Mian Liu of the University of Missouri and colleagues, because the Wenchuan earthquake also increased the stress on numerous others faults in the regionAdditional insights can be gained from seismic moment studies, according to Liu and colleagues. Moment balancing compares how much seismic strain energy is accumulated along a fault over a certain period with the amount of strain energy released over the same period. In the case of the Longmenshan fault, there had been a slow accumulation of strain energy without release by a major seismic event for more than a millennium. After the Wenchuan quake, the southern part of the Longmenshan fault became the fault with the greatest potential for a quake. And now, after Lushan, Liu and colleagues say that the 60 kilometer-long segment of the fault northeast of the Lushan rupture is the place in the region to watch for the next major earthquake. | Earthquakes | 2,014 |
January 2, 2014 | https://www.sciencedaily.com/releases/2014/01/140102133217.htm | Earthquake lights linked to rift environments, subvertical faults | Rare earthquake lights are more likely to occur on or near rift environments, where subvertical faults allow stress-induced electrical currents to flow rapidly to the surface, according to a new study published in the Jan./Feb. issue of | From the early days of seismology, the luminous phenomena associated with some earthquakes have intrigued scholars. Earthquake lights (EQL) appear before or during earthquakes, but rarely after.EQL take a variety of forms, including spheres of light floating through the air. Seconds before the 2009 L'Aquila, Italy earthquake struck, pedestrians saw 10-centimeter high flames of light flickering above the stone-paved Francesco Crispi Avenue in the town's historical city center. On Nov. 12, 1988, a bright purple-pink globe of light moved through the sky along the St. Lawrence River near the city of Quebec, 11 days before a powerful quake. And in 1906, about 100 km northwest of San Francisco, a couple saw streams of light running along the ground two nights preceding that region's great earthquake.Continental rift environments now appear to be the common factor associated with EQL. In a detailed study of 65 documented EQL cases since 1600 A.D., 85 percent appeared spatially on or near rifts, and 97 percent appeared adjacent to subvertical faults (a rift, a graben, strike-slip or transform fault). Intraplate faults are associated with just 5 percent of Earth's seismic activity, but 97 percent of documented cases of earthquake lights."The numbers are striking and unexpected," said Robert Thériault, a geologist with the Ministère des Ressources Naturelles of Québec, who, along with colleagues, culled centuries of literature references, limiting the cases in this study to 65 of the best-documented events in the Americas and Europe."We don't know quite yet why more earthquake light events are related to rift environments than other types of faults," said Thériault, "but unlike other faults that may dip at a 30-35 degree angle, such as in subduction zones, subvertical faults characterize the rift environments in these cases."Two of the 65 EQL events are associated with subduction zones, but Thériault suggests there may be an unknown subvertical fault present. "We may not know the fault distribution beneath the ground," said Thériault. "We have some idea of surface structures, but sedimentary layers or water may obscure the underlying fault structure."While the 65 earthquakes ranged in magnitude, from M 3.6 to 9.2, 80 percent were greater than M 5.0. The EQL varied in shape and extent, though most commonly appeared as globular luminous masses, either stationary or moving, as atmospheric illuminations or as flame-like luminosities issuing from the ground.Timing and distance to the epicenter vary widely. Most EQL are seen before and/or during an earthquake, but rarely after, suggesting to the authors that the processes responsible for EQL formation are related to a rapid build-up of stress prior to fault rupture and rapid local stress changes during the propagation of the seismic waves. Stress-activated mobile electronic charge carriers, termed positive holes, flow swiftly along stress gradients. Upon reaching the surface, they ionize air molecules and generate the observed luminosities.Eyewitness reports and security cameras captured a large number of light flashes during the 2007 Pisco, Peru M 8.0 earthquake. Together with seismic records obtained on a local university campus, the automatic security camera records allow for an exact timing and location of light flashes that illuminated a large portion of the night sky. The light flashes identified as EQL coincided with the passage of the seismic waves.Thériault likes the account of a local L'Aquila resident, who, after seeing flashes of light from inside his home two hours before the main shock, rushed his family outside to safety."It's one of the very few documented accounts of someone acting on the presence of earthquake lights," said Thériault. "Earthquake lights as a pre-earthquake phenomenon, in combination with other types of parameters that vary prior to seismic activity, may one day help forecast the approach of a major quake," said Thériault. | Earthquakes | 2,014 |
December 22, 2013 | https://www.sciencedaily.com/releases/2013/12/131222160024.htm | Scientists anticipated size and location of 2012 Costa Rica earthquake | Scientists using GPS to study changes in Earth's shape accurately forecasted the size and location of the magnitude 7.6 Nicoya earthquake that occurred in 2012 in Costa Rica. | The Nicoya Peninsula in Costa Rica is one of the few places where land sits atop the portion of a subduction zone where Earth's greatest earthquakes take place. Costa Rica's location therefore makes it the perfect spot for learning how large earthquakes rupture. Because earthquakes greater than about magnitude 7.5 have occurred in this region roughly every 50 years, with the previous event striking in 1950, scientists have been preparing for this earthquake through a number of geophysical studies. The most recent study used GPS to map out the area along the fault storing energy for release in a large earthquake."This is the first place where we've been able to map out the likely extent of an earthquake rupture along the subduction megathrust beforehand," said Andrew Newman, an associate professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.The study was published online Dec. 22, 2013, in the journal Subduction zones are locations where one tectonic plate is forced under another one. The collision of tectonic plates during this process can unleash devastating earthquakes, and sometimes devastating tsunamis. The magnitude 9.0 earthquake off the coast of Japan in 2011 was due to just such a subduction zone eaerthquake. The Cascadia subduction zone in the Pacific Northwest is capable of unleashing a similarly sized quake. Damage from the Nicoya earthquake was not as bad as might be expected from a magnitude 7.6 quake."Fortunately there was very little damage considering the earthquake's size," said Marino Protti of OVSICORI and the study's lead author. "The historical pattern of earthquakes not only allowed us to get our instruments ready, it also allowed Costa Ricans to upgrade their buildings to be earthquake safe."Plate tectonics are the driving force for subduction zones. As tectonic plates converge, strain temporarily accumulates across the plate boundary when portions of the interface between these tectonic plates, called a megathrust, become locked together. The strain can accumulate to dangerous levels before eventually being released as a massive earthquake."The Nicoya Peninsula is an ideal natural lab for studying these events, because the coastline geometry uniquely allows us to get our equipment close to the zone of active strain accumulation," said Susan Schwartz, professor of earth sciences at the University of California, Santa Cruz, and a co-author of the study.Through a series of studies starting in the early 1990s using land-based tools, the researchers mapped regions where tectonic plates were completely locked along the subduction interface. Detailed geophysical observations of the region allowed the researchers to create an image of where the faults had locked.The researchers published a study a few months before the earthquake, describing the particular locked patch with the clearest potential for the next large earthquake in the region. The team projected the total amount of energy that could have developed across that region and forecasted that if the locking remained similar since the last major earthquake in 1950, then there is presently enough energy for an earthquake on the order of magnitude 7.8 there.Because of limits in technology and scientific understanding about processes controlling fault locking and release, scientists cannot say much about precisely where or when earthquakes will occur. However, earthquakes in Nicoya have occurred about every 50 years, so seismologists had been anticipating another one around 2000, give or take 20 years, Newman said. The earthquake occurred in September of 2012 as a magnitude 7.6 quake."It occurred right in the area we determined to be locked and it had almost the size we expected," Newman said.The researchers hope to apply what they've learned in Costa Rica to other environments. Virtually every damaging subduction zone earthquake occurs far offshore."Nicoya is the only place on Earth where we've actually been able to get a very accurate image of the locked patch because it occurs directly under land," Newman said. "If we really want to understand the seismic potential for most of the world, we have to go offshore."Scientists have been able to reasonably map portions of these locked areas offshore using data on land, but the resolution is poor, particularly in the regions that are most responsible for generating tsunamis, Newman said. He hopes that his group's work in Nicoya will be a driver for geodetic studies on the seafloor to observe such Earth deformation. These seafloor geodetic studies are rare and expensive today."If we want to understand the potential for large earthquakes, then we really need to start doing more seafloor observations," Newman said. "It's a growing push in our community and this study highlights the type of results that one might be able to obtain for most other dangerous environments, including offshore the Pacific Northwest." | Earthquakes | 2,013 |
December 20, 2013 | https://www.sciencedaily.com/releases/2013/12/131220113357.htm | An earthquake or a snow avalanche has its own shape | Predicting earthquakes or snow avalanches is difficult, but to for instance reduce the related risks it is of high importance to know if an avalanche event is big or small. Researchers from Aalto University in Finland have, together with colleagues from Oslo and ENS, Lyon, found that such events or say the acoustic sound bursts coming from the tearing of paper have a typical form independent of whether they are big or small. | However, it is crucial what one observes -- paper fracture or the avalanching of snow. The results were just published in the Avalanches of snow or earthquakes can be described in other ways than the well-known Gutenberg-Richter scale, which gives a prediction of how likely a big avalanche or event is. Each avalanche or burst has its own typical shape or form, which tells for instance when most snow is sliding after the avalanche has started. The shape of can be predicted based on mathematical models, or one can find the right model by looking at the measured shape.-- We studied results from computer simulations, and found different kinds of forms of events. We then analyzed them with pen and paper, and together with our experimental collaborators, and concluded that our predictions for the avalanche shapes were correct, Mikko Alava explains.The results can be applied to comparing experiments with simplified model systems, to a much greater depth. The whole shape of an avalanche holds much more information than say the Gutenberg-Richter index, even with a few other so-called critical exponents. | Earthquakes | 2,013 |
December 12, 2013 | https://www.sciencedaily.com/releases/2013/12/131212100142.htm | Global map predicts locations for giant earthquakes | A team of international researchers, led by Monash University's Associate Professor Wouter Schellart, have developed a new global map of subduction zones, illustrating which ones are predicted to be capable of generating giant earthquakes and which ones are not. | The new research, published in the journal Since then two other giant earthquakes have occurred at subduction zones, one in Chile in February 2010 and one in Japan in March 2011, which both caused massive destruction, killed many thousands of people and resulted in billions of dollars of damage.Most earthquakes occur at the boundaries between tectonic plates that cover the Earth's surface. The largest earthquakes on Earth only occur at subduction zones, plate boundaries where one plate sinks (subducts) below the other into the Earth's interior. So far, seismologists have recorded giant earthquakes for only a limited number of subduction zone segments. But accurate seismological records go back to only ~1900, and the recurrence time of giant earthquakes can be many hundreds of years."The main question is, are all subduction segments capable of generating giant earthquakes, or only some of them? And if only a limited number of them, then how can we identify these," Dr Schellart said.Dr Schellart, of the School of Geosciences, and Professor Nick Rawlinson from the University of Aberdeen in Scotland used earthquake data going back to 1900 and data from subduction zones to map the main characteristics of all active subduction zones on Earth. They investigated if those subduction segments that have experienced a giant earthquake share commonalities in their physical, geometrical and geological properties.They found that the main indicators include the style of deformation in the plate overlying the subduction zone, the level of stress at the subduction zone, the dip angle of the subduction zone, as well as the curvature of the subduction zone plate boundary and the rate at which it moves.Through these findings Dr Schellart has identified several subduction zone regions capable of generating giant earthquakes, including the Lesser Antilles, Mexico-Central America, Greece, the Makran, Sunda, North Sulawesi and Hikurangi."For the Australian region subduction zones of particular significance are the Sunda subduction zone, running from the Andaman Islands along Sumatra and Java to Sumba, and the Hikurangi subduction segment offshore the east coast of the North Island of New Zealand. Our research predicts that these zones are capable of producing giant earthquakes," Dr Schellart said."Our work also predicts that several other subduction segments that surround eastern Australia (New Britain, San Cristobal, New Hebrides, Tonga, Puysegur), are not capable of producing giant earthquakes." | Earthquakes | 2,013 |
December 11, 2013 | https://www.sciencedaily.com/releases/2013/12/131211093834.htm | Runaway process drives intermediate-depth earthquakes | Stanford researchers have uncovered a vital clue about the mechanism behind a type of earthquake that originates deep within Earth and accounts for a quarter of all temblors worldwide, some of which are strong enough to pose a safety hazard. | Stanford scientists may have solved the mystery of what drives a type of earthquake that occurs deep within Earth and accounts for one in four quakes worldwide.Known as intermediate-depth earthquakes, these temblors originate farther down inside Earth than shallow earthquakes, which take place in the uppermost layer of Earth's surface, called the crust. The kinds of quakes that afflict California and most other places in the world are shallow earthquakes."Intermediate-depth earthquakes occur at depths of about 30 miles down to about 190 miles," said Greg Beroza, a professor of geophysics at Stanford and a coauthor of a new study that will be published in an upcoming issue of the journal Geophysical Research Letters.Unlike shallow earthquakes, the cause of intermediate quakes is not well-understood. Part of the problem is that the mechanism for shallow earthquakes should not physically work for quakes at greater depths."Shallow earthquakes occur when stress building up at faults overcomes friction, resulting in sudden slip and energy release," Beroza said. "That mechanism shouldn't work at the higher pressures and temperatures at which intermediate depth earthquakes occur."A better understanding of intermediate-depth quakes could help scientists forecast where they will occur and the risk they pose to buildings and people."They represent 25 percent of the catalog of earthquakes, and some of them are large enough to produce damage and deaths," said study first author Germán Prieto, an assistant professor of geophysics at the Massachusetts Institute of Technology.There are two main hypotheses for what may be driving intermediate depth earthquakes. According to one idea, water is squeezed out of rock pores at extreme depths and the liquid acts like a lubricant to facilitate fault sliding. This fits with the finding that intermediate quakes generally occur at sites where one tectonic plate is sliding, or subducting, beneath another."Typically, subduction involves oceanic plates whose rocks contain lots of water," Beroza said.A competing idea is that as rocks at extreme depths deform, they generate heat due to friction. The heated rocks become more malleable, or plastic, and as a result slide more easily against each other. This can create a positive feedback loop that further weakens the rock and increases the likelihood of fault slippage."It's a runaway process in which the increasing heat generates more slip, and more slip generates more heat and so on," Prieto said.To distinguish between the two possible mechanisms, the scientists studied a site near the city of Bucaramanga in Colombia that boasts the highest concentration of intermediate quakes in the world. About 18 intermediate depth temblors rattle Bucaramanga every day. Most are magnitude 2 to 3, weak quakes that are detectable only by sensitive instruments.But about once a month one occurs that is magnitude 5 or greater -- strong enough to be felt by the city's residents. Moreover, past studies have revealed that most of the quakes appear to be concentrated at a site located about 90 miles beneath Earth's surface that scientists call the Bucaramanga Nest.This type of clustering is highly unusual and makes the Bucaramanga Nest a "natural laboratory" for studying intermediate depth earthquakes. Comparison studies of intermediate quakes from different parts of the world are difficult because the makeup of Earth's crust and mantle can vary widely by location.In the Bucaramanga Nest, however, the intermediate quakes are so closely packed together that for the purposes of scientific studies and computer models, it's as if they all occurred at the same spot. This vastly simplifies calculations, Beroza said."When comparing a magnitude 2 and a magnitude 5 intermediate depth earthquake that are far apart, you have to model everything, including differences in the makeup of the Earth's surface," he said. "But if they're close together, you can assume that the seismic waves of both quakes suffered the same distortions as they traveled toward the Earth's surface."By studying seismic waves picked up by digital seismometers installed on Earth's surface above the Bucaramanga Nest, the scientists were able to measure two key parameters of the intermediate quakes happening deep underground.One, called the stress drop, allowed the team to estimate the total amount of energy released during the fault slips that caused the earthquakes. The other was radiated energy, which is a measure of how much of the energy generated by the fault slip is actually converted to seismic waves that propagate through Earth to shake the surface.Two things immediately stood out to the researchers. One was that the stress drop for intermediate quakes increased along with their magnitudes. That is, larger intermediate quakes released proportionally more total energy than smaller ones. Second, the amount of radiated energy released by intermediate earthquakes accounted for only a tiny portion of the total energy as calculated by the stress drop."For these intermediate-depth earthquakes in Colombia, the amount of energy converted to seismic waves is only a small fraction of the total energy," Beroza said.The implication is that intermediate earthquakes are expending most of their energy locally, likely in the form of heat."This is compelling evidence for a thermal runaway failure mechanism for intermediate earthquakes, in which a slipping fault generates heat. That allows for more slip and even more heat, and a positive feedback loop is created," said study coauthor Sarah Barrett, a Stanford graduate student in Beroza's research group. | Earthquakes | 2,013 |
December 5, 2013 | https://www.sciencedaily.com/releases/2013/12/131205141316.htm | Deep-sea study reveals cause of 2011 tsunami: Unusually thin, slippery geological fault found | The devastating tsunami that struck Japan's Tohoku region in March 2011 was touched off by a submarine earthquake far more massive than anything geologists had expected in that zone. | Now, a team of scientists including McGill University geologist Christie Rowe, has published a set of studies in the journal Prof. Rowe, of McGill's Department of Earth & Planetary Sciences, was one of 27 scientists from 10 countries who participated in a 50-day expedition in 2012 on the Japanese drilling vessel Chikyu. The team drilled three holes in the Japan Trench area to study the rupture zone of the 2011 earthquake, a fault in the ocean floor where two of Earth's major tectonic plates meet, deep beneath the surface of the Pacific Ocean.The joint where the Pacific and North American plates meet forms what is known as a "subduction" zone, with the North American plate riding over the edge of the Pacific plate. The latter plate bends and plunges deep into Earth, forming the Japan Trench.The conventional view among geologists has been that deep beneath the seafloor, where rocks are strong, movements of the plates can generate a lot of elastic rebound. Closer to the surface of the seafloor, where rocks are softer and less compressed, this rebound effect was thought to taper off.Until 2011, the largest displacement of plates ever recorded along a fault occurred in 1960 off the coast of Chile, where a powerful earthquake displaced the seafloor plates by an average of 20 metres. In the Tohoku earthquake, the slip amounted to 30 to 50 metres -- and the slip actually grew bigger as the subterranean rupture approached the seafloor. This runaway rupture thrust up the seafloor, touching off the horrifying tsunami.The results of last year's drilling by the Chikyu expedition, outlined in the For one thing, the fault, itself, is very thin -- less than five metres thick in the area sampled. "To our knowledge, it's the thinnest plate boundary on Earth," Rowe says. By contrast, California's San Andreas fault is several kilometers thick in places.The scientists also discovered that the clay deposits that fill the narrow fault are made of extremely fine sediment. "It's the slipperiest clay you can imagine," says Rowe. "If you rub it between your fingers, it feels like a lubricant."The discovery of this unusual clay in the Tohoku slip zone suggests that other subduction zones in the northwest Pacific where this type of clay is present -- from Russia's Kamchatka peninsula to the Aleutian Islands -- may be capable of generating similar, huge earthquakes, Rowe adds.To conduct the studies, the scientists used specially designed deep-water drilling equipment that enabled them to drill more than 800 metres beneath the sea floor, in an area where the water is around 6,900 metres deep. No hole had ever before been drilled that deep in an area of similar water depth. At those extraordinary depths, it took six hours from the time the drill pulled core samples from the fault until it reached the ship.During night shifts on deck, Rowe was in charge of deciding which sections of drill core would go to geochemists for water sampling, and which would go to geologists for studies of the sediment and deformation structures. "We X-rayed the core as soon as it came on board, so the geochemists could get their water sample before oxygen was able to penetrate inside the pores of the sediment."The expedition was supported by member countries of the Integrated Ocean Drilling Program (particularly Japan and the US), and Canadian participants were supported by the European Consortium for Ocean Research Drilling, of which Canada is a member. | Earthquakes | 2,013 |
December 1, 2013 | https://www.sciencedaily.com/releases/2013/12/131201140151.htm | What drives aftershocks? | On 27 February 2010 an earthquake of magnitude 8.8 struck South-Central Chile near the town of Maule. The main shock displaced the subduction interface by up to 16 meters. Like usually after strong earthquakes a series of aftershocks occurred in the region with decreasing size over the next months. A surprising result came from an afterslip study: Up to 2 meters additional slip occurred along the plate interface within 420 days only, in a pulse like fashion and without associated seismicity. An international research group lead by GFZ analysed the main shock as well as the following postseismic phase with a dense network of instruments including more than 60 high-resolution GPS stations. | The aftershocks and the now found "silent" afterslip are key to understand the processes occurring after strong earthquakes. The GPS data in combination with seismological data allowed for the first time a comparative analysis: Are aftershocks triggered solely by stress transfer from the main shock or are additional mechanisms active? "Our results suggest, that the classic view of the stress relaxation due to aftershocks are too simple" says Jonathan Bedford from GFZ to the new observation: "Areas with large stress transfer do not correlate with aftershocks in all magnitude classes as hitherto assumed and stress shadows show surprisingly high seismic activity."A conclusion is that local processes which are not detectable at the surface by GPS monitoring along the plate interface have a significant effect on the local stress field. Pressurized fluids in the crust and mantle could be the agent here. As suspected previously, the main and aftershocks might have generated permeabilities in the source region which are explored by hydrous fluids. This effects the local stress field triggering aftershocks rather independently from the large scale, main shock induced stress transfer. The present study provides evidences for such a mechanism. Volume (3D) seismic tomography which is sensitive to fluid pressure changes in combination with GPS monitoring will allow to better monitor the evolution of such processes.The main shock was due to a rupture of the interface between the Nasca and the South American plates. Aftershocks are associated with hazards as they can be of similar size as the main shock and, in contrast to the latter, much shallower in the crust. | Earthquakes | 2,013 |
November 17, 2013 | https://www.sciencedaily.com/releases/2013/11/131117155609.htm | Volcano discovered smoldering under a kilometer of ice in West Antarctica: Heat may increase rate of ice loss | It wasn't what they were looking for but that only made the discovery all the more exciting. | In January 2010 a team of scientists had set up two crossing lines of seismographs across Marie Byrd Land in West Antarctica. It was the first time the scientists had deployed many instruments in the interior of the continent that could operate year-round even in the coldest parts of Antarctica.Like a giant CT machine, the seismograph array used disturbances created by distant earthquakes to make images of the ice and rock deep within West Antarctica.There were big questions to be asked and answered. The goal, says Doug Wiens, professor of earth and planetary science at Washington University in St. Louis and one of the project's principle investigators, was essentially to weigh the ice sheet to help reconstruct Antarctica's climate history. But to do this accurately the scientists had to know how Earth's mantle would respond to an ice burden, and that depended on whether it was hot and fluid or cool and viscous. The seismic data would allow them to map the mantle's properties.In the meantime, automated-event-detection software was put to work to comb the data for anything unusual.When it found two bursts of seismic events between January 2010 and March 2011, Wiens' PhD student Amanda Lough looked more closely to see what was rattling the continent's bones.Was it rock grinding on rock, ice groaning over ice, or, perhaps, hot gases and liquid rock forcing their way through cracks in a volcanic complex?Uncertain at first, the more Lough and her colleagues looked, the more convinced they became that a new volcano was forming a kilometer beneath the ice.The discovery of the new as yet unnamed volcano is announced in the Nov. 17 advanced online issue of The teams that install seismographs in Antarctica are given first crack at the data. Lough had done her bit as part of the WUSTL team, traveling to East Antarctica three times to install or remove stations in East Antarctica.In 2010 many of the instruments were moved to West Antarctica and Wiens asked Lough to look at the seismic data coming in, the first large-scale dataset from this part of the continent."I started seeing events that kept occurring at the same location, which was odd, "Lough said. "Then I realized they were close to some mountains-but not right on top of them.""My first thought was, 'Okay, maybe its just coincidence.' But then I looked more closely and realized that the mountains were actually volcanoes and there was an age progression to the range. The volcanoes closest to the seismic events were the youngest ones."The events were weak and very low frequency, which strongly suggested they weren't tectonic in origin. While low-magnitude seismic events of tectonic origin typically have frequencies of 10 to 20 cycles per second, this shaking was dominated by frequencies of 2 to 4 cycles per second.But glacial processes can generate low-frequency events. If the events weren't tectonic could they be glacial?To probe farther, Lough used a global computer model of seismic velocities to "relocate" the hypocenters of the events to account for the known seismic velocities along different paths through the Earth. This procedure collapsed the swarm clusters to a third their original size.It also showed that almost all of the events had occurred at depths of 25 to 40 kilometers (15 to 25 miles below the surface). This is extraordinarily deep -- deep enough to be near the boundary between the earth's crust and mantle, called the Moho, and more or less rules out a glacial origin.It also casts doubt on a tectonic one. "A tectonic event might have a hypocenter 10 to 15 kilometers (6 to 9 miles) deep, but at 25 to 40 kilometers, these were way too deep," Lough says.A colleague suggested that the event waveforms looked like Deep Long Period earthquakes, or DPLs, which occur in volcanic areas, have the same frequency characteristics and are as deep. "Everything matches up," Lough says.The seismologists also talked to Duncan Young and Don Blankenship of the University of Texas who fly airborne radar over Antarctica to produce topographic maps of the bedrock. "In these maps, you can see that there's elevation in the bed topography at the same location as the seismic events," Lough says.The radar images also showed a layer of ash buried under the ice. "They see this layer all around our group of earthquakes and only in this area," Lough says."Their best guess is that it came from Mount Waesche, an existing volcano near Mt Sidley. But that is also interesting because scientists had no idea when Mount Waesche was last active, and the ash layer is sets the age of the eruption at 8,000 years ago. "The case for volcanic origin has been made. But what exactly is causing the seismic activity?"Most mountains in Antarctica are not volcanic," Wiens says, "but most in this area are. Is it because East and West Antarctica are slowly rifting apart? We don't know exactly. But we think there is probably a hot spot in the mantle here producing magma far beneath the surface.""People aren't really sure what causes DPLs," Lough says. "It seems to vary by volcanic complex, but most people think it's the movement of magma and other fluids that leads to pressure-induced vibrations in cracks within volcanic and hydrothermal systems.""Definitely," Lough says. "In fact because of the radar shows a mountain beneath the ice I think it has erupted in the past, before the rumblings we recorded.The scientists calculated that an enormous eruption, one that released a thousand times more energy than the typical eruption, would be necessary to breach the ice above the volcano.On the other hand a subglacial eruption and the accompanying heat flow will melt a lot of ice. "The volcano will create millions of gallons of water beneath the ice -- many lakes full," says Wiens. This water will rush beneath the ice towards the sea and feed into the hydrological catchment of the MacAyeal Ice Stream, one of several major ice streams draining ice from Marie Byrd Land into the Ross Ice Shelf.By lubricating the bedrock, it will speed the flow of the overlying ice, perhaps increasing the rate of ice-mass loss in West Antarctica."We weren't expecting to find anything like this," Wiens says. | Earthquakes | 2,013 |
November 4, 2013 | https://www.sciencedaily.com/releases/2013/11/131104152726.htm | Gas injection probably triggered small earthquakes near Snyder, Texas | A new study correlates a series of small earthquakes near Snyder, Texas between 2006 and 2011 with the underground injection of large volumes of gas, primarily carbon dioxide (CO | Although the study suggests that underground injection of gas triggered the Snyder earthquakes, it also points out that similar rates of injections have not triggered comparable quakes in other fields, bolstering the idea that underground gas injection does not cause significant seismic events in many geologic settings.No injuries or severe damage were reported from the quakes identified in the study.The study represents the first time underground gas injection has been correlated with earthquakes greater than magnitude 3.The results, from Wei Gan and Cliff Frohlich at The University of Texas at Austin's Institute for Geophysics, appear this week in an online edition of the journal The study focused on an area of northwest Texas with three large oil and gas fields -- the Cogdell field, the Salt Creek field and the Scurry Area Canyon Reef Operators Committee unit (SACROC) -- which have all produced petroleum since the 1950s.Operators began injecting COThis latest study was funded by the U.S. Geological Survey and the National Natural Science Foundation of China.Using a high-resolution temporary network of seismometers, Gan and Frohlich identified 93 earthquakes in the Cogdell area from March 2009 to December 2010, three of which were greater than magnitude 3. An even larger earthquake, with magnitude 4.4, occurred in Cogdell in September 2011. Using data on injections and extractions of fluids and gases, they concluded that the earthquakes were correlated with the increase in CO"What's interesting is we have an example in Cogdell field, but there are other fields nearby that have experienced similar COIn a paper published last year in the "The fact that the different fields responded differently to COFrohlich suggests one possible explanation for the different response to gas injection in the three fields might be that there are geological faults in the Cogdell area that are primed and ready to move when pressures from large volumes of gas reduce friction on these faults. The other two fields might not have such faults.Frohlich suggests an important next step in understanding seismic risks for proposed CCS projects would be to create geological models of Cogdell and other nearby fields to better understand why they respond differently to gas injection.Gan and Frohlich analyzed seismic data collected between March 2009 and December 2010 by the EarthScope USArray Program, a National Science Foundation-funded network of broadband seismometers deployed from the Canadian border to the Gulf of Mexico. Because of the high density of instruments, they were able to detect earthquakes down to magnitude 1.5, too weak for people to feel at the surface and many of which were not detected by the U.S. Geological Survey's more limited seismic network.Using the USArray data, the researchers identified and located 93 well-recorded earthquakes. Most occurred in several northeast-southwest trending linear clusters, which might indicate the presence of previously unidentified faults. Three of the quakes identified in the USArray data were greater than magnitude 3. According to U.S. Geological Survey observations for the same area from 2006 to 2011, 18 earthquakes greater than magnitude 3 occurred in the study area.Gan and Frohlich also evaluated data on injections and extractions of oil, water and gas in the study area collected by the Texas Railroad Commission, the state agency that regulates oil and gas operations. Since 1990, rates of liquid injection and extraction, as well as gas produced, remained fairly constant in all three oil and gas fields. The only significant change was a substantial increase in injection rates of gas, primarily COPrevious work by Frohlich and others has shown that underground injection of liquids can induce earthquakes.This research was partially supported by National Natural Science Foundation of China (Grant 41174076) and by the U.S. Geological Survey (Award G13AP00023).The University of Texas at Austin is committed to transparency and disclosure of all potential conflicts of interest of its researchers. Frohlich has no research support from the petroleum industry, but he has consulted for geophysical service companies concerning seismic risks for dams, power plants, water pipelines and petroleum fields. Gan has no research support from the petroleum industry. | Earthquakes | 2,013 |
October 30, 2013 | https://www.sciencedaily.com/releases/2013/10/131030125621.htm | Improving earthquake early warning systems for California and Taiwan | Earthquake early warning systems may provide the public with crucial seconds to prepare for severe shaking. For California, a new study suggests upgrading current technology and relocating some seismic stations would improve the warning time, particularly in areas poorly served by the existing network -- south of San Francisco Bay Area to north Los Angeles and north of the San Francisco Bay Area. | A separate case study focuses on the utility of low cost sensors to create a high-density, effective network that can be used for issuing early warnings in Taiwan. Both studies appear in the November issue of the journal "We know where most active faults are in California, and we can smartly place seismic stations to optimize the network," said Serdar Kuyuk, assistant professor of civil engineering at Sakarya University in Turkey, who conducted the California study while he was a post-doctoral fellow at University of California (UC), Berkeley. Richard Allen, director of the Seismological Laboratory at UC Berkeley, is the co-author of this study.Japan started to build its EEW system after the 1995 Kobe earthquake and performed well during the 2011 magnitude 9 Tohoku-Oki earthquake. While the U.S. Geological Survey(USGS)/Caltech Southern California Seismic and TriNet Network in Southern California was upgraded in response to the 1994 Northridge quake, the U.S is lagging behind Japan and other countries in developing a fully functional warning system."We should not wait until another major quake before improving the early warning system," said Kuyuk.Noting California's recent law that calls for the creation of a statewide earthquake early warning (EEW) system, Kuyuk says "the study is timely and highlights for policymakers where to deploy stations for optimal coverage." The approach maximizes the warning time and reduces the size of "blind zones" where no warning is possible, while also taking into account budgetary constraints.Earthquake early warning systems detect the initiation of an earthquake and issue warning alerts of possible forthcoming ground shaking. Seismic stations detect the energy from the compressional P-wave first, followed by the shear and surface waves, which cause the intense shaking and most damage.The warning time that any system generates depends on many factors, with the most important being the proximity of seismic stations to the earthquake epicenter. Once an alert is sent, the amount of warning time is a function of distance from the epicenter, where more distant locations receive more time.Areas in "blind zones" do not receive any warning prior to arrival of the more damaging S-wave. The goal, writes Kuyuk and Allen, is to minimize the number of people and key infrastructure within the blind zone. For the more remote earthquakes, such as earthquakes offshore or in unpopulated regions, larger blind zones can be tolerated."There are large blind zones between the Bay Area and Los Angeles where there are active faults," said Kuyuk. "Why? There are only 10 stations along the 150-mile section of the San Andreas Fault. Adding more stations would improve warning for people in these areas, as well as people in LA and the Bay Area should an earthquake start somewhere in between," said Kuyuk.Adding stations may not be so simple, according to Allen. "While there is increasing enthusiasm from state and federal legislators to build the earthquake early warning system that the public wants," said Allen, "the reality of the USGS budget for the earthquake program means that it is becoming impossible to maintain the functionality of the existing network operated by the USGS and the universities."The USGS was recently forced to downgrade the telemetry of 58 of the stations in the San Francisco Bay Area in order to reduce costs," said Allen. "While our In California, the California Integrated Seismic Network (CISN) consists of multiple networks, with 2900 seismic stations at varying distances from each other, ranging from 2 to 100 km. Of the some 2900 stations, 377 are equipped to contribute to an EEW system.Kuyuk and Allen estimate 10 km is the ideal distance between seismic stations in areas along major faults or near major cities. For other areas, an interstation distance of 20 km would provide sufficient warning. The authors suggest greater density of stations and coverage could be achieved by upgrading technology used by the existing stations, integrating Nevada stations into the current network, relocating some existing stations and adding new ones to the network.The U.S. Geological Survey (USGS) and the Gordon and Betty Moore Foundation funded this study.In a separate study, Yih-Min Wu of National Taiwan University reports on the successful experiment to use low cost MEMS sensors to build a high-density seismic network to support an early warning system for Taiwan.MEMS accelerometers are tiny sensors used in common devices, such as smart phones and laptops. These sensors are relatively cheap and have proven to be sensitive detectors of ground motion, particularly from large earthquakes.The current EEW system in Taiwan consists of 109 seismic stations that can provide alerts within 20 seconds following the initial detection of an earthquake. Wu sought to reduce the time between earthquake and initial alert, thereby increasing the potential warning time.The EEW research group at National Taiwan University developed a P-wave alert device named "Palert" that uses MEMS accelerometers for onsite earthquake early warning, at one-tenth the cost of traditional strong motion instruments.From June 2012 to May 2013 Wu and his colleagues tested a network of 400 Palert devices deployed throughout Taiwan, primarily at elementary schools to take advantage of existing power and Internet connections and where they can be used to educate students about earthquake hazard mitigation.During the testing period, the Palert system functioned similarly to the existing EEW system, which consists of the conventional strong motion instruments. With four times as many stations, the Palert network can provide a detailed shaking map for damage assessments, which it did for the March 2013 magnitude 6.1 Nantou quake.Wu suggests the relatively low cost Palert device may have commercial potential and can be readily integrated into existing seismic networks to increase coverage density of EEW systems. In addition to China, Indonesia and Mexico, plans call for the Palert devices to be installed near New Delhi, India to test the feasibility of an EEW system there. | Earthquakes | 2,013 |
October 30, 2013 | https://www.sciencedaily.com/releases/2013/10/131030104114.htm | Google street view: Tool for recording earthquake damage | A scientist from Cologne University has used Google's online street view scans to document the damage caused by the 2009 L'Aquila earthquake and suggests that the database would be a useful tool for surveying damage caused by future earthquakes. The findings are published in the November issue of the | The magnitude 6.3 2009 L'Aquila earthquake in the Italian Abruzzi Mountains caused widespread damage in the city and surrounding villages.In 2011 Klaus-G. Hinzen, a seismologist with Cologne University in Germany, and colleagues from Italy conducted a field survey, taking 3D laser scans to document earthquake rotated objects. Later Hinzen used Google Earth software to map the exact locations of numerous photos of damaged constructions and when consulting Google street views, discovered the scans had been taken less than one year before the earthquake, providing an unexpected opportunity to compare the locations captured by the 2011 photos with Google street view scans.Google Earth's aerial views have helped capture an overview of damage to L'Aquila and specific collapsed structures. But the Google street views show the details -- fractures, plaster breaks and collapsed walls. The scans help identify the damage caused by the quake rather than a lack of building maintenance or disrepair.Hinzen suggests that any planned systematic survey of earthquake damage could benefit from the use of Google street view, if available for the area under investigation. | Earthquakes | 2,013 |
October 21, 2013 | https://www.sciencedaily.com/releases/2013/10/131021211710.htm | Earthquake-triggered landslides pose significant hazard for Seattle, new study details potential damage | A new study suggests the next big earthquake on the Seattle fault may cause devastating damage from landslides, greater than previously thought and beyond the areas currently defined as prone to landslides. Published online Oct. 22 by the | "A major quake along the Seattle fault is among the worst case scenarios for the area since the fault runs just south of downtown. Our study shows the need for dedicated studies on seismically induced landsliding" said co-author Kate Allstadt, doctoral student at University of Washington.Seattle is prone to strong shaking as it sits atop the Seattle Basin -- a deep sedimentary basin that amplifies ground motion and generates strong seismic waves that tend to increase the duration of the shaking. The broader region is vulnerable to earthquakes from multiple sources, including deep earthquakes within the subducted Juan de Fuca plate, offshore megathrust earthquakes on Cascadia subduction zone and the shallow crustal earthquakes within the North American Plate.For Seattle, a shallow crustal earthquake close to the city would be most damaging. The last major quake along the Seattle fault was in 900 AD, long before the city was established, though native people lived in the area. The earthquake triggered giant landslides along Lake Washington, causing entire blocks of forest to slide into the lake."There's a kind of haunting precedence that tells us that we should pay attention to a large earthquake on this fault because it happened in the past," said Allstadt, who also serves as duty seismologist for the Pacific Northwest Seismic Network. John Vidale of University of Washington and Art Frankel of the U.S. Geological Survey (USGS) are co-authors of the study, which was funded by the USGS.While landslides triggered by earthquakes have caused damage and casualties worldwide, they have not often been the subject of extensive quantitative study or fully incorporated into seismic hazard assessments, say authors of this study that looks at just one scenario among potentially hundreds for a major earthquake in the Seattle area.Dividing the area into a grid of 210-meter cells, they simulated ground motion for a magnitude 7 Seattle fault earthquake and then further subdivided into 5-meter cells, applying anticipated amplification of shaking due to the shallow soil layers. This refined framework yielded some surprises."One-third of the landslides triggered by our simulation were outside of the areas designated by the city as prone to landsliding," said Allstadt. "A lot of people assume that all landslides occur in the same areas, but those triggered by rainfall or human behavior have a different triggering mechanism than landslides caused by earthquakes so we need dedicated studies."While soil saturation -- whether the soil is dry or saturated with water -- is the most important factor when analyzing the potential impact of landslides, the details of ground motion rank second. The amplification of ground shaking, directivity of seismic energy and geological features that may affect ground motion are very important to the outcome of ground failure, say authors.The authors stress that this is just one randomized scenario study of many potential earthquake scenarios that could strike the city. While the results do not delineate the exact areas that will be affected in a future earthquake, they do illustrate the extent of landsliding to expect for a similar event.The study suggests the southern half of the city and the coastal bluffs, many of which are developed, would be hardest hit. Depending upon the water saturation level of the soil at the time of the earthquake, several hundred to thousands of buildings could be affected citywide. For dry soil conditions, there are more than 1000 buildings that are within all hazard zones, 400 of those in the two highest hazard designation zones. The analysis suggests landslides could also affect some inland slopes, threatening key transit routes and impeding recovery efforts. For saturated soil conditions, it is an order of magnitude worse, with 8000 buildings within all hazard zones, 5000 of those within the two highest hazard zones. These numbers only reflect the number of buildings in high-risk areas, not the number of buildings that would necessarily suffer damage."The extra time we took to include the refined ground motion detail was worth it. It made a significant difference to our understanding of the potential damage to Seattle from seismically triggered landslides," said Allstadt, who would like to use the new framework to run many more scenarios to prepare for future earthquakes in Seattle. | Earthquakes | 2,013 |
October 10, 2013 | https://www.sciencedaily.com/releases/2013/10/131010142750.htm | Iron in Earth's core weakens before melting | The iron in the Earth's inner core weakens dramatically before it melts, explaining the unusual properties that exist in the moon-sized solid centre of our planet that have, up until now, been difficult to understand. | Scientists use seismic waves -- pulses of energy generated during earthquakes -- to measure what is happening in the Earth's inner core, which at 6000 km beneath our feet is completely inaccessible.Problematically for researchers, the results of seismic measurements consistently show that these waves move through the Earth's solid inner core at much slower speeds than predicted by experiments and simulations.Specifically, a type of seismic wave called a 'shear wave' moves particularly slowly through the Earth's core relative to the speed expected for the material -- mainly iron -- from which the core is made. Shear waves move through the body of the object in a transverse motion -- like waves in a rope, as opposed to waves moving through a slinky spring.Now, in a paper published in They calculated that at temperatures up to 95% of what is needed to melt iron in the Earth's inner core, the speed of the seismic waves moving through the inner core decreases linearly but, after 95%, it drops dramatically.At about 99% of the melting temperature of iron, the team's calculated velocities agree with seismic data for the Earth's inner core. Since independent geophysical results suggest that the inner core is likely to be at 99-100% of its melting temperature, the results presented in this paper give a compelling explanation as to why the seismic wave velocities are lower than those predicted previously.Professor Lidunka Vočadlo, from the UCL department of Earth Sciences and an author of the paper said: "The Earth's deep interior still holds many mysteries that scientists are trying to unravel."The proposed mineral models for the inner core have always shown a faster wave speed than that observed in seismic data. This mismatch has given rise to several complex theories about the state and evolution of the Earth's core."The authors stress that this is not the end of the story as other factors need to be taken into account before a definitive core model can be made. As well as iron, the core contains nickel and light elements, such as silicon and sulphur.Professor Vočadlo said: "The strong pre-melting effects in iron shown in our paper are an exciting new development in understanding the Earth's inner core. We are currently working on how this result is affected by the presence of other elements, and we may soon be in a position to produce a simple model for the inner core that is consistent with seismic and other geophysical measurements. " | Earthquakes | 2,013 |
September 29, 2013 | https://www.sciencedaily.com/releases/2013/09/130929202752.htm | Tiny sensor used in smart phones could create urban seismic network | A tiny chip used in smart phones to adjust the orientation of the screen could serve to create a real-time urban seismic network, easily increasing the amount of strong motion data collected during a large earthquake, according to a new study published in the October issue of the | Micro-Electro-Mechanical System (MEMS) accelerometers measure the rate of acceleration of ground motion and vibration of cars, buildings and installations. In the 1990s MEMS accelerometers revolutionized the automotive airbag industry and are found in many devices used daily, including smart phones, video games and laptops.Antonino D'Alessandro and Giuseppe D'Anna, both seismologists at Istituto Nazionale di Geosifica e Vulcanologia in Italy, tested whether inexpensive MEMS accelerometers could reliably and accurately detect ground motion caused by earthquakes. They tested the LIS331DLH MEMS accelerometer installed in the iPhone mobile phone, comparing it to the earthquake sensor EpiSensor ES-T force balance accelerometer produced by Kinemetrics Inc.The tests suggest that the MEMS accelerometers can detect moderate to strong earthquakes (greater than magnitude 5) when located near the epicenter. The device produces sufficient noise to prevent it from accurately detecting lesser quakes -- a limitation to its use in monitoring strong motion.D'Alessandro and D'Anna note that the technology is rapidly evolving, and there will soon be MEMS sensors that are sensitive to quakes less than magnitude 5. The real advantage, say the authors, is the widespread use of mobile phones and laptops that include MEMS technology, making it possible to dramatically increase coverage when strong earthquakes occur.The current state of the MEMS sensors, suggest the authors, could be used for the creation of an urban seismic network that could transmit in real-time ground motion data to a central location for assessment. The rich volume of data could help first responders identify areas of greatest potential damage, allowing them to allocate resources more effectively.The article, "Suitability of low-cost three-axis MEMS accelerometers in strong-motion seismology: tests on the LIS331DLH (iPhone) accelerometer," is published in October issue of | Earthquakes | 2,013 |
September 19, 2013 | https://www.sciencedaily.com/releases/2013/09/130919142200.htm | Geologists simulate deep earthquakes in lab | More than 20 years ago, geologist Harry Green, now a distinguished professor of the graduate division at the University of California, Riverside, and colleagues discovered a high-pressure failure mechanism that they proposed then was the long-sought mechanism of very deep earthquakes (earthquakes occurring at more than 400 km depth). | The result was controversial because seismologists could not find a seismic signal in Earth that could confirm the results.Seismologists have now found the critical evidence. Indeed, beneath Japan, they have even imaged the tell-tale evidence and showed that it coincides with the locations of deep earthquakes.In the Sept. 20 issue of the journal "We confirmed essentially all aspects of our earlier experimental work and extended the conditions to significantly higher pressure," Green said. "What is crucial, however, is that these experiments are accomplished in a new type of apparatus that allows us to view and analyze specimens using synchrotron X-rays in the premier laboratory in the world for this kind of experiments -- the Advanced Photon Source at Argonne National Laboratory."The ability to do such experiments has now allowed scientists like Green to simulate the appropriate conditions within Earth and record and analyze the "earthquakes" in their small samples in real time, thus providing the strongest evidence yet that this is the mechanism by which earthquakes happen at hundreds of kilometers depth.The origin of deep earthquakes fundamentally differs from that of shallow earthquakes (earthquakes occurring at less than 50 km depth). In the case of shallow earthquakes, theories of rock fracture rely on the properties of coalescing cracks and friction."But as pressure and temperature increase with depth, intracrystalline plasticity dominates the deformation regime so that rocks yield by creep or flow rather than by the kind of brittle fracturing we see at smaller depths," Green explained. "Moreover, at depths of more than 400 kilometers, the mineral olivine is no longer stable and undergoes a transformation resulting in spinel. a mineral of higher density"The research team focused on the role that phase transformations of olivine might play in triggering deep earthquakes. They performed laboratory deformation experiments on olivine at high pressure and found the "earthquakes" only within a narrow temperature range that simulates conditions where the real earthquakes occur in Earth."Using synchrotron X-rays to aid our observations, we found that fractures nucleate at the onset of the olivine to spinel transition," Green said. "Further, these fractures propagate dynamically so that intense acoustic emissions are generated. These phase transitions in olivine, we argue in our research paper, provide an attractive mechanism for how very deep earthquakes take place."Green was joined in the study by Alexandre Schubnel at the Ecole Normale Supérieure, France; Fabrice Brunet at the Université de Grenoble, France; and Nadège Hilairet, Julian Gasc and Yanbin Wang at the University of Chicago, Ill. | Earthquakes | 2,013 |
September 19, 2013 | https://www.sciencedaily.com/releases/2013/09/130919142152.htm | Seismologists puzzle over largest deep earthquake ever recorded | A magnitude 8.3 earthquake that struck deep beneath the Sea of Okhotsk on May 24, 2013, has left seismologists struggling to explain how it happened. At a depth of about 609 kilometers (378 miles), the intense pressure on the fault should inhibit the kind of rupture that took place. | "It's a mystery how these earthquakes happen. How can rock slide against rock so fast while squeezed by the pressure from 610 kilometers of overlying rock?" said Thorne Lay, professor of Earth and planetary sciences at the University of California, Santa Cruz.Lay is coauthor of a paper, published in the September 20 issue of Deep earthquakes occur in the transition zone between the upper mantle and lower mantle, from 400 to 700 kilometers below the surface. They result from stress in a deep subducted slab where one plate of Earth's crust dives beneath another plate. Such deep earthquakes usually don't cause enough shaking on the surface to be hazardous, but scientifically they are of great interest.The energy released by the Sea of Okhotsk earthquake produced vibrations recorded by several thousand seismic stations around the world. Ye, Lay, and their coauthors determined that it released three times as much energy as the 1994 Bolivia earthquake, comparable to a 35 megaton TNT explosion. The rupture area and rupture velocity were also much larger. The rupture extended about 180 kilometers, by far the longest rupture for any deep earthquake recorded, Lay said. It involved shear faulting with a fast rupture velocity of about 4 kilometers per second (about 9,000 miles per hour), more like a conventional earthquake near the surface than other deep earthquakes. The fault slipped as much as 10 meters, with average slip of about 2 meters."It looks very similar to a shallow event, whereas the Bolivia earthquake ruptured very slowly and appears to have involved a different type of faulting, with deformation rather than rapid breaking and slippage of the rock," Lay said.The researchers attributed the dramatic differences between these two deep earthquakes to differences in the age and temperature of the subducted slab. The subducted Pacific plate beneath the Sea of Okhotsk (located between the Kamchatka Peninsula and the Russian mainland) is a lot colder than the subducted slab where the 1994 Bolivia earthquake occurred."In the Bolivia event, the warmer slab resulted in a more ductile process with more deformation of the rock," Lay said.The Sea of Okhotsk earthquake may have involved re-rupture of a fault in the plate produced when the oceanic plate bent down into the Kuril-Kamchatka subduction zone as it began to sink. But the precise mechanism for initiating shear fracture under huge confining pressure remains unclear. The presence of fluid can lubricate the fault, but all of the fluids should have been squeezed out of the slab before it reached that depth."If the fault slips just a little, the friction could melt the rock and that could provide the fluid, so you would get a runaway thermal effect. But you still have to get it to start sliding," Lay said. "Some transformation of mineral forms might give the initial kick, but we can't directly detect that. We can only say that it looks a lot like a shallow event." | Earthquakes | 2,013 |
September 16, 2013 | https://www.sciencedaily.com/releases/2013/09/130916091144.htm | Superconductivity to meet humanity's greatest challenges | The stage is now set for superconductivity to branch out and meet some of the biggest challenges facing humanity today. | This is according to a topical review `Superconductivity and the environment: a Roadmap', published today, 16 September, in IOP Publishing's journal Lance Cooley, a guest editor of the article who is based at the Fermi National Accelerator Laboratory, said: "Superconductivity has been meeting some great challenges over the past 50 years. The Large Hadron Collider, humankind's largest machine, would not exist were it not for superconductivity.""There are many uses of superconductors in other big science projects, laboratory devices, and MRI systems. Now, as the roadmap outlines, new materials and technologies enable researchers and entrepreneurs to be more versatile and apply superconductivity in other ways that contribute to our everyday lives, such as innovations to benefit our environment."By utilising superconducting quantum interference devices (SQUIDs) -- very sensitive contraptions that can measure extremely small changes in magnetic fields -- one section explains how unexploded weapons, otherwise known as unexploded ordnances (UXOs), can be detected and safely recovered.Thousands of UXOs are still discovered each year around Europe, especially in areas that were heavily bombed during the Second World War. They can be very unstable and still pose a major threat; however, the sheer scale and complexity of the terrain that needs to be surveyed makes detecting them very complicated.A section by Pascal Febvre, from the University of Savoie, explains how a complete network of SQUIDs dotted around the globe could also aid the detection of solar bursts which send magnetic particles hurtling towards Earth, potentially wreaking havoc with our communication systems.A similar network of SQUIDs could also help detect the specific magnetic signature of Earthquakes before they strike.One area already progressing with the help of superconducting technology is high-speed rail travel. Magnetically levitating (maglev) trains, whereby the carriage is levitated by magnets and has no contact with the track, have already been deployed in Germany, China, Japan and Brazil.These countries are now looking to develop high temperature superconducting maglev trains which use liquid nitrogen instead of liquid helium to cool the tracks. This is expected to simplify the cooling process, reduce operational costs, offer more stable levitation and allow lighter carriages to be used, according to Motoaki Terai from the Central Japan Railway Company.Kyeongdal Choi and Woo Seok Kim, from Korea Polytechnic University, explain how high temperature superconducting technologies can be used to effectively store power from wind and solar plants, as the weather dictates how much power can be generated at any one time, unlike non-renewable sources such as coal and oil which have a constant output.Superconducting cables could also carry an electrical current with no resistance across large distances from the wind and solar power plants to cities and towns. According to Steven Eckroad, from the Electric Power Research Institute, and Adela Marian, from the Institute for Advanced Sustainability Studies, advances in cryogenics, the development of low-cost wires and ac-to-dc current converters will make this technology cost-effective and environmentally friendly.Professor Shigehiro Nishijima of Osaka University points out the increasing need for clean water for domestic purposes and describes the possibility of using high field magnetic separation systems based on superconducting magnets for this purpose. | Earthquakes | 2,013 |
September 5, 2013 | https://www.sciencedaily.com/releases/2013/09/130905142815.htm | Beneath Earth's surface, scientists find long 'fingers' of heat | Scientists seeking to understand the forces at work beneath the surface of Earth have used seismic waves to detect previously unknown "fingers" of heat, some of them thousands of miles long, in Earth's upper mantle. Their discovery, published Sept. 5 in | Many volcanoes arise at collision zones between the tectonic plates, but hotspot volcanoes form in the middle of the plates. Geologists have hypothesized that upwellings of hot, buoyant rock rise as plumes from deep within Earth's mantle -- the layer between the crust and the core that makes up most of Earth's volume -- and supply the heat that feeds these mid-plate volcanoes.But some hotspot volcano chains are not easily explained by this simple model, a fact which suggests there are more complex interactions between these hot plumes and the upper mantle. Now, a computer modeling approach, developed by University of Maryland seismologist Vedran Lekic and colleagues at the University of California Berkeley, has produced new seismic wave imagery which reveals that the rising plumes are, in fact, influenced by a pattern of finger-like structures carrying heat deep beneath Earth's oceanic plates.Seismic waves are waves of energy produced by earthquakes, explosions and volcanic eruptions, which can travel long distances below Earth's surface. As they travel through layers of different density and elasticity, their shape changes. A global network of seismographs records these changing waveforms. By comparing the waveforms from hundreds of earthquakes recorded at locations around the world, scientists can make inferences about the structures through which the seismic waves have traveled.The process, known as seismic tomography, works in much the same way that CT scans (computed tomography) reveal structures hidden beneath the surface of the human body. But since we know much less about the structures below Earth's surface, seismic tomography isn't easy to interpret. "The Earth's crust varies a lot, and being able to represent that variation is difficult, much less the structure deeper below" said Lekic, an assistant professor of geology at the College Park campus.Until recently, analyses like the one in the study would have taken up to 19 years of computer time. While studying for his doctorate with the study's senior author, UC Berkeley Prof. Barbara Romanowicz, Lekic developed a method to more accurately model waveform data while still keeping computer time manageable, which resulted in higher-resolution images of the interaction between the layers of Earth's mantle.By refining this method, a research team led by UC Berkeley graduate student Scott French found finger-like channels of low-speed seismic waves flowing about 120 to 220 miles below the sea floor, and stretching out in bands about 700 miles wide and 1,400 miles apart. The researchers also discovered a subtle but important difference in speed: at this depth, seismic waves typically travel about 2.5 to 3 miles per second, but the average seismic velocity in the channels was 4 percent slower. Because higher temperatures slow down seismic waves, the researchers infer that the channels are hotter than the surrounding material."We estimate that the slowdown we're seeing could represent a temperature increase of up to 200 degrees Celsius," or about 390 degrees Fahrenheit, said French, the study's study lead author. At these depths, absolute temperatures in the mantle are about 1,300 degrees Celsius, or 2,400 degrees Fahrenheit, the researchers said.Geophysicists have long theorized that channels akin to those revealed in the computer model exist, and are interacting with the plumes in Earth's mantle that feed hotspot volcanoes. But the new images reveal for the first time the extent, depth and shape of these channels. And they also show that the fingers align with the motion of the overlying tectonic plate. The researchers hypothesize that these channels may be interacting in complex ways with both the tectonic plates above them and the hot plumes rising from below."This global pattern of finger-like structures that we're seeing, which has not been documented before, appears to reflect interactions between the upwelling plumes and the motion of the overlying plates," Lekic said. "The deflection of the plumes into these finger-like channels represents an intermediate scale of convection in the mantle, between the large-scale circulation that drives plate motions and the smaller scale plumes, which we are now starting to image.""The exact nature of those interactions will need further study," said French, "but we now have a clearer picture that can help us understand the 'plumbing' of Earth's mantle responsible for hotspot volcano islands like Tahiti, Reunion and Samoa." | Earthquakes | 2,013 |
September 3, 2013 | https://www.sciencedaily.com/releases/2013/09/130903151849.htm | Iranian telegraph operator, first to propose earthquake early warning system | In 1909, an Iranian telegraph operator living in the remote desert town of Kerman noticed an unusual movement of the magnetic needle of his telegraph instrument. While other telegraph operators during the late 1800s and early 1900s noticed the phenomenon, the Iranian telegraph operator proposed an earthquake early warning system, as detailed in an article published today by the journal | Nineteenth century telegraph operators in New Zealand, Switzerland, Chile, the Caribbean and elsewhere noted the usefulness of electric telegraph for recording natural phenomena. But the Iranian telegraph operator and cashier, named Yusef (Joseph), took the next step, suggesting the concept of a local earthquake warning system in a Persian newspaper, The New Iran.He became aware of anomaly in 1897 and put the knowledge to use in 1909, using the six seconds of warning to urge his fellow dwellers to evacuate the building."I am confident if a more sophisticated instrument is built," wrote Yusef, "a few minutes after the needle's anomalous move, the earthquake will be felt. And if the system is connected to a big bell (an alarm system), it can be heard by all the people, and their lives will be saved."While J.D. Cooper, M.D. is credited with first proposing an early warning system in 1868, which he described in an article printed by The San Francisco Daily Bulletin, the Iranian telegraph operator living on the edge of desert likely had no access to American newspapers. Few newspapers existed at that time in Iran, when the literacy rate did not exceed five percent.Manuel Berberian, who authored the SRL paper, called Yusef's attempt to transfer knowledge in the service of others "priceless." He noted that by the 100th anniversary of the printing of Yusef's article, earthquakes had claimed the lives of more than 164,000 Iranians, and no plans for an early warning system are in development.The article, "Early Earthquake Detection and Warning Alarm System in Iran by a Telegraph Operator: A 116-Year-Old Disaster Prevention Attempt," will appear in the September issue of SRL, which is published by the Seismological Society of America. | Earthquakes | 2,013 |
August 29, 2013 | https://www.sciencedaily.com/releases/2013/08/130829093041.htm | Acoustic waves warn of tsunami | An early warning system against tsunamis has been developed and tailored for the need of the Mediterranean, but preparedness on the ground is paramount to ensuring peoples' safety. | When a coastal area is about to be hit by the waves of a tsunami, time is everything. The earlier we know where and when it is going to hit the coast, the more chances there are to evacuate the area. Early warning systems play a crucial role. Until now, seismic signals were used to issue such warning. These are subsequently confirmed or cleared by measuring sea level height. This approach stems from the fact that shallow submarine earthquakes exceeding a given magnitude are the most likely causes of tsunamis.More recently, an EU funded project called NEARESTfound a better way of identifying a tsunami threat at early stage. "To do this we developed a new device we called tsunameter that we put as close as possible to those places where we know that is very likely a tsunami is generated," says Francesco Chierici, who is the project coordinator and also works as a researcher the Radio Astronomy Institute (IRA), in Bologna, Italy. This tsunameter can be placed close to the geological faults that are responsible for the earthquake and, accordingly, for tsunamis. Detecting a tsunami near its source is crucial "expecially in peculiar environments such as the Mediterranean where the tsunami are generated very close to the coasts," says Chierici.Every device is connected with a surface buoy and consists of a set of instruments collecting several kinds of data. These include local acceleration and pressure of water, seismic waves, and, in particular, the acoustic waves generated by tsunamis.With this information, actual tsunamis can be distinguished from the background noise, "using a specific mathematical algorithm" which is interpreting the data. Under the project, the tsunameter had been tested for a year off the Gulf of Cadiz in Spain, at water depth of 3,200 metres. Since the project was completed, in March 2010, the tsunameters are now tested in a new research programme called Multidisciplinary Oceanic Information SysTem (MOIST), run by the Italian National Institute of Geophysics and Vulcanology(INGV) in Rome.The idea of using acoustic waves as tsunami signal is effective, according toStefano Tinti, professor of geology at the University of Bologna, Italy, and an expert in tsunamis. "Their speed of propagation is slower than that of seismic waves, but still quicker than that of the tsunami," he said. The problem is the technology "is still in an experimental stage and it's not so easy to separate the hydroacoustic signal from the others when the detector is so close to the source."Another issue is the cost of installation and maintenance. "Off-shore detection systems are more expensive than coastal ones," he adds. Tinti also believes it could be more effective to use the many islands spread around the Mediterranean. "The detection could be of no use in terms of warning system for the very point where the detection takes place," he contends, "but it is still very useful for other areas of the coast." The difficulties related to having a costly off-shore observation system are confirmed by Fernando Carrilho of the Portuguese Marine and Atmosphere Institute (IPMA), which was a project partner. When the Portuguese government decided to build a national tsunami early warning system, he explains, experts opted for a cheaper coastal sea-level observation system.Other experts point out to issues related to providing a timely warning to the population. "Measurements would need to be far enough from the land to be affected to give enough time to raise the alarm," says Philippe Blondel, acoustic remote sensing expert at the department of physics of the University of Bath, UK. Apart from which early warning system you choose, the difference between saving lives or not is preparedness. "For example, if the Vesuvius erupts and a flank collapses into the sea, this would affect the millions of people in and around Naples, in Italy. Even with the best organisation, there are only so many roads available for people to get away in a hurry," says Blondel.A different level of preparedness is required in the Mediterranean, compared to the situation in Japan and United States where "as soon as a tsunami is confirmed as being underway, alarms will sound in all communities likely to be affected," Blondel explains. Clear evacuation routes have been signposted, he believes, and everybody has been trained to know where to go. He quotes the example of the US West coast, where many long, romantic sandy beaches now have signs every few hundred meters saying 'tsunami risk: run this way' in clearly understandable icons. | Earthquakes | 2,013 |
August 27, 2013 | https://www.sciencedaily.com/releases/2013/08/130827134756.htm | New 3-D Earth model more accurately pinpoints source of earthquakes, explosions | During the Cold War, U.S. and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they're looking around the globe to pinpoint much smaller explosives tests. | Under the sponsorship of the National Nuclear Security Administration's Office of Defense Nuclear Nonproliferation R&D, Sandia National Laboratories and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth's mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the U.S. Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.The model uses a scalable triangular tessellation and seismic tomography to map the Earth's "compressional wave seismic velocity," a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard said. Compressional waves -- measured first after seismic events -- move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.SALSA3D also reduces the uncertainty in the model's predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he added."When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That's a difficult problem for these big 3-D models. It's mainly a computational problem," Ballard said. "The math is not so tough, just getting it done is hard, and we've accomplished that."A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models.In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area that was 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used with the one-dimensional model.Sandia recently released SALSA3D's framework -- the triangular tessellated grid on which the model is built -- to other Earth scientists, seismologists and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth's structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Ballard said. (See box.)"GeoTess makes models compatible and standardizes everything," he said. "This would really facilitate sharing of different models, if everyone agreed on it."Seismologists and researchers worldwide can now download GeoTess, which provides a common model parameterization for multidimensional Earth models and a software support system that addresses the construction, population, storage and interrogation of data stored in the model. GeoTess is not specific to any particular data, so users have considerable flexibility in how they store information in the model. The free package, including source code, is being released under the very liberal BSD Open Source License. The code is available in Java and C++, with interfaces to the C++ version written in C and Fortran90. GeoTess has been tested on multiple platforms, including Linux, SunOS, MacOSX and Windows.When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at U.S. and international ground monitoring stations associated with nuclear explosion monitoring organizations worldwide. Scientists use these signals to determine the location.They first predict the time taken for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth's materials from the crust to the inner core, Ballard said."If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver," he says.For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that's fast, Ballard said.One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don't do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth's crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth's core.RSTT, a previous model developed jointly by Sandia, Los Alamos and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).Still, "the biggest errors we get are close to the surface of the Earth. That's where the most variability in materials is," Ballard said.Today, Earth scientists are mapping three dimensions: the radius, latitude and longitude.Anyone who's studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize the polar regions even though they are remote because seismic waves travel under them, Ballard said.Triangular tessellation solves that with nodes, or intersections of the triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.The way Sandia calculates the seismic velocities uses the same math that is used to detect a tumor in an MRI, except on a global, rather than a human, scale.Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos Lab's Ground Truth catalog."We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would create that data set. It's mathematically similar to doing linear regression, but on steroids," Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.With 10 million data points, Sandia uses a distributed computer network with about 400 core processors to characterize the seismic velocity at every node.Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy's source in about a second, he said.But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and sometimes significantly exceed this threshold in most parts of the world."It's extremely difficult to do because the problem is so large," Ballard said. "But we've got to know it within 1,000 square kilometers or they might search in the wrong place." | Earthquakes | 2,013 |
August 27, 2013 | https://www.sciencedaily.com/releases/2013/08/130827112113.htm | Earthquakes and tectonics: First direct observation of subducting continental crust during the collision of two continents | Earthquake damage to buildings is mainly due to the existing shear waves which transfer their energy during an earthquake to the houses. These shear waves are significantly influenced by the underground and the topography of the surrounding area. Detailed knowledge of the landform and the near-surface underground structure is, therefore, an important prerequisite for a local seismic hazard assessment and for the evaluation of the ground-effect, which can strongly modify and increase local ground motion. | As described in the latest issue of The method is based on ambient seismic noise recordings and analyses. "We use small, hardly noticeable amplitude ground motions as well as anthropogenic ground vibrations," Marco Pilz, a scientist at GFZ, explains. "With the help of these small signals we can obtain detailed images of the shallow seismic velocity structure." In particular, images and velocity changes in the underground due to earthquakes and landslides can be obtained in almost real time."What is new about our method is the direct calculation of the shear wave velocity. Moreover, we are working on a local, small-scale level -- compared to many other studies," Marco Pilz continues.This method has already been successfully applied: Many regions of Central Asia are threatened by landslides. Since the shear wave velocity usually drops significantly before a landslide slip this technique offers the chance to monitor changes in landslide prone areas almost in real time.Further application can be used in earthquake research. The authors were able to map the detailed structure of a section of the Issyk-Ata fault, Kyrgyzstan, which runs along the southern border of the capital city, Bishkek, with a population of approx. 900.000 inhabitants. They showed that close to the surface of the mapped section a splitting into two different small fault branches can be observed. This can influence the pace of expansion or also an eventual halting of the propagation on the main fault.Central Asia is extensively seismically endangered; the accompanying processes and risks are investigated by the Central-Asian Institute of Applied Geosciences (CAIAG) in Bishkek, a joint institution established by the GFZ and the Kyrgyz government.The Pamir and Tien Shan are the result of the crash of two continental plates: the collision of India and Eurasia causes the high mountain ranges. This process is still ongoing today and causes breaking of the Earths crust, of which earthquakes are the consequence.A second group of GFZ-scientists has investigated together with colleagues from Tajikistan and CAIAG the tectonic process of collision in this region. They were, for the first time, able to image continental crust descending into the Earth's mantle. In the scientific journal Earth and Planetary Sciences Letters the scientists report that this subduction of continental crust has, to date, never been directly observed. To make their images, the scientists applied a special seismological method (so-called receiver function-analysis) on seismograms that had been collected in a two years long field experiment in the Tien Shan-Pamir-Hindu Kush area. Here, the collision of the Indian and Eurasian plates presents an extreme dimension."These extreme conditions cause the Eurasian lower crust to subduct into the Earth's mantle," explains Felix Schneider from the GFZ German Research Centre for Geosciences." Such a subduction can normally be observed during the collision of ocean crust with continental crust, as the ocean floors are heavier than continental rock."Findings at the surface of metamorphic rocks that must have arisen from ultra-high pressures deep in the Earth's mantle also provide evidence for subduction of continental crust in the Pamir region. Furthermore, the question arises, how the occurrence of numerous earthquakes at unusual depths of down to 300 km in the upper mantel can be explained. Through the observation of the subducting part of the Eurasian lower crust, this puzzle could, however, be solved. | Earthquakes | 2,013 |
August 19, 2013 | https://www.sciencedaily.com/releases/2013/08/130819102714.htm | How shale fracking led to an Ohio town's first 100 earthquakes | Since records began in 1776, the people of Youngstown, Ohio had never experienced an earthquake. However, from January 2011, 109 tremors were recorded and new research in | In December 2010, Northstar 1, a well built to pump wastewater produced by fracking in the neighboring state of Pennsylvania, came online. In the year that followed seismometers in and around Youngstown recorded 109 earthquakes; the strongest being a magnitude 3.9 earthquake on December 31, 2011.The study authors analyzed the Youngstown earthquakes, finding that their onset, cessation, and even temporary dips in activity were all tied to the activity at the Northstar 1 well. The first earthquake recorded in the city occurred 13 days after pumping began, and the tremors ceased shortly after the Ohio Department of Natural Resources shut down the well in December 2011.Dips in earthquake activity correlated with Memorial Day, the Fourth of July, Labor Day, and Thanksgiving, as well as other periods when the injection at the well was temporarily stopped."In recent years, waste fluid generated during the shale gas production -- hydraulic fracturing, had been increasing steadily in United States. Earthquakes were triggered by these waste fluid injection at a deep well in Youngstown, Ohio during Jan. 2011 -- Feb. 2012. We found that the onset of earthquakes and cessation were tied to the activity at the Northstar 1 deep injection well. The earthquakes were centered in subsurface faults near the injection well. These shocks were likely due to the increase in pressure from the deep waste water injection which caused the existing fault to slip," said Dr. Won-Young Kim. "Throughout 2011, the earthquakes migrated from east to west down the length of the fault away from the well -- indicative of the earthquakes being caused by expanding pressure front." | Earthquakes | 2,013 |
August 15, 2013 | https://www.sciencedaily.com/releases/2013/08/130815145148.htm | Slow earthquakes may foretell larger events | Monitoring slow earthquakes may provide a basis for reliable prediction in areas where slow quakes trigger normal earthquakes, according to Penn State geoscientists. | "We currently don't have any way to remotely monitor when land faults are about to move," said Chris Marone, professor of geophysics. "This has the potential to change the game for earthquake monitoring and prediction, because if it is right and you can make the right predictions, it could be big."Marone and Bryan Kaproth-Gerecht, recent Ph.D. graduate, looked at the mechanisms behind slow earthquakes and found that 60 seconds before slow stick slip began in their laboratory samples, a precursor signal appeared.Normal stick slip earthquakes typically move at a rate of three to 33 feet per second, but slow earthquakes, while they still stick and slip for movement, move at rates of about 0.004 inches per second taking months or more to rupture. However, slow earthquakes often occur near traditional earthquake zones and may precipitate potentially devastating earthquakes."Understanding the physics of slow earthquakes and identifying possible precursory changes in fault zone properties are increasingly important goals," the researchers report on line in today's (Aug. 15) issue of Using serpentine, a common mineral often found in slow earthquake areas, Marone and Kaproth-Gerecht performed laboratory experiments applying shear stress to rock samples so that the samples exhibited slow stick slip movement. The researchers repeated experiments 50 or more times and found that, at least in the laboratory, slow fault zones undergo a transition from a state that supports slow velocity below about 0.0004 inches per second to one that essentially stops movement above that speed."We recognize that this is complicated and that velocity depends on the friction," said Marone. "We don't know for sure what is happening, but, from our lab experiments, we know that this phenomenon is occurring."The researchers think that what makes this unusual pattern of movement is that friction contact strength goes down as velocity goes up, but only for a small velocity range. Once the speed increases enough, the friction contact area becomes saturated. It can't get any smaller and other physical properties take over, such as thermal effects. This mechanism limits the speed of slow earthquakes. Marone and Kaproth-Gerecht also looked at the primary elastic waves and the secondary shear waves produced by their experiments."Here we see elastic waves moving and we know what's going on with P and S waves and the acoustic speed," said Marone. "This is important because this is what you can see in the field, what seismographs record."Marone notes that there are not currently sufficient measuring devices adjacent to known fault lines to make any type of prediction from the precursor signature of the movement of the elastic waves. It is, however, conceivable that with the proper instrumentation, a better picture of what happens before a fault moves in slip stick motion is possible and perhaps could lead to some type of prediction. | Earthquakes | 2,013 |
August 12, 2013 | https://www.sciencedaily.com/releases/2013/08/130812103406.htm | California seafloor mapping reveals hidden treasures | Science and technology have peeled back a veil of water just offshore of California, revealing the hidden seafloor in unprecedented detail. New imagery, specialized undersea maps, and a wealth of data from along the California coast are now available. Three new products in an ongoing series were released today by the U.S. Geological Survey -- a map set for the area offshore of Carpinteria, a catalog of data layers for geographic information systems, and a collection of videos and photos of the seafloor in state waters along the entire California coast, | "A program of this vast scope can't be accomplished by any one organization. By working with other government agencies, universities, and private industry the USGS could fully leverage all its resources," said USGS Pacific Region Director Mark Sogge. "Each organization brings to the table a unique and complementary set of resources, skills, and know-how."The USGS is a key partner in the California Seafloor Mapping Program, a large, unique, and historically ambitious collaboration between state and federal agencies, academia, and the private sector to create a comprehensive base-map series for all of California's ocean waters. Scientists are collecting sonar data, video and photographic imagery, seismic surveys, and bottom-sediment data to create a series of maps of seafloor bathymetry, habitats, geology, and more, in order to inform coastal managers and planners, government entities, and researchers. With the new maps, decision makers and elected officials can better design and monitor marine reserves, evaluate ocean energy potential, understand ecosystem dynamics, recognize earthquake and tsunami hazards, regulate offshore development, and improve maritime safety."The Ocean Protection Council recognized early on that seafloor habitats and geology were a fundamental data gap in ocean management," said California's Secretary for Natural Resources and Ocean Protection Council Chair John Laird. "After an impressive effort by many partners to collect and interpret the data, the maps being produced now are providing pioneering science that's changing the way we manage our oceans.""Our collaboration with the state and more than 15 other partners is critical to the success of this program. We've come together to make the maps, and then to use them. We all like to say that you can't manage it, monitor it, or model it if you don't know what the 'it' is, and our seafloor mapping gives that important 'it' to the entire coastal management and research community," said the USGS' lead researcher on this project, Sam Johnson. | Earthquakes | 2,013 |
August 2, 2013 | https://www.sciencedaily.com/releases/2013/08/130802095140.htm | Revised location of 1906 rupture of San Andreas Fault in Portola Valley | New evidence suggests the 1906 earthquake ruptured the San Andreas Fault in a single trace through Portola Village, current day Town of Portola Valley, and indicates a revised location for the fault trace. | Portola Valley, south of San Francisco, has been extensively studied and the subject of the first geological map published in California. Yet studies have offered conflicting conclusions, caused in part by a misprinted photograph and unpublished data, as to the location and nature of the 1906 surface rupture through the area."It is critical for the residents and leaders of Portola Valley to know the exact location of the fault -- an active fault near public buildings and structures," said co-author Chester T. Wrucke, a retired geologist with the U.S. Geological Survey and resident of Portola Valley. Independent researcher Robert T. Wrucke and engineering geologist Ted Sayre, with Cotton Shires and Associates, are co-authors of the study, published by the Using a new high-resolution imaging technology, known as bare-earth airborne LiDAR (Light Detection And Ranging), combined with field observations and an extensive review of archival photography, researchers reinterpreted previous documentation to locate the 1906 fault trace."People back then were hampered by thick vegetation to see a critical area," said Chester Wrucke. "Modern technology -- LiDAR -- and modern techniques made it possible for us to see the bare ground, interpret correctly where old photographs were taken and identify the fault trace."The 1906 earthquake changed the landscape of Portola Valley, breaking rock formations, cracking roads, creating landslides and forcing other changes masked by vegetation. With easy access to the area, local professors and photographers from Stanford created a rich trove of field observations, photos and drawings.J.C. Banner, then a geology professor at Stanford, was among the scientists who, along with his students, submitted their observations of the 1906 fault rupture to the California Earthquake Commission to include in an official compilation of the cause and affects of the earthquake. While the compilation, published in 1908, contained a final conclusion that the earthquake ruptured along a single fault trace in Portola Valley, a key map of that trace -- Map 22 -- included unintentional errors of the fault location.Studies of the area resumed 50 years later, and those studies relied on literature, including Map 22. Subsequent studies published variations of Map 22, further altering the assumed location of the fault and suggesting the earthquake ruptured along multiple traces of the fault.The authors sought to answer a seemingly simple question -- where did the fault cross Alpine Road? "With variations in the literature and interpretation of the data, we decided to pay close attention to the original work," said Robert Wrucke.The authors relied on Branner's description, together with 1906 photographs, a hand-drawn map, a student notebook and an analysis of changes to Alpine Road for clues to confirm the location of where the fault crossed Alpine Road.Scanning archives to study all available photos from 1906 and notes from observers, the researchers compared geological features to LiDAR images. Their forensic analysis suggests the primary rupture in 1906 in Portola Valley was along the western of two main traces of the San Andreas Fault. Their analysis shows that there was no step-over within the town to the other trace."The biggest practical benefit of knowing the correct fault position is the ability to keep proposed news buildings off the critical rupture zone," said Sayre."We had the luxury of LiDAR and were able to meld LiDAR data with old photos and made a breakthrough," said Bob Wrucke. "Modern technology helps with geological interpretation. Our experience may be useful for others in situations where there's confusion." | Earthquakes | 2,013 |
July 31, 2013 | https://www.sciencedaily.com/releases/2013/07/130731164722.htm | Artificial earthquakes could lead to safer, sturdier buildings | Earthquakes never occur when you need one, so a team led by Johns Hopkins structural engineers is shaking up a building themselves in the name of science and safety. | Using massive moving platforms and an array of sensors and cameras, the researchers are trying to find out how well a two-story building made of cold-formed steel can stand up to a lab-generated Southern California quake.The testing, taking place this summer in Buffalo, N.Y., marks the culmination of a three-year, $1 million research project involving scientists from six universities and design consultants from the steel industry. The work is taking place in the only facility in the U.S. that is capable of replicating an earthquake in three directions beneath a building measuring 50 feet long, 20 feet wide and 20 feet tall. The trials will wrap up in mid-August when the researchers will shake the unoccupied structure with forces comparable to those at the epicenter of the catastrophic 1994 Northridge earthquake in Los Angeles, which claimed dozens of lives and caused billions of dollars in damage.The researchers may sound like a wrecking crew, but their work has important implications for the people who construct, live or work in buildings. The results are expected to lead to improved nationwide building codes that will make future cold-formed steel buildings less expensive to construct than current ones. Also, the new codes could, in certain cases, make lightweight cold-formed steel buildings less costly to construct than those made of materials such as timber, concrete or hot-rolled steel. In earthquake-prone regions, these code updates should help structural designers and builders reduce the likelihood of a costly and life-threatening building collapse. Finally, the research, funded primarily by the National Science Foundation, with added support from the steel industry, could lead to broader use of building components made of environmentally friendly cold-formed steel, made of 100 percent recycled steel.Cold-formed steel pieces, commonly used to frame low- and mid-rise buildings, are made by bending sheet metal, roughly one-millimeter-thick, into structural shapes without using heat. Cold-formed steel already has been used in an array of structures such as college dorms, assisted living centers, small hotels, barracks, apartments and office buildings. Although the material is popular, some large knowledge gaps exist regarding how well cold-formed steel structures will stand up to extreme conditions -- including earthquakes. This has caused engineers to be very conservative in their design methods. The tests being conducted atop two "shake tables" at the University at Buffalo should help close those information gaps and lead to better constructed buildings, says lead researcher Benjamin Schafer, of the Whiting School of Engineering at Johns Hopkins."This is the first time a full building of cold-formed steel framing has ever been tested in this way, so even the small things we're learning could have a huge impact," said Schafer, the Swirnow Family Scholar, professor and chair of the Department of Civil Engineering. "We'll see code changes and building design changes. We think this will ultimately lead to more economic, more efficient and more sustainable buildings."In May, Schafer's team began supervising a construction crew in assembling a first version of the test building. This structure, about the size of a small real estate or medical office building, was mainly composed of the cold-formed steel skeleton and oriented strand board (OSB) sheathing. When those first tests were completed, that structure was torn down and replaced by an identical building that also included non-structural components such as stairs and interior walls. The researchers are trying to determine whether these additions, which do not support the frame of the building, can still help reduce damage during a quake. It is the second version of the test building that in August will face the strongest seismic forces, as recorded during the Northridge earthquake.At the test site, the construction of the buildings, the shake trials and the collection of data have been overseen by Kara Peterman of Fairfax, Va., a Johns Hopkins civil engineering doctoral student being supervised by Schafer. She has been gathering data from more than 150 sensors and eight video cameras installed in and around the test buildings. During a simulated quake, these instruments are designed to track the three-dimensional movement of the structure and to record any piece in the building that has "failed," such as beams that have bent or screws that have come loose.Peterman said tests on the first version of the building yielded surprisingly good results. "It moved a lot less than we were predicting," she said. "We did find one small portion of the steel that failed, but that was because of a conflict in the design plans, not because of the way it was constructed. And that small failure was purely local -- it didn't affect the structure as a whole."She said she is anxious to see how much the addition of interior walls and other non-structural components will add to the building's stability during the more powerful tests ahead. Peterman predicted that the final high-intensity test is likely to damage the building, but not enough to cause a catastrophic collapse.When the testing is completed and the results are analyzed, Schafer's team plans to incorporate the findings into computer models that will be shared freely with engineers who want to see on their desktop how their designs are likely to respond in an earthquake. "The modeling," Schafer said, "will create cost efficiencies and potentially save lives."In addition to the Johns Hopkins participants, academic researchers from the following schools have taken part in the project: Bucknell University, McGill University, University of North Texas and Virginia Tech. Steel industry partners who have provided technical expertise, materials and additional funding include Bentley Systems, Incorporated; ClarkDietrich Building Systems; Devco Engineering, Inc.; DSi Engineering; Mader Construction Company, Inc.; Simpson Strong-Tie Company, Inc.; the Steel Framing Industry Association; the Steel Stud Manufacturers Association; and the American Iron and Steel Institute.The research has been funded by National Science Foundation grant number 1041578. | Earthquakes | 2,013 |
July 31, 2013 | https://www.sciencedaily.com/releases/2013/07/130731133159.htm | 'Highway from Hell' fueled Costa Rican volcano | If some volcanoes operate on geologic timescales, Costa Rica's Irazú had something of a short fuse. In a new study in the journal | "If we had had seismic instruments in the area at the time we could have seen these deep magmas coming," said the study's lead author, Philipp Ruprecht, a volcanologist at Columbia University's Lamont-Doherty Earth Observatory. "We could have had an early warning of months, instead of days or weeks."Towering more than 10,000 feet and covering almost 200 square miles, Irazú erupts about every 20 years or less, with varying degrees of damage. When it awakened in 1963, it erupted for two years, killing at least 20 people and burying hundreds of homes in mud and ash. Its last eruption, in 1994, did little damage.Irazú sits on the Pacific Ring of Fire, where oceanic crust is slowly sinking beneath the continents, producing some of earth's most spectacular fireworks. Conventional wisdom holds that the mantle magma feeding those eruptions rises and lingers for long periods of time in a mixing chamber several miles below the volcano. But ash from Irazú's prolonged explosion is the latest to suggest that some magma may travel directly from the upper mantle, covering more than 20 miles in a few months."There has to be a conduit from the mantle to the magma chamber," said study co-author Terry Plank, a geochemist at Lamont-Doherty. "We like to call it the highway from hell."Their evidence comes from crystals of the mineral olivine separated from the ashes of Irazú's 1963-1965 eruption, collected on a 2010 expedition to the volcano. As magma rising from the mantle cools, it forms crystals that preserve the conditions in which they formed. Unexpectedly, Irazú's crystals revealed spikes of nickel, a trace element found in the mantle. The spikes told the researchers that some of Irazú's erupted magma was so fresh the nickel had not had a chance to diffuse."The study provides one more piece of evidence that it's possible to get magma from the mantle to the surface in very short order," said John Pallister, who heads the U.S. Geological Survey (USGS) Volcano Disaster Assistance Program in Vancouver, Wash. "It tells us there's a potentially shorter time span we need to worry about."Deep, fast-rising magma has been linked to other big events. In 1991, Mount Pinatubo in the Philippines spewed so much gas and ash into the atmosphere that it cooled Earth's climate. In the weeks before the eruption, seismographs recorded hundreds of deep earthquakes that USGS geologist Randall White later attributed to magma rising from the mantle-crust boundary. In 2010, a chain of eruptions at Iceland's Eyjafjallajökull volcano that caused widespread flight cancellations also indicated that some magma was coming from down deep. Small earthquakes set off by the eruptions suggested that the magma in Eyjafjallajökull's last two explosions originated 12 miles and 15 miles below the surface, according to a 2012 study by University of Cambridge researcher Jon Tarasewicz in Geophysical Research Letters.Volcanoes give off many warning signs before a blow-up. Their cones bulge with magma. They vent carbon dioxide and sulfur into the air, and throw off enough heat that satellites can detect their changing temperature. Below ground, tremors and other rumblings can be detected by seismographs. When Indonesia's Mount Merapi roared to life in late October 2010, officials led a mass evacuation later credited with saving as many as 20,000 lives.Still, the forecasting of volcanic eruptions is not an exact science. Even if more seismographs could be placed along the flanks of volcanoes to detect deep earthquakes, it is unclear if scientists would be able to translate the rumblings into a projected eruption date. Most problematically, many apparent warning signs do not lead to an eruption, putting officials in a bind over whether to evacuate nearby residents."[Several months] leaves a lot of room for error," said Erik Klemetti, a volcanologist at Denison University. "In volcanic hazards you have very few shots to get people to leave."Scientists may be able to narrow the window by continuing to look for patterns between eruptions and the earthquakes that precede them. The Olivine minerals with nickel spikes similar to Irazú's have been found in the ashes of arc volcanoes in Mexico, Siberia and the Cascades of the U.S. Pacific Northwest, said Lamont geochemist Susanne Straub, whose ideas inspired the study. "It's clearly not a local phenomenon," she said. The researchers are currently analyzing crystals from past volcanic eruptions in Alaska's Aleutian Islands, Chile and Tonga, but are unsure how many will bear Irazú's fast-rising magma signature. "Some may be capable of producing highways from hell and some may not," said Ruprecht. | Earthquakes | 2,013 |
July 30, 2013 | https://www.sciencedaily.com/releases/2013/07/130730163144.htm | Simulations aiding study of earthquake dampers for structures | Researchers have demonstrated the reliability and efficiency of "real-time hybrid simulation" for testing a type of powerful damping system that might be installed in buildings and bridges to reduce structural damage and injuries during earthquakes. | The magnetorheological-fluid dampers are shock-absorbing devices containing a liquid that becomes far more viscous when a magnetic field is applied."It normally feels like a thick fluid, but when you apply a magnetic field it transforms into a peanut-butter consistency, which makes it generate larger forces when pushed through a small orifice," said Shirley Dyke, a professor of mechanical engineering and civil engineering at Purdue University.This dramatic increase in viscosity enables the devices to exert powerful forces and to modify a building's stiffness in response to motion during an earthquake. The magnetorheological-fluid dampers, or MR dampers, have seen limited commercial use and are not yet being used routinely in structures.Research led by Dyke and doctoral students Gaby Ou and Ali Ozdagli has now shown real-time hybrid simulations are reliable in studying the dampers. The research is affiliated with the National Science Foundation's George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), a shared network of laboratories based at Purdue.Dyke and her students are working with researchers at the Harbin Institute of Technology in China, home to one of only a few large-scale shake-table facilities in the world.Findings will be discussed during the NEES Quake Summit 2013 on Aug. 7-8 in Reno. A research paper also was presented in May during a meeting in Italy related to a consortium called SERIES (Seismic Engineering Research Infrastructures for European Synergies). The paper was authored by Ou, Dyke, Ozdagli, and researchers Bin Wu and Bo Li from the Harbin Institute."The results indicate that the real-time hybrid simulation concept can be considered as a reliable and efficient testing method," Ou said.The simulations are referred to as hybrid because they combine computational models with data from physical tests."You have physical models and computational models being combined for one test," Dyke said.Researchers are able to perform structural tests at slow speed, but testing in real-time -- or the actual speed of an earthquake -- sheds new light on how the MR dampers perform in structures. The real-time ability has only recently become feasible due to technological advances in computing."Sometimes real-time testing is necessary, and that's where we focus our efforts," said Dyke, who organized a workshop on the subject to be held during the NEES meeting in Reno. "This hybrid approach is taking off lately. People are getting very excited about it."Ozdagli also is presenting related findings next week during the 2013 Conference of the ASCE Engineering Mechanics Institute in Evanston, Ill.The simulations can be performed in conjunction with research using full-scale building tests. However, there are very few large-scale facilities in the world, and the testing is time-consuming and expensive."The real-time hybrid simulations allow you to do many tests to prepare for the one test using a full-scale facility," Dyke said. "The nice thing is that you can change the numerical model any way you want. You can make it a four-story structure one day and the next day it's a 10-story structure. You can test an unlimited number of cases with a single physical setup."The researchers will present two abstracts during the Reno meeting. One focuses on how the simulation method has been improved and the other describes the overall validation of real-time hybrid simulations.To prove the reliability of the approach the researchers are comparing pure computational models, pure physical shake-table tests and then the real-time hybrid simulation. Research results from this three-way comparison are demonstrating that the hybrid simulations are accurate.Ou has developed a mathematical approach to cancel out "noise" that makes it difficult to use testing data. She combined mathematical tools for a new "integrated control strategy" for the hybrid simulation."She found that by integrating several techniques in the right mix you can get better performance than in prior tests," Dyke said.The researchers have validated the simulations."It's a viable method that can be used by other researchers for many different purposes and in many different laboratories," Dyke said.Much of the research is based at Purdue's Robert L. and Terry L. Bowen Laboratory for Large-Scale Civil Engineering Research and has been funded by the NSF through NEES. A portion is supported by the Sohmen Fund, an endowment established by Purdue alumnus Anna Pao Sohmen to facilitate faculty and student exchange with the Harbin Institute of Technology and Ningbo University. The fund is managed by International Programs at Purdue. | Earthquakes | 2,013 |
July 23, 2013 | https://www.sciencedaily.com/releases/2013/07/130723073957.htm | Devastating long-distance impact of earthquakes | In 2006 the island of Java, Indonesia was struck by a devastating earthquake followed by the onset of a mud eruption to the east, flooding villages over several square kilometers and that continues to erupt today. Until now, researchers believed the earthquake was too far from the mud volcano to trigger the eruption. Geophysicists at the University of Bonn, Germany and ETH Zurich, Switzerland use computer-based simulations to show that such triggering is possible over long distances. The results have been published in | On May 27, 2006 the ground of the Indonesian island Java was shaking with a magnitude 6.3 earthquake. The epicenter was located 25 km southwest of the city of Yogyakarta and initiated at a depth of 12 km. The earthquake took thousands of lives, injured ten thousand and destroyed buildings and homes. 47 hours later, about 250 km from the earthquake hypocenter, a mud volcano formed that came to be known as "Lusi," short for "Lumpur Sidoarjo." Hot mud erupted in the vicinity of an oil drilling-well, shooting mud up to 50 m into the sky and flooding the area. Scientists expect the mud volcano to be active for many more years.Was the eruption of the mud triggered by natural events or was it human-made by the nearby exploration-well? Geophysicists at the University of Bonn, Germany and at ETH Zürich, Switzerland investigated this question with numerical wave-propagation experiments. "Many researchers believed that the earthquake epicenter was too far from Lusi to have activated the mud volcano," says Prof. Dr. Stephen A. Miller from the department of Geodynamics at the University of Bonn. However, using their computer simulations that include the geological features of the Lusi subsurface, the team of Stephen Miller concluded that the earthquake was the trigger, despite the long distance.The overpressured solid mud layer was trapped between layers with different acoustic properties, and this system was shaken from the earthquake and aftershocks like a bottle of champagne. The key, however, is the reflections provided by the dome-shaped geology underneath Lusi that focused the seismic waves of the earthquakes like the echo inside a cave. Prof. Stephen Miller explains: "Our simulations show that the dome-shaped structure with different properties focused seismic energy into the mud layer and could very well have liquified the mud that then injected into nearby faults."Previous studies would have underestimated the energy of the seismic waves, as ground motion was only considered at the surface. However, geophysicists at the University of Bonn suspect that those were much less intense than at depth. The dome-like structure "kept" the seismic waves at depth and damped those that reached the surface. "This was actually a lower estimate of the focussing effect because only one wave cycle was input. This effect increases with each wave cycle because of the reducing acoustic impedance of the pressurizing mud layer." In response to claims that the reported highest velocity layer used in the modeling is a measurement artifact, Miller says „that does not change our conclusions because this effect will occur whenever a layer of low acoustic impedance is sandwiched between high impedance layers, irrespective of the exact values of the impedances. And the source of the Lusi mud was the inside of the sandwich. "It has already been proposed that a tectonic fault is connecting Lusi to a 15 km distant volcanic system. Prof. Miller explains "This connection probably supplies the mud volcano with heat and fluids that keep Lusi erupting actively up to today," explains Miller.With their publication, scientists from Bonn and Zürich point out, that earthquakes can trigger processes over long distances, and this focusing effect may apply to other hydrothermal and volcanic systems. Stephen Miller concludes: "Being a geological rarity, the mud volcano may contribute to a better understanding of triggering processes and relationships between seismic and volcanic activity." Miller also adds „maybe this work will settle the long-standing controversy and focus instead on helping those affected." The island of Java is part of the so called Pacific Ring of Fire, a volcanic belt which surrounds the entire Pacific Ocean. Here, oceanic crust is subducted underneath oceanic and continental tectonic plates, leading to melting of crustal material at depth. The resulting magma uprises and is feeding numerous volcanoes. | Earthquakes | 2,013 |
July 14, 2013 | https://www.sciencedaily.com/releases/2013/07/130714160521.htm | Some volcanoes 'scream' at ever higher pitches until they blow their tops | It is not unusual for swarms of small earthquakes to precede a volcanic eruption. They can reach a point of such rapid succession that they create a signal called harmonic tremor that resembles sound made by various types of musical instruments, though at frequencies much lower than humans can hear. | A new analysis of an eruption sequence at Alaska's Redoubt Volcano in March 2009 shows that the harmonic tremor glided to substantially higher frequencies and then stopped abruptly just before six of the eruptions, five of them coming in succession."The frequency of this tremor is unusually high for a volcano, and it's not easily explained by many of the accepted theories," said Alicia Hotovec-Ellis, a University of Washington doctoral student in Earth and space sciences.Documenting the activity gives clues to a volcano's pressurization right before an explosion. That could help refine models and allow scientists to better understand what happens during eruptive cycles in volcanoes like Redoubt, she said.The source of the earthquakes and harmonic tremor isn't known precisely. Some volcanoes emit sound when magma -- a mixture of molten rock, suspended solids and gas bubbles -- resonates as it pushes up through thin cracks in Earth's crust.But Hotovec-Ellis believes in this case the earthquakes and harmonic tremor happen as magma is forced through a narrow conduit under great pressure into the heart of the mountain. The thick magma sticks to the rock surface inside the conduit until the pressure is enough to move it higher, where it sticks until the pressure moves it again.Each of these sudden movements results in a small earthquake, ranging in magnitude from about 0.5 to 1.5, she said. As the pressure builds, the quakes get smaller and happen in such rapid succession that they blend into a continuous harmonic tremor."Because there's less time between each earthquake, there's not enough time to build up enough pressure for a bigger one," Hotovec-Ellis said. "After the frequency glides up to a ridiculously high frequency, it pauses and then it explodes."She is the lead author of a forthcoming paper in the Journal of Volcanology and Geothermal Research that describes the research. Co-authors are John Vidale of the UW and Stephanie Prejean and Joan Gomberg of the U.S. Geological Survey.Hotovec-Ellis is a co-author of a second paper, published online July 14 in Nature Geoscience, that introduces a new "frictional faulting" model as a tool to evaluate the tremor mechanism observed at Redoubt in 2009. The lead author of that paper is Ksenia Dmitrieva of Stanford University, and other co-authors are Prejean and Eric Dunham of Stanford.The pause in the harmonic tremor frequency increase just before the volcanic explosion is the main focus of the Nature Geoscience paper. "We think the pause is when even the earthquakes can't keep up anymore and the two sides of the fault slide smoothly against each other," Hotovec-Ellis said.She documented the rising tremor frequency, starting at about 1 hertz (or cycle per second) and gliding upward to about 30 hertz. In humans, the audible frequency range starts at about 20 hertz, but a person lying on the ground directly above the magma conduit might be able to hear the harmonic tremor when it reaches its highest point (it is not an activity she would advise, since the tremor is closely followed by an explosion).Scientists at the USGS Alaska Volcano Observatory have dubbed the highest-frequency harmonic tremor at Redoubt Volcano "the screams" because they reach such high pitch compared with a 1-to-5 hertz starting point. Hotovec-Ellis created two recordings of the seismic activity. A 10-second recording covers about 10 minutes of seismic sound and harmonic tremor, sped up 60 times. A one-minute recording condenses about an hour of activity that includes more than 1,600 small earthquakes that preceded the first explosion with harmonic tremor.Upward-gliding tremor immediately before a volcanic explosion also has been documented at the Arenal Volcano in Costa Rica and Soufrière Hills volcano on the Caribbean island of Montserrat."Redoubt is unique in that it is much clearer that that is what's going on," Hotovec-Ellis said. "I think the next step is understanding why the stresses are so high."The work was funded in part by the USGS and the National Science Foundation. | Earthquakes | 2,013 |
July 12, 2013 | https://www.sciencedaily.com/releases/2013/07/130712095205.htm | Induced seismicity? Recent spike of earthquakes in the central and eastern U.S. may be linked to human activity | The number of earthquakes has increased dramatically over the past few years within the central and eastern United States. More than 300 earthquakes above a magnitude 3.0 occurred in the three years from 2010-2012, compared with an average rate of 21 events per year observed from 1967-2000. | This increase in earthquakes prompts two important questions: Are they natural, or human-made? And what should be done in the future as we address the causes and consequences of these events to reduce associated risks? U.S. Geological Survey scientists have been analyzing the changes in the rate of earthquakes as well as the likely causes, and they have some answers.USGS scientists have found that at some locations the increase in seismicity coincides with the injection of wastewater in deep disposal wells. Much of this wastewater is a byproduct of oil and gas production and is routinely disposed of by injection into wells specifically designed and approved for this purpose.U.S. Geological Survey geophysicist William Ellsworth reviewed the issue of injection-induced earthquakes in a recent study published in the journal Science. The article focused on the injection of fluids into deep wells as a common practice for disposal of wastewater, and discusses recent events and key scientific challenges for assessing this hazard and moving forward to reduce associated risks.Although it may seem like science fiction, human-made earthquakes have been a reality for decades. It has long been understood that earthquakes can be induced by impoundment of water in reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations.Water that is salty or polluted by chemicals needs to be disposed of in a manner that prevents it from contaminating freshwater sources. Often, it is most economical to geologically sequester such wastewaters by injecting them underground, deep below any aquifers that provide drinking water.Wastewater can result from a variety of processes related to energy production. For example, water is usually present in rock formations containing oil and gas and therefore will be co-produced during oil and gas production. Wastewater can also occur as flow back from hydraulic fracturing operations that involve injecting water under high pressure into a rock formation to stimulate the movement of oil and gas to a well for production.When wastewater disposal takes place near faults, and underground conditions are right, earthquakes may be more likely to occur, Ellsworth's research showed. Specifically, an earthquake can be triggered by the well-understood mechanism of raising the water pressure inside a fault. If the pressure increases enough, the fault may fail, releasing stored tectonic stress in the form of an earthquake. Even faults that have not moved in millions of years can be made to slip and cause an earthquake if conditions underground are right.While the disposal process has the potential to trigger earthquakes, not every wastewater disposal well produces earthquakes. In fact, very few of the more than 30,000 wells designed for this purpose appear to cause earthquakes.Many questions have been raised about whether hydraulic fracturing -- commonly known as "fracking" -- is responsible for the recent increase of earthquakes. USGS's studies suggest that the actual hydraulic fracturing process is only very rarely the direct cause of felt earthquakes. While hydraulic fracturing works by making thousands of extremely small "microearthquakes," they are rarely felt and are too small to cause structural damage. As noted previously, wastewater associated with hydraulic fracturing has been linked to some, but not all, of the induced earthquakes.USGS scientists are dedicated to gaining a better understanding of the geological conditions and industrial practices associated with induced earthquakes, and to determining how seismic risk can be managed.One risk-management approach highlighted in Ellsworth's article involves the setting of seismic activity thresholds for safe operation. Under this "traffic-light" system, if seismic activity exceeds preset thresholds, reductions in injection would be made. If seismicity continued or escalated, operations could be suspended.The current regulatory framework for wastewater disposal wells was designed to protect drinking water sources from contamination and does not address earthquake safety. Ellsworth noted that one consequence is that both the quantity and timeliness of information on injection volumes and pressures reported to the regulatory agencies is far from ideal for managing earthquake risk from injection activities.Thus, improvements in the collection and reporting of injection data to regulatory agencies would provide much-needed information on conditions potentially associated with induced seismicity. In particular, said Ellsworth, daily reporting of injection volumes, and peak and average injection pressures would be a step in the right direction, as would measurement of the pre-injection water pressure and tectonic stress.There is a growing interest in understanding the risks associated with injection-induced earthquakes, especially in the areas of the country where damaging earthquakes are rare.For example, wastewater disposal appears to have induced the magnitude-5.6 earthquake that struck rural central Oklahoma in 2011, leading to a few injuries and damage to more than a dozen homes. Damage from an earthquake of this magnitude would be even worse if it were to happen in a more densely populated area.As the use of injection for disposal of wastewater increases, the importance of knowing the associated risks also grows. To meet these challenges, the USGS hopes to increase research efforts to understand the causes and effects of injection-induced earthquakes.The USGS has FAQs online ( | Earthquakes | 2,013 |
July 11, 2013 | https://www.sciencedaily.com/releases/2013/07/130711142401.htm | Distant earthquakes trigger tremors at U.S. waste-injection sites, says study | Large earthquakes from distant parts of the globe are setting off tremors around waste-fluid injection wells in the central United States, says a new study. Furthermore, such triggering of minor quakes by distant events could be precursors to larger events at sites where pressure from waste injection has pushed faults close to failure, say researchers. | The 2010 Chile earthquake set off tremors near waste-injection sites in central Oklahoma and southern Colorado, says a new study in Among the sites covered: a set of injection wells near Prague, Okla., where the study says a huge earthquake in Chile on Feb. 27, 2010 triggered a mid-size quake less than a day later, followed by months of smaller tremors. This culminated in probably the largest quake yet associated with waste injection, a magnitude 5.7 event which shook Prague on Nov. 6, 2011. Earthquakes off Japan in 2011, and Sumatra in 2012, similarly set off mid-size tremors around injection wells in western Texas and southern Colorado, says the study. The paper appears this week in the leading journal "The fluids are driving the faults to their tipping point," said lead author Nicholas van der Elst, a postdoctoral researcher at Columba University's Lamont-Doherty Earth Observatory. "The remote triggering by big earthquakes is an indication the area is critically stressed."Tremors triggered by distant large earthquakes have been identified before, especially in places like Yellowstone National Park and some volcanically active subduction zones offshore, where subsurface water superheated by magma can weakenfaults, making them highly vulnerable to seismic waves passing by from somewhere else. The study in A surge in U.S. energy production in the last decade or so has sparked what appears to be a rise in small to mid-sized earthquakes in the United States. Large amounts of water are used both to crack open rocks to release natural gas through hydrofracking, and to coax oil and gas from underground wells using conventional techniques. After the gas and oil have been extracted, the brine and chemical-laced water must be disposed of, and is often pumped back underground elsewhere, sometimes causing earthquakes."These passing seismic waves are like a stress test," said study coauthor Heather Savage, a geophysicist at Lamont-Doherty. "If the number of small earthquakes increases, it could indicate that faults are becoming critically stressed and might soon host a larger earthquake."The 2010 magnitude 8.8 Chile quake, which killed more than 500 people, sent surface waves rippling across the planet, triggering a magnitude 4.1 quake near Prague 16 hours later, the study says. The activity near Prague continued until the magnitude 5.7 quake on Nov. 6, 2011 that destroyed 14 homes and injured two people. A study earlier this year led by seismologist Katie Keranen, also a coauthor of the new study, now at Cornell University, found that the first rupture occurred less than 650 feet away from active injection wells. In April 2012, a magnitude 8.6 earthquake off Sumatra triggered another swarm of earthquakes in the same place.The pumping of fluid into the field continues to this day, along with a pattern of small quakes.The 2010 Chile quake also set off a swarm of earthquakes on the Colorado-New Mexico border, in Trinidad, near wells where wastewater used to extract methane from coal beds had been injected, the study says. The swarm was followed more than a year later, on Aug. 22 2011, by a magnitude 5.3 quake that damaged dozens of buildings. A steady series of earthquakes had already struck Trinidad in the past, including a magnitude 4.6 quake in 2001 that the U.S. Geological Survey (USGS) has investigated for links to wastewater injection.The new study found also that Japan's devastating magnitude 9.0 earthquake on March 11, 2011 triggered a swarm of earthquakes in the west Texas town of Snyder, where injection of fluid to extract oil from the nearby Cogdell fields has been setting off earthquakes for years, according to a 1989 study in the Bulletin of the Seismological Society of America. About six months after the Japan quake, a magnitude 4.5 quake struck Snyder.The idea that seismic activity can be triggered by separate earthquakes taking place faraway was once controversial. One of the first cases to be documented was the magnitude 7.3 earthquake that shook California's Mojave Desert in 1992, near the town of Landers, setting off a series of distant events in regions with active hot springs, geysers and volcanic vents. The largest was a magnitude 5.6 quake beneath Little Skull Mountain in southern Nevada, 150 miles away; the farthest, a series of tiny earthquakes north of Yellowstone caldera, according to a 1993 study in Science led by USGS geophysicist David Hill.In 2002, the magnitude 7.9 Denali earthquake in Alaska triggered a series of earthquakes at Yellowstone, nearly 2,000 miles away, throwing off the schedules of some of its most predictable geysers, according to a 2004 study in Geology led by Stephan Husen, a seismologist at the Swiss Federal Institute of Technology in Zürich. The Denali quake also triggered bursts of slow tremors in and around California's San Andreas, San Jacinto and Calaveras faults, according to a 2008 study in Science led by USGS geophysicist Joan Gomberg."We've known for at least 20 years that shaking from large, distant earthquakes can trigger seismicity in places with naturally high fluid pressure, like hydrothermal fields," said study coauthor Geoffrey Abers, a seismologist at Lamont-Doherty. "We're now seeing earthquakes in places where humans are raising pore pressure."The new studymay be the first to find evidence of triggered earthquakes on faults critically stressed by waste injection. If it can be replicated and extended to other sites at risk of humanmade earthquakes it could "help us understand where the stresses are," said William Ellsworth, an expert on human-induced earthquakes with the USGS who was not involved in the study.In the same issue of | Earthquakes | 2,013 |
July 11, 2013 | https://www.sciencedaily.com/releases/2013/07/130711142355.htm | Geothermal power facility induces earthquakes, study finds | An analysis of earthquakes in the area around the Salton Sea Geothermal Field in southern California has found a strong correlation between seismic activity and operations for production of geothermal power, which involve pumping water into and out of an underground reservoir. | "We show that the earthquake rate in the Salton Sea tracks a combination of the volume of fluid removed from the ground for power generation and the volume of wastewater injected," said Emily Brodsky, a geophysicist at the University of California, Santa Cruz, and lead author of the study, published online in "The findings show that we might be able to predict the earthquakes generated by human activities. To do this, we need to take a large view of the system and consider both the water coming in and out of the ground," said Brodsky, a professor of Earth and planetary sciences at UCSC.Brodsky and coauthor Lia Lajoie, who worked on the project as a UCSC graduate student, studied earthquake records for the region from 1981 through 2012. They compared earthquake activity with production data for the geothermal power plant, including records of fluid injection and extraction. The power plant is a "flash-steam facility" which pulls hot water out of the ground, flashes it to steam to run turbines, and recaptures as much water as possible for injection back into the ground. Due to evaporative losses, less water is pumped back in than is pulled out, so the net effect is fluid extraction.During the period of relatively low-level geothermal operations before 1986, the rate of earthquakes in the region was also low. Seismicity increased as the operations expanded. After 2001, both geothermal operations and seismicity climbed steadily.The researchers tracked the variation in net extraction over time and compared it to seismic activity. The relationship is complicated because earthquakes are naturally clustered due to local aftershocks, and it can be difficult to separate secondary triggering (aftershocks) from the direct influence of human activities. The researchers developed a statistical method to separate out the aftershocks, allowing them to measure the "background rate" of primary earthquakes over time."We found a good correlation between seismicity and net extraction," Brodsky said. "The correlation was even better when we used a combination of all the information we had on fluid injection and net extraction. The seismicity is clearly tracking the changes in fluid volume in the ground."The vast majority of the induced earthquakes are small, and the same is true of earthquakes in general. The key question is what is the biggest earthquake that could occur in the area, Brodsky said. The largest earthquake in the region of the Salton Sea Geothermal Field during the 30-year study period was a magnitude 5.1 earthquake.The nearby San Andreas fault, however, is capable of unleashing extremely destructive earthquakes of at least magnitude 8, Brodsky said. The location of the geothermal field at the southern end of the San Andreas fault is cause for concern due to the possibility of inducing a damaging earthquake."It's hard to draw a direct line from the geothermal field to effects on the San Andreas fault, but it seems plausible that they could interact," Brodsky said.At its southern end, the San Andreas fault runs into the Salton Sea, and it's not clear what faults there might be beneath the water. A seismically active region known as the Brawley Seismic Zone extends from the southern end of the San Andreas fault to the northern end of the Imperial fault. The Salton Sea Geothermal Field, located on the southeastern edge of the Salton Sea, is one of four operating geothermal fields in the area. | Earthquakes | 2,013 |
July 5, 2013 | https://www.sciencedaily.com/releases/2013/07/130705101538.htm | Ancient jigsaw puzzle of past supercontinent revealed | A new study published today in the journal | Researchers from Royal Holloway University, The Australian National University and Geoscience Australia, have helped clear up previous uncertainties on how the plates evolved and where they should be positioned when drawing up a picture of the past.Dr Lloyd White from the Department of Earth Sciences at Royal Holloway University said: "The Earth's tectonic plates move around through time. As these movements occur over many millions of years, it has previously been difficult to produce accurate maps of where the continents were in the past."We used a computer program to move geological maps of Australia, India and Antarctica back through time and built a 'jigsaw puzzle' of the supercontinent Gondwana. During the process, we found that many existing studies had positioned the plates in the wrong place because the geological units did not align on each plate."The researchers adopted an old technique used by people who discovered the theories of continental drift and plate tectonics, but which had largely been ignored by many modern scientists."It was a simple technique, matching the geological boundaries on each plate. The geological units formed before the continents broke apart, so we used their position to put this ancient jigsaw puzzle back together again," Dr White added."It is important that we know where the plates existed many millions of years ago, and how they broke apart, as the regions where plates break are often where we find major oil and gas deposits, such as those that are found along Australia's southern margin."A video demonstrating the ancient jigsaw puzzle can be viewed here: | Earthquakes | 2,013 |
June 26, 2013 | https://www.sciencedaily.com/releases/2013/06/130626142936.htm | Location of upwelling in Earth's mantle discovered to be stable | A study published in | Conrad has studied patterns of tectonic plates throughout his career, and has long noticed that the plates were, on average, moving northward. "Knowing this," explained Conrad, "I was curious if I could determine a single location in the Northern Hemisphere toward which all plates are converging, on average." After locating this point in eastern Asia, Conrad then wondered if other special points on Earth could characterize plate tectonics. "With some mathematical work, I described the plate tectonic 'quadrupole', which defines two points of 'net convergence' and two points of 'net divergence' of tectonic plate motions."When the researchers computed the plate tectonic quadruople locations for present-day plate motions, they found that the net divergence locations were consistent with the African and central Pacific locations where scientists think that mantle upwellings are occurring today. "This observation was interesting and important, and it made sense," said Conrad. "Next, we applied this formula to the time history of plate motions and plotted the points -- I was astonished to see that the points have not moved over geologic time!" Because plate motions are merely the surface expression of the underlying dynamics of the Earth's mantle, Conrad and his colleagues were able to infer that upwelling flow in the mantle must also remain stable over geologic time. "It was as if I was seeing the 'ghosts' of ancient mantle flow patterns, recorded in the geologic record of plate motions!"Earth's mantle dynamics govern many aspects of geologic change on the Earth's surface. This recent discovery that mantle upwelling has remained stable and centered on two locations (beneath Africa and the Central Pacific) provides a framework for understanding how mantle dynamics can be linked to surface geology over geologic time. For example, the researchers can now estimate how individual continents have moved relative to these two upwelling locations. This allows them to tie specific events that are observed in the geologic record to the mantle forces that ultimately caused these events.More broadly, this research opens up a big question for solid earth scientists: What processes cause these two mantle upwelling locations to remain stable within a complex and dynamically evolving system such as the mantle? One notable observation is that the lowermost mantle beneath Africa and the Central Pacific seems to be composed of rock assemblages that are different than the rest of the mantle. Is it possible that these two anomalous regions at the bottom of the mantle are somehow organizing flow patterns for the rest of the mantle? How?"Answering such questions is important because geologic features such as ocean basins, mountains belts, earthquakes and volcanoes ultimately result from Earth's interior dynamics," Conrad described. "Thus, it is important to understand the time-dependent nature of our planet's interior dynamics in order to better understand the geological forces that affect the planetary surface that is our home."The mantle flow framework that can be defined as a result of this study allows geophysicists to predict surface uplift and subsidence patterns as a function of time. These vertical motions of continents and seafloor cause both local and global changes in sea level. In the future, Conrad wants to use this new understanding of mantle flow patterns to predict changes in sea level over geologic time. By comparing these predictions to observations of sea level change, he hopes to develop new constraints on the influence of mantle dynamics on sea level. | Earthquakes | 2,013 |
June 18, 2013 | https://www.sciencedaily.com/releases/2013/06/130618113717.htm | Seismic gap outside of Istanbul: Is this where the expected Marmara earthquake will originate from? | Earthquake researchers have now identified a 30 kilometers long and ten kilometers deep area along the North Anatolian fault zone just south of Istanbul that could be the starting point for a strong earthquake. The group of seismologists including Professor Marco Bohnhoff of the GFZ German Research Centre for Geosciences reported in the current online issue of the scientific journal | The Istanbul-Marmara region of northwestern Turkey with a population of more than 15 million faces a high probability of being exposed to an earthquake of magnitude 7 or more. To better understand the processes taking place before a strong earthquake at a critically pressurized fault zone, a seismic monitoring network was built on the Princes Islands in the Sea of Marmara off Istanbul under the auspices of the Potsdam Helmholtz Centre GFZ together with the Kandilli Earthquake Observatory in Istanbul. The Princes Islands offer the only opportunity to monitor the seismic zone running below the seafloor from a distance of few kilometers.The now available data allow the scientists around GFZ researcher Marco Bohnhoff to come to the conclusion that the area is locked in depth in front of the historic city of Istanbul: "The block we identified reaches ten kilometers deep along the fault zone and has displayed no seismic activity since measurements began over four years ago. This could be an indication that the expected Marmara earthquake could originate there," says Bohnhoff.This is also supported by the fact that the fracture zone of the last strong earthquake in the region, in 1999, ended precisely in this area -- probably at the same structure, which has been impeding the progressive shift of the Anatolian plate in the south against the Eurasian plate in the north since 1766 and building up pressure. The results are also being compared with findings from other fault zones, such as the San Andreas Fault in California, to better understand the physical processes before an earthquake.Currently, the GFZ is intensifying its activity to monitor the earthquake zone in front of Istanbul. Together with the Disaster and Emergency Management Presidency of Turkey AFAD, several 300 meter deep holes are currently being drilled around the eastern Marmara Sea, into which highly sensitive borehole seismometers will be placed. With this Geophysical borehole Observatory at the North Anatolian Fault GONAF, measurement accuracy and detection threshold for microearthquakes are improved many times over. In addition, the new data also provide insights on the expected ground motion in the event of an earthquake in the region. Bohnhoff: "Earthquake prediction is scientifically impossible. But studies such as this provide a way to better characterize earthquakes in advance in terms of location, magnitude and rupture progression, and therefore allow a better assessment of damage risk." | Earthquakes | 2,013 |
June 17, 2013 | https://www.sciencedaily.com/releases/2013/06/130617104614.htm | New 'embryonic' subduction zone found | A new subduction zone forming off the coast of Portugal heralds the beginning of a cycle that will see the Atlantic Ocean close as continental Europe moves closer to America. | Published in Lead author Dr João Duarte, from the School of Geosciences said the team mapped the ocean floor and found it was beginning to fracture, indicating tectonic activity around the apparently passive South West Iberia plate margin."What we have detected is the very beginnings of an active margin -- it's like an embryonic subduction zone," Dr Duarte said."Significant earthquake activity, including the 1755 quake which devastated Lisbon, indicated that there might be convergent tectonic movement in the area. For the first time, we have been able to provide not only evidences that this is indeed the case, but also a consistent driving mechanism."The incipient subduction in the Iberian zone could signal the start of a new phase of the Wilson Cycle -- where plate movements break up supercontinents, like Pangaea, and open oceans, stabilise and then form new subduction zones which close the oceans and bring the scattered continents back together.This break-up and reformation of supercontinents has happened at least three times, over more than four billion years, on Earth. The Iberian subduction will gradually pull Iberia towards the United States over approximately 220 million years.The findings provide a unique opportunity to observe a passive margin becoming active -- a process that will take around 20 million years. Even at this early phase the site will yield data that is crucial to refining the geodynamic models."Understanding these processes will certainly provide new insights on how subduction zones may have initiated in the past and how oceans start to close," Dr Duarte said. | Earthquakes | 2,013 |
June 12, 2013 | https://www.sciencedaily.com/releases/2013/06/130612133140.htm | When will the next megathrust hit the west coast of North America? | Understanding the size and frequency of large earthquakes along the Pacific coast of North America is of great importance, not just to scientists, but also to government planners and the general public. The only way to predict the frequency and intensity of the ground motion expected from large and giant "megathrust " earthquakes along Canada's west coast is to analyze the geologic record. | A new study published today in the One of the co-authors of the study, Dr. Audrey Dallimore, Associate Professor at Royal Roads University explains: "Some BC coastal fjords preserve annually layered organic sediments going back all the way to deglacial times. In Effingham Inlet, on the west coast of Vancouver Island, these sediments reveal disturbances we interpret were caused by earthquakes. With our very detailed age model that includes 68 radiocarbon dates and the Mazama Ash deposit (a volcanic eruption that took place 6800 yrs ago); we have identified 22 earthquake shaking events over the last 11,000 years, giving an estimate of a recurrence interval for large and megathrust earthquakes of about 500 years. However, it appears that the time between major shaking events can stretch up to about a 1,000 years."The last megathrust earthquake originating from the Cascadia subduction zone occurred in 1700 AD. Therefore, we are now in the risk zone of another earthquake. Even though it could be tomorrow or perhaps even centuries before it occurs, paleoseismic studies such as this one can help us understand the nature and frequency of rupture along the Cascadia Subduction Zone, and help Canadian coastal communities to improve their hazard assessments and emergency preparedness plans.""This exceptionally well-dated paleoseismic study by Enkin et al., involved a multi-disciplinary team of Canadian university and federal government scientists, and a core from the 2002 international drill program Marges Ouest Nord Américaines (MONA) campaign," says Dr. Olav Lian, an associate editor of the In addition to analyzing the Effingham Inlet record for earthquake events, this study site has also revealed much information about climate and ocean changes throughout the Holocene to the present. These findings also clearly illustrate the importance of analyzing the geologic record to help today's planners and policy makers, and ultimately to increase the resiliency of Canadian communities. " | Earthquakes | 2,013 |
June 6, 2013 | https://www.sciencedaily.com/releases/2013/06/130606155132.htm | Earthquake acoustics can indicate if a massive tsunami is imminent | Stanford scientists have identified key acoustic characteristics of the 2011 Japan earthquake that indicated it would cause a large tsunami. The technique could be applied worldwide to create an early warning system for massive tsunamis. | On March 11, 2011, a magnitude 9.0 undersea earthquake occurred 43 miles off the shore of Japan. The earthquake generated an unexpectedly massive tsunami that washed over eastern Japan roughly 30 minutes later, killing more than 15,800 people and injuring more than 6,100. More than 2,600 people are still unaccounted for.Now, computer simulations by Stanford scientists reveal that sound waves in the ocean produced by the earthquake probably reached land tens of minutes before the tsunami. If correctly interpreted, they could have offered a warning that a large tsunami was on the way.Although various systems can detect undersea earthquakes, they can't reliably tell which will form a tsunami, or predict the size of the wave. There are ocean-based devices that can sense an oncoming tsunami, but they typically provide only a few minutes of advance warning.Because the sound from a seismic event will reach land well before the water itself, the researchers suggest that identifying the specific acoustic signature of tsunami-generating earthquakes could lead to a faster-acting warning system for massive tsunamis.The finding was something of a surprise. The earthquake's epicenter had been traced to the underwater Japan Trench, a subduction zone about 40 miles east of Tohoku, the northeastern region of Japan's larger island. Based on existing knowledge of earthquakes in this area, seismologists puzzled over why the earthquake rupture propagated from the underground fault all the way up to the seafloor, creating a massive upward thrust that resulted in the tsunami.Direct observations of the fault were scarce, so Eric Dunham, an assistant professor of geophysics in the School of Earth Sciences, and Jeremy Kozdon, a postdoctoral researcher working with Dunham, began using the cluster of supercomputers at Stanford's Center for Computational Earth and Environmental Science (CEES) to simulate how the tremors moved through the crust and ocean.The researchers built a high-resolution model that incorporated the known geologic features of the Japan Trench and used CEES simulations to identify possible earthquake rupture histories compatible with the available data.Retroactively, the models accurately predicted the seafloor uplift seen in the earthquake, which is directly related to tsunami wave heights, and also simulated sound waves that propagated within the ocean.In addition to valuable insight into the seismic events as they likely occurred during the 2011 earthquake, the researchers identified the specific fault conditions necessary for ruptures to reach the seafloor and create large tsunamis.The model also generated acoustic data; an interesting revelation of the simulation was that tsunamigenic surface-breaking ruptures, like the 2011 earthquake, produce higher amplitude ocean acoustic waves than those that do not.The model showed how those sound waves would have traveled through the water and indicated that they reached shore 15 to 20 minutes before the tsunami."We've found that there's a strong correlation between the amplitude of the sound waves and the tsunami wave heights," Dunham said. "Sound waves propagate through water 10 times faster than the tsunami waves, so we can have knowledge of what's happening a hundred miles offshore within minutes of an earthquake occurring. We could know whether a tsunami is coming, how large it will be and when it will arrive."The team's model could apply to tsunami-forming fault zones around the world, though the characteristics of telltale acoustic signature might vary depending on the geology of the local environment. The crustal composition and orientation of faults off the coasts of Japan, Alaska, the Pacific Northwest and Chile differ greatly."The ideal situation would be to analyze lots of measurements from major events and eventually be able to say, 'this is the signal'," said Kozdon, who is now an assistant professor of applied mathematics at the Naval Postgraduate School. "Fortunately, these catastrophic earthquakes don't happen frequently, but we can input these site specific characteristics into computer models -- such as those made possible with the CEES cluster -- in the hopes of identifying acoustic signatures that indicates whether or not an earthquake has generated a large tsunami."Dunham and Kozdon pointed out that identifying a tsunami signature doesn't complete the warning system. Underwater microphones called hydrophones would need to be deployed on the seafloor or on buoys to detect the signal, which would then need to be analyzed to confirm a threat, both of which could be costly. Policymakers would also need to work with scientists to settle on the degree of certainty needed before pulling the alarm.If these points can be worked out, though, the technique could help provide precious minutes for an evacuation.The study is detailed in the current issue of the journal | Earthquakes | 2,013 |
June 6, 2013 | https://www.sciencedaily.com/releases/2013/06/130606101728.htm | 'Caldas tear' resolves puzzling seismic activity beneath Colombia | Colombia sits atop a complex geological area where three tectonic plates are interacting, producing seismicity patterns that have puzzled seismologists for years. Now seismologists have identified the "Caldas tear," which is a break in a slab that separates two subducting plates and accounts for curious features, including a "nest" of seismic activity beneath east-central Colombia and high grade mineral deposits on the surface. | In a paper published in the June issue of the "This paper attempts to provide a unifying concept of how the deformation is proceeding on a regional scale in Colombia," said Mann.The complex regional tectonic activity includes movement of three plates: the Caribbean plate that is subducting or being forced beneath Colombia in the north; the Panama block or Panama plate that is colliding with Colombia in the central part of the country; and the Nazca plate, which is an oceanic plate that is subducting beneath the southern part of Colombia from the Pacific.While Colombians have experienced earthquakes in the past 20 years, none has been exceedingly large, despite the complex zone of convergence beneath it."Unlike the high seismicity to the south of the Caldas tear, there are few destructive earthquakes north of the tear, which suggests that there is an accumulation of stresses that could trigger strong motion events resulting from the frontal collision of the Panamanian Arc against Colombia," said Vargas.The authors used the Colombian Seismic Network's database of more than 100,000 seismic events to identify the prominent tear, where the slab is broken along a very distinct break, separating the Nazca oceanic plate, which is coming from the Pacific, from the Panama plate, which is an old island arc (Sandra ridge) that pushes into Colombia from the west."We think that this Panama block is acting as an indenter. It's a block of thick crust that doesn't subduct easily, rather it subducts at a shallow angle," said Mann. "And because it's thick crust, it acts like a fist or an indenter that's pushing into the whole country of Colombia and northwest South America."Vargas and Mann suggest the crust that preceded the indenter or colliding Panama block is being ripped apart from the indenter, which is crust that cannot easily subduct. The Caldas tear forms the southern edge of the indenter, separating it from the Nazca plate. The indenter is breaking from the crust that preceded it, forming the Bucamaranga nest, which is a dense area of seismic activity in a small volume of crust at about 140 km depth.Using tomographic data, the authors inferred that the Caribbean plate crust to the north is subducting at a very shallow angle and producing relatively little deep seismicity. Tomography is a way to map the geometry of a subduction zone on the basis of differences in the seismic attenuation of crusts. Colder subducting crust, for example, will have a lower attenuation than the surrounding mantle. The Caribbean plate in the north is subducting at a slower rate than the Nazca plate in the south, where seismic activity is greater."In the center of it all is the indenter -- an incredibly important feature for Colombia and for assessing its earthquake hazard," said Mann.Colombia features large strike slip faults that form a V-shaped pattern, which is symmetrical about the Panama plate. Vargas and Mann relate the upper crust strike slip faults to the indenter, which they suggest is pushing the crust further east than in areas in the north and south.The Caldas tear is reflected in the landscape. The Magdalena River, which runs northward, changes from broad valleys to steeper relief gorges as it crosses the tear, suggesting to researchers that the tear may propagate to the surface. There is an alignment of small volcanoes along the tear that have an unusual composition, distinctly different than seen in the volcano arc in the south."We have found that this tectonic structure not only controls the distribution of major mineral deposits, but has also come to control the geometry of several sedimentary basins, and the distribution of hydrocarbons retained in them," said Vargas."Tearing and breakoff of the subducted slabs as the result of a collision of the Panama arc-indenter with northwestern South America," appears in the June 2013 issue of | Earthquakes | 2,013 |
June 3, 2013 | https://www.sciencedaily.com/releases/2013/06/130603142313.htm | New explanation for slow earthquakes on San Andreas | New Zealand's geologic hazards agency reported this week an ongoing, "silent" earthquake that began in January is still going strong. Though it is releasing the energy equivalent of a 7.0 earthquake, New Zealanders can't feel it because its energy is being released over a long period of time, therefore slow, rather than a few short seconds. | These so-called "slow slip events" are common at subduction zone faults -- where an oceanic plate meets a continental plate and dives beneath it. They also occur on continents along strike-slip faults like California's San Andreas, where two plates move horizontally in opposite directions. Occurring close to the surface, in the upper 3-5 kilometers (km) of the fault, this slow, silent movement is referred to as "creep events."In a study published this week in "The observation that faults creep in different ways at different places and times in the earthquake cycle has been around for 40 years without a mechanical model that can explain this variability," says WHOI geologist and co-author Jeff McGuire. "Creep is a basic feature of how faults work that we now understand better."Fault creep occurs in shallow portions of the fault and is not considered a seismic event. There are two types of creep. In one form, creep occurs as a continuous stable sliding of unlocked portions of the fault, and can account for approximately 25 millimeters of motion along the fault per year. The other type is called a "creep event," sudden slow movement, lasting only a few hours, and accommodating approximately 3 centimeters of slip per event. Creep events are separated by long intervals of slow continuous creep."Normal earthquakes happen when the locked portions of the fault rupture under the strain of accumulated stress and the plates move or slip along the fault," says the study's lead author, WHOI postdoctoral scholar Matt Wei. "This kind of activity is only a portion of the total fault movement per year. However, a significant fraction of the total slip can be attributed to fault creep."Scientists have mapped out the segments of the San Andreas fault that experience these different kinds of creep, and which segments are totally "locked," experiencing no movement at all until an earthquake rupture. They know the source of earthquakes is a layer of unstable rock at about 5- 15 km depth along the fault. But have only recently begun to understand the source of fault creep.For nearly two decades, geologists have accepted and relied upon a mechanical model to explain the geologic source of fault creep. This model explains that continuous creep is generated in the upper-most "stable" sediment layer of the fault plane and episodic creep events originate in a "conditionally stable" layer of rock sandwiched between the sediment and the unstable layer of rock (the seismogenic zone, where earthquakes originate) below it.But when Wei and his colleagues tried to use this mechanical model to reproduce the geodetic data after a 1987 earthquake in southern California's Superstition Hills fault, they found it is impossible to match the observations."Superstition Hills was a very large earthquake. Immediately following the quake, the US Geologic Survey installed creepmeters to measure the post-seismic deformation. The result is a unique data set that shows both afterslip and creep events," says Wei.The researchers could only match the real world data set and on-the-ground observations by embedding an additional unstable layer within the top sediment layer of the model. "This layer may result from fine-scale lithological heterogeneities within the stable zone -- frictional behavior varies with lithology, generating the instability," the authors write. "Our model suggests that the displacement of and interval between creep events are dependent on the thickness, stress, and frictional properties of the shallow, unstable layer."There are major strike-slip faults like the San Andreas around the world, but the extent of creep events along those faults is something of a mystery. "Part of the reason is that we don't have creepmeters along these faults, which are often in sparsely populated areas. It takes money and effort, so a lot of these faults are not covered [with instruments]. We can use remote sensing to know if they are creeping, but we don't know if it's from continuous creep or creep events," says Wei.Simulating faults to better understand how stress, strain, and earthquakes work is inherently difficult because of the depth at which the important processes happen. Recovering drill cores and installing instruments at significant depths within Earth is very expensive and still relatively rare. "Rarely are the friction tests done on real cores," says Wei. "Most of the friction tests are done on synthetic cores. Scientists will grind rocks into powder to simulate the fault." Decades of these experiments have provided an empirical framework to understand how stress and slip evolve on faults, but geologists are still a long way from having numerical models tailored to the parameters that hold for particular faults in the Earth.McGuire says the new research is an important step in ground-truthing those lab simulations. "This work has shown that the application of the friction laws derived from the lab can accurately describe some first order variations that we observe with geodesy between different faults in the real world," he says. "This is an important validation of the scaling up of the lab results to large crustal faults."For the scientists, this knowledge is a new beginning for further research into understanding fault motion and the events that trigger them. Creep events are important because they are shallow and release stress, but are still an unknown factor in understanding earthquake behavior. "There's much we still don't know. For example, it's possible that the shallow layer source of creep events could magnify an earthquake as it propagates through these layers," says Wei.Additionally, the findings can help understand the slow slip events happening along subduction zones, like the ongoing event in New Zealand. "By validating the friction models with shallow creep events that have very precise data, we can have more confidence in the mechanical implications of the deep subduction zone events," McGuire says. | Earthquakes | 2,013 |
May 20, 2013 | https://www.sciencedaily.com/releases/2013/05/130520114021.htm | Slow earthquakes: It's all in the rock mechanics | Earthquakes that last minutes rather than seconds are a relatively recent discovery, according to an international team of seismologists. Researchers have been aware of these slow earthquakes, only for the past five to 10 years because of new tools and new observations, but these tools may explain the triggering of some normal earthquakes and could help in earthquake prediction. | "New technology has shown us that faults do not just fail in a sudden earthquake or by stable creep," said Demian M. Saffer, professor of geoscience, Penn State. "We now know that earthquakes with anomalous low frequencies -- slow earthquakes -- and slow slip events that take weeks to occur exist."These new observations have put a big wrinkle into our thinking about how faults work, according to the researchers who also include Chris Marone, professor of geosciences, Penn State; Matt J. Ikari, recent Ph.D. recipient, and Achim J. Kopf, former Penn State postdoctural fellow, both now at the University of Bremen, Germany. So far, no one has explained the processes that cause slow earthquakes.The researchers thought that the behavior had to be related to the type of rock in the fault, believing that clay minerals are important in this slip behavior to see how the rocks reacted. Ikari performed laboratory experiments using natural samples from drilling done offshore of Japan in a place where slow earthquakes occur. The samples came from the Integrated Ocean Drilling Program, an international collaborative. The researchers reported their results recently in Nature Geoscience.These samples are made up of ocean sediment that is mostly clay with a little quartz."Usually, when you shear clay-rich fault rocks in the laboratory in the way rocks are sheared in a fault, as the speed increases, the rocks become stronger and self arrests the movement," said Saffer. "Matt noticed another behavior. Initially the rocks reacted as expected, but these clays got weaker as they slid further. They initially became slightly stronger as the slip rate increased, but then, over the long run, they became weaker."The laboratory experiments that produced the largest effect closely matched the velocity at which slow earthquakes occur in nature. The researchers also found that water content in the clays influenced how the shear occurred."From the physics of earthquake nucleation based on the laboratory experiments we would predict the size of the patch of fault that breaks at tens of meters," said Saffer. "The consistent result for the rates of slip and the velocity of slip in the lab are interesting. Lots of things point in the direction for this to be the solution."The researchers worry about slow earthquakes because there is evidence that swarms of low frequency events can trigger large earthquake events. In Japan, a combination of broadband seismometers and global positioning system devices can monitor slow earthquakes.For the Japanese and others in earthquake prone areas, a few days of foreknowledge of a potential earthquake hazard could be valuable and save lives.For slow slip events, collecting natural samples for laboratory experiments is more difficult because the faults where these take place are very deep. Only off the north shore of New Zealand is there a fault that can be sampled. Saffer is currently working to arrange a drilling expedition to that fault.The National Science Foundation and the Deutsche Forschungsgemeinschaft supported this work. | Earthquakes | 2,013 |
May 17, 2013 | https://www.sciencedaily.com/releases/2013/05/130517085819.htm | GPS solution provides three-minute tsunami alerts | Researchers have shown that, by using global positioning systems (GPS) to measure ground deformation caused by a large underwater earthquake, they can provide accurate warning of the resulting tsunami in just a few minutes after the earthquake onset. For the devastating Japan 2011 event, the team reveals that the analysis of the GPS data and issue of a detailed tsunami alert would have taken no more than three minutes. | The results are published on 17 May in Most tsunamis, including those in offshore Sumatra, Indonesia in 2004 and Japan in 2011, occur following underwater ground motion in subduction zones, locations where a tectonic plate slips under another causing a large earthquake. To a lesser extent, the resulting uplift of the sea floor also affects coastal regions. There, researchers can measure the small ground deformation along the coast with GPS and use this to determine tsunami information."High-precision real-time processing and inversion of these data enable reconstruction of the earthquake source, described as slip at the subduction interface. This can be used to calculate the uplift of the sea floor, which in turn is used as initial condition for a tsunami model to predict arrival times and maximum wave heights at the coast," says lead-author Andreas Hoechner from the German Research Centre for Geosciences (GFZ).In the new "Japan has a very dense network of GPS stations, but these were not being used for tsunami early warning as of 2011. Certainly this is going to change soon," states Hoechner.The scientists used raw data from the Japanese GPS Earth Observation Network (GEONET) recorded a day before to a day after the 2011 earthquake. To shorten the time needed to provide a tsunami alert, they only used data from 50 GPS stations on the northeast coast of Japan, out of about 1200 GEONET stations available in the country.At present, tsunami warning is based on seismological methods. However, within the time limit of 5 to 10 minutes, these traditional techniques tend to underestimate the earthquake magnitude of large events. Furthermore, they provide only limited information on the geometry of the tsunami source (see note). Both factors can lead to underprediction of wave heights and tsunami coastal impact. Hoechner and his team say their method does not suffer from the same problems and can provide fast, detailed and accurate tsunami alerts.The next step is to see how the GPS solution works in practice in Japan or other areas prone to devastating tsunamis. As part of the GFZ-lead German Indonesian Tsunami Early Warning System project, several GPS stations were installed in Indonesia after the 2004 earthquake and tsunami near Sumatra, and are already providing valuable information for the warning system."The station density is not yet high enough for an independent tsunami early warning in Indonesia, since it is a requirement for this method that the stations be placed densely close to the area of possible earthquake sources, but more stations are being added," says Hoechner.NoteTraditional tsunami early warning methods use hypocentre (the point directly beneath the epicentre where the seismic fault begins to rupture) and magnitude only, meaning the source of the earthquake and tsunami is regarded as a point source. However, especially in the case of subduction earthquakes, it can have a large extension: in Japan in 2011 the connection between the tectonic plates broke on a length of about 400km and the Sumatra event in 2004 had a length of some 1500km. To get a good tsunami prediction, it is important to consider this extension and the spatial slip distribution. | Earthquakes | 2,013 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.