Date
stringlengths 11
18
| Link
stringlengths 62
62
| Title
stringlengths 16
148
| Summary
stringlengths 1
2.68k
| Body
stringlengths 22
13k
⌀ | Category
stringclasses 20
values | Year
int64 2k
2.02k
|
---|---|---|---|---|---|---|
May 16, 2013 | https://www.sciencedaily.com/releases/2013/05/130516182002.htm | How should geophysics contribute to disaster planning? | Earthquakes, tsunamis, and other natural disasters often showcase the worst in human suffering -- especially when those disasters strike populations who live in rapidly growing communities in the developing world with poorly enforced or non-existent building codes. | This week in Cancun, a researcher from Yale-National University of Singapore (NUS) College in Singapore is presenting a comparison between large-scale earthquakes and tsunamis in different parts of the world, illustrating how nearly identical natural disasters can play out very differently depending on where they strike.The aim of the talk at the 2013 Meeting of the Americas, which is sponsored by the American Geophysical Union (AGU), is to focus on the specific role geoscientists can play in disaster risk reduction and how their work should fit in with the roles played by other experts for any given community."To reduce the losses from these disasters, a diverse group of researchers, engineers, and policy makers need to come together to benefit from each other's expertise," said Brian McAdoo, professor of science at Yale-NUS College. "Geophysicists play a crucial role in natural hazard identification and determining the key questions of, how often does a geophysical hazard affect a given area and how big will it be when it hits?" McAdoo said. "We need to be aware of how this information is incorporated into the disaster planning architecture."In his talk, McAdoo will present case studies that he and his colleague Vivienne Bryner compiled comparing death counts and economic fallout following geophysical events of similar magnitude in areas with different levels of economic development.What their analysis shows is that deaths tend to be higher in poor countries exposed to severe natural disasters because of existing socioeconomic, environmental, and structural vulnerabilities. At the same time, economic losses tend to be higher in developed nations, but developing countries may be less able to absorb those economic losses that do occur.As an example, he points to the earthquakes that hit Haiti, San Francisco, and Christchurch and Canterbury, New Zealand, in 2010, 1989 and 2010-2011. While the quakes were nearly identical in magnitude, the consequences of these natural disasters were remarkably different.Some 185 people died in the 2011 Canterbury earthquake, which was preceded by the larger Christchurch quake in 2010 in which nobody died. Both quakes and their aftershocks cost New Zealand about $6.5 billion, which was approximately 10-20 percent of its gross domestic product (GDP). The 1989 San Francisco earthquake killed 63 people, and it cost $5.6 billion (the equivalent of about $10 billion in 2010 dollars). The U.S. economy is so large, however, that it only caused a one-tenth of one percent drop in U.S. GDP. The 2011 earthquake in Haiti, on the other hand, killed some 200,000 people and resulted in economic losses approaching an estimated $8 billion, which is more than 80 percent of Haiti's GDP.To address such disparities, McAdoo advocates what is known as Disaster Risk Reduction (DRR) decision making -- a framework for finding solutions to best prepare for natural disasters, lessen their impact, and sensibly engage in post-disaster reconstruction. For such planning to work, he said, it must be broad-based."We won't ever be able to prevent disasters," he said. "The only way we will effectively minimize the effects of hazards is to collaborate across academic disciplines, businesses, governments, NGOs, and perhaps most critically the exposed community.""Planning for any sort of natural disaster takes insight into what may be expected, which necessarily includes the important perspective of scientists," added Philip ("Bo") Hammer, Associate Vice President for Physics Resources at the American Institute of Physics (AIP) and co-organizer of the session in which McAdoo is speaking. "One reason why we organized this session in the first place was to encourage the sharing of such perspectives within the context of how geophysicists can build local capacity, not only for dealing with acute issues such as disasters, but also longer term challenges like building capacity for economic growth."The talk, "Building Capacity for Disaster Risk Reduction," will be presented by Brian G. McAdoo and Vivienne Bryner on Friday, May 17, 2013, at the 2013 Meeting of the Americas in Cancún, Mexico. McAdoo is affiliated with Yale-NUS College in Singapore, and Bryner is at University of Otago in Dunedin, New Zealand. | Earthquakes | 2,013 |
May 14, 2013 | https://www.sciencedaily.com/releases/2013/05/130514190635.htm | Research helps paint finer picture of massive 1700 earthquake | In 1700, a massive earthquake struck the west coast of North America. Though it was powerful enough to cause a tsunami as far as Japan, a lack of local documentation has made studying this historic event challenging. | Now, researchers from the University of Pennsylvania have helped unlock this geological mystery using a fossil-based technique. Their work provides a finer-grained portrait of this earthquake and the changes in coastal land level it produced, enabling modelers to better prepare for future events.Penn's team includes Benjamin Horton, associate professor and director of the Sea Level Research Laboratory in the Department of Earth and Environmental Science in the School of Arts and Sciences, along with then lab members Simon Engelhart and Andrea Hawkes. They collaborated with researchers from Canada's University of Victoria, the National Taiwan University, the Geological Survey of Canada and the United States Geological Survey.The research was published in theThe Cascadia Subduction Zone runs along the Pacific Northwest coast of the United States to Vancouver Island in Canada. This major fault line is capable of producing megathrust earthquakes 9.0 or higher, though, due to a dearth of observations or historical records, this trait was only discovered within the last several decades from geology records. The Lewis and Clark expedition did not make the first extensive surveys of the region until more than 100 years later, and contemporaneous aboriginal accounts were scarce and incomplete.The 1700 Cascadia event was better documented in Japan than in the Americas. Records of the "orphan tsunami" -- so named because its "parent" earthquake was too far away to be felt -- gave earth scientists hints that this subduction zone was capable of such massive seismic activity. Geological studies provided information about the earthquake, but many critical details remained lost to history."Previous research had determined the timing and the magnitude, but what we didn't know was how the rupture happened," Horton said. "Did it rupture in one big long segment, more than a thousand kilometers, or did it rupture in parcels?"To provide a clearer picture of how the earthquake occurred, Horton and his colleagues applied a technique they have used in assessing historic sea-level rise. They traveled to various sites along the Cascadia subduction zone, taking core samples from up and down the coast and working with local researchers who donated pre-existing data sets. The researchers' targets were microscopic fossils known as foraminifera. Through radiocarbon dating and an analysis of different species' positions with the cores over time, the researchers were able to piece together a historical picture of the changes in land and sea level along the coastline. The research revealed how much the coast suddenly subsided during the earthquake. This subsidence was used to infer how much the tectonic plates moved during the earthquake."What we were able to show for the first time is that the rupture of Cascadia was heterogeneous, making it similar to what happened with the recent major earthquakes in Japan, Chile and Sumatra," Horton said.This level of regional detail for land level changes is critical for modeling and disaster planning."It's only when you have that data that you can start to build accurate models of earthquake ruptures and tsunami inundation," Horton said. "There were areas of the west coast of the United States that were more susceptible to larger coastal subsidence than others."The Cascadia subduction zone is of particular interest to geologists and coastal managers because geological evidence points to recurring seismic activity along the fault line, with intervals between 300 and 500 years. With the last major event occurring in 1700, another earthquake could be on the horizon. A better understanding of how such an event might unfold has the potential to save lives."The next Cascadia earthquake has the potential to be the biggest natural disaster that the Unites States will have to come to terms with -- far bigger than Sandy or even Katrina," Horton said. "It would happen with very little warning; some areas of Oregon will have less than 20 minutes to evacuate before a large tsunami will inundate the coastline like in Sumatra in 2004 and Japan in 2011."The research was supported by the National Science Foundation, the United States Geological Survey and the University of Victoria. Simon Engelhart and Andrea Hawkes are now assistant professors at the University of Rhode Island and the University of North Carolina, respectively. Their co-authors were Pei-Ling Wang of the University of Victoria and National Taiwan University, Kelin Wang of the University of Victoria and the Geological Survey of Canada's Pacific Geoscience Centre, Alan Nelson of the United States Geological Survey's Geologic Hazards Science Center and Robert Witter of the United States Geological Survey's Alaska Science Center. | Earthquakes | 2,013 |
May 13, 2013 | https://www.sciencedaily.com/releases/2013/05/130513152411.htm | Using earthquake sensors to track endangered whales | The fin whale is the second-largest animal ever to live on Earth. It is also, paradoxically, one of the least understood. The animal's huge size and global range make its movements and behavior hard to study. | A carcass that washed up on a Seattle-area beach this spring provided a reminder that sleek fin whales, nicknamed "greyhounds of the sea," are vulnerable to collision when they strike fast-moving ships. Knowing their swimming behaviors could help vessels avoid the animals. Understanding where and what they eat could also help support the fin whale's slowly rebounding populations.University of Washington oceanographers are addressing such questions using a growing number of seafloor seismometers, devices that record vibrations. A series of three papers published this winter in the Journal of the Acoustical Society of America interprets whale calls found in earthquake sensor data, an inexpensive and non-invasive way to monitor the whales. The studies are the first to match whale calls with fine-scale swimming behavior, providing new hints at the animals' movement and communication patterns.The research began a decade ago as a project to monitor tremors on the Juan de Fuca Ridge, a seismically active zone more than a mile deep off the Washington coast. That was the first time UW researchers had collected an entire year's worth of seafloor seismic data."Over the winter months we recorded a lot of earthquakes, but we also had an awful lot of fin-whale calls," said principal investigator William Wilcock, a UW professor of oceanography. At first the fin whale calls, which at 17 to 35 vibrations per second overlap with the seismic data, "were kind of just a nuisance," he said.In 2008 Wilcock got funding from the Office of Naval Research to study the previously discarded whale calls.Dax Soule, a UW doctoral student in oceanography, compared the calls recorded by eight different seismometers. Previous studies have done this for just two or three animals at a time, but the UW group automated the work to analyze more than 300,000 whale calls. The method is similar to how a smartphone's GPS measures a person's location by comparing paths to different satellites. Researchers looked at the fin whale's call at the eight seismometers to calculate a position. That technique let them follow the animal's path through the instrument grid and within 10 miles of its boundaries.Soule created 154 individual fin whale paths and discovered three categories of vocalizing whales that swam south in winter and early spring of 2003. He also found a category of rogue whales that traveled north in the early fall, moving faster than the other groups while emitting a slightly higher-pitched call."One idea is that these are juvenile males that don't have any reason to head south for the breeding season," Soule said. "We can't say for sure because so little is known about fin whales. To give you an idea, people don't even know how or why they make their sound."The fin whale's call is not melodic, but that's a plus for this approach. The second-long chirp emitted roughly every 25 seconds is consistently loud and at the lower threshold of human hearing, so within range of earthquake monitoring instruments. These loud, repetitive bleeps are ideally suited for computer analysis.Michelle Weirathmueller, a UW doctoral student in oceanography, used Soule's triangulations to determine the loudness of the call. She found the fin whale's call is surprisingly consistent at 190 decibels, which translates to 130 decibels in air -- about as loud as a jet engine.Knowing the consistent amplitude of the fin whale's song will help Weirathmueller track whales with more widely spaced seismometer networks, in which a call is recorded by only one instrument at a time. Those include the Neptune Canada project, the U.S. cabled observatory component of the Ocean Observatories Initiative, and the huge 70-seismometer Cascadia Initiative array that's begun to detect tremors off the Pacific Northwest coast."We'd like to know where the fin whales are at any given time and how their presence might be linked to food availability, ocean conditions and seafloor geology," Weirathmueller said. "This is an incredibly rich dataset that can start to pull together the information we need to link the fin whales with their deep-ocean environments." | Earthquakes | 2,013 |
May 13, 2013 | https://www.sciencedaily.com/releases/2013/05/130513103731.htm | Western Indian Ocean earthquake and tsunami hazard potential greater than previously thought | Earthquakes similar in magnitude to the 2004 Sumatra earthquake could occur in an area beneath the Arabian Sea at the Makran subduction zone, according to recent research published in | The research was carried out by scientists from the University of Southampton based at the National Oceanography Centre Southampton (NOCS), and the Pacific Geoscience Centre, Natural Resources Canada.The study suggests that the risk from undersea earthquakes and associated tsunami in this area of the Western Indian Ocean -- which could threaten the coastlines of Pakistan, Iran, Oman, India and potentially further afield -- has been previously underestimated. The results highlight the need for further investigation of pre-historic earthquakes and should be fed into hazard assessment and planning for the region.Subduction zones are areas where two of Earth's tectonic plates collide and one is pushed beneath the other. When an earthquake occurs here, the seabed moves horizontally and vertically as the pressure is released, displacing large volumes of water that can result in a tsunami.The Makran subduction zone has shown little earthquake activity since a magnitude 8.1 earthquake in 1945 and magnitude 7.3 in 1947. Because of its relatively low seismicity and limited recorded historic earthquakes it has often been considered incapable of generating major earthquakes.Plate boundary faults at subduction zones are expected to be prone to rupture generating earthquakes at temperatures of between 150 and 450 °C. The scientists used this relationship to map out the area of the potential fault rupture zone beneath the Makran by calculating the temperatures where the plates meet. Larger fault rupture zones result in larger magnitude earthquakes."Thermal modelling suggests that the potential earthquake rupture zone extends a long way northward, to a width of up to 350 kilometres which is unusually wide relative to most other subduction zones," says Gemma Smith, lead author and PhD student at University of Southampton School of Ocean and Earth Science, which is based at NOCS.The team also found that the thickness of the sediment on the subducting plate could be a contributing factor to the magnitude of an earthquake and tsunami there."If the sediments between the plates are too weak then they might not be strong enough to allow the strain between the two plates to build up," says Smith. "But here we see much thicker sediments than usual, which means the deeper sediments will be more compressed and warmer. The heat and pressure make the sediments stronger. This results in the shallowest part of the subduction zone fault being potentially capable of slipping during an earthquake."These combined factors mean the Makran subduction zone is potentially capable of producing major earthquakes, up to magnitude 8.7-9.2. Past assumptions may have significantly underestimated the earthquake and tsunami hazard in this region." | Earthquakes | 2,013 |
May 3, 2013 | https://www.sciencedaily.com/releases/2013/05/130503105033.htm | Hearing the Russian meteor, in America: Sound arrived in 10 hours, lasted 10 more | How powerful was February's meteor that crashed into Russia? Strong enough that its explosive entry into our atmosphere was detected almost 6,000 miles away in Lilburn, Ga., by infrasound sensors -- a full 10 hours after the meteor's explosion. A Georgia Tech researcher has modified the signals and made them audible, allowing audiences to "hear" what the meteor's waves sounded like as they moved around the globe on February 15. | Lilburn is home to one of nearly 400 USArray seismic/infrasound stations in use in the eastern United States. They are part of a large-scale project named "Earthscope," an initiative funded by the National Science Foundation that studies Earth's interior beneath North America. The stations are mainly deployed to record seismic waves generated from earthquakes, but their sound sensors can record ultra long-period sound waves, also known as infrasound waves.The human ear cannot hear these infrasound signals. However, by playing the data faster than true speed, Georgia Tech faculty member Zhigang Peng increased the sound waves' frequency to audible levels. The Incorporated Research Institutions for Seismology's Data Managment Center provided the data."The sound started at about 10 hours after the explosion and lasted for another 10 hours in Georgia," said Peng, an associate professor in the School of Earth and Atmospheric Sciences. He's confident that the sound is associated with the meteor impact because a slow propagation of the sound waves can be seen across the entire collection of USArray stations, as well as other stations in Alaska and polar regions."They are like tsunami waves induced by large earthquakes," Peng added. "Their traveling speeds are similar, but the infrasound propagates in the atmosphere rather than in deep oceans."Scientists believe the meteor was about 55 feet in diameter, weighed more than 7,000 tons and raced through the sky at 40,000 miles an hour. Its energy was estimated at 30 nuclear bombs. More than 1,500 people were hurt.Using the same sonification process, Peng also converted seismic waves from North Korea's nuclear test on February 12 and an earthquake in Nevada the next day. Each registered as a 5.1 magnitude event but created different sounds. The measurements were collected by seismic instruments located about 100 to 200 miles from each event. For further comparison, Peng has also created a seismic recording of the meteor impact at a similar distance."The initial sound of the nuclear explosion is much stronger, likely due to the efficient generation of compressional wave (P wave) for an explosive source," said Peng. "In comparison, the earthquake generated stronger shear waves that arrived later than its P wave."Peng says the seismic signal from the meteor is relatively small, even after being amplified by 10 times. According to Peng, this is mainly because most of the energy from the meteor explosion propagated as the infrasound displayed in the initial sound clip. Only a very small portion was turned into seimsic waves propagating inside Earth.This isn't the first time Peng has converted seismic data into audible files. He also sonified 2011's historic Tohoku-Oki, Japan, earthquake as it moved through Earth and around the globe.The seismic and sound data generated by the meteor impact and other sources can be used to demonstrate their global impact. Scientists are also using them to better understand their source characterizations and how they propagate above and inside Earth.Video: | Earthquakes | 2,013 |
May 1, 2013 | https://www.sciencedaily.com/releases/2013/05/130501101307.htm | Scientists retrieve temperature data from Japan Trench observatory | With the successful retrieval of a string of instruments from deep beneath the seafloor, an international team of scientists has completed an unprecedented series of operations to obtain crucial temperature measurements of the fault that caused the devastating Tohoku earthquake and tsunami in March 2011. | Emily Brodsky, a professor of Earth and planetary sciences at UC Santa Cruz, helped organize the Japan Trench Fast Drilling Project (JFAST), which successfully drilled across the Tohoku earthquake fault last year and installed a borehole observatory nearly 7 kilometers beneath the ocean surface. UCSC research scientist Patrick Fulton was on board the research vessel This was the last phase of operations for JFAST, designed to investigate the huge slip (50 meters or more) on the shallow portion of the plate boundary fault that was largely responsible for the Tohoku earthquake and tsunami. The data recovered from the sensors provide a very high-precision record of temperature at 55 different depths across the plate boundary. Many of the sensors also recorded water pressure."We will be analyzing the data to characterize the amount of frictional heat on the fault during the Tohoku earthquake," Fulton said. "We'll also be closely investigating the effects of other processes within the subsurface, such as groundwater flow and seafloor movement due to aftershocks. It is exciting to finally have this amazing data in hand."According to Brodsky, the entire project was unprecedented on many levels. "Nobody had done rapid-response drilling in the ocean, nobody had drilled anything substantial under 7 kilometers of water, nobody had placed an observatory in a fault that deep, and nobody had retrieved a string of instruments from that deep," she said.The scientific drilling vessel The recovered sensors provide data that will be used to determine the frictional heat generated by fault slip during the Tohoku earthquake. Scientists will infer the forces on the fault during the earthquake from these measurements of dissipated energy. The new data are critical to understanding the causes of the large, shallow displacements during earthquakes that can generate devastating tsunamis. The JFAST observatory provides the first temperature measurements at a subduction plate boundary fault immediately after an earthquake.Fulton described the recovery operation in an email from aboard the Brodsky and Fulton will be busy analyzing the data over the next few weeks in preparation for the Japan Geoscience Union Meeting, May 19 to 24, when they will present some of the initial results. | Earthquakes | 2,013 |
April 30, 2013 | https://www.sciencedaily.com/releases/2013/04/130430151644.htm | Finding a sensible balance for natural hazard mitigation with mathematical models | Uncertainty issues are paramount in the assessment of risks posed by natural hazards and in developing strategies to alleviate their consequences. | In a paper published last month in the "Science tells us a lot about the natural processes that cause hazards, but not everything," says Seth Stein. "Meteorologists are steadily improving forecasts of the tracks of hurricanes, but forecasting their strength is harder. We know a reasonable amount about why and where earthquakes will happen, some about how big they will be, but much less about when they will happen. This situation is like playing the card game '21', in which players see only some of the dealer's cards. It is actually even harder, because we do not fully understand the rules of the game, and are trying to figure them out while playing it."How much mitigation is needed? The bottom of a U-shaped curve is a "sweet spot" -- a sensible balance. Photo Credit: Jerome Stein and Seth SteinEarthquake cycles -- triggered by movement of the Earth's tectonic plates and the resulting stress and strain at plate boundaries -- are irregular in time and space, making it hard to predict the timing and magnitude of earthquakes and tsunamis. Hence, forecasting the probabilities of future rare events presents "deep uncertainty," Stein says. "Deep uncertainties arise when the probabilities of outcomes are poorly known, unknown, or unknowable. In such situations, past events may give little insight into future ones."Another conundrum for authorities in such crisis situations is the appropriate amount of resources to direct toward a disaster zone. "Much of the problem comes from the fact that formulating effective natural hazard policy involves using a complicated combination of geoscience, mathematics, and economics to analyze the problem and explore the costs and benefits of different options. In general, mitigation policies are chosen without this kind of analysis," says Stein. "The challenge is deciding The Japanese earthquake and tsunami in 2011 toppled seawalls 5-10 meters high. The seawalls being rebuilt are about 12 meters high, and would be expected to protect against large tsunamis expected every few hundred years. But critics argue that it would be more cost effective and efficient to focus on relocation and evacuation strategies for populations that may be affected by such tsunamis rather than building higher seawalls, especially in areas where the population is small and dwindling.In this paper, Stein says, the authors set out to "find the amount of mitigation -- which could be the height of a seawall or the earthquake resistance of buildings -- that is best for society." The objective is to provide methods for authorities to use their limited resources in the best possible way in the face of uncertainty.Selecting an optimum strategy, however, depends on estimating the expected value of damage. This, in turn, requires prediction of the probability of disasters.It is still unknown whether to assume that the probability of a large earthquake on a fault line is constant with time (as routinely assumed in hazard planning) or whether the probability gets smaller after the last incidence and increases with time. Hence, the authors incorporate both these scenarios using the general probability model of drawing balls from an urn. If an urn contains balls that are labeled "E" for event and "N" for no event, each year is like drawing a ball. "If after drawing a ball, we replace it, the probability of an event stays constant. Thus an event is never 'overdue' because one has not happened recently, and the fact that one happened recently does not make another less likely," explains Stein. "In contrast, we can add E-balls after a draw when an event does not occur, and remove E-balls when an event occurs. This makes the probability of an event increase with time until one happens, after which it decreases and then grows again."Since the likelihood of future earthquakes depends on strain accumulation at plate boundaries, the model incorporates parameters for how fast strain accumulates between quake incidences, and strain release that happens during earthquakes.The authors select the optimal mitigation strategy by using a general stochastic model, which is a method used to estimate the probability of outcomes in different situations under constrained data. They minimize the expected present value of damage, the costs of mitigation, and the risk premium, which reflects the variance, or inconsistency, of the hazard. The optimal mitigation is the bottom of a U-shaped curve summing up the cost of mitigation and expected losses, a sensible balance.To determine the advantages and pitfalls of rebuilding after such disasters, the authors present a deterministic model. Here, outcomes are precisely determined by taking into account relationships between states and events. The authors use this model to determine if Japan should invest in nuclear power plant construction given the Fukushima Daiichi nuclear reactor meltdown during the 2011 tsunami. Taking into account the financial and societal benefits of reactors, and balancing them against risks -- both financial and natural -- the model determines the preferred outcome.Such models can also be applied toward other disaster situations, such as hurricanes and floods, and toward policies to diminish the effects of climate change. Stein gives an example: "Given the damage to New York City by the storm surge from Hurricane Sandy, options under consideration range from doing nothing, using intermediate strategies like providing doors to keep water out of vulnerable tunnels, to building up coastlines or installing barriers to keep the storm surge out of rivers. In this case, a major uncertainty is the effect of climate change, which is expected to make flooding worse because of the rise of sea levels and higher ferocity and frequency of major storms. Although the magnitude of these effects is uncertain, this formulation can be used to develop strategies by exploring the range of possible effects." | Earthquakes | 2,013 |
April 29, 2013 | https://www.sciencedaily.com/releases/2013/04/130429133705.htm | No Redoubt: Volcanic eruption forecasting improved | Forecasting volcanic eruptions with success is heavily dependent on recognizing well-established patterns of pre-eruption unrest in the monitoring data. But in order to develop better monitoring procedures, it is also crucial to understand volcanic eruptions that deviate from these patterns. | New research from a team led by Carnegie's Diana Roman retrospectively documented and analyzed the period immediately preceding the 2009 eruption of the Redoubt volcano in Alaska, which was characterized by an abnormally long period of pre-eruption seismic activity that's normally associated with short-term warnings of eruption. Their work is published today by Well-established pre-eruption patterns can include a gradual increase in the rate of seismic activity, a progressive alteration in the type of seismic activity, or a change in ratios of gas released. "But there are numerous cases of volcanic activity that in some way violated these common patterns of precursory unrest," Roman said. "That's why examining the unusual precursor behavior of the Redoubt eruption is so enlightening."About six to seven months before the March 2009 eruption, Redoubt began to experience long-period seismic events, as well as shallow volcanic tremors, which intensified into a sustained tremor over the next several months. Immediately following this last development, shallow, short-period earthquakes were observed at an increased rate below the summit. In the 48 hours prior to eruption both deep and shallow earthquakes were recorded.This behavior was unusual because precursor observations usually involve a transition from short-period to long-period seismic activity, not the other way around. What's more, seismic tremor is usually seen as a short-term warning, not something that happens months in advance. However, these same precursors were also observed during the 1989-90 Redoubt eruption, thus indicating that the unusual seismic pattern reflects some unique aspect of the volcano's magma system.Advanced analysis of the seismic activity taking place under the volcano allowed Roman and her team to understand the changes taking place before, during, and after eruption. Their results show that the eruption was likely preceded by a protracted period of slow magma ascent, followed by a short period of rapidly increasing pressure beneath Redoubt.Elucidating the magma processes causing these unusual precursor events could help scientists to hone their seismic forecasting, rather than just relying on the same forecasting tools they're currently using, ones that are not able to detect anomalies.For example, using current techniques, the forecasts prior to Redoubt's 2009 eruption wavered over a period of five months, back and forth between eruption being likely within a few weeks to within a few days. If the analytical techniques used by Roman and her team had been taken into consideration, the early risk escalations might not have been issued."Our work shows the importance of clarifying the underlying processes driving anomalous volcanic activity. This will allow us to respond to subtle signals and increase confidence in making our forecasts." Roman said. | Earthquakes | 2,013 |
April 25, 2013 | https://www.sciencedaily.com/releases/2013/04/130425142355.htm | Earth's center is 1,000 degrees hotter than previously thought, synchrotron X-ray experiment shows | Scientists have determined the temperature near the Earth's centre to be 6000 degrees Celsius, 1000 degrees hotter than in a previous experiment run 20 years ago. These measurements confirm geophysical models that the temperature difference between the solid core and the mantle above, must be at least 1500 degrees to explain why the Earth has a magnetic field. The scientists were even able to establish why the earlier experiment had produced a lower temperature figure. | The results are published on 26 April 2013 in The research team was led by Agnès Dewaele from the French national technological research organization CEA, alongside members of the French National Center for Scientific Research CNRS and the European Synchrotron Radiation Facility ESRF in Grenoble (France).The Earth's core consists mainly of a sphere of liquid iron at temperatures above 4000 degrees and pressures of more than 1.3 million atmospheres. Under these conditions, iron is as liquid as the water in the oceans. It is only at the very centre of the Earth, where pressure and temperature rise even higher, that the liquid iron solidifies. Analysis of earthquake-triggered seismic waves passing through the Earth, tells us the thickness of the solid and liquid cores, and even how the pressure in the Earth increases with depth. However these waves do not provide information on temperature, which has an important influence on the movement of material within the liquid core and the solid mantle above. Indeed the temperature difference between the mantle and the core is the main driver of large-scale thermal movements, which together with the Earth's rotation, act like a dynamo generating the Earth's magnetic field. The temperature profile through the Earth's interior also underpins geophysical models that explain the creation and intense activity of hot-spot volcanoes like the Hawaiian Islands or La Réunion.To generate an accurate picture of the temperature profile within the Earth's centre, scientists can look at the melting point of iron at different pressures in the laboratory, using a diamond anvil cell to compress speck-sized samples to pressures of several million atmospheres, and powerful laser beams to heat them to 4000 or even 5000 degrees Celsius."In practice, many experimental challenges have to be met," explains Agnès Dewaele from CEA, "as the iron sample has to be insulated thermally and also must not be allowed to chemically react with its environment. Even if a sample reaches the extreme temperatures and pressures at the centre of the Earth, it will only do so for a matter of seconds. In this short timeframe it is extremely difficult to determine whether it has started to melt or is still solid."This is where X-rays come into play. "We have developed a new technique where an intense beam of X-rays from the synchrotron can probe a sample and deduce whether it is solid, liquid or partially molten within as little as a second, using a process known diffraction," says Mohamed Mezouar from the ESRF, "and this is short enough to keep temperature and pressure constant, and at the same time avoid any chemical reactions."The scientists determined experimentally the melting point of iron up to 4800 degrees Celsius and 2.2 million atmospheres pressure, and then used an extrapolation method to determine that at 3.3 million atmospheres, the pressure at the border between liquid and solid core, the temperature would be 6000 +/- 500 degrees. This extrapolated value could slightly change if iron undergoes an unknown phase transition between the measured and the extrapolated values.When the scientists scanned across the area of pressures and temperatures, they observed why Reinhard Boehler, then at the MPI for Chemistry in Mainz (Germany), had in 1993 published values about 1000 degrees lower. Starting at 2400 degrees, recrystallization effects appear on the surface of the iron samples, leading to dynamic changes of the solid iron's crystalline structure. The experiment twenty years ago used an optical technique to determine whether the samples were solid or molten, and it is highly probable that the observation of recrystallization at the surface was interpreted as melting."We are of course very satisfied that our experiment validated today's best theories on heat transfer from the Earth's core and the generation of the Earth's magnetic field. I am hopeful that in the not-so-distant future, we can reproduce in our laboratories, and investigate with synchrotron X-rays, every state of matter inside the Earth," concludes Agnès Dewaele. | Earthquakes | 2,013 |
April 19, 2013 | https://www.sciencedaily.com/releases/2013/04/130419160704.htm | Calculating tsunami risk for the US East Coast | The greatest threat of a tsunami for the U.S. east coast from a nearby offshore earthquake stretches from the coast of New England to New Jersey, according to John Ebel of Boston College, who presented his findings today at the Seismological Society of America 2013 Annual Meeting. | The potential for an East Coast tsunami has come under greater scrutiny after a 2012 earthquake swarm that occurred offshore about 280 kilometers (170 miles) east of Boston. The largest earthquake in the 15-earthquake swarm, most of which occurred on April 12, 2012, was magnitude (M) 4.0.In 2012 several other earthquakes were detected on the edge of the Atlantic continental shelf of North America, with magnitudes between 2 and 3.5. These quakes occurred off the coast of southern Newfoundland and south of Cape Cod, as well as in the area of the April swarm. All of these areas have experienced other earthquake activity in the past few decades prior to 2012.The setting for these earthquakes, at the edge of the continental shelf, is similar to that of the 1929 M7.3 Grand Banks earthquake, which triggered a 10-meter tsunami along southern Newfoundland and left tens of thousands of residents homeless.Ebel's preliminary findings suggest the possibility than an earthquake-triggered tsunami could affect the northeast coast of the U.S. The evidence he cites is the similarity in tectonic settings of the U.S. offshore earthquakes and the major Canadian earthquake in 1929. More research is necessary, says Ebel, to develop a more refined hazard assessment of the probability of a strong offshore earthquake along the northeastern U.S. coast. | Earthquakes | 2,013 |
April 19, 2013 | https://www.sciencedaily.com/releases/2013/04/130419132605.htm | After major earthquake: A global murmur, then unusual silence | In the global aftershock zone that followed the major April 2012 Indian Ocean earthquake, seismologists noticed an unusual pattern. The magnitude (M) 8.6 earthquake, a strike-slip event at intraoceanic tectonic plates, caused global seismic rates of M≥4.5 to rise for several days, even at distances thousands of kilometers from the mainshock site. However, the rate of M≥6.5 seismic activity subsequently dropped to zero for the next 95 days. | This period of quiet, without a large quake, has been a rare event in the past century. So why did this period of quiet occur?In his research presentation, Fred Pollitz of the U.S. Geological Survey suggests that the Indian Ocean earthquake caused short-term dynamic stressing of a global faulting system. Across the planet, there are faults that are "close to failure" and ready to rupture. It may be, suggests Pollitz and his colleagues, that a large quake encourages short-term triggering of these close-to-failure faults but also relieves some of the stress that has built up along these faults. Large magnitude events would not occur until tectonic movement loads stress back on to the faults at the ready-to-fail levels they reached before the mainshock.Using a statistical model of global seismicity, Pollitz and his colleagues show that a transient seismic perturbation of the size of the April 2012 global aftershock would inhibit rupture in 88 percent of their possible M≥6.5 earthquake fault sources over the next 95 days, regardless of how close they were to failure beforehand. | Earthquakes | 2,013 |
April 19, 2013 | https://www.sciencedaily.com/releases/2013/04/130419132603.htm | Measuring the hazards of global aftershock | The entire world becomes an aftershock zone after a massive magnitude (M) 7 or larger earthquake -- but what hazard does this pose around the planet? Researchers are working to extend their earthquake risk estimates over a global scale, as they become better at forecasting the impact of aftershocks at a local and regional level. | There is little doubt that surface waves from a large, M≥7 earthquake can distort fault zones and volcanic centers as they pass through Earth's crust, and these waves could trigger seismic activity. According to the Tom Parsons, seismologist with the U.S. Geological Survey, global surveys suggest that there is a significant rate increase in global seismic activity during and in the 45 minutes after a M≥7 quake across all kinds of geologic settings. But it is difficult to find strong evidence that surface waves from these events immediately trigger M>5 earthquakes, and these events may be relatively rare. Nevertheless, seismologists would like to be able to predict the frequency of large triggered quakes in this global aftershock zone and associated hazard.Studies of hundreds of M≥7 mainshock earthquake effects in 21 different regions around the world has provided some initial insights into how likely a damaging global aftershock might be. Initial results show that remote triggering has occurred at least once in about half of the regions studied during the past 30 years. Larger (M>5) global aftershocks appear to be delayed by several hours as compared with their lower magnitude counterparts. Parsons suggests that local seismic networks can monitor the rate of seismic activity immediately after a global mainshock quake, with the idea that a vigorous uptick in activity could signal a possible large aftershock.Parsons presented his research at the annual meeting of the Seismological Society of America. | Earthquakes | 2,013 |
April 19, 2013 | https://www.sciencedaily.com/releases/2013/04/130419105158.htm | Mine disaster: Hundreds of aftershocks | A new University of Utah study has identified hundreds of previously unrecognized small aftershocks that happened after Utah's deadly Crandall Canyon mine collapse in 2007, and they suggest the collapse was as big -- and perhaps bigger -- than shown in another study by the university in 2008. | Mapping out the locations of the aftershocks "helps us better delineate the extent of the collapse at Crandall canyon. It's gotten bigger," says Tex Kubacki, a University of Utah master's student in mining engineering."We can see now that, prior to the collapse, the seismicity was occurring where the mining was taking place, and that after the collapse, the seismicity migrated to both ends of the collapse zone," including the mine's west end, he adds.Kubacki was scheduled to present the findings Friday in Salt Lake City during the Seismological Society of America's 2013 annual meeting.Six coal miners died in the Aug. 6, 2007 mine collapse, and three rescuers died 10 days later. The mine's owner initially blamed the collapse on an earthquake, but the University of Utah Seismograph Stations said it was the collapse itself, not an earthquake, that registered on seismometers.A 2008 study by University of Utah seismologist Jim Pechmann found the epicenter of the collapse was near where the miners were working, and aftershocks showed the collapse area covered 50 acres, four times larger than originally thought, extending from crosscut 120 on the east to crosscut 143 on the west, where miners worked. A crosscut is a north-south tunnel intersecting the mine's main east-west tunnels.In the new study, the collapse area "looks like it goes farther west -- to the full extent of the western end of the mine, Kubacki says.Study co-author Michael "Kim" McCarter, a University of Utah professor of mining engineering, says the findings are tentative, but "might extend the collapse farther west." He is puzzled because "some of that is in an area where no mining had occurred."Kubacki says one theory is that the seismic events at the west end and some of those at the eastern end of the mine may be caused by "faulting forming along a cone of collapse" centered over the mine.Kubacki and McCarter conducted the new study with seismologists Keith Koper and Kris Pankow of the University of Utah Seismograph Stations. McCarter and Pankow also coauthored the 2008 study.Before the new study, researchers knew of about 55 seismic events -- down to magnitude 1.6 -- near the mine before and after the collapse, which measured 3.9 on the local magnitude scale and 4.1 on the "moment" magnitude scale that better reflects energy release, Kubacki says.The new study analyzed records of seismometers closest to the mine for evidence of tremors down to magnitudes minus-1, which Kubacki says is about one-tenth the energy released by a hand grenade. He found:- Strong statistical evidence there were at least 759 seismic events before the mine collapse and 569 aftershocks.- Weak evidence there were as many as 1,022 seismic events before the collapse and 1,167 aftershocks."We've discovered up to about 2,000 previously unknown events spanning from July 26 to Aug. 30, 2007," Kubacki says, although some of the weak-evidence events may turn out not to be real or to be unrelated to the collapse.The seismic events found in the new study show tremors clustered in three areas: the east end of the collapse area, the area where miners were working toward the mine's west end, and -- new in this study -- at the mine's west end, beyond where miners worked."We have three clusters to look at and try to come up with an explanation of why there were three," McCarter says. "They are all related to the collapse."Some of the tremors in the eastern cluster are related to rescue attempts and a second collapse that killed three rescuers, but some remain unexplained, he adds.Kubacki says most of the seismic activity before the collapse was due to mining, although scientists want to investigate whether any of those small jolts might have been signs of the impending collapse. So far, however, "there is nothing measured that would have said, 'Here's an event [mine collapse] that's ready to happen," McCarter says.Kubacki came up with the new numbers of seismic events by analyzing the records of seismometers closest to Crandall Canyon (about 12 miles away). "We took the known seismic events already in the catalog and searched for events that looked the same," he adds. "These new events kept popping up. There are tiny events that may show up on one station but not network-wide.""Any understanding we can get toward learning how and why mine collapses happen is going to be of interest to the mining community," Kubacki says.McCarter adds: "We are looking at the Crandall Canyon event because we have accurate logs and very extensive seismic data, and that provides a way of investigating the data to see if anything could be applied to other mines to improve safety." | Earthquakes | 2,013 |
April 17, 2013 | https://www.sciencedaily.com/releases/2013/04/130417092130.htm | Helping to forecast earthquakes in Salt Lake Valley | Salt Lake Valley, home to the Salt Lake City segment of the Wasatch fault zone and the West Valley fault zone, has been the site of repeated surface-faulting earthquakes (of about magnitude 6.5 to 7). New research trenches in the area are helping geologists and seismologists untangle how this complex fault system ruptures and will aid in forecasting future earthquakes in the area. | At the annual meeting of the Seismological Society of America (SSA), Christopher DuRoss and Michael Hylland of the Utah Geological Survey will present research today that indicates geologically recent large earthquakes on the West Valley fault zone likely occurred with (or were triggered by) fault movement on the Salt Lake City segment. DuRoss and Hylland consider it less likely that West Valley fault movement happens completely independently from movement on the Salt Lake City segment. This likely pairing has implications for how the seismic hazard in Salt Lake Valley is modeled.The trenches have also helped the researchers revise the history of large earthquakes in the area, showing that the Salt Lake City segment has been more active than previously thought. Since about 14,000 years ago, eight quakes have occurred on the segment. Depending on the time period, these quakes have occurred roughly every 1300 to 1500 years on average. It has been 1400 years since the most recent large earthquake on the segment. The earthquake history of the West Valley fault zone had been largely unknown, but now four earthquakes have been well dated.This new fault research contributes to a broader goal of evaluating Utah's earthquake hazards and risk. For example, this type of information on prehistoric earthquakes will be used by the Working Group on Utah Earthquake Probabilities, formed under the auspices of the Utah Geological Survey and U.S. Geological Survey, to forecast probabilities for future earthquakes in the Wasatch Front region. | Earthquakes | 2,013 |
April 15, 2013 | https://www.sciencedaily.com/releases/2013/04/130415151436.htm | Research aims to settle debate over origin of Yellowstone volcano | A debate among scientists about the geologic formation of the supervolcano encompassing the region around Yellowstone National Park has taken a major step forward, thanks to new evidence provided by a team of international researchers led by University of Rhode Island Professor Christopher Kincaid. | In a publication appearing in last week's edition of Using a state-of-the-art plate tectonic laboratory model, they showed that volcanism in the Yellowstone area was caused by severely deformed and defunct pieces of a former mantle plume. They further concluded that the plume was affected by circulation currents driven by the movement of tectonic plates at the Cascades subduction zone.Mantle plumes are hot buoyant upwellings of magma inside Earth. Subduction zones are regions where dense oceanic tectonic plates dive beneath buoyant continental plates. The origins of the Yellowstone supervolcano have been argued for years, with sides disagreeing about the role of mantle plumes.According to Kincaid, the simple view of mantle plumes is that they have a head and a tail, where the head rises to the surface, producing immense magma structures and the trailing tail interacts with the drifting surface plates to create a chain of smaller volcanoes of progressively younger age. But Yellowstone doesn't fit this typical mold. Among its oddities, its eastward trail of smaller volcanoes called the Snake River Plain has a mirror-image volcanic chain, the High Lava Plain, that extends to the west. As a result, detractors say the two opposite trails of volcanoes and the curious north-south offset prove the plume model simply cannot work for this area, and that a plates-only model must be at work.To examine these competing hypotheses, Kincaid, former graduate student Kelsey Druken, and colleagues at the Australian National University built a laboratory model of Earth's interior using corn syrup to simulate fluid-like motion of Earth's mantle. The corn syrup has properties that allow researchers to examine complex time changing, three-dimensional motions caused by the collisions of tectonic plates at subduction zones and their effect on unsuspecting buoyant plumes.By using the model to simulate a mantle plume in the Yellowstone region, the researchers found that it reproduced the characteristically odd patterns in volcanism that are recorded in the rocks of the Pacific Northwest."Our model shows that a simple view of mantle plumes is not appropriate when they rise near subduction zones, and that these features get ripped apart in a way that seems to match the patterns in magma output in the northwestern U.S. over the past 20 million years," said Kincaid, a professor of geological oceanography at the URI Graduate School of Oceanography. "The sinking plate produces a flow field that dominates the interaction with the plume, making the plume passive in many ways and trapping much of the magma producing energy well below the surface. What you see at the surface doesn't look like what you'd expect from the simple models."The next step in Kincaid's research is to conduct a similar analysis of the geologic formations in the region around the Tonga subduction zone and the Samoan Islands in the South Pacific, another area where some scientists dispute the role of mantle plumes.According to Kincaid, "A goal of geological oceanography is to understand the relationship between Earth's convecting interior and our oceans over the entire spectrum of geologic time. This feeds directly into the very pressing need for understanding where Earth's ocean-climate system is headed, which clearly hinges on our understanding of how it has worked in past." | Earthquakes | 2,013 |
April 15, 2013 | https://www.sciencedaily.com/releases/2013/04/130415094845.htm | The Fukushima Dai-ichi Nuclear Power Plant accident: Two years on, the fallout continues | More than two years after the earthquake and tsunami that devastated parts of Japan, scientists are still trying to quantify the extent of the damage. | Of particular importance is determining just how much hazardous material escaped into the atmosphere from the stricken Fukushima Dai-ichi nuclear power plant in the period following the disaster on 11 March 2011.Scientists estimate a 'source term' (the types and amounts of hazardous materials released following an accident) by running computerised atmospheric and oceanic dispersal simulations and collecting samples from seawater. Data from the Fukushima incident is unfortunately plentiful. Immediately after the accident some radionucleids were carried east by a strong jet stream and reached the west coast of North America in just four days; other airborne radionucleids were eventually deposited into the Pacific Ocean. Further releases of hazardous material occurred through accidental and intentional discharges of contaminated water from the plant into the ocean.Writing in the Further research and modelling is needed to improve their new estimate, but this study is an important step in understanding the likely effects of the Fukushima incident on the marine environment by providing a clearer picture of how much hazardous material was actually released. | Earthquakes | 2,013 |
April 5, 2013 | https://www.sciencedaily.com/releases/2013/04/130405064400.htm | The resilience of the Chilean coast after the earthquake of 2010 | In February 2010, a violent earthquake struck Chile, causing a tsunami 10 m in height. Affecting millions of people, the earthquake and giant wave also transformed the appearance of the coastline: the dunes and sandbars were flattened, and the coast subsided in places by up to 1 m. But although the inhabitants are still affected for the long term, the shore system quickly rebuilt itself. A team from IRD and its Chilean partners(1) showed that in less than a year, the sedimentary structures had reformed. The Chilean coast therefore represented a unique "natural laboratory" for studying coastal formation processes. The subsidence of the coast also revealed the effects of rising sea levels on shores. | In addition to the material and human damage, the consequences of the earthquake and tsunami on the coastline biology and appearance were very severe. For lack of previous observations, it was the first time that a scientific team, bringing together IRD researchers and their Chilean partners(1), have been able to describe the geomorphological impact of such a catastrophe.Less than a week after the event, the international team had been formed and was making observations, at first on a spot basis, to evaluate the impact on 800 km of coast. The topographical and GPS surveys showed that the tsunami acted like a bulldozer, destroying existing structures: dunes, underwater sandbars, beaches, etc. This "reset" made the Chilean coast a unique case for the scientists to understand the formation of these geomorphological structures.A twice-monthly follow-up of the natural reconstruction of the coastline was then conducted by means of topographical surveys, satellite imaging and geo-referenced photos. They found that the shore had responded quickly to the disaster. After a few months, most of the sandy coastal structures had rebuilt themselves -- but with a different morphology. Unexpectedly, within one year the sediment system had found a new equilibrium(2), different to that preceding the earthquake.The earthquake lifted the offshore bar south of the epicentre, whereas for around 100 km to the north, it sunk by tens of centimetres to one metre. This subsidence reproduced within a few minutes the effects that the rise in sea level predicted over the coming decades would have(3). This makes the Chilean coastline a unique natural "laboratory" to better anticipate the impacts of global warming on coasts. Until now, models based their projections on a simple equation, known as the "Bruun equation"(4). Thanks to their observations, the researchers have just shown that reality appears to be more complex than predicted(5).In December 2012, a joint mission with the Chilean partners allowed a permanent observation system for continuously tracking the dynamic of the shore to be created. The recent creation of the The redistribution of the land masses due to this earthquake slightly reduced the planet's moment of inertia, i.e. its resistance to rotation, thereby shortening the length of the day by 1.26 millionths of a second.Worth knowing :On 27 February 2010, a mega-earthquake of a magnitude of 8.8 off the coast of Chile and the tsunami 10 m high which followed, caused more than 600 victims, and affected millions of Chileans. Collapsed buildings and bridges, power and telephone lines cut… the losses were evaluated at more than 15 billion dollars.It was one of the six most powerful earthquakes ever recorded on the planet. The Earth's crust broke over 500 km, along an ocean fault situated just 6 km off the Chilean coast.Notes :(2) (3) Global warming is melting ice and expanding surface water. The oceans will thus rise by around 1 m by 2100, according to the latest projections.(4) Bruun's equation says that the retreat of the coastline will be proportional to the rise in sea level.(5) The team compared two bays where the ground level had subsided by 80 cm. In Duao Bay, the beach | Earthquakes | 2,013 |
April 3, 2013 | https://www.sciencedaily.com/releases/2013/04/130403141402.htm | Rocky mountains originated from previously unknown oceanic plate | The mountain ranges of the North American Cordillera are made up of dozens of distinct crustal blocks. A new study clarifies their mode of origin and identifies a previously unknown oceanic plate that contributed to their assembly. | The extensive area of elevated topography that dominates the Western reaches of North America is exceptionally broad, encompassing the coastal ranges, the Rocky Mountains and the high plateaus in between. In fact, this mountain belt consists of dozens of crustal blocks of varying age and origin, which have been welded onto the American continent over the past 200 million years. "How these blocks arrived in North America has long been a puzzle," says LMU geophysicist Karin Sigloch, who has now taken a closer look at the problem, in collaboration with the Canadian geologist Mitchell Mihalynuk.One popular model for the accretion process postulates that a huge oceanic plate -- the Farallon Plate -- acted as a conveyor belt to sweep crustal fragments eastwards to the margin of American Plate, to which they were attached as the denser Farallon Plate was subducted under it. However, this scenario is at variance with several geological findings, and does not explain why the same phenomenon is not observed on the west coast of South America, the classical case of subduction of oceanic crust beneath a continental plate. The precise source of the crustal blocks themselves has also remained enigmatic, although geological studies suggest that they derive from several groups of volcanic islands. "The geological strata in North America have been highly deformed over the course of time, and are extremely difficult to interpret, so these findings have not been followed up," says Sigloch.Sigloch and Mihalynuk have now succeeded in assembling a comprehensive picture of the accretion process by incorporating geophysical findings obtained by seismic tomography. This technique makes it possible to probe the geophysical structure of Earth's interior down to the level of the lower mantle by analyzing the propagation velocities of seismic waves. The method can image the remnants of ancient tectonic plates at great depths, ocean floor that subducted, i.e., disappeared from the surface and sank back into the mantle, long time ago.Most surprisingly, the new data suggest that the Farallon Plate was far smaller than had been assumed, and underwent subduction well to the west of what was then the continental margin of North America. Instead it collided with, and subducted under, an intervening and previously unrecognized oceanic plate. Sigloch and Mihalynuk were able to locate the remnants of several deep-sea trenches that mark subduction sites at which oceanic plates plunge at a steep angle into the mantle and are drawn almost vertically into its depths. "The volcanic activity that accompanies the subduction process will have generated lots of new crustal material, which emerged in the form of island arcs along the line of the trenches, and provided the material for the crustal blocks," Sigloch explains.As these events were going on, the American Plate was advancing steadily westwards, as indicated by striped patterns of magnetized seafloor in the North Atlantic. The first to get consumed was the previously unknown oceanic plate, which can be detected seismologically beneath today's east coast of North America. Only then did the continent begin to encounter the Farallon plate. On its westward journey, North America overrode one intervening island arc after another -- annexing ever more of them for the construction of its wide mountains of the West. | Earthquakes | 2,013 |
April 3, 2013 | https://www.sciencedaily.com/releases/2013/04/130403104248.htm | Earth is 'lazy' when forming faults like those near San Andreas | Geoscientist Michele Cooke and colleagues at the University of Massachusetts Amherst take an uncommon, "Earth is lazy" approach to modeling fault development in the crust that is providing new insights into how faults grow. In particular, they study irregularities along strike-slip faults, the active zones where plates slip past each other such as at the San Andreas Fault of southern California. | Until now there has been a great deal of uncertainty among geologists about the factors that govern how new faults grow in regions where one plate slides past or over another around a bend, says Cooke. In their study published in an early online edition of the Testing ideas about how Earth's crust behaves in real time is impossible because actions unfold over many thousands of years, and success in reconstructing events after the fact is limited. A good analog for laboratory experiments has been a goal for decades. "Geologists don't agree on how the earth's crust handles restraining bends along faults. There's just a lack of evidence. When researchers go out in the field to measure faults, they can't always tell which one came first, for example," Cooke says.Unlike most geoscience researchers, she takes a mechanical efficiency approach to study dynamic fault systems' effectiveness at transforming input energy into force and movement. For example, a straight fault is more efficient at accommodating strain than a bumpy fault. For this reason Cooke is very interested in how the efficiency of fault bends evolves with increasing deformation.Her data suggest that at restraining bends, the crust behaves in accord with "work minimization" principles, an idea she dubs the "Lazy Earth" hypothesis. "Our approach offers some of the first system-type evidence of how faults evolve around restraining bends," she says.Further, Cooke's UMass Amherst lab is one of only a handful worldwide to use a relatively new modeling technique that uses kaolin clay rather than sand to better understand the behavior of Earth's crust.For these experiments, she and colleagues Mariel Schottenfeld and Steve Buchanan, both undergraduates at the time, used a clay box or tray loaded with kaolin, also known as china clay, prepared very carefully so its viscosity scales to that of the earth's crust. When scaled properly, data from clay experiments conducted over several hours in a table-top device are useful in modeling restraining bend evolution over thousands of years and at the scale of tens of kilometers.Cooke says sand doesn't remember faults the way kaolin can. In an experiment of a bend in a fault, sand will just keep forming new faults. But clay will remember an old fault until it's so inefficient at accommodating the slip that a new fault will eventually form in a manner much more similar to what geologists see on the ground.Another innovation Cooke and colleagues use is a laser scan to map the clay's deformation over time and to collect quantitative data about the system's efficiency. "It's a different approach than the conventional one," Cooke acknowledges. "I think about fault evolution in terms of work and efficiency. With this experiment we now have compelling evidence from the clay box experiment that the development of new faults increases the efficiency of the system. There is good evidence to support the belief that faults grow to improve efficiency in the Earth's crust as well. ""We're moving toward much more precision within laboratory experiments," she adds. "This whole field is revolutionized in past six years. It's an exciting time to be doing this sort of modeling. Our paper demonstrates the mastery we now can have over this method."The observation that a fault's active zone can shift location significantly over 10,000 years is very revealing, Cooke says, and has important implications for understanding seismic hazards. The more geologists understand fault development, the better they may be able to predict earthquake hazards and understand Earth's evolution, she points out.Funding for this work came from grants from the National Science Foundation and the Southern California Earthquake Center. | Earthquakes | 2,013 |
April 2, 2013 | https://www.sciencedaily.com/releases/2013/04/130402144525.htm | Seismic hazards: Seismic simulation code speeds up | A team of researchers at the San Diego Supercomputer Center (SDSC) and the Department of Electronic and Computer Engineering at the University of California, San Diego, has developed a highly scalable computer code that promises to dramatically cut both research times and energy costs in simulating seismic hazards throughout California and elsewhere. | The team, led by Yifeng Cui, a computational scientist at SDSC, developed the scalable GPU (graphical processing units) accelerated code for use in earthquake engineering and disaster management through regional earthquake simulations at the petascale level as part of a larger computational effort coordinated by the Southern California Earthquake Center (SCEC). San Diego State University (SDSU) is also part of this collaborative effort in pushing the envelope toward extreme-scale earthquake computing."The increased capability of GPUs, combined with the high-level GPU programming language CUDA, has provided tremendous horsepower required for acceleration of numerically intensive 3D simulation of earthquake ground motions," said Cui, who recently presented the team's new development at the NVIDIA 2013 GPU Technology Conference (GTC) in San Jose, Calif.A technical paper based on this work will be presented June 5-7 at the 2013 International Conference on Computational Science Conference in Barcelona, Spain.The accelerated code, which was done using GPUs as opposed to CPUs, or central processing units, is based on a widely-used wave propagation code called AWP-ODC, which stands for Anelastic Wave Propagation by Olsen, Day and Cui. It was named after Kim Olsen and Steven Day, geological science professors at San Diego State University (SDSU), and SDSC's Cui. The research team restructured the code to exploit high performance and throughput, memory locality, and overlapping of computation and communication, which made it possible to scale the code linearly to more than 8,000 NVIDIA Kepler GPU accelerators.The team performed GPU-based benchmark simulations of the 5.4 magnitude earthquake that occurred in July 2008 below Chino Hills, near Los Angeles. Compute systems included The benchmarks, run on By delivering a significantly higher level of computational power, researchers can provide more accurate earthquake predictions with increased physical reality and resolution, with the potential of saving lives and minimizing property damage."This is an impressive achievement that has made petascale-level computing a reality for us, opening up some new and really interesting possibilities for earthquake research," said Thomas Jordan, director of SCEC, which has been collaborating with UC San Diego and SDSU researchers on this and other seismic research projects, such as the simulation of a magnitude 8.0 earthquake, the largest ever simulation to-date."Substantially faster and more energy-efficient earthquake codes are urgently needed for improved seismic hazard evaluation," said Cui, citing the recent destructive earthquakes in China, Haiti, Chile, New Zealand, and Japan.While the GPU-based AWP-ODC code is already in research use, further enhancements are being planned for use on hybrid heterogeneous architectures such as "One goal going forward is to use this code to calculate an improved probabilistic seismic hazard forecast for the California region under a collaborative effort coordinated by SCEC," said Cui. "Our ultimate goal is to support development of a CyberShake model that can assimilate information during earthquake cascades so we can improve our operational forecasting and early warning systems."CyberShake is a SCEC project focused on developing new approaches to performing seismic hazard analyses using 3D waveform modeling. The GPU-based code has potential to save hundreds of millions of CPU-hours required to complete statewide seismic hazard map calculations in planning.Additional members on the UC San Diego research team include Jun Zhou and Efecan Poyraz, graduate students with the university's Department of Electrical and Computer Engineering (Zhou devoted his graduate research to this development work); SDSC researcher Dong Ju Choi; and Clark C. Guest, an associate professor of electrical and computer engineering at UC San Diego's Jacobs School of Engineering.Compute resources used for this research are supported by XSEDE under NSF grant number OCI-1053575, while additional funding for research was provided through XSEDE's Extended Collaborative Support Service (ECSS) program."ECSS exists for exactly this reason, to help a research team make significant performance gains and take their simulations to the next level," said Nancy Wilkins-Diehr, co-director of the ECSS program and SDSC's associate director. "We're very pleased with the results we were able to achieve for PI Thomas Jordan and his team. ECSS projects are typically conducted over several months to up to one year. This type of targeted support may be requested by anyone through the XSEDE allocations process."Additional funding came from the UC San Diego Graduate Program, Petascale Research in Earthquake System Science on Researchers acknowledge the following individuals for their contributions: Kim Olsen and Steven Day of SDSU; Amit Chourasia of SDSC (visualizations); Jeffrey Vetter of ORNL and his | Earthquakes | 2,013 |
March 31, 2013 | https://www.sciencedaily.com/releases/2013/03/130331165559.htm | Congestion in Earth's mantle: Mineralogists explain why plate tectonics stagnates in some places | Earth is dynamic. What we perceive as solid ground beneath our feet, is in reality constantly changing. In the space of a year Africa and America are drifting apart at the back of the Middle Atlantic for some centimeters while the floor of the Pacific Ocean is subducted underneath the South American Continent. "In 100 million years' time Africa will be pulled apart and North Australia will be at the equator," says Prof. Dr. Falko Langenhorst from the Friedrich Schiller University Jena (Germany). Plate tectonics is leading to a permanent renewal of the ocean floors, the mineralogist explains. The gaps between the drifting slabs are being filled up by rising melt, solidifying to new oceanic crust. In other regions the slabs dive into the deep interior of Earth and mix with the surrounding Earth's mantle. | Earth is the only planet in our solar system, conducting such a 'facelift' on a regular basis. But the continuous up and down on Earth's crust doesn't run smoothly everywhere. "Seismic measurements show that in some mantle regions, where one slab is subducted underneath another one, the movement stagnates, as soon as the rocks have reached a certain depth," says Prof. Langenhorst. The causes of the 'congestion' of the subducted plate are still unknown. In the current issue of According to this, the rocks of the submerging ocean plate pond at a depth of 440 to 650 kilometers -- in the transition zone between the upper and the lower Earth mantle. "The reason for that can be found in the slow diffusion and transformation of mineral components," mineralogist Langenhorst explains. On the basis of high pressure experiments the scientists were able to clarify things: under the given pressure and temperature in this depth, the exchange of elements between the main minerals of the subducted ocean plate -- pyroxene and garnet -- is slowed down to an extreme extent. "The diffusion of a pyroxene-component in garnet is so slow, that the submerging rocks don't become denser and heavier, and therefore stagnate," the Jena scientist says.Interestingly there is congestion in Earth's mantle exactly where the ocean floor submerges particularly fast into the interior of Earth. "In the Tonga rift off Japan for example, the speed of subduction is very high," Prof. Langenhorst states. Thereby the submerging rocks of the oceanic plate stay relatively cold up to great depth, which makes the exchange of elements between the mineral components exceptionally difficult. "It takes about 100 Million years for pyroxene crystals which are only 1 mm in size to diffuse into the garnet. For this amount of time the submerging plate stagnates," Langenhorst describes the rock congestion. It can probably only diffuse at the boundary of the lower Earth mantle. Because then pyroxene changes into the mineral akimotoite due to the higher pressure in the depth of 650 kilometers. "This could lead to an immediate rise in the rock density and would enable the submerging into greater depths." | Earthquakes | 2,013 |
March 27, 2013 | https://www.sciencedaily.com/releases/2013/03/130327144127.htm | Scientists image deep magma beneath Pacific seafloor volcano | Since the plate tectonics revolution of the 1960s, scientists have known that new seafloor is created throughout the major ocean basins at linear chains of volcanoes known as mid-ocean ridges. But where exactly does the erupted magma come from? | Researchers at Scripps Institution of Oceanography at UC San Diego now have a better idea after capturing a unique image of a site deep in the Earth where magma is generated.Using electromagnetic technology developed and advanced at Scripps, the researchers mapped a large area beneath the seafloor off Central America at the northern East Pacific Rise, a seafloor volcano located on a section of the global mid-ocean ridges that together form the largest and most active chain of volcanoes in the solar system. By comparison, the researchers say the cross-section area of the melting region they mapped would rival the size of San Diego County.Details of the image and the methods used to capture it are published in the March 28 issue of the journal "Our data show that mantle upwelling beneath the mid-ocean ridge creates a deeper and broader melting region than previously thought," said Kerry Key, lead author of the study and an associate research geophysicist at Scripps. "This was the largest project of its kind, enabling us to image the mantle with a level of detail not possible with previous studies."The northern East Pacific Rise is an area where two of the planet's tectonic plates are spreading apart from each another. Mantle rising between the plates melts to generate the magma that forms fresh seafloor when it erupts or freezes in the crust.Data for the study was obtained during a 2004 field study conducted aboard the research vessel The marine electromagnetic technology behind the study was originally developed in the 1960s by Charles "Chip" Cox, an emeritus professor of oceanography at Scripps, and his student Jean Filloux. In recent years the technology was further advanced by Steven Constable and Key. Since 1995 Scripps researchers have been working with the energy industry to apply this technology to map offshore geology as an aid to exploring for oil and gas reservoirs."We have been working on developing our instruments and interpretation software for decades, and it is really exciting to see it all come together to provide insights into the fundamental processes of plate tectonics," said Constable, a coauthor of the paper and a professor in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics at Scripps. "It was really a surprise to discover that melting started so deep in the mantle -- much deeper than was expected."Key believes the insights that electromagnetics provides will continue to grow as the technology matures and data analysis techniques improve (last week Key and his colleagues announced the use of electromagnetics in discovering a magma lubricant for the planet's tectonic plates)."Electromagnetics is really coming of age as a tool for imaging the earth," said Key. "Much of what we know about the crust and mantle is a result of using seismic techniques. Now electromagnetic technology is offering promise for further discoveries."Key also has future plans to apply electromagnetic technology to map subglacial lakes and groundwater in the polar regions.In addition to Key and Constable, coauthors of the paper include Lijun Liu of the University of Illinois and Anne Pommier of Arizona State University.The study was supported by the National Science Foundation and the Seafloor Electromagnetic Methods Consortium at Scripps. | Earthquakes | 2,013 |
March 26, 2013 | https://www.sciencedaily.com/releases/2013/03/130326151125.htm | 2011 Oklahoma temblor: Wastewater injection spurred biggest earthquake yet, study says | A new study in the journal | The recent boom in U.S. energy production has produced massive amounts of wastewater. The water is used both in hydrofracking, which cracks open rocks to release natural gas, and in coaxing petroleum out of conventional oil wells. In both cases, the brine and chemical-laced water has to be disposed of, often by injecting it back underground elsewhere, where it has the potential to trigger earthquakes. The water linked to the Prague quakes was a byproduct of oil extraction at one set of oil wells, and was pumped into another set of depleted oil wells targeted for waste storage.Scientists have linked a rising number of quakes in normally calm parts of Arkansas, Texas, Ohio and Colorado to below-ground injection. In the last four years, the number of quakes in the middle of the United States jumped 11-fold from the three decades prior, the authors of the Geology study estimate. Last year, a group at the U.S. Geological Survey also attributed a remarkable rise in small- to mid-size quakes in the region to humans. The risk is serious enough that the National Academy of Sciences, in a report last year called for further research to "understand, limit and respond" to induced seismic events. Despite these studies, wastewater injection continues near the Oklahoma earthquakes.The magnitude 5.7 quake near Prague was preceded by a 5.0 shock and followed by thousands of aftershocks. What made the swarm unusual is that wastewater had been pumped into abandoned oil wells nearby for 17 years without incident. In the study, researchers hypothesize that as wastewater replenished compartments once filled with oil, the pressure to keep the fluid going down had to be ratcheted up. As pressure built up, a known fault -- known to geologists as the Wilzetta fault--jumped. "When you overpressure the fault, you reduce the stress that's pinning the fault into place and that's when earthquakes happen," said study coauthor Heather Savage, a geophysicist at Columbia University's Lamont-Doherty Earth Observatory.The amount of wastewater injected into the well was relatively small, yet it triggered a cascading series of tremors that led to the main shock, said study co-author Geoffrey Abers, also a seismologist at Lamont-Doherty. "There's something important about getting unexpectedly large earthquakes out of small systems that we have discovered here," he said. The observations mean that "the risk of humans inducing large earthquakes from even small injection activities is probably higher" than previously thought, he said.Hours after the first magnitude 5.0 quake on Nov. 5, 2011, University of Oklahoma seismologist Katie Keranen rushed to install the first three of several dozen seismographs to record aftershocks. That night, on Nov. 6, the magnitude 5.7 main shock hit and Keranen watched as her house began to shake for what she said felt like 20 seconds. "It was clearly a significant event," said Keranen, the Geology study's lead author. "I gathered more equipment, more students, and headed to the field the next morning to deploy more stations."Keranen's recordings of the magnitude 5.7 quake, and the aftershocks that followed, showed that the first Wilzetta fault rupture was no more than 650 feet from active injection wells and perhaps much closer, in the same sedimentary rocks, the study says. Further, wellhead records showed that after 13 years of pumping at zero to low pressure, injection pressure rose more than 10-fold from 2001 to 2006, the study says.The Oklahoma Geological Survey has yet to issue an official account of the sequence, and wastewater injection at the site continues. In a statement responding to the paper, Survey seismologist Austin Holland said the study showed the earthquake sequence could have been triggered by the injections. But, he said, "it is still the opinion of those at the Oklahoma Geological Survey that these earthquakes could be naturally occurring. There remain many open questions, and more scientific investigations are underway on this sequence of earthquakes and many others within the state of Oklahoma."The risk of setting off earthquakes by injecting fluid underground has been known since at least the 1960s, when injection at the Rocky Mountain Arsenal near Denver was suspended after a quake estimated at magnitude 4.8 or greater struck nearby -- the largest tied to wastewater disposal until the one near Prague, Okla. A series of similar incidents have emerged recently. University of Memphis seismologist Stephen Horton in a study last year linked a rise in earthquakes in north-central Arkansas to nearby injection wells. University of Texas, Austin, seismologist Cliff Frohlich in a 2011 study tied earthquake swarms at the Dallas-Fort Worth Airport to a brine disposal well a third of a mile away. In Ohio, Lamont-Doherty seismologists Won-Young Kim and John Armbruster traced a series of 2011 earthquakes near Youngstown to a nearby disposal well. That well has since been shut down, and Ohio has tightened its waste-injection rules.Wastewater injection is not the only way that people can touch off quakes. Evidence suggests that geothermal drilling, impoundment of water behind dams, enhanced oil recovery, solution salt mining and rock quarrying also can trigger seismic events. (Hydrofracking itself is not implicated in significant earthquakes; the amount of water used is usually not enough to produce substantial shaking.) The largest known earthquakes attributed to humans may be the two magnitude 7.0 events that shook the Gazli gas fields of Soviet Uzbekistan in 1976, followed by a third magnitude 7.0 quake eight years later. In a 1985 study in the Bulletin of the Seismological Society of America, Lamont-Doherty researchers David Simpson and William Leith hypothesized that the quakes were human-induced but noted that a lack of information prevented them from linking the events to gas production or other triggers. In 2009, a geothermal energy project in Basel, Switzerland, was canceled after development activities apparently led to a series of quakes of up to magnitude 3.4 that caused some $8 million in damage to surrounding properties.In many of the wastewater injection cases documented so far, earthquakes followed within days or months of fluid injection starting. In contrast, the Oklahoma swarm happened years after injection began, similar to swarms at the Cogdell oil field in West Texas and the Fort St. John area of British Columbia.The Wilzetta fault system remains under stress, the study's authors say, yet regulators continue to allow injection into nearby wells. Ideally, injection should be kept away from known faults and companies should be required to provide detailed records of how much fluid they are pumping underground and at what pressure, said Keranen. The study authors also recommend sub-surface monitoring of fluid pressure for earthquake warning signs. Further research is needed but at a minimum, "there should be careful monitoring in regions where you have injection wells and protocols for stopping pumping even when small earthquakes are detected," said Abers. In a recent op-ed in the Albany (N.Y.) Times Union, Abers argued that New York should consider the risk of induced earthquakes from fluid injection in weighing whether to allow hydraulic fracturing to extract the state's shale gas reserves.The study was also coauthored by Elizabeth Cochran of the U.S. Geological Survey. | Earthquakes | 2,013 |
March 20, 2013 | https://www.sciencedaily.com/releases/2013/03/130320155224.htm | Can intraplate earthquakes produce stronger shaking than at plate boundaries? | New information about the extent of the 1872 Owens Valley earthquake rupture, which occurs in an area with many small and discontinuous faults, may support a hypothesis proposed by other workers that these types of quakes could produce stronger ground shaking than plate boundary earthquakes underlain by oceanic crust, like many of those taking place along the San Andreas fault. | Published estimates of the 1872 Owens Valley earthquake in southeastern California put the quake at a magnitude 7.4-7.5 to 7.7-7.9. Early work indicates the Owens Valley fault is ~140 kilometers long, and ~113 kilometers ruptured in 1872. Recent work comparing magnitude estimates from reported shaking effects versus fault rupture parameters suggests that the Owens Valley surface rupture was either longer than previously suspected, or that there was unusually strong ground shaking during the event. Colin Amos of Western Washington University and colleagues tested the hypothesis that the 1872 rupture may have extended farther to the south in Owens Valley. They conclude that the 1872 Owens Valley earthquake did not trigger additional rupture in the Haiwee area, indicating that the 1872 rupture was not likely significantly longer than previously reported.Amos and colleagues dug trenches in the southwestern Owens Valley area to look at the prominent Sage Flat fault east of Haiwee Reservoir. The trench data, combined with dating of the exposed sediment, allowed them to preclude the southern extent of the 1872 rupture from the Sage Flat area and identify two other much older surface-rupturing earthquakes in the area 25,000 to 30,000 years ago. The evaluation of their trench site suggests that the only recent ground disturbance, possibly coincident with the 1872 earthquake, was mostly weak fracturing that may have resulted from ground shaking -- rather than triggered slip along a fault. Soil liquefaction -- the conversion of soil into a fluid-like mass during earthquakes -- likely occurred at other nearby saturated wetlands and meadows closer to the axis of the valley. | Earthquakes | 2,013 |
March 20, 2013 | https://www.sciencedaily.com/releases/2013/03/130320155222.htm | Roman mausoleum tested for ancient earthquake damage | Built under a sheer cliff, with a commanding view of the forum and castle in the ancient city of Pinara in Turkey, a Roman mausoleum has been knocked off-kilter, its massive building blocks shifted and part of its pediment collapsed. The likely cause is an earthquake, according to a new detailed model by Klaus-G. Hinzen and colleagues at the University of Cologne. They conclude that a 6.3 magnitude earthquake could have caused the damage, and their new finding gives seismologists a new data point to consider when they calculate the likely earthquake hazards for this southwestern region of Turkey. | Researchers have seen other signs of strong seismic activity in Pinara, most notably a raised edge to the ancient town's Roman theater that appears to be due to activity along a fault. But archaeologists and seismologists were not certain how the mausoleum sustained its damage. An earthquake seemed likely, but the mausoleum is also built under a cliff honeycombed with numerous other tombs, and damage from a rockfall seemed possible.Hinzen and colleagues mapped the position of each part of the mausoleum using laser scans, and transferred 90 million data points collected from the scans into a 3-D computer model of the tomb. They then ran several damage simulations on the 3-D model, concluding that rockfall was not a likely cause of damage, but that an earthquake with magnitude 6.3 would be sufficient to produce the observed damage pattern to the mausoleum's heavy stone blocks. | Earthquakes | 2,013 |
March 20, 2013 | https://www.sciencedaily.com/releases/2013/03/130320142705.htm | Scientists discover 'lubricant' for Earth's tectonic plates: Hidden magma layer could play role in earthquakes | Scientists at Scripps Institution of Oceanography at UC San Diego have found a layer of liquefied molten rock in Earth's mantle that may be acting as a lubricant for the sliding motions of the planet's massive tectonic plates. The discovery may carry far-reaching implications, from solving basic geological functions of the planet to a better understanding of volcanism and earthquakes. | The scientists discovered the magma layer at the Middle America trench offshore Nicaragua. Using advanced seafloor electromagnetic imaging technology pioneered at Scripps, the scientists imaged a 25-kilometer- (15.5-mile-) thick layer of partially melted mantle rock below the edge of the Cocos plate where it moves underneath Central America.The discovery is reported in the March 21 issue of the journal The new images of magma were captured during a 2010 expedition aboard the U.S. Navy-owned and Scripps-operated research vessel "This was completely unexpected," said Key, an associate research geophysicist in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics at Scripps. "We went out looking to get an idea of how fluids are interacting with plate subduction, but we discovered a melt layer we weren't expecting to find at all -- it was pretty surprising."For decades scientists have debated the forces and circumstances that allow the planet's tectonic plates to slide across Earth's mantle. Studies have shown that dissolved water in mantle minerals results in a more ductile mantle that would facilitate tectonic plate motions, but for many years clear images and data required to confirm or deny this idea were lacking."Our data tell us that water can't accommodate the features we are seeing," said Naif, a Scripps graduate student and lead author of the paper. "The information from the new images confirms the idea that there needs to be some amount of melt in the upper mantle and that's really what's creating this ductile behavior for plates to slide."The marine electromagnetic technology employed in the study was originated by Charles "Chip" Cox, an emeritus professor of oceanography at Scripps, and in recent years further advanced by Constable and Key. Since 2000 they have been working with the energy industry to apply this technology to map offshore oil and gas reservoirs.The researchers say their results will help geologists better understand the structure of the tectonic plate boundary and how that impacts earthquakes and volcanism."One of the longer-term implications of our results is that we are going to understand more about the plate boundary, which could lead to a better understanding of earthquakes," said Key.The researchers are now seeking to find the source that supplies the magma in the newly discovered layer.The National Science Foundation and the Seafloor Electromagnetic Methods Consortium at Scripps supported the research. | Earthquakes | 2,013 |
March 18, 2013 | https://www.sciencedaily.com/releases/2013/03/130318180438.htm | Slabs of ancient tectonic plate still lodged under California | The Isabella anomaly -- indications of a large mass of cool, dehydrated material about 100 kilometers beneath central California -- is in fact a surviving slab of the Farallon oceanic plate. Most of the Farallon plate was driven deep into the Earth's mantle as the Pacific and North American plates began converging about 100 million years ago, eventually coming together to form the San Andreas fault. | Large chunks of an ancient tectonic plate that slid under North America millions of years ago are still present under parts of central California and Mexico, according to new research led by Brown University geophysicists.Around 100 million years ago, the Farallon oceanic plate lay between the converging Pacific and North American plates, which eventually came together to form the San Andreas fault. As those plates converged, much of the Farallon was subducted underneath North America and eventually sank deep into the mantle. Off the west coast of North America, the Farallon plate fragmented, leaving a few small remnants at the surface that stopped subducting and became part of the Pacific plate.But this new research suggests that large slabs from Farallon remain attached to these unsubducted fragments. The researchers used seismic tomography and other data to show that part of the Baja region and part of central California near the Sierra Nevada mountains sit atop "fossil" slabs of the Farallon plate."Many had assumed that these pieces would have broken off quite close to the surface," said Brown geophysicist Donald Forsyth, who led the research with Yun Wang, a former Brown graduate student now at the University of Alaska. "We're suggesting that they actually broke off fairly deep, leaving these large slabs behind."The findings are published in the Geologists had known for years about a "high velocity anomaly" in seismic tomography data near the Sierra Nevada mountains in California. Seismic tomography measures the velocity of seismic waves deep underground. The speed of the waves provides information about the composition and temperature of the subsurface. Generally, slower waves mean softer and hotter material; faster waves mean stiffer and cooler material.The anomaly in California, known as the Isabella anomaly, indicated that a large mass of relatively cool and dehydrated material is present at a depth of 100 to 200 kilometers below the surface. Just what that mass was wasn't known, but there were a few theories. It was often explained by a process called delamination. The crust beneath the eastern part of the mountains is thin and the mantle hot, indicating that part of the lithospheric plate under the mountains had delaminated -- broken off. The anomaly, scientists thought, might be the signature of that sunken hunk of lithosphere, which would be cooler and dryer than the surrounding mantle.But a few years ago, scientists detected a new anomaly under the Mexico's Baja Peninsula, due east of one of the known coastal remains of the Farallon plate. Because of its proximity to the Farallon fragment, Forsyth and Wang thought it was very likely that the anomaly represented an underground extension of the fragment.A closer look at the region showed that there are high-magnesium andesite deposits on the surface near the eastern edge of the anomaly. These kinds of deposits are volcanic rocks usually associated with the melting of oceanic crust material. Their presence suggests that the eastern edge of the anomaly represents the spots where Farallon finally gave way and broke off, sending andesites to the surface as the crust at the end of the subducted plate melted.That led Forsyth and his colleagues to suspect that perhaps the Isabella anomaly in California might also represent a slab still connected to an unsubducted fragment of the Farallon plate. So they re-examined the tomography data along the entire West Coast. They compared the Baja and Isabella anomalies to anomalies associated with known Farallon slabs underneath Washington and Oregon.The study found that all of the anomalies are strongest at the same depth -- right around 100 kilometers. And all of them line up nearly due east of known fragments from Farallon."The geometry was the kicker," Forsyth said. "The way they line up just makes sense."The findings could force scientists to re-examine the tectonic history of western North America, Forsyth said. In particular, it forces a rethinking of the delamination of the Sierra Nevada, which had been used to explain the Isabella anomaly."However the Sierra Nevada was delaminated," Forsyth said, "it's probably not in the way that many people had been thinking."His research colleague asnd co-author Brian Savage of the University of Rhode Island agrees. "This work has radically changed our understanding of the makeup of the west coast of North America," Savage said. "It will cause a thorough rethinking of the geological history of North America and undoubtedly many other continental margins.""The work was supported by the National Science Foundation. Other authors on the paper were Brown graduate student Christina Rau, Brown undergraduate Nina Carriero, Brandon Schmandt from the University of Oregon, and James Gaherty from Columbia University. | Earthquakes | 2,013 |
March 7, 2013 | https://www.sciencedaily.com/releases/2013/03/130307124554.htm | Waves generated by Russian meteor recorded crossing the US | A network of seismographic stations recorded spectacular signals from the blast waves of the meteor that landed near Chelyabinsk, Russia, as the waves crossed the United States. | The National Science Foundation- (NSF) supported stations are used to study earthquakes and Earth's deep interior.While thousands of earthquakes around the globe are recorded by seismometers in these stations--part of the permanent Global Seismographic Network (GSN) and EarthScope's temporary Transportable Array (TA)--signals from large meteor impacts are far less common.The meteor explosion near Chelyabinsk on Feb. 15, 2013, generated ground motions and air pressure waves in the atmosphere. The stations picked up the signals with seismometers and air pressure sensors.The ground motions were recorded by the GSN and the TA. The pressure waves were detected by special sensors that are part of the TA."The NSF-supported Global Seismic Network and EarthScope Transportable Array made spectacular recordings of the Chelyabinsk meteor's impact," says Greg Anderson, program director in NSF's Division of Earth Sciences."These recordings of seismic waves through the Earth, and sound waves through the atmosphere, are good examples of how these facilities can help global organizations better monitor earthquakes, clandestine nuclear tests and other threats."The Chelyabinsk meteor exploded in the atmosphere at approximately 9.20 a.m. local time.The blast caused significant damage in the city, breaking thousands of windows and injuring more than 1,000 people.Energy from the blast created pressure waves in the atmosphere that moved rapidly outward and around the globe. The blast also spread within Earth as a seismic wave.The two wave types--seismic wave and pressure wave--travel at very different speeds.Waves in the ground travel quickly, at about 3.4 kilometers per second. Waves in the atmosphere are much slower, moving at about 0.3 kilometers per second, and can travel great distances.GSN stations in Russia and Kazakhstan show the ground-traveling wave as a strong, abrupt pulse with a duration of about 30 seconds.The atmospheric waves--referred to as infrasound--were detected across a range of inaudible frequencies and were observed at great distances on infrasound microphones.When the infrasound waves reached the eastern United States--after traveling 8.5 hours through the atmosphere across the Arctic from the impact site in Russia--they were recorded at TA stations at the Canadian border.The infrasound waves reached Florida three hours later, nearly 12 hours after the blast.Infrasound sensors at TA stations along the Pacific coast and in Alaska also recorded the blast, but with signatures that were shorter and simpler than those recorded by stations in the mid-continent and along the southeastern seaboard.The duration of the signals, and the differences between the waveforms in the east and west, scientists believe, are related to the way in which energy travels and bounces on its long path through the atmosphere.The Transportable Array is operated by the IRIS (Incorporated Research Institutions for Seismology) Consortium as part of NSF's EarthScope Project. It consists of 400 stations traversing the United States, recording at each site along the way for two years.Each of the TA stations was originally equipped with sensitive broadband seismometers for measuring ground motions, but in 2010, NSF awarded the University of California, San Diego, in cooperation with IRIS, funding to add pressure and infrasound sensors.These special sensors help scientists understand how changes in pressure affect ground motions recorded by the TA's seismometers and provide a view of regional pressure changes related to weather patterns.The sensors also record events such as tornadoes, derechos, rocket launches, chemical explosions--and meteor impacts.The Chelyabinsk meteor is the largest signal recorded to date.In 2013, the Transportable Array will reach states in the Northeast, completing its traverse of the contiguous United States and southern Canada.The GSN's primary mission is collecting data to monitor worldwide earthquakes and to study Earth's deep interior.It's funded jointly by NSF and the U.S. Geological Survey and is managed and operated by IRIS in collaboration with the U.S. Geological Survey's Albuquerque Seismological Laboratory and the University of California, San Diego.As part of a worldwide network of seismic stations, data from the GSN have contributed over the past three decades to the monitoring of nuclear explosions at test sites in the United States, the former Soviet Union, India, Pakistan and Korea. For example, GSN stations provided observations of the Korean nuclear test on Feb. 12, 2013. | Earthquakes | 2,013 |
March 7, 2013 | https://www.sciencedaily.com/releases/2013/03/130307124800.htm | Sea floor earthquake zones can act like a 'magnifying lens' strengthening tsunamis beyond what was through possible | The earthquake zones off of certain coasts -- like those of Japan and Java -- make them especially vulnerable to tsunamis, according to a new study. They can produce a focusing point that creates massive and devastating tsunamis that break the rules for how scientists used to think tsunamis work. | Until now, it was largely believed that the maximum tsunami height onshore could not exceed the depth of the seafloor. But new research shows that when focusing occurs, that scaling relationship breaks down and flooding can be up to 50 percent deeper with waves that do not lose height as they get closer to shore."It is as if one used a giant magnifying lens to focus tsunami energy," said Utku Kanoglu, professor at the Middle East Technical University and senior author of the study. "Our results show that some shorelines with huge earthquake zones just offshore face a double whammy: not only they are exposed to the tsunamis, but under certain conditions, focusing amplifies these tsunamis far more than shoaling and produces devastating effects."The team observed this effect both in Northern Japan, which was struck by the Tohoku tsunami of 2011, and in Central Java, which was struck by a tsunami in 2006."We are still trying to understand the implications," said Costas Synolakis, director of the Tsunami Research Center at the USC Viterbi School of Engineering and a co-author of the study. "But it is clear that our findings will make it easier to identify locales that are tsunami magnets, and thus help save lives in future events."During an earthquake, sections of the sea floor lift up while others sink. This creates tsunamis that propagate trough-first in one direction and crest-first in the other. The researchers discovered that on the side of the earthquake zone where the wave propagates trough-first, there is a location where focusing occurs -- strengthening it before it hits the coastline with an unusual amount of energy that is not seen by the crest-first wave. Based on the shape, location, and size of the earthquake zone, that focal point can concentrate the tsunami's power right on to the coastline.In addition, before this analysis, it was thought that tsunamis usually decrease in height continuously as they move away from where they are created and grow close to shore, just as wind waves do. The study's authors instead suggest that the crest of the tsunami remains fairly intact close to the source."While our study does not preclude that other factors may help tsunamis overgrow, we now know when to invoke exotic explanations for unusual devastation: only when the basic classic wave theory we use does not predict focusing, or if the focusing is not high enough to explain observations," said Vasily Titov, a researcher at NOAA's Pacific Marine Environmental Laboratory and study co-author.Animation of a formation and focusing of a Tsunami: | Earthquakes | 2,013 |
February 21, 2013 | https://www.sciencedaily.com/releases/2013/02/130221084714.htm | Earthquakes in small laboratory samples | Mechanical failure of materials is a complex phenomenon underlying many accidents and natural disasters ranging from the fracture of small devices to earthquakes. Despite the vast separation of spatial, temporal, energy, and strain-rate scales, and the differences in geometry, it has been proposed that laboratory experiments on brittle fracture in heterogeneous materials can be a model for earthquake occurrence. | A study led by researchers from the University of Barcelona, and published on the journal The researcher Eduard Vives, from the Faculty of Physics of the UB, led the research in which collaborated several researchers from the Faculty, Xavier Illa, Antoni Planes and Jordi Baró (the main author), as well as Álvaro Corral, from the Centre for Mathematical Research (CERCA -- Government of Catalonia), and researchers from the University of Cambridge, the University of Viena and the Institute for Scientific and Technological Research of San Luis Potosi (Mexico).The material, analyzed by means of a device developed by the Materials Technological Unit of the Scientific and Technological Centers of the UB, is a porous glass (40 % porosity), designed for industrial applications, and named "The experiment carried out simulates the emergence of a new fault," explains the UB researcher Eduard Vives. "By this means -- he continues -- , we observed time distribution, which at the laboratory corresponds to some hours and in earthquakes to thousands of years." On the contrary, seismology study the space statistical changes considering the data obtained from high seismic activity areas, as California, and low activity ones. According to the researcher, "this symmetry in space and time reveals that it is probable that earthquakes behavior corresponds to any kind of self-organized criticality -- as some theories state -- , and if it could be proved, it would be a great advance to apply existent theories.Several works have previously tried to establish comparisons between earthquakes and laboratory fracture of materials, mainly using rocks, but results were not completely reliable, as they do not reproduce all the properties of earthquakes. "This material allows to carried out experiments that control several parameters, such as or magnitude or speed," concludes Vives.The results of the experiments performed with this material fulfill the four fundamental laws of statistical seismology. On the one hand, the energy detected by acoustic emissions varies as the Gutenberg-Ritcher law affirms; this law states that the number of earthquakes as a function of their radiated energy decreases as a power law.To get a general idea of the different scales, it is important to remember that a big earthquake (magnitude 8) equals 1,000 Hiroshima bombs, whereas the maximum energy measured in the laboratory equals the fission energy of one uranium atom. This different magnitude corresponds, approximately, to a factor of 10Another experiment made with this material studied the number of aftershocks produced after a big fracture and it has been observed that it decays with time, so the tendency to follow Omori's law is clear. "Laboratory maximum rate of aftershocks with time corresponds to some hours, whereas in earthquakes it last more than one hundred years," remarks the UB researcher.The third law of statistical seismology is the one related to waiting times, which relates the time between two consecutive earthquakes. In this case, laboratory results obtained were compared to the ones got from the earthquakes happened in Southern California, and "although different scales, similarity is higher," affirms Vives. Finally, the productivity law was also proved, which relates the rate of aftershocks triggered by a mainshock to its magnitude: larger-magnitude earthquakes produce on average more aftershocks. | Earthquakes | 2,013 |
February 13, 2013 | https://www.sciencedaily.com/releases/2013/02/130213114513.htm | Quake test: Can NYC's row houses handle an earthquake? | Researchers will conduct a rare -- if not unprecedented -- large-scale earthquake simulation to determine how vulnerable New York's unreinforced masonry buildings (row houses) are to temblors. | Designed to imitate the 2011 Virginia quake that rattled the East Coast, the test will occur at 11 a.m. Feb. 19 at the University at Buffalo's Multidisciplinary Center for Earthquake Engineering Research (MCEER).Two 14-foot-tall walls -- built with materials such as 100-year-old brick -- will replicate turn-of-the-century row houses (often called "brownstones") found in New York.Researchers will use an earthquake shake table within UB's earthquake simulation lab to mimic the Virginia temblor as if its epicenter was under the New York region. They will use the test results to calculate estimates for property loss and potential human casualties.To see a video preview of the test, visit: While not common, earthquakes periodically hit the New York City region, including a 5.5 magnitude temblor in 1884, according to the U.S. Geological Survey."New York City is not a high seismic zone, but the risk there is significant because of the existing infrastructure and large population," said Juan Aleman, PhD candidate and Fulbright scholar in UB's School of Engineering and Applied Sciences. "With this test, we hope to learn how buildings will react to a quake similar to the one that struck Virginia in 2011."Aleman is working with Andrew Whittaker, MCEER director and professor and chair of UB's Department of Civil, Structural and Environmental Engineering; and Gilberto Mosqueda, a former UB researcher, who works as an associate professor in structural engineering at the University of California, San Diego.The upcoming test is collaboration between UB and the International Masonry Institute. | Earthquakes | 2,013 |
February 7, 2013 | https://www.sciencedaily.com/releases/2013/02/130207141454.htm | Stress change during the 2011 Tohoku-Oki earthquake illuminated | The 11 March 2011 Tohoku-Oki earthquake (Mw9.0) produced the largest slip ever recorded in an earthquake, over 50 meters. Such huge fault movement on the shallow portion of the megathrust boundary came as a surprise to seismologists because this portion of the subduction zone was not thought to be accumulating stress prior to the earthquake. In a recently published study, scientists from the Integrated Ocean Drilling Program (IODP) shed light on the stress state on the fault that controls the very large slip. The unexpectedly large fault displacements resulted in the devastating tsunamis that caused tremendous damage and loss of lives along the coast of Japan. | The study, published in 8 February 2013 issue of the journal "The study investigated the stress change associated with the 2011 Tohoku-Oki earthquake and tested the hypothesis by determining the in-situ stress state of the frontal prism from the drilled holes," says a lead author Weiren Lin of Japan Agency for Marine-Earth Science and Technology (JAMSTEC). "We have established a new framework that the large slips in this region are an indication of coseismic fault zone and nearly the total stress accumulated was released during the earthquake."JFAST was designed and undertaken by the international scientific community to better understand the 2011 Tohoku-Oki earthquake. The expedition was carried out aboard the scientific drilling vessel Chikyu from April to July 2012. JFAST drill sites were located approximately 220 km from the eastern coast of Honshu, Japan, in nearly 7000 m of water."The project is looking at the stress and physical properties of the fault zone soon after a large earthquake," co-author James Mori of Kyoto University, Co-Chief Scientist who led the JFAST expedition explains.It is the first time that "rapid-response drilling" (within 13 months after the earthquake) has been attempted to measure the temperature across a subduction fault zone. The fast mobilization is necessary to observe time sensitive data, such as the temperature signal. JAMSTEC successfully mobilized a research expedition for IODP to investigate the large displacement by drilling from the ocean floor to the plate boundary, reaching a maximum depth of more than 850 m below seafloor (mbsf)."Understanding the stress conditions that control the very large slip of this shallow portion of the megathrust may be the most important seismological issue for this earthquake." Mori says.The research published this week determined the stress field from breakouts observed in a borehole around 820 mbsf, in a region thought to contain the main slip zone of the 2011 earthquake. Lin and his co-authors analyzed a suite of borehole-logging data collected while drilling with Logging-While-Drilling (LWD) tools during IODP Expedition 343. Local compressive failures (borehole breakouts) are formed in the borehole wall during the drilling and are imaged with the LWD tools. The orientation and size of the breakouts are used to infer the present direction and magnitudes of the stress field. An important finding of the paper is that the present shear stress on the fault is nearly zero, indicating that there was a nearly complete stress change during the earthquake. Usually, earthquakes are thought to release only a portion of the stress on the fault."This was the first time for such nearly complete stress change has been recognized by direct measurement in drilling through the ruptured fault. This is the first time direct stress measurements have been reported, a little over a year after a great subduction zone earthquake." Lin says.The expedition set new milestones in scientific ocean drilling by drilling a borehole to 854.81 mbsf in water depths of 6897.5 meters. Deep core was obtained and analyzed from this depth. The Japan Trench plate boundary was sampled and a parallel borehole was instrumented with a borehole observatory system. The core samples and borehole observatory provide scientists with valuable opportunities to learn about residual heat, coseismic frictional stress, fluid and rock properties, and other factors related to megathrust earthquakes."We will be able to address very fundamental and important questions about the physics of slip of the thrust near the trench, and how to identify past events in the rock record." says Frederick Chester, Texas A&M University, co-author of the The expedition science party, comprising both ship-board and shore-based scientists, is conducting further investigations of core samples and borehole logging data. Data from the borehole observatory are expected to be retrieved later this month using the JAMSTEC ROV Kaiko7000II, and those data will be combined with the current results to continue to increase understanding of the processes involved in this large slip earthquake."We anticipate that the results from the JFAST expedition will provide us with a better understanding of the faulting mechanisms for this critical location," says Mori. "Investigations and research findings from the expedition have obvious consequences for evaluating future tsunami hazards at other subduction zones around the world, such as the Nankai Trough in Japan and Cascadia in the Pacific of North America." | Earthquakes | 2,013 |
February 7, 2013 | https://www.sciencedaily.com/releases/2013/02/130207002002.htm | The deep roots of catastrophe: Partly molten, Florida-sized blob forms atop Earth's core | A University of Utah seismologist analyzed seismic waves that bombarded Earth's core, and believes he got a look at the earliest roots of Earth's most cataclysmic kind of volcanic eruption. But don't worry. He says it won't happen for perhaps 200 million years. | "What we may be detecting is the start of one of these large eruptive events that -- if it ever happens -- could cause very massive destruction on Earth," says seismologist Michael Thorne, the study's principal author and an assistant professor of geology and geophysics at the University of Utah.But disaster is "not imminent," he adds, "This is the type of mechanism that may generate massive plume eruptions, but on the timescale of 100 million to 200 million years from now. So don't cancel your cruises."The new study, set for publication this week in the journal "These very large, massive eruptions may be tied to some extinction events," Thorne says. The Ontong eruptions have been blamed for oxygen loss in the oceans and a mass die-off of sea life.Since the early 1990s, scientists have known of the existence of two continent-sized "thermochemical piles" sitting atop Earth's core and beneath most of Earth's volcanic hotspots -- one under much of the South Pacific and extending up to 20 degrees north latitude, and the other under volcanically active Africa.Using the highest-resolution method yet to make seismic images of the core-mantle boundary, Thorne and colleagues found evidence the pile under the Pacific actually is the result of an ongoing collision between two or more piles. Where they are merging is a spongy blob of partly molten rock the size of Florida, Wisconsin or Missouri beneath the volcanically active Samoan hotspot.The study's computer simulations "show that when these piles merge together, they may trigger the earliest stages of a massive plume eruption," Thorne says.Thorne conducted the new study with Allen McNamara and Edward Garnero of Arizona State University, and Gunnar Jahnke and Heiner Igel of the University of Munich. The National Science Foundation funded the research.Seismic imaging uses earthquake waves to make images of Earth's interior somewhat like X-rays make CT scan pictures of the inside of the human body.The new study assembled the largest set of data ever used to map the lower mantle in the Pacific region by using 4,221seismograms from hundreds of seismometers around the world that detected 51 deep earthquakes originating more than 60 miles under the surface.Thorne and colleagues looked for secondary earthquake shear waves known as S-waves that travel through much of Earth, hitting the core, and then convert to primary compressional waves or P-waves as they travel across the top of the core. Then they convert back to S-waves as they re-enter the mantle and then reach seismometers. Thorne says the short bursts of P-wave energy are very sensitive to detecting variations in the rock at the core-mantle boundary.Thorne performed 200 days of supercomputer simulations at the University of Utah's Center for High Performance Computing. He simulated hundreds of possible shapes of the continent-sized piles and state-sized blobs until he found the shapes that could best explain the seismic wave patterns that were observed.The new study provided an unusual look at one of the most remote parts of Earth, located about 1,800 miles beneath the surface: the boundary between the planet's molten outer core and its warm mantle rock, which has convection movement that has been compared with a conveyor belt or slowly boiling tomato soup. (Tectonic plates of Earth's crust and uppermost mantle drift atop the warmer, convecting lower mantle.)"We did hundreds of simulations for lots of different variations of what Earth might look like at the core-mantle boundary -- the most simulations anybody has ever done to look at the core-mantle boundary structure," Thorne saysAt some places where oceanic and continental tectonic plates collide -- such as offshore from the Pacific Northwest to Alaska -- the seafloor plate dives or "subducts" beneath the continent and plunges slowly into the mantle. Thorne suspects subducting plates ultimately fall deep enough to help push the piles around on Earth's core.Whether hotspots originate at the core-mantle boundary or at shallower depths has been debated for decades.But in the 1990s, geophysicists found evidence for the continent-size thermochemical piles beneath Africa and the Pacific. These are known technically as LLSVPs, or "large low shear velocity provinces," because seismic shear waves passing through them move 5 percent slower that through surrounding mantle rock. That suggests they have a different composition and-or temperature than the surrounding mantle.Previous studies also have observed smaller blobs of rock, measuring perhaps 60-by-60 miles on the edges of the continent-sized masses. Seismic shear waves move as much as 45 percent slower through these blobs -- known technically as ULVZs or "ultra low velocity zones" -- indicating they may be spongy and partly molten.Thorne says his analysis of seismic waves passing through the core-mantle boundary reveals the Pacific pile really represents two or more continent-sized piles slowly sliding atop the core and colliding so that partly molten blobs on their edges are merging into the largest such blob or ULVZ ever observed -- roughly the size of Florida."My study might be the first to show actual seismic evidence that the piles are moving," he says. "People who have done previous simulations have suggested this. They are sitting atop the core and getting pushed around by overlying mantle forces like subduction. They move around on the core somewhat like continental plates drift at Earth's surface."Thorne says the merging LLSVP piles are each about 1,800 miles diameter, forming a single pile some 3,600 miles wide from east to west and stretching across Earth's core beneath an area from Australia almost to South America. Two blobs, or ULVZs, on the piles' edges merged to form a new blob that is perhaps 6 to 10 miles thick and covers an area about 500 miles long and 150 miles wide, about the area of Florida or "eight to 10 times larger than any ULVZs we observed before," Thorne says.Because the larger piles haven't fully merged, seismic imaging shows there is a depression or "hole" between them, and the Florida-sized blob is forming there as smaller UVLZs merge in the hole."We are actually seeing that these piles are being shoved around," Thorne says. "If hotspots actually are generated near the core-mantle boundary, where they are being generated seems related to where these piles and ULVZs are. So if we are pushing these piles around, we also are pushing around where hotspot volcanism may occur."Warmer rock is less dense than cooler rock. Thorne says that where the ULVZ blobs form seems to be related to where the hot rock starts convecting upward to begin the long, slow process of forming a plume that eventually causes massive eruptions. | Earthquakes | 2,013 |
February 5, 2013 | https://www.sciencedaily.com/releases/2013/02/130205102118.htm | Cargo container research to improve buildings' ability to withstand tsunamis | Anyone who has seen the movie "Impossible" or watched footage from the Japanese tsunami has learned the terror that can strike with little warning. In those cases, when there is no time to flee, there may still be time to reach higher ground, called vertical evacuation. | But as you race to the third floor, how do you know if the building will hold up? Walls of water are not the only danger. Another potentially lethal challenge is water-driven debris -- such as 60,000-pound fully loaded cargo containers -- transformed into projectiles. Often pulled behind semi-trucks on highways, these containers that line port areas well exceed the telephone-pole-size 1,000-pound default log assumed by most U.S. building-design guidelines.A multi-university team lead by Ronald Riggs, a structural engineer at the University of Hawaii, has determined just what the impact could be and will present findings at an international conference in June. The goal is to supply structural engineers with information to design buildings in areas vulnerable to tsunamis.Currently there are no scientifically tested guidelines. And, as those who survived the Japanese tsunami that swept thousands to their deaths can attest, no one had planned for such force."Most structural systems are designed to defy gravity, not a side kick from a shipping container," Riggs says. "An engineer can build what it takes to withstand the karate chop, but first the engineer has to know what forces to expect."This knowledge is vital not only for the buildings into which people might flee, but also for coastline storage tanks that could spew chemicals or other pollutants if damaged.Riggs first began thinking about the problem as he examined damage to bridges and buildings following Hurricane Katrina. He noticed the cargo containers and barges that had been flung onto land in areas such as Biloxi, Miss. On another scientific excursion to Samoa, he says he saw a shipping container "whacked against a meeting hall -- and there was no port anywhere nearby.""These shipping containers are surprisingly ubiquitous," Riggs says. The point was further brought home on TVs across the world that played and replayed footage from Tohoku, Japan, as tsunami-fed waters dragged cars, trucks and shipping containers as much as six miles inland and then back out to sea in the drawdown."They may have been moving only about 10 miles an hour, but given their weight, this is a significant load for a structure not made for it."His colleagues and he proposed research to analyze several pieces of the puzzle with the help of the George E. Brown Jr. Network for Earthquake Engineering Simulation (NEES), a distributed laboratory with 14 sites across the country funded by the National Science Foundation. The network, which also funded Riggs' research, provides access to highly specialize, sophisticated and expensive equipment.For Riggs, two NEES sites were needed. One is at Lehigh University in Bethlehem, Pa., which specializes in real-time multi-directional testing for earthquake simulation of large-scale structural systems. The other is a wave flume longer than a football field at the Tsunami Research Facility at Oregon State University. At the Lehigh site, they swung full-scale wooden poles and shipping containers through the air on a pendulum to determine the force of impact at various velocities. At Oregon State, they ran similar tests at a 1:5 scale, but this time in its large flume wave to see if that made a difference.His basic assumptions held true, but there were two surprises. First, when the speed of the projectile was the same, the water did not have a significant impact."We thought the fact that it was in water would increase the load, but it did not, at least not substantially," Riggs says. "The impact is so short, on the order of a few milliseconds, that in some ways the water doesn't have time to increase the force."The second surprise was that the weight of the shipping container's contents also did not matter as much as he would have expected. The container itself, which is roughly 20 feet long and weighs about 5,000 pounds empty, could weigh as much as 60,000 pounds when fully loaded. Yet, its load when striking a building was not significantly greater than that of the empty container.The reason is the same as for the water, Riggs says."Unless the contents are rigidly attached to the frame of the container, which they usually are not, the contents also don't have time to increase the force during the very short duration of impact."The next step for Riggs and his team is to use the preliminary findings to better define building guidelines and policy."It's especially important for areas like Japan and the Cascadia area on the West Coast of the United States where tsunamis are most likely to strike with little warning, making vertical evacuation essential," Riggs says. "Or in Waikiki where the population density would make horizontal evacuation (trying to outrun the tsunami) problematic."Riggs will present the team's findings at the 32nd International Conference on Ocean, Offshore and Arctic Engineering, sponsored by the Society of Mechanical Engineers ASME to be held June 9-14 in Nantes, France. His colleagues are Clay Naito, associate professor at Lehigh University; Dan Cox, professor at Oregon State University; and Marcelo Kobayashi, associate professor at the University of Hawaii. | Earthquakes | 2,013 |
February 4, 2013 | https://www.sciencedaily.com/releases/2013/02/130204153701.htm | Hoodoos: Key to earthquakes? | In the absence of long-term instrumental data, fragile rock formations, called hoodoos, may be key to understanding seismic hazard risk. In a new study published in the | Hoodoos can be found in desert regions and are highly susceptible to erosion that makes their age uncertain. Despite that uncertainty, existing unfractured hoodoos, tall spires of sedimentary rock, may help put limits on ground motion associated with recent events by understanding the minimal force necessary to break the shafts made primarily of relatively soft sandstone.The Garlock fault region features an active strike-slip fault. Anooshehpoor, et al., estimated the tensile strength of two hoodoos and considered previously published physical evidence of fault offsets that suggest at least one large earthquake, resulting in seven meters (23 feet) of slip, in the last 550 years. And yet, the hoodoos are still intact, suggesting median or low level of ground motion associated with the large quakes in this region.While the age of the hoodoos cannot be exactly ascertained, the authors argue that these rocks can still serve as a valuable tool in constraining ground motion and thus contribute to the development of probabilistic seismic hazard assessments in the area. | Earthquakes | 2,013 |
January 30, 2013 | https://www.sciencedaily.com/releases/2013/01/130130101816.htm | Disasters prompt older children to be more giving, younger ones to be more selfish | A natural disaster can bring out the best in older children, prompting 9-year-olds to be more willing to share, while 6-year-olds become more selfish. Researchers at the University of Toronto, the University of Chicago, and Liaoning Normal University made this finding in a rare natural experiment in China around the time of a horrific earthquake. | A crucial difference between the two age groups emerged one month after the disaster. The 6-year-olds' willingness to share in a test measuring altruism dropped by a third, while among 9-year-olds, willingness to give to others nearly tripled. Three years later, children in the age groups returned to pre-earthquake levels of altruism."The study provides the first evidence to suggest that experiencing a natural disaster affects children's altruistic giving significantly," said Kang Lee, university distinguished professor at the University of Toronto."The immediate negative effect of the earthquake on 6-year-olds suggests that altruism at that age is still fragile," Lee said."We think that empathy is the intervening variable," said Jean Decety, the Irving B. Harris Professor of Psychology and Psychiatry at the University of Chicago, a member of the research team and a study co-author. The study demonstrates the developmental differences in the growth of empathy, Decety explained.As a child grow up, the prefrontal cortex matures with improved connections among the circuits involved with emotion. "As they grow older, children become able to better regulate their own vicarious emotions and understand better what they feel, and they are more inclined to act pro-socially," said Decety."Even with the group of 9-year-olds, we show that not only are they more altruistic and give more than the 6-year-olds, but those 9-year olds with higher empathy scores donated significantly more than 9-year-olds with lower scores," Decety added.The journal In early 2008, the researchers were in Sichuan, China, working on a study on empathy and altruism among children and had completed the first portion of it. In May 2008, an earthquake struck the region and killed 87,000 people.The team immediately decided to change the course of their study and explore what the experience of a disaster might mean to the children's concern for others.In the study, the team tested children's altruism by having them individually pick 10 favorite stickers from a set of 100. Afterward, they were told some of their classmates were not included in the test and asked if they would give up some of the stickers for them to enjoy. Without the researcher watching, children would put stickers into an envelope and seal it if they wanted to share. The amount of stickers they chose to give up was determined to be a measure of altruism.The children also were given a standard test of empathy, which gauged their reactions to seeing animated vignettes of people who are injured. Nine-year-olds had significantly higher scores on empathy on the test than 6-year-olds.Although there was a significant impact on altruism one month after the disaster, the study showed that groups of 6-year-olds and 9-year-olds had similar levels of altruism in follow-up tests three years after the disaster -- equivalent to the levels observed among 6-year-olds and 9-year-olds immediately before the earthquake."Experience with adversity, though generally having negative impacts on children, may in fact be beneficial, at least for older children, in evoking empathy toward others and in turn enhancing their altruistic giving, albeit temporarily," said Hong Li, also a lead author of the paper.The John Templeton Foundation, the Social Sciences and Humanities Research Council of Canada and the Chinese National Science Foundation supported this research. | Earthquakes | 2,013 |
January 23, 2013 | https://www.sciencedaily.com/releases/2013/01/130123133901.htm | Scientists underestimated potential for Tohoku earthquake: Now what? | The massive Tohoku, Japan, earthquake in 2011 and Sumatra-Andaman superquake in 2004 stunned scientists because neither region was thought to be capable of producing a megathrust earthquake with a magnitude exceeding 8.4. | Now earthquake scientists are going back to the proverbial drawing board and admitting that existing predictive models looking at maximum earthquake size are no longer valid.In a new analysis published in the journal "Once you start examining the paleoseismic and geodetic records, it becomes apparent that there had been the kind of long-term plate deformation required by a giant earthquake such as the one that struck Japan in 2011," Goldfinger said. "Paleoseismic work has confirmed several likely predecessors to Tohoku, at about 1,000-year intervals."The researchers also identified long-term "supercycles" of energy within plate boundary faults, which appear to store this energy like a battery for many thousands of years before yielding a giant earthquake and releasing the pressure. At the same time, smaller earthquakes occur that do not to any great extent dissipate the energy stored within the plates.The newly published analysis acknowledges that scientists historically may have underestimated the number of regions capable of producing major earthquakes on a scale of Tohoku."Since the 1970s, scientists have divided the world into plate boundaries that can generate 9.0 earthquakes versus those that cannot," said Goldfinger, a professor in OSU's College of Earth, Ocean, and Atmospheric Sciences. "Those models were already being called into question when Sumatra drove one stake through their heart, and Tohoku drove the second one."Now we have no models that work," he added, "and we may not have for decades. We have to assume, however, that the potential for 9.0 subduction zone earthquakes is much more widespread than originally thought."Both Tohoku and Sumatra were written off in the textbooks as not having the potential for a major earthquake, Goldfinger pointed out."Their plate age was too old, and they didn't have a really large earthquake in their recent history," Goldfinger said. "In fact, if you look at a northern Japan seismic risk map from several years ago, it looks quite benign -- but this was an artifact of recent statistics."Paleoseismic evidence of subduction zone earthquakes is not yet plentiful in most cases, so little is known about the long-term earthquake potential of most major faults. Scientists can determine whether a fault has ruptured in the past -- when and to what extent -- but they cannot easily estimate how big a specific earthquake might have been. Most, Goldfinger says, fall into ranges -- say, 8.4 to 8.7.Nevertheless, that type of evidence can be more telling than historical records because it may take many thousands of years to capture the full range of earthquake behavior.In their analysis, the researchers point to several subduction zone areas that previously had been discounted as potential 9.0 earthquake producers -- but may be due for reconsideration. These include central Chile, Peru, New Zealand, the Kuriles fault between Japan and Russia, the western Aleutian Islands, the Philippines, Java, the Antilles Islands and Makran, Pakistan/Iran.Onshore faults such as the Himalayan Front may also be hiding outsized earthquakes, the researchers add. Their work was supported by the National Science Foundation.Goldfinger, who directs the Active Tectonics and Seafloor Mapping Laboratory at Oregon State, is a leading expert on the Cascadia Subduction Zone off the Pacific Northwest coast of North America. His comparative studies have taken him to the Indian Ocean, Japan and Chile, and in 2007, he led the first American research ship into Sumatra waters in nearly 30 years to study similarities between the Indian Ocean subduction zone and Cascadia.Paleoseismic evidence abounds in the Cascadia Subduction Zone, Goldfinger pointed out. When a major offshore earthquake occurs, the disturbance causes mud and sand to begin streaming down the continental margins and into the undersea canyons. Coarse sediments called turbidites run out onto the abyssal plain; these sediments stand out distinctly from the fine particulate matter that accumulates on a regular basis between major tectonic events.By dating the fine particles through carbon-14 analysis and other methods, Goldfinger and colleagues can estimate with a great deal of accuracy when major earthquakes have occurred. Over the past 10,000 years, there have been 19 earthquakes that extended along most of the Cascadia Subduction Zone margin, stretching from southern Vancouver Island to the Oregon-California border."These would typically be of a magnitude from about 8.7 to 9.2 -- really huge earthquakes," Goldfinger said. "We've also determined that there have been 22 additional earthquakes that involved just the southern end of the fault. We are assuming that these are slightly smaller -- more like 8.0 -- but not necessarily. They were still very large earthquakes that if they happened today could have a devastating impact."Other researchers on the analysis include Yasutaka Ikeda of University of Tokyo, Robert S. Yeats of Oregon State University, and Junjie Ren, of the Chinese Seismological Bureau. | Earthquakes | 2,013 |
January 18, 2013 | https://www.sciencedaily.com/releases/2013/01/130118130104.htm | Ancient Earth's geochemistry: Some tectonic processes driving volcanic activity occurred 3.8 billion years ago | Researchers still have much to learn about the volcanism that shaped our planet's early history. New evidence from a team led by Carnegie's Frances Jenner demonstrates that some of the tectonic processes driving volcanic activity, such as those taking place today, were occurring as early as 3.8 billion years ago. Their work is published in | Upwelling and melting of Earth's mantle at mid-ocean ridges, as well as the eruption of new magmas on the seafloor, drive the continual production of the oceanic crust. As the oceanic crust moves away from the mid-ocean ridges and cools it becomes denser than the underlying mantle. Over time the majority of this oceanic crust sinks back into the mantle, which can trigger further volcanic eruptions. This process is known as subduction and it takes place at plate boundaries.Volcanic eruptions that are triggered by subduction of oceanic crust are chemically distinct from those erupting at mid-ocean ridges and oceanic island chains, such as Hawaii. The differences between the chemistry of magmas produced at each of these tectonic settings provide 'geochemical fingerprints' that can be used to try to identify the types of tectonic activity taking place early in Earth's history.Previous geochemical studies have used similarities between modern subduction zone magmas and those erupted about 3.8 billion years ago, during the Eoarchean era, to argue that subduction-style tectonic activity was taking place early in Earth's history. But no one was able to locate any suites of volcanic rocks with compositions comparable to modern mid-ocean ridge or oceanic island magmas that were older than 3 billion years and were also free from contamination by continental crust.Because of this missing piece of the puzzle, it has been ambiguous whether the subduction-like compositions of volcanic rocks erupted 3.8 billion years ago really were generated at subduction zones, or whether this magmatism should be attributed to other processes taking place early in Earth's history. Consequently, evidence for subduction-related tectonics earlier than 3 billion years ago has been highly debated in scientific literature.Jenner and her team collected 3.8 billion-year-old volcanic rocks from Innersuartuut, an island in southwest Greenland, and found the samples have compositions comparable to modern oceanic islands, such as Hawaii."The Innersuartuut samples may represent the world's oldest recognized suite of oceanic island basalts, free from contamination by continental crust," Jenner said. "This evidence strengthens previous arguments that subduction of oceanic crust into the mantle has been taking place since at least 3.8 billion years ago." | Earthquakes | 2,013 |
January 9, 2013 | https://www.sciencedaily.com/releases/2013/01/130109151204.htm | Faulty behavior: New earthquake fault models show that 'stable' zones may contribute to the generation of massive earthquakes | In an earthquake, ground motion is the result of waves emitted when the two sides of a fault move -- or slip -- rapidly past each other, with an average relative speed of about three feet per second. Not all fault segments move so quickly, however -- some slip slowly, through a process called creep, and are considered to be "stable," e.g. not capable of hosting rapid earthquake-producing slip. One common hypothesis suggests that such creeping fault behavior is persistent over time, with currently stable segments acting as barriers to fast-slipping, shake-producing earthquake ruptures. But a new study by researchers at the California Institute of Technology (Caltech) and the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) shows that this might not be true. | "What we have found, based on laboratory data about rock behavior, is that such supposedly stable segments can behave differently when an earthquake rupture penetrates into them. Instead of arresting the rupture as expected, they can actually join in and hence make earthquakes much larger than anticipated," says Nadia Lapusta, professor of mechanical engineering and geophysics at Caltech and coauthor of the study, published January 9 in the journal She and her coauthor, Hiroyuki Noda, a scientist at JAMSTEC and previously a postdoctoral scholar at Caltech, hypothesize that this is what occurred in the 2011 magnitude 9.0 Tohoku-Oki earthquake, which was unexpectedly large.Fault slip, whether fast or slow, results from the interaction between the stresses acting on the fault and friction, or the fault's resistance to slip. Both the local stress and the resistance to slip depend on a number of factors such as the behavior of fluids permeating the rocks in Earth's crust. So, the research team formulated fault models that incorporate laboratory-based knowledge of complex friction laws and fluid behavior, and developed computational procedures that allow the scientists to numerically simulate how those model faults will behave under stress."The uniqueness of our approach is that we aim to reproduce the entire range of observed fault behaviors -- earthquake nucleation, dynamic rupture, postseismic slip, interseismic deformation, patterns of large earthquakes -- within the same physical model; other approaches typically focus only on some of these phenomena," says Lapusta.In addition to reproducing a range of behaviors in one model, the team also assigned realistic fault properties to the model faults, based on previous laboratory experiments on rock materials from an actual fault zone -- the site of the well-studied 1999 magnitude 7.6 Chi-Chi earthquake in Taiwan."In that experimental work, rock materials from boreholes cutting through two different parts of the fault were studied, and their properties were found to be conceptually different," says Lapusta. "One of them had so-called velocity-weakening friction properties, characteristic of earthquake-producing fault segments, and the other one had velocity-strengthening friction, the kind that tends to produce stable creeping behavior under tectonic loading. However, these 'stable' samples were found to be much more susceptible to dynamic weakening during rapid earthquake-type motions, due to shear heating."Lapusta and Noda used their modeling techniques to explore the consequences of having two fault segments with such lab-determined fault-property combinations. They found that the ostensibly stable area would indeed occasionally creep, and often stop seismic events, but not always. From time to time, dynamic rupture would penetrate that area in just the right way to activate dynamic weakening, resulting in massive slip. They believe that this is what happened in the Chi-Chi earthquake; indeed, the quake's largest slip occurred in what was believed to be the "stable" zone."We find that the model qualitatively reproduces the behavior of the 2011 magnitude 9.0 Tohoku-Oki earthquake as well, with the largest slip occurring in a place that may have been creeping before the event," says Lapusta. "All of this suggests that the underlying physical model, although based on lab measurements from a different fault, may be qualitatively valid for the area of the great Tohoku-Oki earthquake, giving us a glimpse into the mechanics and physics of that extraordinary event."If creeping segments can participate in large earthquakes, it would mean that much larger events than seismologists currently anticipate in many areas of the world are possible. That means, Lapusta says, that the seismic hazard in those areas may need to be reevaluated.For example, a creeping segment separates the southern and northern parts of California's San Andreas Fault. Seismic hazard assessments assume that this segment would stop an earthquake from propagating from one region to the other, limiting the scope of a San Andreas quake. However, the team's findings imply that a much larger event may be possible than is now anticipated -- one that might involve both the Los Angeles and San Francisco metropolitan areas."Lapusta and Noda's realistic earthquake fault models are critical to our understanding of earthquakes -- knowledge that is essential to reducing the potential catastrophic consequences of seismic hazards," says Ares Rosakis, chair of Caltech's division of engineering and applied science. "This work beautifully illustrates the way that fundamental, interdisciplinary research in the mechanics of seismology at Caltech is having a positive impact on society."Now that they've been proven to qualitatively reproduce the behavior of the Tohoku-Oki quake, the models may be useful for exploring future earthquake scenarios in a given region, "including extreme events," says Lapusta. Such realistic fault models, she adds, may also be used to study how earthquakes may be affected by additional factors such as human-made disturbances resulting from geothermal energy harvesting and CO"Creeping fault segments can turn from stable to destructive due to dynamic weakening" appears in the January 9 issue of the journal | Earthquakes | 2,013 |
January 7, 2013 | https://www.sciencedaily.com/releases/2013/01/130107095734.htm | Seismic fabric coming on the market | In the case of earthquakes, only seconds may remain for a safe escape from buildings. Debris falling down and obstructing the escape routes may even aggravate the situation. A product developed at Karlsruhe Institute of Technology (KIT) extends the time for saving lives by reinforcing walls and keeping off the debris. An innovative building material manufacturer now has launched the mature innovation on the market. | null | Earthquakes | 2,013 |
December 28, 2012 | https://www.sciencedaily.com/releases/2012/12/121228084026.htm | More serious earthquakes predicted in the Himalayas | A research team led by scientists from Nanyang Technological University (NTU) has discovered that massive earthquakes in the range of 8 to 8.5 magnitudes on the Richter scale have left clear ground scars in the central Himalayas. | This ground-breaking discovery has huge implications for the area along the front of the Himalayan Mountains, given that the region has a population density similar to that of New York City.NTU Professor Paul Tapponnier, who is recognised as a leading scientist in the field of neotectonics, said that the existence of such devastating quakes in the past means that quakes of the same magnitude could happen again in the region in future, especially in areas which have yet to have their surface broken by a temblor.Published recently in Nature Geosciences, the study by NTU's Earth Observatory of Singapore (EOS) in Singapore and colleagues in Nepal and France, showed that in 1255 and 1934, two great earthquakes ruptured the surface of the earth in the Himalayas. This runs contrary to what scientists have previously thought.Massive earthquakes are not unknown in the Himalayas, as quakes in 1897, 1905, 1934 and 1950 all had magnitudes between 7.8 and 8.9, each causing tremendous damage. But they were previously thought not to have broken the earth's surface -- classified as blind quakes -- which are much more difficult to track.However, Prof Tapponnier said that by combining new high resolution imagery and state of the art dating techniques, they could show that the 1934 earthquake did indeed rupture the surface, breaking the ground over a length of more than 150 kilometres, essentially south of the part of the range that harbours Mt Everest.This break formed along the main fault in Nepal that currently marks the boundary between the Indian and Asian tectonic plates -- also known as the Main Frontal Thrust (MFT) fault.Using radiocarbon dating of offset river sediments and collapsed hill-slope deposits, the research team managed to separate several episodes of tectonic movement on this major fault and pin the dates of the two quakes, about 7 centuries apart."The significance of this finding is that earthquakes of magnitude 8 to 8.5 may return at most twice per millennium on this stretch of the fault, which allows for a better assessment of the risk they pose to the surrounding communities," said Prof Tapponnier.Prof Tapponnier warns that the long interval between the two recently discovered earthquake ruptures does not mean people should be complacent, thinking that there is still time before the next major earthquake happens in the region."This does not imply that the next mega-earthquake in the Himalayas will occur many centuries from now because we still do not know enough about adjacent segments of the MFT Mega-thrust," Prof Tapponier explains."But it does suggest that areas west or east of the 1934 Nepal ground rupture are now at greater risk of a major earthquake, since there are little or no records of when last earth shattering temblor happened in those two areas."The next step for Prof Tapponnier and his EOS scientists is to uncover the full extent of such fault ruptures, which will then allow them to build a more comprehensive model of earthquake hazard along the Himalayan front.About the NTU's Earth Observatory of Singapore (EOS)EOS is a premier research institute at NTU, Singapore, which conducts fundamental research on earthquakes, volcanic eruptions, tsunami and climate change in and around Southeast Asia, towards safer and more sustainable societies.Funded by the National Research Foundation's Research Centres of Excellence programme, EOS and its field of research contributes greatly to NTU's research strengths in Sustainability, which is one of the university's Five Peaks of Excellence. | Earthquakes | 2,012 |
December 5, 2012 | https://www.sciencedaily.com/releases/2012/12/121205103003.htm | Great-earthquake hot spots pinpointed | The world's largest earthquakes occur at subduction zones -- locations where a tectonic plate slips under another. But where along these extended subduction areas are great earthquakes most likely to happen? Scientists have now found that regions where 'scars' on the seafloor, called fracture zones, meet subduction areas are at higher risk of generating powerful earthquakes. | The results are published December 5 in "We find that 87% of the 15 largest (8.6 magnitude or higher) and half of the 50 largest (8.4 magnitude or higher) earthquakes of the past century are associated with intersection regions between oceanic fracture zones and subduction zones," says Dietmar Müller, researcher at the University of Sydney in Australia and lead author of the Powerful earthquakes related to these intersection regions include the destructive 2011 Tohoku-Oki and 2004 Sumatra events."If the association we found were due to a random data distribution, only about 25% of great subduction earthquakes should coincide with these special tectonic environments. Therefore, we can rule out that the link we found is just due to chance," he adds.The researchers considered about 1,500 earthquakes in their study. They used a database of significant post-1900 events, as well as geophysical data mapping fracture zones and subduction zones, among others. They analysed information from these databases by using a specific data mining method."The method was originally developed for analysing online user data," says Thomas Landgrebe, also involved in the study. "The technique we apply is commonly used to find a few specific items which are expected to be most appealing to an Internet user. Instead, we use it to find which tectonic environment is most suitable for generating great earthquakes."Since earthquake generation is a very complex process, the scientists don't yet have a complete understanding of why great earthquakes prefer the intersection areas. They suggest that it is due to the physical properties of fracture zones, which result in "strong, persistent coupling in the subduction boundaries," Landgrebe explains. This means that the subduction fault area is locked and thus capable of accumulating stress over long periods of time."The connection we have uncovered provides critical information for seismologists to, in the long run, pinpoint particular tectonic environments that are statistically more prone to strong seismic coupling and great earthquake supercycles," Müller says. An area with earthquake supercycles experiences recurring powerful earthquakes every few centuries or millennia.Regions that have long earthquake supercycles are usually not picked up as risk areas by seismic hazard maps as these are constructed mainly using data collected after 1900. An example is the area of the 2011 Tohoku-Oki earthquake, which had no record of large earthquakes over the past century and was not predicted to be of significant risk by previous hazard maps."The power of our new method is that it does pick up many of these regions and, hence, could contribute to much-needed improvements of long-term seismic hazard maps," Müller explains."Even though we don't fully understand the physics of long earthquake cycles, any improvements that can be made using statistical data analysis should be considered as they can help reduce earthquake damage and loss of life."The floor of the Earth's oceans is crossed by underwater mountain systems, or ocean ridges, such as the mid-Atlantic ridge that runs from north to south between the Americas and Africa. These ridges divide two tectonic plates that move apart as lava emerges from the opening, spreading the sea floor. The mid-ocean ridge jogs back and forth at offsets known as transform faults, creating zig-zagged plate boundaries. Fracture zones are scars in the ocean floor left by these transform faults. | Earthquakes | 2,012 |
December 5, 2012 | https://www.sciencedaily.com/releases/2012/12/121205091042.htm | Oceanography student uses crashing waves on shorelines to study Earth's interior | Scientists have long used the speed of seismic waves traveling through Earth as a means of learning about the geologic structure beneath Earth's surface, but the seismic waves they use have typically been generated by earthquakes or human-made explosions. A University of Rhode Island graduate student is using the tiny seismic waves created by ocean waves crashing on shorelines around the world to learn how an underwater plateau was formed 122 million years ago. | "There are any number of ways to create seismic waves, but most people only think about earthquakes and explosions," said Brian Covellone, a doctoral student at the URI Graduate School of Oceanography. "Using data from ocean waves allows you to sample a different region of the Earth's interior than you can from using earthquake data. And by combining multiple data sets, you can get a clearer picture of the geology of the Earth."Covellone, a native of Warwick, R.I., is investigating the origin of the Ontong Java plateau, an underwater feature north of New Guinea and the Solomon Islands, which formed when a massive volume of magma erupted from the seafloor over a short period of time. Scientists disagree about whether the eruption happened as the result of a plume of magma breaking through the seafloor, in the way that Hawaii was formed, or at a mid-ocean ridge where Earth's crust was spreading apart and new crust was forming.According to Covellone, the velocity of seismic waves depends on the temperature and the composition of the material through which it travels. By studying how quickly seismic waves travel near the Ontong Java plateau, scientists can infer how the structure was formed. By analyzing data from hundreds of seismometers around the Pacific Ocean, especially those between Hawaii and Australia, Covellone believes that coastal waves will help provide the answer."Earthquake data stands out compared to the background noise of waves crashing on the shoreline," he said. "But that noise has valuable information in it. If you take that noise and sum it up over months or years, they grow in amplitude and you can pull out valuable data from it."Waves on a beach excite the Earth in a non-random way, and when I stack all that data together, I get a seismogram that looks like an earthquake, with a large spike in it," he said.Covellone will present a summary of his research December 5 at a meeting of the American Geophysical Union."By the time I graduate, I expect to have a high-resolution image of the velocity structure underneath the Ontong Java plateau," said Covellone. "And from that we'll have an idea of what caused the plateau to form. It will shed a lot of light on the subject, but it's still just one piece of a big puzzle." | Earthquakes | 2,012 |
December 5, 2012 | https://www.sciencedaily.com/releases/2012/12/121205090919.htm | Seeing stars, finding nukes: Radio telescopes can spot clandestine nuclear tests | In the search for rogue nukes, researchers have discovered an unlikely tool: astronomical radio telescopes. | Ohio State University researchers previously demonstrated another unlikely tool, when they showed that South Korean GPS stations detected telltale atmospheric disturbances from North Korea's 2009 nuclear test.Both techniques were born out of the discovery that underground nuclear explosions leave their mark -- on the outer reaches of Earth's atmosphere.Now, working with astronomers at the U.S. Naval Research Laboratory (NRL), they have analyzed historical data from the Very Large Array (VLA), a constellation of 27 radio telescopes near Socorro, New Mexico -- and discovered that the VLA recorded a very similar pattern of disturbances during the last two American underground nuclear tests, which took place in Nevada in 1992.Dorota Grejner-Brzezinska, professor of geodetic and geoinformation engineering at Ohio State, said that the new findings help support the notion that GPS systems -- and their technological successors, global navigation satellite systems (GNSS) -- are viable tools for detecting clandestine nuclear tests around the globe. She added that now is a good time to begin developing the concept."With a global availability of permanently tracking GPS networks now extending to GNSS, tremendous amounts of information are becoming available, and the infrastructure is growing," she said. "We have a great opportunity to develop these ideas, and make a tool that will aid the global community."Grejner-Brzezinska presented the findings in a press conference at the American Geophysical Union (AGU) meeting on Dec. 4 with study co-authors Jihye Park, a postdoctoral researcher in geodetic and geoinformation engineering at Ohio State, and Joseph Helmboldt, a radio astronomer at NRL. Park presented the research in a lecture at AGU on Dec. 3.While radio telescopes don't cover the entire globe as GPS systems do, Helmboldt said that the two technologies complement each other, with telescopes offering higher-resolution measurements over a smaller area."The observations we make as radio astronomers are not so different from GPS," he said. "We may be looking up at a distant galaxy instead of down to the Earth, but either way, we're all looking at radio waves traveling through the ionosphere."The ionosphere is the outermost layer of the atmosphere, which begins approximately 50 miles above Earth's surface. It contains charged particles that can interfere with radio waves and cause measurement errors in GPS and radio telescopes.For that reason, both radio astronomers and geodetic scientists routinely monitor the ionosphere in order to detect these errors and compensate for them."We're talking about taking the error patterns -- basically, the stuff we usually try to get rid of -- and making something useful out of it," Grejner-Brzezinska said.Park, who developed this analysis method to earn her doctoral degree at Ohio State, cited key similarities and differences between the GPS data from the 2009 North Korean nuclear test and the VLA data from the 1992 American tests: one on Sept. 18 named Hunters Trophy, and the other on Sept. 23, named Divider.The North Korean bomb is believed to have had a yield of about five kilotons. According to the GPS data, the wave front of atmospheric disturbance spread outward from the test site in the village of P'unggye at approximately 540 miles per hour. It reached 11 GPS stations in South Korea, China, Japan, and Russia in that first hour. In contrast, Hunters Trophy and Divider each had yields of 20 kilotons. Each blast created a wave front that quickly covered the 700 miles from the Nevada Test Site to the VLA, with a top speed of approximately 1,500 miles per hour."Clearly, the U.S. explosions were much bigger than the North Korean explosion," Park said. "The wave fronts traveled faster, and the amplitudes were higher. There are still details missing from the North Korean test, but we can learn a lot by comparing the two events."Park will continue this work while she takes a new position at the University of Nottingham starting in January. She's already found that GPS stations in the North Pacific recorded ionospheric disturbances during the deadly Japanese earthquake of 2011, and she will focus on how to differentiate between earthquake signals and nuclear test signals.Collaborators on this work include Ralph R. von Frese, professor in the School of Earth Sciences at Ohio State; Yu "Jade" Morton, professor in electrical engineering at Miami University in Oxford, Ohio; and Thomas Wilson, an astronomer at NRL. | Earthquakes | 2,012 |
December 4, 2012 | https://www.sciencedaily.com/releases/2012/12/121204112217.htm | Pacific Northwest and Himalayas could experience major earthquakes, geophysicists say | Research by Stanford scientists focuses on geologic features and activity in the Himalayas and Pacific Northwest that could mean those areas are primed for major earthquakes. | The Himalayan range was formed, and remains currently active, due to the collision of the Indian and Asian continental plates. Scientists have known for some time that India is subducting under Asia, and have recently begun studying the complexity of this volatile collision zone in greater detail, particularly the fault that separates the two plates, the Main Himalayan Thrust (MHT).Previous observations had indicated a relatively uniform fault plane that dipped a few degrees to the north. To produce a clearer picture of the fault, Warren Caldwell, a geophysics doctoral student at Stanford, has analyzed seismic data from 20 seismometers deployed for two years across the Himalayas by colleagues at the National Geophysical Research Institute of India.The data imaged a thrust dipping a gentle two to four degrees northward, as has been previously inferred, but also revealed a segment of the thrust that dips more steeply (15 degrees downward) for 20 kilometers. Such a ramp has been postulated to be a nucleation point for massive earthquakes in the Himalaya.Although Caldwell emphasized that his research focuses on imaging the fault, not on predicting earthquakes, he noted that the MHT has historically been responsible for a magnitude 8 to 9 earthquake every several hundred years."What we're observing doesn't bear on where we are in the earthquake cycle, but it has implications in predicting earthquake magnitude," Caldwell said. "From our imaging, the ramp location is a bit farther north than has been previously observed, which would create a larger rupture width and a larger magnitude earthquake."Caldwell will present a poster detailing the research on Dec. 4 at the meeting of the American Geophysical Union in San Francisco.Caldwell's adviser, geophysics Professor Simon Klemperer, added that recent detections of magma and water around the MHT indicate which segments of the thrust will rupture during an earthquake."We think that the big thrust vault will probably rupture southward to the Earth's surface, but we don't expect significant rupture north of there," Klemperer said. The findings are important for creating risk assessments and disaster plans for the heavily populated cities in the region.Klemperer spoke about the evolution of geophysical studies of the Himalayas Dec. 3 at the same meeting in San Francisco.The Cascadia subduction zone, which stretches from northern California to Vancouver Island, has not experienced a major seismic event since it ruptured in 1700, an 8.7-9.2 magnitude earthquake that shook the region and created a tsunami that reached Japan. And while many geophysicists believe the fault is due for a similar scale event, the relative lack of any earthquake data in the Pacific Northwest makes it difficult to predict how ground motion from a future event would propagate in the Cascadia area, which runs through Seattle, Portland and Vancouver.Stanford postdoctoral scholar Annemarie Baltay will present research on how measurements of small seismic tremors in the region can be utilized to determine how ground motion from larger events might behave. Baltay's research involves measuring low amplitude tectonic tremor that occurs 30 kilometers below Earth's surface, at the intersections of tectonic plates, roughly over the course of a month each year.By analyzing how the tremor signal decays along and away from the Cascadia subduction zone, Baltay can calculate how ground motion activity from a larger earthquake will dissipate. An important application of the work will be to help inform new construction how best to mitigate damage should a large earthquake strike."We can't predict when an earthquake will occur, but we can try to be very prepared for them," Baltay said. "Looking at these episodic tremor events can help us constrain what the ground motion might be like in a certain place during an earthquake."Though Baltay has focused on the Cascadia subduction zone, she said that the technique could be applied in areas of high earthquake risk around the world, such as Alaska and Japan.Baltay will present a poster presentation of the research on Dec. 5 in San Francisco.The slow slip and tremor events in Cascadia are also being studied by Stanford geophysics Professor Paul Segall, although in an entirely different manner. Segall's group uses computational models of the region to determine whether the cumulative effects of many small events can trigger a major earthquake."You have these small events every 15 months or so, and a magnitude 9 earthquake every 500 years. We need to known whether you want to raise an alert every time one of these small events happens," Segall said. "We're doing sophisticated numerical calculations to simulate these slow events and see whether they do relate to big earthquakes over time. What our calculations have shown is that ultimately these slow events do evolve into the ultimate fast event, and it does this on a pretty short time scale."Unfortunately, so far Segall's group has not seen any obvious differences in the numerical simulations between the average slow slip event and those that directly precede a big earthquake. The work is still young, and Segall noted that the model needs refinement to better match actual observations and to possibly identify the signature of the event that triggers a large earthquake."We're not so confident in our model that public policy should be based on the output of our calculations, but we're working in that direction," Segall said.One thing that makes Segall's work difficult is a lack of data from actual earthquakes in the Cascadia region. Earlier this year, however, earthquakes in Mexico and Costa Rica occurred in areas that experience slow slip events similar to those in Cascadia. Segall plans to speak with geophysicists who have studied the lead-up to those earthquakes to compare the data to his simulations. | Earthquakes | 2,012 |
December 3, 2012 | https://www.sciencedaily.com/releases/2012/12/121203145840.htm | Russian Far East holds seismic hazards: Potential to trigger tsunamis that pose risk to Pacific Basin | For decades, a source of powerful earthquakes and volcanic activity on the Pacific Rim was shrouded in secrecy, as the Soviet government kept outsiders away from what is now referred to as the Russian Far East. | But research in the last 20 years has shown that the Kamchatka Peninsula and Kuril Islands are a seismic and volcanic hotbed, with a potential to trigger tsunamis that pose a risk to the rest of the Pacific Basin.A magnitude 9 earthquake in that region in 1952 caused significant damage elsewhere on the Pacific Rim, and even less-powerful quakes have had effects throughout the Pacific Basin."There's not a large population in the Russian Far East, but it's obviously important to the people who live there. Thousands of people were killed in tsunamis because of the earthquake in 1952. And tsunamis don't stay home," said Jody Bourgeois, a University of Washington professor of Earth and space sciences.Bourgeois will discuss the seismic and volcanic threats in the Kamchatka-Kurils region Dec. 3 during the fall meeting of the American Geophysical Union in San Francisco.Earthquakes greater than magnitude 8 struck the central Kurils in 2006 and 2007, and both produced large local tsunamis, up to about 50 feet. Though the tsunamis that crossed the Pacific were much smaller, the one from the 2006 quake did more than $10 million in damage at Crescent City, Calif.In 2009, Sarychev Peak in the Kurils erupted spectacularly, disrupting air traffic over the North Pacific.Clearly, determining the frequency of such events is important to many people over a broad area, Bourgeois said."Let's say you decide to build a nuclear power plant in Crescent City. You have to consider local events, but you also have to consider non-local events, worst-case scenarios, which includes tsunamis coming across the Pacific," she said.But that is only possible by understanding the nature of the hazards, and the historic record for earthquakes, tsunamis and volcanic eruptions in Kamchatka and the Kurils is relatively short. In addition, because the region was closed off from much of the world for decades, much of the information has started becoming available only recently.Much has been learned in the last 10 years in the examination of tsunami deposits and other evidence of prehistoric events, Bourgeois said, but more field work in the Kamchatka-Kurils subduction zone is required to get a clearer picture."For hazard analysis, you should just assume that a subduction zone can produce a magnitude 9 earthquake," she said. So it is important to "pay attention to the prehistoric record" to know where, and how often, such major events occur.Bourgeois noted that in the last 25 years research in the Cascadia subduction zone off the coast of Washington, Oregon, northern California and British Columbia has demonstrated that the historic record does not provide a good characterization of the hazard. It was once assumed the risks in the Northwest were small, but the research has shown that, before there were any written records, Cascadia produced at least one magnitude 9 earthquake and a tsunami that struck Japan.Alaska's Aleutian Islands and the Komandorsky Islands, an extension of the Aleutians controlled by Russia, are another source of seismic and volcanic activity that need to be evaluated for their potential risk beyond what is known from the historical record."The Aleutians are under-studied," Bourgeois said. "The work in the Russian Far East is kind of a template for the Aleutians."Ideally, a dedicated boat could ferry researchers to a number of islands in the Aleutian chain, similar to how Bourgeois and other scientists from the United States, Japan and Russia have carried out a detailed research project in the Kuril Islands in the last decade."The problem is that during the (research) field season, boats are commonly in demand for fishing," she said. | Earthquakes | 2,012 |
November 30, 2012 | https://www.sciencedaily.com/releases/2012/11/121130222247.htm | Geoscientists cite 'critical need' for basic research to unleash promising energy resources | Developers of renewable energy and shale gas must overcome fundamental geological and environmental challenges if these promising energy sources are to reach their full potential, according to a trio of leading geoscientists. | Their findings will be presented on Dec. 4 at the fall meeting of the American Geophysical Union (AGU) in San Francisco."There is a critical need for scientists to address basic questions that have hindered the development of emerging energy resources, including geothermal, wind, solar and natural gas, from underground shale formations," said Mark Zoback, a professor of geophysics at Stanford University. "In this talk we present, from a university perspective, a few examples of fundamental research needs related to improved energy and resource recovery."Zoback, an authority on shale gas development and hydraulic fracturing, served on the U.S. Secretary of Energy's Committee on Shale Gas Development. His remarks will be presented in collaboration with Jeff Tester, an expert on geothermal energy from Cornell University, and Murray Hitzman, a leader in the study of "energy critical elements" from the Colorado School of Mines."One option for transitioning away from our current hydrocarbon-based energy system to non-carbon sources is geothermal energy -- from both conventional hydrothermal resources and enhanced geothermal systems," said Zoback, a senior fellow at the Precourt Institute for Energy at Stanford.Unlike conventional geothermal power, which typically depends on heat from geysers and hot springs near the surface, enhanced geothermal technology has been touted as a major source of clean energy for much of the planet.The idea is to pump water into a deep well at pressures strong enough to fracture hot granite and other high-temperature rock miles below the surface. These fractures enhance the permeability of the rock, allowing the water to circulate and become hot.A second well delivers steam back to the surface. The steam is used to drive a turbine that produces electricity with virtually no greenhouse gas emissions. The steam eventually cools and is re-injected underground and recycled to the surface.In 2006, Tester co-authored a major report on the subject, estimating that 2 percent of the enhanced geothermal resource available in the continental United States could deliver roughly 2,600 times more energy than the country consumes annually.But enhanced geothermal systems have faced many roadblocks, including small earthquakes that are triggered by hydraulic fracturing. In 2005, an enhanced geothermal project in Basel, Switzerland, was halted when frightened citizens were shaken by a magnitude 3.4 earthquake. That event put a damper on other projects around the world.Last year, Stanford graduate student Mark McClure developed a computer model to address the problem of induced seismicity.Instead of injecting water all at once and letting the pressure build underground, McClure proposed reducing the injection rate over time so that the fracture would slip more slowly, thus lowering the seismicity. This novel technique, which received the 2011 best paper award from the journal Zoback also will also discuss challenges facing the emerging shale gas industry. "The shale gas revolution that has been under way in North America for the past few years has been of unprecedented scale and importance," he said. "As these resources are beginning to be developed globally, there is a critical need for fundamental research on such questions as how shale properties affect the success of hydraulic fracturing, and new methodologies that minimize the environmental impact of shale gas development."Approximately 30,000 shale gas wells have already been drilled in North America, he added, yet fundamental challenges have kept the industry from maximizing its full potential. "The fact is that only 25 percent of the gas is produced, and 75 percent is left behind," he said. "We need to do a better job of producing the gas and at the same time protecting the environment."Earlier this year, Zoback and McClure presented new evidence that in shale gas reservoirs with extremely low permeability, pervasive slow slip on pre-existing faults may be critical during hydraulic fracturing if it is to be effective in stimulating production.Even more progress is required in extracting petroleum, Zoback added. "The recovery of oil is only around 5 percent, so we need to do more fundamental research on how to get more hydrocarbons out of the ground," he said. "By doing this better we'll actually drill fewer wells and have less environmental impact. That will benefit all of the companies and the entire nation."Geology plays a surprising role in the development of renewable energy resources."It is not widely recognized that meeting domestic and worldwide energy needs with renewables, such as wind and solar, will be materials intensive," Zoback said. "However, elements like platinum and lithium will be needed in significant quantities, and a shortage of such 'energy critical elements' could significantly inhibit the adoption of these otherwise game-changing technologies."Historically, energy critical elements have been controlled by limited distribution channels, he said. A 2009 study co-authored by Hitzman found that China produced 71 percent of the world's supply of germanium, an element used in many photovoltaic cells. Germanium is typically a byproduct of zinc extraction, and China is the world's leading zinc producer.About 30 elements are considered energy critical, including neodymium, a key component of the magnets used in wind turbines and hybrid vehicles. In 2009, China also dominated the neodymium market."How these elements are used and where they're found are important issues, because the entire industrial world needs access to them," Zoback said. "Therefore, if we are to sustainably develop renewable energy technologies, it's imperative to better understand the geology, metallurgy and mining engineering of these critical mineral deposits."Unfortunately, he added, there is no consensus among federal and state agencies, the global mining industry, the public or the U.S. academic community regarding the importance of economic geology in securing a sufficient supply of energy critical elements. | Earthquakes | 2,012 |
November 27, 2012 | https://www.sciencedaily.com/releases/2012/11/121127191258.htm | NASA's TRMM satellite confirms 2010 landslides | A NASA study using TRMM satellite data revealed that the year 2010 was a particularly bad year for landslides around the world. | A recent NASA study published in the October issue of the The work, led by Dalia Kirschbaum, a research physical scientist in the Hydrological Sciences Laboratory at NASA's Goddard Space Flight Center in Greenbelt, Md., is part of an ongoing effort to catalog worldwide rainfall-triggered landslides -- one of the world's lesser known but often catastrophic natural hazards. Locating them is a step in an effort to be able, one day, to predict and warn.Currently, Kirschbaum explains, no consistent regional or global scale warning system exists for landslide disasters. To create one, scientists need to understand more than the individual factors that may contribute to local landslides -- the intensity and total amount of rainfall over hours to days, slope angle, soil type and saturation, among others."For other hazards like hurricanes, there's a clearly defined season," says Kirschbaum. "From satellite data and observations we know that hurricane season in the Atlantic spans from June 1 to Nov. 30. But we don't have that type of record for landslides around the world, and we want to know when and where to expect them in different regions."Scientists also need a systematic way to assess landslide hazards for a region, and one way to do that, says Kirschbaum, is to look at the distribution and intensity of rain from satellite data and see how that correlates with where and how often landslides are being reported.The first step to developing landslide hazard assessments is to improve record keeping, Kirschbaum says. For the past several years, she has been developing the first global database for landslides triggered by rain, called the Global Landslide Catalog (GLC). Landslides are often too small to pick out of satellite imagery, so news stories are currently the best sources of information.Daily, she and other research assistants examined media reports for possible landslides and dug out details such as where the landslide occurred, whether the landslide was triggered by a rain shower or a tropical cyclone, if there were fatalities, and if the type of landslide was characterized as a mudslide or a debris flow, among other characteristics. The GLC now has six complete years of data -- 2003, and 2007 through 2011 -- with new entries continually added. It contains more than 4,000 events that describe 20,600 reported fatalities among 60 different countries for 2007 through 2011.However, Kirschbaum notes, landslide events are not uniformly reported around the world, with some not making the news unless they cause deaths or property damage. After compiling the 2010 record, Kirschbaum noted that 2010 appeared to be a particularly devastating year for landslides in parts of China, Central America and the Himalayan Arc. Kirschbaum compared the GLC with satellite-based rainfall information from TRMM and other satellites to see if the higher numbers of landslide reports corresponded with a higher amount of rainfall for those areas.The TRMM Multisatellite Precipitation Analysis (TMPA) data includes rain data from 50 degrees north latitude to 50 degrees south latitude -- from southern Canada to the tip of South America--providing three-hourly and daily coverage of precipitation. Kirschbaum and her team relied on TMPA's 14-year record to provide a picture of global rainfall that could be related to landslides that occurred.The big advantage of TMPA data, compared to ground data such as that from rain gauges, is that it measures rainfall in the same, consistent way over large regions. This provides a broader perspective that helps researchers like Kirschbaum and eventually forecasters equipped with satellite rainfall data to detect the kinds of rainfall signatures that are likely to produce destructive landslides.Stating that extreme or prolonged rainfall can lead to landslides is not anything new, Kirschbaum says, but what is novel is that by examining the Global Landslide Catalog and TMPA precipitation record together, she can look at how reported landslide events compare with rainfall amounts. This in turn helps her find areas where landslides may be occurring but are not being reported.2010 was indeed an extraordinary year for landslides, says Kirschbaum, with three times as many reported events and twice as many reported fatal events as previous years within the Global Landslide Catalog.Zhouqu County, China, was hit by the deadliest landslides in decades, according to state media, which buried some areas under as much as 23 feet (7 meters) of suffocating sludge. 1,765 people died. Property damages totaled an estimated $759 million. The TMPA rain data showed that the region's susceptible, steeply sloped geography was pummeled by extreme rainfall from a local cloudburst."This is really the first time that you can see how rainfall varies with respect to rainfall-triggered landslides because no other database has this type of globally consistent view," says Kirschbaum.While most of the intense rainfall that triggers landslides is local in scale, global atmospheric circulation patterns can often affect these local conditions. As Kirschbaum explains: "What I think is unique about 2010 is there is a really strong El Niño going into a really strong La Niña, which consequently impacted rainfall patterns around the world."Eventually, as the database of both landslides and rainfall data grows, Kirschbaum will be able to see more defined patterns in landslide occurrence month-by-month, and if those patterns are changing."I think in some ways we can compare the situation with landslides with that of earthquakes 50 years ago," says David Petley, professor and co-director of the Institute of Hazard, Risk and Resilience at Durham University's Department of Geography. "Our understanding of earthquakes improved dramatically when a global seismic array was deployed that allowed us to map earthquakes in space and time, providing insight into plate tectonics and thus earthquake mechanisms. We have not been able to map landslides in the same way, which means that we lack that time and space understanding. Dalia's work is important because she is collating this data, which is providing new insights."Until there's enough data about where and when landslides have occurred in the past, there will not be enough information to issue useful warnings about future events at the global scale. Kirschbaum plans to expand the Global Landslide Catalog in order to provide a more complete dataset to evaluate landslide forecasting models. Adequate warning, one day, could help vulnerable people escape roaring slurries of slumping earth, to recover, rebuild or relocate.This work was funded by the upcoming Global Precipitation Measurement (GPM) mission, which will improve upon current rainfall datasets, with real-time assessment of rainfall accumulations that lead to landslide triggering. The GPM Core satellite is set to launch in 2014 and will extend coverage of precipitation measurements using a constellation of satellites to deliver a global rain dataset every three hours.For more information about the GPM mission, visit: For more information about NASA's TRMM satellite, visit: | Earthquakes | 2,012 |
November 26, 2012 | https://www.sciencedaily.com/releases/2012/11/121126130846.htm | Models for evacuation procedures in big cities after massive earthquakes | Tokyo Tech 's Toshihiro Osaragi and colleagues report on models for evacuation procedures in big cities after massive earthquakes based on the behavior of people in Tokyo after the Tohoku-Pacific Ocean Earthquake on March 11, 2011. | The details are also described in the November issue of The Tohoku-Pacific Ocean Earthquake occurred on 11 May 2011. On this day all rail services in the Tokyo Metropolitan area were paralyzed amid the unprecedented confusion that followed the tremor.Thousands people were unable to contact families and friends, and in a state of uneasiness, many decided to return home on foot. Main roads were heavily congested with both cars and people, a state which severely obstructed the movement of emergency vehicles.Here, Toshihiro Osaragi at Tokyo Institute of Technology describes the construction of several models that describe decision-making and behavior of individuals attempting to reach home on foot in the wake of a devastating earthquake.He has simulated the movement of individuals who have decided to return home on foot, and demonstrates the spatiotemporal distribution of those who might be exposed to hazardous city fires on their way home in the aftermath of a massive earthquake, which has been predicted to occur in the Tokyo Metropolitan area in near future.Osaragi research underscores the importance of considering pedestrian flow under such extreme scenarios in order to establish emergency evacuation procedures. "Using the model proposed, we can assess not only the potential number of stranded individuals, but also their detailed attributes," says Osaragi. "Such information would undoubtedly prove helpful in actual planning for immediate post-disaster mitigation." | Earthquakes | 2,012 |
November 14, 2012 | https://www.sciencedaily.com/releases/2012/11/121114134656.htm | What lies beneath? New survey technique offers detailed picture of our changing landscape | A new surveying technique developed at The University of Nottingham is giving geologists their first detailed picture of how ground movement associated with historical mining is changing the face of our landscape. | The new development by engineers at the University has revealed a more complete map of subsidence and uplift caused by the settlement of old mines in the East Midlands and other areas of the country and has shown that small movements in the landscape are bound by natural fault lines and mining blocks.It appears to support concerns that movement associated with historical mining is continuing far longer than previously anticipated.The research has been led by Dr Andrew Sowter in the University's Department of Civil Engineering. He said: "This method allows us to measure patterns of slow millimetre-scale movement across large regions of the landscape and, in the UK, almost everywhere we look is dominated by our industrial past. Large tracts of our land, including parts of our cities, towns and infrastructure as well as agricultural and woodland areas, are steadily creeping upwards over mines that were closed decades ago."The new development builds on existing technology that allows engineers to use satellite radar technology to measure points on the landscape over a length of time to assess whether they are moving up (uplifting) or sinking down (subsiding).Previously, this has relied on using fixed, unchanging objects like buildings that can be accurately re-measured and compared against previous measurements time after time. However, the technique has not been practical for use in the rural landscape meaning that geologists could only get half the picture.Now, Dr Sowter has developed a technique called the Intermittent Small Baseline Subset (ISBAS) method which adapts the same technology and extends it to rural areas by taking stacks of these radar images and identifying those more transient points in the rural landscape against which changes over time are able to be measured.The technique is now being used by the British Geological Survey (BGS), based in Keyworth in Nottinghamshire, which is the world's oldest national geological survey providing expert services and impartial advice on all areas of geosciences for both the public and private sectors.Principal geologist at the BGS Poul Strange said: "This new technique is going to allow us to refine our geological maps. Previously when surveying rural areas we were almost guessing and the final result was more of an interpretation. Now we are able to produce maps that far more accurately reflect what is happening with the geology below the surface and enable us to predict any potential risks posed by ground movement."Rural areas are particularly important because we need to know what is happening with the geology there and how movement or natural fault lines may affect future developments such as new housing or high speed rail links."The technique will assist BGS in the work it is doing looking at potential ground movement in former mining areas of South Derbyshire, Nottinghamshire and Leicestershire where most mines closed no later than the early 90s.The BGS has so far recorded geological evidence that movement in areas where deep coal mining has been in operation historically actually continues for up to 11 years, far more than any previous estimate such as the six-year limit set by the Subsidence Act of 1991.They believe the problem may be caused by ground water, which would have been pumped out while the mines were open, seeping back into the disused pits and causing a 'bounce back' effect on the surrounding landscape. However, they estimate that this uplift is only likely to offer around a 4% recovery on where the landscape would have been before mining began.In particular, they have been using the new technique to explore the rural areas surrounding locations like Swadlincote in Derbyshire and Oakthorpe near Measham in Leicestershire which have a long-standing history of problems with mining-related subsidence.They are able to see how this movement is interacting with natural fault lines, which could potentially cause other seismic activity such as the Market Rasen earthquake of 2008, which measured 5.2 on the Richter Scale and was felt as far away as Wales, Scotland and London.The research could also be of vital significance in assessing future issues with subsidence and uplift connected with other types of activity such as fracking, where geological shale rocks are drilled and injected with fluid to encourage them to fracture and release natural gases.Dr Sowter has previously spent time working at The University of Nottingham China Ningbo, where his research centred on China's 'sinking cities' problem, where some of the country's most densely populated communities, such as Shanghai, are sinking under the weight of towering skyscrapers.Funded by the National Natural Science Foundation of China, the work aimed to develop techniques to help Chinese authorities identify with far greater accuracy which areas are moving and by how much. | Earthquakes | 2,012 |
November 12, 2012 | https://www.sciencedaily.com/releases/2012/11/121112090213.htm | At least six major earthquakes on the Alhama de Murcia fault in the last 300,000 years | An international group of researchers, with Spanish participation, has analysed the most recent history of the Alhama de Murcia fault. They discovered that it has experienced six major earthquakes above 7 on the Richter scale. According to the scientists, this provides "convincing evidence" that the maximum earthquake magnitudes in the area are higher than originally thought. | Since 2001, researchers from the Universities of Barcelona, Leon, Complutense de Madrid (UCM), Coimbra (Portugal), Aahus (Denmark) and the National Autonomous University of Mexico have been working on the Alhama de Murcia fault in order to identify those high magnitude earthquakes that have occurred during the Quaternary period -- the most recent of geological ages."Due to lack of information, up until just ten years ago there were no geological data on paleoseismic activity for the active faults in Spain and very little had been invested in studying their geology. The evidence used to understand active faults came merely from historical seismic records that hardly collected data on the largest of earthquakes related to these faults. As geology can go even further back in time, the earthquakes that we have found are bigger and of a greater magnitude," as explained by Jose J. Martínez Díaz, researcher at the UCM and coauthor of the study published in the journal This fault is a fracture plane of the land that crosses the entire earth's crust. Therefore, to identify the prehistoric earthquakes in its walls, scientists had to make surface excavations perpendicular to the fault (trenches of between 20 and 30 metres long and 4 metres deep). This allowed for them to take an exceptionally extensive paleoseismic record.As Martínez adds, "when large earthquakes exceed magnitude 6, they usually break at the surface and as a result, we have been able to identify this in their walls." These tectonic deformations were dated using carbon-14 and infrared stimulated luminescence techniques.In order to understand the behavioural patterns of the Alhama de Murcia fault, the researchers had to reconstruct hundreds of thousands of years "much more than the Americans or the Japanese, who can understand their fault patterns by studying just 10 thousand years."This is because faults in Spain are slow-moving and there is therefore much more time between major earthquakes (to the tune of thousands of years) compared to much faster-moving faults like San Andreas in California. According to their estimations, the Alhama fault would be created more than 9 million years ago and would have caused earthquakes from the outset thus shaping the landscape of the region."It was in our interest to detect the seismic activity from the Quaternary period, or, in other words, earthquakes that occurred more than 1.8 million years ago. In total, we have identified a minimum of six earthquakes of high magnitude during the period studied (more than 300,000 years) but we know that the real number is actually much higher. In some cases, sedimentary evidence could have disappeared or maybe they can be found in parts of the fault that have yet to be studied," outlines the researcher.Another revelation according to the article is that the area could suffer from a stronger earthquake than originally thought. "During earthquakes, the entire length of the fault does not break. It does so in segments. We have proven that this fault could break at once at the two western segments, from Góñar (Almería) to Totana (Murcia) causing at the same time an earthquake of a magnitude above 7," explains Martínez."This fault has already produced an earthquake of magnitude 6.5 or 7 thousands of years ago, and could do so again tomorrow. As a result, it is vital to bear in mind the earthquake risk calculations and building codes on the area," outline the researchers.The seismic hazard map forming the basis of the Spanish Seismic Resistance Construction Standard assigns the area of Lorca with a maximum acceleration for construction design of 0.19 g. However, the recent earthquake reached a magnitude of 5.2 and generated a much higher acceleration of 0.36 g."The area's hazard level was underestimated because until now estimations have been based on the historical earthquake catalogue which only records events from the last 2000 years," points out Martínez. The researcher believes that the fault activity parameters obtained through paleoseismic studies like this can help to improve risk calculations.But will they be capable of predicting the next high magnitude movement? The authors stress that it is indeed possible to determine the maximum magnitude as well as the location of the earthquake. However, at present there is still no way of predicting the moment it will strike, as this involves a complex geological phenomenon governed by non-linear physical processes."Earthquakes like the one in Lorca and ones before produce fault stress changes which increase in certain points of the fault. We know this thanks to models and results that we have published from previous studies. The next earthquake is more likely to occur in these areas. However, estimating when is impossible," ensures the scientist.With regards to the study recently published in the "There is much scientific discussion on the matter, we are part of various groups that have been working in the area for some time and I am not the only one who is sceptic of the idea. The 2011 earthquake in Lorca was similar to those that took place in 1674 and 1818 at a time when aquifer exploitation was not practiced. I believe that there is no need to search for any unusual reason behind the earthquake. It was down to the fault's natural tectonic evolution. It was a completely normal earthquake from a geological point of view -- the small magnitude kind which occurs on a fault every so often," concludes Martínez. | Earthquakes | 2,012 |
November 6, 2012 | https://www.sciencedaily.com/releases/2012/11/121106092721.htm | 2011 Virginia earthquake triggered landslides at extraordinary distances | The 2011 Mineral, Virginia M-5.8 earthquake was felt over an extraordinarily large area. A new study details landslides triggered by the earthquake at distances four times greater and over an area 20 times larger than previously documented for M-5.8 earthquakes worldwide. | The study, to be published in the December issue of the U.S. Geological Survey scientists Randall W. Jibson, who is scheduled to present his findings on Nov. 6 at the annual meeting of the Geological Society of America, and co-author Edwin L. Harp painstakingly mapped rock falls to determine the distance limits from the epicenter to compare with previously documented earthquakes.The 2011 Virginia earthquake was the largest earthquake in the eastern U.S. since 1897. Although it did not produce large, damaging landslides, it did trigger small landslides of rock and soil from steep slopes.Because landslides can occur without earthquake shaking, Jibson and Harp looked for evidence of recent physical disruption that could be attributed to the quake. From August 25 though September 3, Jibson and Harp made detailed observations by driving outward from the epicentral area in transects; they stopped frequently to get on their hands and knees to overturn rocks to see whether there was green grass underneath -- a sign that the rock could have fallen during the quake. As they drove, they inspected highly susceptible slopes until they had reached an apparent limit for a particular area."We've been doing this for more than 30 years and have developed a consistent observational standard that isn't dependent on the observers," Jibson said. "We are confident we can accurately compare limits for different earthquakes."The authors noted that Hurricane Irene passed near the area a few days after the quake, and they identified small rainfall-triggered debris flows that were quite distinct from rock falls triggered by the quake.Landslide limits were documented along the Blue Ridge Parkway from near Harpers Ferry, West Virginia, southwestward to within 30 km of the Virginia-North Carolina border as well as on transects northwestward through the Appalachian Mountains into West Virginia. The limits to the east and south of the Blue Ridge are less well constrained owing to a lack of susceptible slopes. The authors propose an estimated elliptical area had there been equally susceptible landscape yielding evidence in all areas.There is sufficient documented evidence, say Jibson and Harp, to suggest the need to revise the established distance limits for the occurrence of landslides in different tectonic environments.For the eastern U.S., the documented landslides from the 2011 Virginia earthquake suggest that ground motion is stronger and travels farther parallel to the Appalachian Mountains than perpendicular to them, which is consistent with other sources of intensity information such as the U.S. Geological Survey's Did You Feel It? map.Not all historical post-earthquake landslide investigations have been conducted at the same level of detail, and so they might not be directly comparable with the current study. Also, very few earthquakes in stable continental interiors, where ground motion is known to travel farther than in plate-boundary regions, have had thorough documentations of triggered landslides, noted the authors."Even taking differential landslide reporting into account," wrote the authors, "the landslide limits from the 2011 Virginia earthquake are extraordinary."The study is titled, "Extraordinary Distance Limits of Landslides Triggered by the 2011 Mineral, Virginia Earthquake." | Earthquakes | 2,012 |
November 5, 2012 | https://www.sciencedaily.com/releases/2012/11/121105100925.htm | Hydro-fracking: Fact vs. fiction | In communities across the U.S., people are hearing more and more about a controversial oil and gas extraction technique called hydraulic fracturing -- aka, hydro-fracking. Controversies pivot on some basic questions: Can hydro-fracking contaminate domestic wells? Does it cause earthquakes? How can we know? What can be done about these things if they are true? | "When people talk about contamination from hydraulic fracturing, for instance, they can mean a lot of different things," says hydrogeologist Harvey Cohen of S.S. Papadopulos & Associates in Bethesda, Maryland. "When it's what's happening near their homes, they can mean trucks, drilling machinery, noise." These activities can potentially lead to surface water or groundwater contamination if there are, for example, accidental fuel spills. People also worry about fracking fluids leaking into the aquifers they tap for domestic or municipal water.On the other hand, when petroleum companies talk about risks to groundwater from hydro-fracking, they are often specifically referring to the process of injecting fluids into geologic units deep underground and fracturing the rock to free the oil and gas it contains, says Cohen. This is a much smaller, much more isolated part of the whole hydraulic fracturing operation. It does not include the surface operations -- or the re-injection of the fracking waste fluids at depth in other wells, which is itself another source of concern for many.But all of these concerns can be addressed, says Cohen, who will be presenting his talk on groundwater contamination and fracking on the morning of Nov. 7. For instance, it has been proposed that drillers put non-toxic chemical tracers into their fracking fluids so that if a nearby domestic well is contaminated, that tracer will show up in the well water. That would sort out whether the well is contaminated from the hydro-fracking operations or perhaps from some other source, like a leaking underground storage tank or surface contaminants getting into the groundwater."That would be the 100 percent confident solution," says Cohen of the tracers.Another important strategy is for concerned citizens, cities, and even oil companies to gather baseline data on water quality from wells before hydro-fracking begins. Baseline data would have been very helpful, for example, in the case of the Pavillion gas field the Wind River Formation of Wyoming, according to Cohen, because there are multiple potential sources of contaminants that have been found in domestic wells there. The Pavillion field is just one of multiple sites now being studied by the U.S. Environmental Protection Agency (EPA) to learn about past and future effects of hydro-fracking on groundwater.The same pre-fracking science approach is being taken in some areas to evaluate the seismic effects of disposing of fracking fluids by injecting them deep underground. In Ohio and Texas, this disposal method has been the prime suspect in the recent activation of old, dormant faults that have generated clusters of low intensity earthquakes. So in North Carolina, as well as other places where fracking has been proposed, some scientists are scrambling to gather as much pre-fracking seismic data as possible. | Earthquakes | 2,012 |
October 31, 2012 | https://www.sciencedaily.com/releases/2012/10/121031141854.htm | Tabletop fault model reveals why some earthquakes result in faster shaking | The more time it takes for an earthquake fault to heal, the faster the shake it will produce when it finally ruptures, according to a new study by engineers at the University of California, Berkeley, who conducted their work using a tabletop model of a quake fault. | "The high frequency waves of an earthquake -- the kind that produces the rapid jolts -- are not well understood because they are more difficult to measure and more difficult to model," said study lead author Gregory McLaskey, a former UC Berkeley Ph.D. student in civil and environmental engineering. "But those high frequency waves are what matter most when it comes to bringing down buildings, roads and bridges, so it's important for us to understand them."While the study, to be published in the Nov. 1 issue of the journal "The experiment in our lab allows us to consider how long a fault has healed and more accurately predict the type of shaking that would occur when it ruptures," said Steven Glaser, UC Berkeley professor of civil and environmental engineering and principal investigator of the study. "That's important in improving building designs and developing plans to mitigate for possible damage."To create a fault model, the researchers placed a Plexiglas slider block against a larger base plate and equipped the system with sensors. The design allowed the researchers to isolate the physical and mechanical factors, such as friction, that influence how the ground will shake when a fault ruptures.It would be impossible to do such a detailed study on faults that lie several miles below the surface of the ground, the authors said. And current instruments are generally unable to accurately measure waves at frequencies higher than approximately 100 Hertz because they get absorbed by the earth."There are many people studying the properties of friction in the lab, and there are many others studying the ground motion of earthquakes in the field by measuring the waves generated when a fault ruptures," said McLaskey. "What this study does for the first time is link those two phenomena. It's the first clear comparison between real earthquakes and lab quakes."Noting that fault surfaces are not smooth, the researchers roughened the surface of the Plexiglas used in the lab's model."It's like putting two mountain ranges together, and only the tallest peaks are touching," said McLaskey, who is now a postdoctoral researcher with the U.S. Geological Survey in Menlo Park.As the sides "heal" and press together, the researchers found that individual contact points slip and transfer the resulting energy to other contact points."As the pressing continues and more contacts slip, the stress is transferred to other contact points in a chain reaction until even the strongest contacts fail, releasing the stored energy as an earthquake," said Glaser. "The longer the fault healed before rupture, the more rapidly the surface vibrated.""It is elegant work," said seismologist John Vidale, a professor at the University of Washington who was not associated with the study. "The point that more healed faults can be more destructive is dismaying. It may not be enough to locate faults to assess danger, but rather knowing their history, which is often unknowable, that is key to fully assessing their threat."Glaser and McLaskey teamed up with Amanda Thomas, a UC Berkeley graduate student in earth and planetary sciences, and Robert Nadeau, a research scientist at the Berkeley Seismological Laboratory, to confirm that their lab scenarios played out in the field. The researchers used records of repeating earthquakes along the San Andreas fault that Nadeau developed and maintained. The data were from Parkfield, Calif., an area which has experienced a series of magnitude 6.0 earthquakes two to three decades apart over the past 150 years.Thomas and McLaskey explored the records of very small, otherwise identically repeating earthquakes at Parkfield to show that the quakes produced shaking patterns that changed depending on the time span since the last event, just as predicted by the lab experiments.In the years after a magnitude 6.0 earthquake hit Parkfield in 2004, the small repeating earthquakes recurred more frequently on the same fault patches."Immediately after the 2004 Parkfield earthquake, many nearby earthquakes that normally recurred months or years apart instead repeated once every few days before decaying back to their normal rates," said Thomas. "Measurements of the ground motion generated from each of the small earthquakes confirmed that the shaking is faster when the time from the last rupture increases. This provided an excellent opportunity to verify that ground motions observed on natural faults are similar to those observed in the laboratory, suggesting that a common underlying mechanism -- fault healing -- may be responsible for both."Understanding how forcefully the ground will move when an earthquake hits has been one of the biggest challenges in earthquake science."What makes this study special is the combination of lab work and observations in the field," added Roland Burgmann, a UC Berkeley professor of earth and planetary sciences who reviewed the study but did not participate in the research. "This study tells us something fundamental about how earthquake faults evolve. And the study suggests that, in fact, the lab setting is able to capture some of those processes correctly."Glaser said the next steps in his lab involve measuring the seismic energy that comes from the movement of the individual contact points in the model fault to more precisely map the distribution of stress and how it changes in the run-up to a laboratory earthquake event. | Earthquakes | 2,012 |
October 25, 2012 | https://www.sciencedaily.com/releases/2012/10/121025150359.htm | Fishing for answers off Fukushima | Japan's "triple disaster," as it has become known, began on March 11, 2011, and remains unprecedented in its scope and complexity. To understand the lingering effects and potential public health implications of that chain of events, scientists are turning to a diverse and widespread sentinel in the world's ocean: fish. | Events on March 11 began with a magnitude 9.0 earthquake, the fourth largest ever recorded. The earthquake in turn spawned a massive 40-foot tsunami that inundated the northeast Japanese coast and resulted in an estimated 20,000 missing or dead. Finally, the wave caused catastrophic damage to the Fukushima Dai-ichi nuclear power plant, resulting in the largest accidental release of radiation to the ocean in history, 80 percent of which ended up in the Northwest Pacific Ocean.In a Perspectives article appearing in October 26, 2012, issue of the journal In it, Buesseler shows that the vast majority of fish caught off the northeast coast of Japan remain below limits for seafood consumption, even though the Japanese government tightened those limits in April 2012. Nevertheless, he also finds that the most highly contaminated fish continue to be caught off the coast of Fukushima Prefecture, as could be expected, and that demersal, or bottom-dwelling fish, consistently show the highest level of contamination by a radioactive isotope of cesium from the damaged nuclear power plant. He also points out that levels of contamination in almost all classifications of fish are not declining, although not all types of fish are showing the same levels, and some are not showing any appreciable contamination.As a result, Buesseler concludes that there may be a continuing source of radionuclides into the ocean, either in the form of low-level leaks from the reactor site itself or contaminated sediment on the seafloor. In addition, the varying levels of contamination across fish types points to complex methods of uptake and release by different species, making the task of regulation and of communicating the reasons behind decision-making to the fish-hungry Japanese public all the more difficult."To predict the how patterns of contamination will change over time will take more than just studies of fish," said Buesseler, who led an international research cruise in 2011 to study the spread of radionuclides from Fukushima. "What we really need is a better understanding of the sources and sinks of cesium and other radionuclides that continue to drive what we're seeing in the ocean off Fukushima." | Earthquakes | 2,012 |
October 23, 2012 | https://www.sciencedaily.com/releases/2012/10/121023123958.htm | Self-powered sensors to monitor nuclear fuel rod status | Japan's Fukushima Dai'ichi nuclear disaster that occurred in 2011 -- a result of the strongest earthquake on record in the country and the powerful tsunami waves it triggered -- underscored the need for a method to monitor the status of nuclear fuel rods that doesn't rely on electrical power. | During the disaster, the electrical power connection to the nuclear reactor failed and rendered back-up electrical generators, coolant pumps, and sensor systems useless. The nuclear plant's operators were unable to monitor the fuel rods in the reactor and spent fuel in the storage ponds.To address this issue, Penn State researchers teamed with the Idaho National Laboratory to create a self-powered sensor capable of harnessing heat from nuclear reactors' harsh operating environments to transmit data without electronic networks. The team will present their research at the Acoustical Society of America's upcoming 164th Meeting, October 22-26, 2012, in Kansas City, Missouri."Thermoacoustics exploits the interaction between heat and sound waves," explains Randall A. Ali, a graduate student studying acoustics at Penn State. "Thermoacoustic sensors can operate without moving parts and don't require external power if a heat source, such as fuel in a nuclear reactor, is available."Thermoacoustic engines can be created from a closed cylindrical tube -- even a fuel rod -- and a passive structure called a "stack.""We used stacks made from a ceramic material with a regular array of parallel pores that's manufactured as the substrate for catalytic converters found in many automotive exhaust systems. These stacks facilitate the transfer of heat to the gas in a resonator, and heat is converted to sound when there's a temperature difference along the stack," Ali elaborates.When a thermoacoustic engine operates, an acoustically driven streaming gas jet circulates hot fluid away from the heat source -- nuclear fuel -- and along the walls of the engine and into the surrounding cooling fluid.Penn State and Idaho National Laboratory are also investigating using thermoacoustic sound to monitor microstructural changes in nuclear fuel, measure gas mixture composition, and to act as a failsafe device in emergency situations. | Earthquakes | 2,012 |
October 23, 2012 | https://www.sciencedaily.com/releases/2012/10/121023100911.htm | Researchers to shake a building to study its performance during earthquakes | The two-story building on West Commercial Avenue in El Centro, CA was built in the 1920s and has withstood four major earthquakes in 1940, 1979, 1987, and 2010 but it may not be standing for long. | That's because a research team that includes Babak Moaveni, assistant professor of civil and environmental engineering at Tufts University School of Engineering, plans to shake and rumble the structure until it's on the verge of collapsing into a heap of debris and dust.Moaveni is collaborating with Andreas Stavridis, assistant professor of civil engineering at the University of Texas-Arlington, on a National Science Foundation-funded study to assess how buildings made with reinforced concrete frames and masonry infill walls hold up during an earthquake. The data will also be used to refine existing analytical models and techniques that engineers use when evaluating seismic safety of similarly constructed buildings. The research team also includes engineers from the University of California, at Los Angeles (UCLA).Thousands of such buildings exist in earthquake-prone places like Los Angeles, San Francisco, the Mediterranean and Latin America, and they are vulnerable to serious damage. "These buildings were built and designed years ago according to building codes that have since become outdated," says Moaveni.Typically, after an earthquake, owners of a building like the one on West Commercial Avenue would have the structure repaired and maybe retrofitted so that it could endure the next quake. But damage from the 2010 earthquake was so severe that repair was not worth the cost. Owners and the city officials decided to have it demolished.That's when Moaveni and Stavridis came forward. In the first phase of the project, the engineers will record the building's existing condition. Then, the team will install a spinning device called an eccentric-mass shaker on the building's roof. This device will induce further damage by simulating the pulsing and vibration of an earthquake rattling the structure from the top down. This has not been done before with an entire structure with that degree of damage. "We are glad that the building owners realized that the building's misfortune has presented a unique research opportunity for us," Stavridis explains.The researchers will install cameras at critical locations of the structures to observe damage as the test progresses. At specific intervals, they will also halt the shaker to assess and document structural damage, through visual inspection. Computers will also record data from sensors inside the building. With the initial measurements as a baseline, the researchers will evaluate and quantify progressive damage sustained by the building as it is shaken apart.Field testing of full-scale structures using mechanical shakers plays an important role in this type of seismic research. In previous experiments, researchers have experimented on wall portions or sections of buildings using low-to-moderate levels of vibrations. "This is a very challenging project but a great research opportunity because we are working with an entire existing building," says Moaveni.In their project, Moaveni and Stavridis plan to exert large-amplitude forces on the building. "We don't know if we will shake the building until it collapses," Moaveni says. "But, at a minimum, we will shake it until it is on the verge of collapse." | Earthquakes | 2,012 |
October 19, 2012 | https://www.sciencedaily.com/releases/2012/10/121019114821.htm | World's largest earthquake drill: JPL scientists participate in Great California ShakeOut exercises | On Thursday, Oct. 18, at 10:18 a.m. PDT, more than 9.3 million Californians, including employees at NASA's Jet Propulsion Laboratory, Pasadena, Calif., "dropped, covered and held on" during the 5th annual Great California ShakeOut, the world's largest earthquake drill. The purpose of the ShakeOut is to encourage people and organizations to be prepared to survive and recover when the next big earthquake happens. | In conjunction with the ShakeOut, scientists from two JPL-developed projects participated in a supporting exercise by the California Earthquake Clearinghouse to provide disaster response data that will allow decision makers to best focus their response efforts immediately following an actual earthquake.Here are some facts about the ShakeOut and the two JPL projects participating in the exercise:The Great California ShakeOut is an annual earthquake response drill organized by the Southern California Earthquake Center, the U.S. Geological Survey, the National Science Foundation, the Federal Emergency Management Agency, the California Emergency Management Agency and the California Earthquake Authority. It is designed to raise general public awareness of our ever-present earthquake hazard and allow emergency responders the opportunity to review and test their response plans.The California Earthquake Clearinghouse is a consortium of organizations and individuals that meets in the event of a major earthquake to facilitate the collection and dissemination of information and best apply available expertise and resources. It also serves as a centralized location and information repository and as a temporary umbrella organization to facilitate disaster response.Two JPL projects participated in the ShakeOut exercise on Oct 17. E-DECIDER (Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) and ARIA (Advanced Rapid Imaging and Analysis) are partnering with the California Earthquake Clearinghouse to provide disaster response data to decision makers to help them focus response efforts immediately following an earthquake. Personnel from both projects participated in a tabletop demonstration of a simulated earthquake. Clearinghouse participants used various decision-making tools and data products developed by projects like E-DECIDER and ARIA in real-time to test their capabilities in preparation for an actual earthquake. Although this exercise was part of the overall Oct. 18 ShakeOut activity, it was a separate tabletop exercise/demonstration.E-DECIDER (Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a NASA Applied Sciences-funded project developed by JPL in partnership with the U.S. Geological Survey, Indiana University and UC Davis. It provides tools for earthquake disaster management and response using remote sensing data and NASA earthquake modeling software. The project delivers mapped data products through Web and mobile applications for ease-of-use by decision makers, providing information for both long-term planning and identification of areas where emergency response should be focused in the event of an earthquake disaster.E-DECIDER tools and products help facilitate coordination of decision making and planning for emergency managers and responders by transforming and distributing NASA Earth science data, modeling results and other remote sensing data into a standardized, easy-to-use format.E-DECIDER has a co-investigator at the U.S. Geological Survey, and the project is partnering with the California Earthquake Clearinghouse. In the future, the project envisions that agencies such as the Department of Homeland Security, Federal Emergency Management Agency, California Emergency Management Agency, California Geological Survey and other decision-making agencies and responders may use E-DECIDER tools and products.ARIA (Advanced Rapid Imaging and Analysis) is a JPL- and NASA-funded project under development by JPL and Caltech in partnership with the U.S. Geological Survey. It is building an automated system for providing rapid and reliable GPS and satellite data to support the local, national and international hazard monitoring and response communities. By imaging disasters from space, ARIA data products can provide rapid assessments of the geographic region impacted by a disaster, as well as detailed imaging of the locations where damage occurs.ARIA will provide space-based imagery data products that can help with pre-disaster event monitoring and post-disaster event situational awareness. ARIA will also integrate these space-based products with ground-based seismic data and modeling to deliver science products for earthquakes and other natural hazards, such as volcanoes.ARIA is currently developing systems to deliver data products to the U.S. Geological Survey's earthquake and volcano hazard programs. The project foresees agencies such as the California Geological Survey, the Office of Foreign Disaster Assistance at USAID, the World Bank and the Department of Homeland Security using ARIA data products in the future.E-DECIDER and ARIA provided numerous information products to the California Earthquake Clearinghouse through the Department of Homeland Security Unified Incident Command and Decision Support system and its Web and mobile interfaces. E-DECIDER products included results from models of anticipated ground deformation that show the amount of displacement expected to occur as a result of the simulated earthquake. The model results are presented in KML (Google Earth) format, allowing the data to be overlaid on maps of critical infrastructure and census information that allow responders to quickly identify who and what might have been affected by the simulated earthquake. Additionally, E-DECIDER provided an aftershock forecast that displayed areas with high likelihoods of aftershocks.ARIA products include radar imagery of damage caused by the simulated earthquake, called a damage proxy map. This map was annotated by experts to identify locations for follow-up visits by responders in the field. ARIA also provided simulated GPS movement data from the simulated earthquake to improve the E-DECIDER model.NASA plans to make E-DECIDER available for operational use by its partners, hopefully within the next year. Its decision support products and mobile and Web interfaces would be used by organizations such as the California Geological Survey, California Emergency Management Agency, Federal Emergency Management Agency, the U.S. Geological Survey and other decision making agencies in the event of an earthquake disaster.ARIA is currently building a prototype data system and plans to provide earthquake response-related data products routinely within two years. Upcoming plans include adding data products for monitoring ground movements and for responding to volcanic eruptions. | Earthquakes | 2,012 |
October 18, 2012 | https://www.sciencedaily.com/releases/2012/10/121018102956.htm | World's largest subwoofer: Earthquakes 'pump' ground to produce infrasound | Earthquakes sway buildings, buckle terrain, and rumble -- both audibly and in infrasound, frequencies below the threshold of human hearing. New computer modeling by a team of researchers indicates that most of the low-frequency infrasound comes from an unexpected source: the actual "pumping" of Earth's surface. The researchers confirmed their models by studying data from an actual earthquake. | "It's basically like a loudspeaker," said Stephen Arrowsmith, a researcher with the Geophysics Group at Los Alamos National Laboratory in Santa Fe, N.M., who presents his team's findings at the 164th meeting of the Acoustical Society of America (ASA), held Oct. 22 -- 26 in Kansas City, Missouri. "In much the same way that a subwoofer vibrates air to create deep and thunderous base notes, earthquakes pump and vibrate the atmosphere producing sounds below the threshold of human hearing."Infrasound can reveal important details about an earthquake. In particular, it may be used to measure the amount of ground shaking in the immediate region above the source, which would normally require an array of many seismometers to measure. There is therefore potential to use infrasound to assess damage in the immediate aftermath of an earthquake.To better understand the relationship between earthquakes and infrasound, the researchers used the basic idea that Earth's surface above the earthquake pumps the atmosphere like a piston. They were then able to apply the same modeling approach used on loudspeaker dynamics.The researchers tested their model by comparing its predictions to actual data collected from a magnitude 4.6-earthquake that occurred on January 3, 2011, in Circleville, Utah. The University of Utah maintains seismograph stations across the state supplemented with infrasound sensors, which recorded the infrasound produced during that event. Their predictions were in good agreement with the actual data, suggesting that earthquakes generate most of their sound by pumping the atmosphere like a loudspeaker."This was very exciting because it is the first such clear agreement in infrasound predictions from an earthquake," said Arrowsmith. "Predicting infrasound is complex because winds can distort the signal and our results also suggest we are getting better at correcting for wind effects."Until now, seismologists have not understood the relative importance of the simple pumping of the ground versus other mechanisms for generating infrasound.Additional members of the research team include Relu Burlacu and Kristine Pankow, University of Utah; Brian Stump and Chris Haward, Southern Methodist University; and Richard Stead and Rod Whitaker, Los Alamos National Laboratory. | Earthquakes | 2,012 |
October 3, 2012 | https://www.sciencedaily.com/releases/2012/10/121003094637.htm | City of Ottawa, Canada, sits atop soil, geologic features that amplify seismic waves | Engineers and city planners study surface geology in order to construct buildings that can respond safely to earthquakes. Soft soil amplifies seismic waves, resulting in stronger ground motion than for sites built over bedrock. This study examines the local site response for the city of Ottawa, and the results indicate seismic waves may amplify ground motion greater than expected or referenced in the National Building Code of Canada. | Current knowledge of the earthquake activity in Ottawa area is based on less than 200 years of reported felt events and approximately 100 years of instrumental recordings. While the area has experienced moderate shaking from earthquakes in the range of M 5.2 -- 6.2 during this time, historical accounts suggests certain parts of the city have experienced higher levels of ground motion than others during the larger earthquakes. There is also evidence of devastating prehistoric earthquakes, causing widespread landslides, sediment deformation and liquefaction.The area's geological structure complicates site response analyses. Roughly 20 percent of the Ottawa area is built on bedrock, while the remaining area contains unconsolidated surface deposits.In this study, the authors reconfirmed the unusually large seismic amplification values for weak motion, prompting an extensive site response analysis as part of seismic microzonation studies for the entire city. | Earthquakes | 2,012 |
October 2, 2012 | https://www.sciencedaily.com/releases/2012/10/121002091913.htm | Geofoam protects pipelines from earthquakes | Lightweight and stiff as a board, a plastic foam material is being used to protect Utah's natural gas pipelines from rupturing during earthquakes. | "If an earthquake occurs, high-pressure gas lines are one of the most important items to protect," says Steven Bartlett, associate professor of civil engineering at the University of Utah. "If they rupture and ignite, you essentially have a large blowtorch, which is catastrophic."Bartlett has partnered with natural-gas company Questar to use large expanded polystyrene blocks called "geofoam" as a compressible, protective cover for natural gas pipelines buried underground."This low-impact technology has an advantage in urban environments, particularly if you need to realign already buried structures such as gas lines or utilities without affecting adjacent buildings or other facilities," says Bartlett.Geofoam has been used for decades in Europe, North America and Asia to lighten loads under roads and reduce settlement. One-hundredth the weight of soil with similar strength, geofoam blocks reduce construction time and don't erode or deteriorate.Bartlett previously researched the design and use of geofoam as a lightweight road embankment in the Interstate-15 reconstruction project through the Salt Lake Valley a decade ago, and more recently in the TRAX light rail line that opened last year to serve West Valley City, Utah. Geofoam currently is being used in the TRAX extension to the airport.Questar -- which provides natural gas to almost 900,000 customers in Utah, southwestern Wyoming and southeastern Idaho -- is using geofoam in lightweight covers for minimizing damage to natural gas pipelines caused by severe earthquakes."Most pipelines are designed to withstand some ground shaking, but not several feet of sudden fault offset that may occur in a major earthquake," says Bartlett. "When a fault breaks, it occurs in milliseconds. It is an extreme event. The problem Questar faced was, how could a buried pipeline survive that offset?"Geologists expect that when a major earthquake strikes the Wasatch fault zone in the Salt Lake Valley, a fault rupture likely will make the valley drop down relative to the mountains. As the valley drops down, a buried pipeline would start to lift up. However, most buried pipelines lie under six to eight feet of compacted soil. This weight becomes too much for a pipe to bear, causing it to rupture, Bartlett says.Numerical simulations of earthquake fault ruptures performed by Bartlett and his students show a geofoam-protected pipeline on the valley side of the Salt Lake City segment of the Wasatch fault could withstand up to four times more vertical force than traditional soil cover.Based on Bartlett's experience with geofoam, Questar asked him to develop a strategy for protecting buried pipelines crossing earthquake faults in urban areas, such as 3300 South, an arterial street in the Salt Lake Valley."In this situation, we had to put the pipeline right down the center of the roadway. When we looked at what other countries did, they built a trapezoidal geometry above the pipe -- basically just a wedge," says Bartlett.Such a wedge would require many blocks of foam and would disrupt a large section of road, Bartlett says. "This would be a major problem in an urban area, as you might have to tear up 20 feet of lateral roadway. Try to do that for 3300 South -- you'd have to shut the whole road down."Rather than gut a major thoroughfare, Bartlett proposed a "slot trench" design in which a block of geofoam is placed in a narrow trench between a pipeline and the pavement above. In this design, if the pipeline begins to lift up, it will displace the geofoam block and compress it. Although geofoam is solid, it contains tiny air pockets that can compress without sacrificing the material's overall integrity. As the geofoam is compressed further, it will slide upward along the trench sidewalls and could eventually damage the pavement above. However, says Bartlett, the pipeline will remain intact and essentially undamaged.Since the 3300 South project, Questar has been installing geofoam to protect other natural gas pipelines in the valley.In addition, Bartlett and colleagues at the University of Memphis and University of Illinois at Urbana-Champaign are investigating geofoam to help new buildings withstand earthquakes. When a building shakes during an earthquake, says Bartlett, soil adjacent to the building puts additional pressures on its walls as it tries to move back and forth.By placing a geofoam buffer between a building's walls and neighboring soil, it can sway without experiencing additional pressures. The geofoam, which deforms in a controlled manner when placed against a structure, can reduce earthquake pressures by 30 to 50 percent, according to Bartlett's calculations. This also reduces the amount of steel and reinforcing concrete needed to protect the building from earthquake damage.Compared with compacted soil, geofoam is competitive when total construction costs are considered, Bartlett says. What's more, geofoam requires typical road embankment construction times of one month, compared with 12 to 15 months using traditional methods."When there are sensitive utilities involved, seismic stresses or time is a factor, this technology wins hands down," says Bartlett. | Earthquakes | 2,012 |
September 26, 2012 | https://www.sciencedaily.com/releases/2012/09/120926133105.htm | Large 2012 earthquake triggered temblors worldwide for nearly a week | This year's largest earthquake, a magnitude 8.6 temblor on April 11 centered in the East Indian Ocean off Sumatra, did little damage, but it triggered quakes around the world for at least a week, according to a new analysis by seismologists from the University of California, Berkeley, and the U.S. Geological Survey (USGS). | The April 11 quake was unusually large -- the tenth largest in the last 100 years and, similar to a few other recent large quakes, triggered small quakes during the three hours it took for seismic waves to travel through Earth's crust.The new study shows, however, that some faults weren't rattled enough by the seismic waves to fail immediately, but were primed to break up to six days later.The findings are a warning to those living in seismically active regions worldwide that the risk from a large earthquake could persist -- even on the opposite side of the globe -- for more than a few hours, the experts said."Until now, we seismologists have always said, 'Don't worry about distant earthquakes triggering local quakes,'" said Roland Burgmann, professor of earth and planetary science at UC Berkeley and coauthor of the study. "This study now says that, while it is very rare -- it may only happen every few decades -- it is a real possibility if the right kind of earthquake happens.""We found a lot of big events around the world, including a 7.0 quake in Baja California and quakes in Indonesia and Japan, that created significant local shaking," Burgmann added. "If those quakes had been in an urban area, it could potentially have been disastrous."Burgmann and Fred F. Pollitz, Ross S. Stein and Volkan Sevilgen of the USGS will report their results online on Sept. 26 in advance of publication in the journal Burgmann, Pollitz, a research seismologist, and their colleagues also analyzed earthquake occurrences after five other recent temblors larger than 8.5 -- including the deadly 9.2 Sumatra-Andaman quake in 2004 and the 9.0 Tohoku quake that killed thousands in Japan in 2011 -- but saw only a very modest increase in global earthquake activity after these quakes. They said this could be because the East Indian Ocean quake was a "strike-slip" quake that more effectively generates waves, called Love waves, that travel just under the surface and are energetic enough to affect distant fault zones.Burgmann explained that most large quakes take place at subduction zones, where the ocean bottom sinks below another tectonic plate. This was the origin of the Sumatra-Andaman quake, which produced a record tsunami that took more than 200,000 lives. The 2012 East Indian Ocean quake involved lateral movement -- referred to as strike-slip, the same type of movement that occurs along California's San Andreas Fault -- and was the largest strike-slip quake ever recorded."This was one of the weirdest earthquakes we have ever seen," Burgmann said. "It was like the 1906 San Francisco earthquake, a strike-slip event, but it was huge -- 15 times more energetic. This earthquake and an 8.3 that followed were in a very diffuse zone in an oceanic plate close to the Sumatra subduction zone, but it wasn't a single fault that produced the quake, it was a crisscrossing of three or four faults that all ruptured in sequence to make such a big earthquake, and they ruptured deep."The seismologists analysis found five times the expected number of quakes during the six days following the April 11 quake and aftershock. An unusually low occurrence of quakes during the 6-12 days before that 8.6 quake may have accentuated the impact, possibly because there were many very-close-to-failure faults sensitive to a triggering shock wave, Pollitz said.One possible mechanism for the delayed action, Burgmann said, is that the East Indian Ocean quake triggered a cascade of smaller, undetectable quakes on these faults that led to larger ruptures later on.Alternatively, large quakes could trigger nearly undetectable tremors or microquakes that are a sign of slow slip underground."One possibility is that the earthquake immediately triggers slow slip in some places, maybe accompanied by detectable tremor, and then that runs away into a bigger earthquake," Burgmann speculated. "Some slow slip events take days to a week or more to evolve."The work was supported by the USGS. | Earthquakes | 2,012 |
September 26, 2012 | https://www.sciencedaily.com/releases/2012/09/120926132610.htm | Magnitude-8.7 quake was part of crustal plate breakup | Seismologists have known for years that the Indo-Australian plate of Earth's crust is slowly breaking apart, but they saw it in action last April when at least four faults broke in a magnitude-8.7 earthquake that may be the largest of its type ever recorded. | The great Indian Ocean quake of April 11, 2012 previously was reported as 8.6 magnitude, and the new estimate means the quake was 40 percent larger than had been believed, scientists from the University of Utah and University of California, Santa Cruz, report in the Sept. 27 issue of the journal The quake was caused by at least four undersea fault ruptures southwest of Sumatra, Indonesia, within a 2-minute, 40-second period. It killed at least two people, and eight others died from heart attacks. The quake was felt from India to Australia, including throughout South Asia and Southeast Asia.If the four ruptures were considered separate quakes, their magnitudes would have been 8.5, 7.9, 8.3 and 7.8 on the "moment magnitude" scale used to measure the largest quakes, the scientists report.The 8.7 main shock broke three faults that were parallel but offset from each other -- known as en echelon faults -- and a fourth fault that was perpendicular to and crossed the first fault.The new study concludes that the magnitude-8.7 quake and an 8.2 quake two hours later were part of the breakup of the Indian and Australian subplates along a yet-unclear boundary beneath the Indian Ocean west of Sumatra and southeast of India -- a process that started roughly 50 million years ago and that will continue for millions more."We've never seen an earthquake like this," says study co-author Keith Koper, an associate professor geophysics and director of the University of Utah Seismograph Stations. "This is part of the messy business of breaking up a plate. … This is a geologic process. It will take millions of years to form a new plate boundary and, most likely, it will take thousands of similar large quakes for that to happen."All four faults that broke in the 8.7 quake and the fifth fault that ruptured in the 8.2 quake were strike-slip faults, meaning ground on one side of the fault moves horizontally past ground on the other side.The great quake of last April 11 "is possibly the largest strike-slip earthquake ever seismically recorded," although a similar size quake in Tibet in 1950 was of an unknown type, according to the new study, which was led by two University of California, Santa Cruz, seismologists: graduate student Han Yue and Thorne Lay, a professor of Earth and planetary sciences. The National Science Foundation funded the study.The 8.7 jolt also "is probably the largest intraplate [within a single tectonic plate of Earth's crust] ever seismically recorded," Lay, Yue and Koper add. Most of Earth's earthquakes occur at existing plate boundaries.The researchers cannot be certain the April great quake was the largest intraplate quake or the largest strike-slip quake because "we are comparing it against historic earthquakes long before we had modern seismometers," says Koper.Koper says the 2012 quakes likely were triggered, at least in part, by changes in crustal stresses caused by the magnitude-9.1 Sumatra-Andaman earthquake of Dec. 26, 2004 -- a jolt that generated massive tsunamis that killed most of the 228,000 victims in the Indian Ocean region.The fact the 8.7 and 8.2 quakes were generated by horizontal movements along seafloor strike-slip faults -- not by vertical motion along thrust faults -- explains why they didn't generate major tsunamis. The 8.7 quake caused small tsunamis, the largest of which measured about 12 inches in height at Meulaboh, Indonesia, according to the U.S. Geological Survey.Without major tsunamis, the great earthquake caused "very little damage and death, especially for this size of an earthquake, because it happened in the ocean and away from coastlines," and on strike-slip faults, says Koper.The researchers studied the quake using a variety of methods to analyze the seismic waves it generated. Because the same data can be interpreted in various ways, Koper says it is conceivable that more than four fault segments broke during the 8.7 quake -- conceivably five or even six -- although four fault ruptures is most likely.The Indo-Australian plate is breaking into two or perhaps three pieces (some believe a Capricorn subplate is separating from the west side of the Indian subplate). The magnitude-8.7 and 8.2 great quakes on April 11 occurred over a broad area where the India and Australian subplates are being sheared apart."What we're seeing here is the Indo-Australian plate fragmenting into two separate plates," says Lay.The breakup of the northeast-moving Indo-Australian plate is happening because it is colliding with Asia in the northwest, which slows down the western part of the plate, while the eastern part of the plate continues moving more easily by diving or "subducting" under the island of Sumatra to the northeast. The subduction zone off Sumatra caused the catastrophic 2004 magnitude-9.1 quake and tsunami.Seismic analysis shows the April 11 quakes "involve rupture of a very complex network of faults, for which we have no documented precedent in recorded seismic history," the researchers write.The analysis revealed this sequence for the faults ruptures that generated the 8.7 quake, and the estimated fault rupture lengths and slippage amounts:-- The quake began with the 50-second rupture of a fault extending west-northwest to east-southeast, with an epicenter a few hundred miles southwest of Sumatra. The fault ruptured along a roughly 90-mile length, breaking "bilaterally" both west-northwestward and east-southeastward, and also at least 30 miles deep, "almost ripping through the whole plate," Koper says. The seafloor on one side of the fault slipped about 100 feet past the seafloor on the fault's other side.-- The second fault, which slipped about 25 feet, began to rupture 40 seconds after the quake began. This rupture extended an estimated 60 miles to 120 miles north-northeast to south-southwest -- perpendicular to the first fault and crossing it.-- The third fault was parallel to the first fault and about 90 to the miles southwest of it. It started breaking 70 seconds after the quake began and ruptured along a length of about 90 miles. This fault slipped about 70 feet.-- The fourth fault paralleled the first and third faults, but was to the northwest of both of them. It began to rupture 145 seconds after the quake began and continued to do so for 15 seconds until the quake ended after a total time of 2 minutes and 40 seconds. The fault rupture was roughly 30 miles to 60 miles long. The ground on one side of this fault slipped about 20 feet past ground on the other side. | Earthquakes | 2,012 |
September 9, 2012 | https://www.sciencedaily.com/releases/2012/09/120909150348.htm | Giant 'balloon of magma' inflates under Santorini's volcano | The chamber of molten rock beneath Santorini's volcano expanded 10-20 million cubic metres -- up to 15 times the size of London's Olympic Stadium -- between January 2011 and April 2012, according to a new survey carried out by an international team led by Oxford University and including a scientist from the University of Bristol. | The research is reported in this week'sThe growth of this 'balloon' of magma has seen the surface of the island rise 8-14 centimetres during this period, the researchers found. The results come from an expedition, funded by the UK's Natural Environment Research Council, which used satellite radar images and Global Positioning System receivers (GPS) that can detect movements of Earth's surface of just a few millimetres.The findings are helping scientists to understand more about the inner workings of the volcano which had its last major explosive eruption 3,600 years ago, burying the islands of Santorini under metres of pumice. However, it still does not provide an answer to the biggest question of all: 'When will the volcano next erupt?'In January 2011, a series of small earthquakes began beneath the islands of Santorini. Most were so small they could only be detected with sensitive seismometers but it was the first sign of activity beneath the volcano to be detected for 25 years.Following the earthquakes Michelle Parks, an Oxford University DPhil student, spotted signs of movement of Earth's surface on Santorini in satellite radar images. Oxford University undergraduate students then helped researchers complete a new survey of the island.Michelle Parks of Oxford University's Department of Earth Sciences, an author of the paper, said: "During my field visits to Santorini in 2011, it became apparent that many of the locals were aware of a change in the behaviour of their volcano. The tour guides, who visit the volcano several times a day, would update me on changes in the amount of strong smelling gas being released from the summit, or changes in the colour of the water in some of the bays around the islands."On one particular day in April 2011, two guides told me they had felt an earthquake while they were on the volcano and that the motion of the ground had actually made them jump. Locals working in restaurants on the main island of Thera became aware of the increase in earthquake activity due to the vibration and clinking of glasses in their bars…."Co-author, Dr Juliet Biggs of the University of Bristol's School of Earth Sciences said: "People were obviously aware that something was happening to the volcano, but it wasn't until we saw the changes in the GPS, and the uplift on the radar images that we really knew that molten rock was being injected at such a shallow level beneath the volcano. Many volcanologists study the rocks produced by old eruptions to understand what happened in the past, so it's exciting to use cutting-edge satellite technology to link that to what's going on in the volcanic plumbing system right now."Co-author Professor David Pyle of Oxford University's Department of Earth Sciences, said: "For me, the challenge of this project is to understand how the information on how the volcano is behaving right now can be squared with what we thought we knew about the volcano, based on the studies of both recent and ancient eruptions. There are very few volcanoes where we have such detailed information about their past history."The team calculate that the amount of molten rock that has arrived beneath Santorini in the past year is the equivalent of about 10-20 years growth of the volcano. But this does not mean that an eruption is about to happen: in fact the rate of earthquake activity has dropped off in the past few months. | Earthquakes | 2,012 |
September 7, 2012 | https://www.sciencedaily.com/releases/2012/09/120907095710.htm | Exploration drilling to monitor earthquakes in the Istanbul area | Today the drilling starts for a seismic monitoring network on the Marmara Sea near Istanbul. Specially designed seismic sensors in eight boreholes on the outskirts of Istanbul and around the eastern Marmara Sea will monitor the seismic activity of the region with high precision. In each of the respective 300 meter deep holes several borehole seismometers will be permanently installed at various depths. These detect even barely perceptible earthquakes with very small magnitudes at a high resolution and can thus provide information about the earthquake rupture processes associated with these. | To determine and monitor the seismic hazard of the region and the processes occurring in the fault zone beneath the Marmara Sea off Istanbul with the latest earthquake monitoring technology, the GONAF plate boundary observatory (Geophysical Observatory at the North Anatolian Fault) was set up under the auspices of the GFZ German Research Centre for Geosciences. "Istanbul with its more than 13 million inhabitants is located in a region that is extremely vulnerable to earthquakes. A high probability of a strong earthquake of magnitude up to 7.4 is assumed for the region,"explains Professor Georg Dresen from the GFZ, one of the organizers of the project GONAF. "The data of small earthquakes in the region that are measured in the borehole can provide important information about the processes before a major earthquake."The data is continuously transmitted in real time to Potsdam and Ankara and evaluated there. A particular difficulty is that the earthquake zone to be monitored lies under the seabed of the Marmara Sea, about 20 kilometers off Istanbul. Only monitoring below ground in bore holes ensures the required precision of the measurementsdue to the much lower noise level. "This means we have to get as close as possible to the quake source region," explains GFZ researcher Professor Marco Bohnhoff, director of the project. "With our new, specially developed borehole seismometers the ratio of signal to background noise can be improved by at least a factor of 10, and therefore achieve a much higher resolution."The project involves close cooperation with the Disaster and Emergency Management Presidency of Turkey (AFAD). The drilling is implemented as part of the International Continental Scientific Drilling Program ICDP. Engineers and scientists at the GFZ supervise the construction and installation activities. Upon successful completion and handover of the fully equipped pilot bore hole on the peninsula Tuzla just off Istanbul a first test phase will commence before the remaining seven wells will be drilled. "An earthquake prediction is not the goal of the project," clarifies Marco Bohnhoff. "Earthquake prediction is still not possible. But the data gathered in our project of the seismic activity before, during and after the expected strong quake will mean a great advance in the study of earthquakes." | Earthquakes | 2,012 |
September 4, 2012 | https://www.sciencedaily.com/releases/2012/09/120904100857.htm | Smoking and natural disasters: Christchurch residents increase tobacco consumption post-earthquake | The prevalence of smoking in Christchurch, New Zealand, increased following the 2010 earthquake, according to a new study. | The results of the study will be presented today (4 September 2012) at the European Respiratory Society's Annual Congress in Vienna.The 7.1-magnitude Christchurch earthquake, and subsequent aftershocks, have caused a huge amount of damage and dramatically changed the social, working and living conditions for residents in the city.To investigate the effects of the disaster on smoking levels, researchers from the Canterbury District Health Board, New Zealand, carried out interviews with 1,001 residents 15 months after the first earthquake. Participants were asked about their smoking habits before and after the earthquake.The results showed that prior to the earthquake in August 2010, 319 people were not smoking at this time. Of this group, 76 people had smoked at least once after the earthquake, with 29 people from this group having more than 100 cigarettes since September 2010.Of the 273 people who were smoking in August 2010, 93 had increased their consumption of tobacco. 53 people in this group attributed this increase to the earthquake and the subsequent changes in lifestyle.Professor Lutz Beckert, from the Canterbury District Health Board, said: "Increased levels of smoking were found in Christchurch residents after the earthquake. 28% of people who were not smoking prior to the earthquake picked up the habit following the quakes. This suggests that exposure to trauma, such as a natural disaster, can prompt people to start smoking as they believe it is a valid way to deal with their anxiety over their experiences and coping for changes in lifestyle."It is important for healthcare professionals to be aware of this increased risk in the aftermath of a disaster, such as the Christchurch earthquake, so that they can be ready to provide the necessary support to residents before they turn to cigarettes." | Earthquakes | 2,012 |
August 31, 2012 | https://www.sciencedaily.com/releases/2012/08/120831145222.htm | Earthquake hazards map study finds deadly flaws | Three of the largest and deadliest earthquakes in recent history occurred where earthquake hazard maps didn't predict massive quakes. A University of Missouri scientist and his colleagues recently studied the reasons for the maps' failure to forecast these quakes. They also explored ways to improve the maps. Developing better hazard maps and alerting people to their limitations could potentially save lives and money in areas such as the New Madrid, Missouri fault zone. | "Forecasting earthquakes involves many uncertainties, so we should inform the public of these uncertainties," said Mian Liu, of MU's department of geological sciences. "The public is accustomed to the uncertainties of weather forecasting, but foreseeing where and when earthquakes may strike is far more difficult. Too much reliance on earthquake hazard maps can have serious consequences. Two suggestions may improve this situation. First, we recommend a better communication of the uncertainties, which would allow citizens to make more informed decisions about how to best use their resources. Second, seismic hazard maps must be empirically tested to find out how reliable they are and thus improve them."Liu and his colleagues suggest testing maps against what is called a null hypothesis, the possibility that the likelihood of an earthquake in a given area -- like Japan -- is uniform. Testing would show which mapping approaches were better at forecasting earthquakes and subsequently improve the maps.Liu and his colleagues at Northwestern University and the University of Tokyo detailed how hazard maps had failed in three major quakes that struck within a decade of each other. The researchers interpreted the shortcomings of hazard maps as the result of bad assumptions, bad data, bad physics and bad luck.Wenchuan, China -- In 2008, a quake struck China's Sichuan Province and cost more than 69,000 lives. Locals blamed the government and contractors for not making buildings in the area earthquake-proof, according to Liu, who says that hazard maps bear some of the blame as well since the maps, based on bad assumptions, had designated the zone as an area of relatively low earthquake hazard.Léogâne, Haiti -- The 2010 earthquake that devastated Port-au-Prince and killed an estimated 316,000 people occurred along a fault that had not caused a major quake in hundreds of years. Using only the short history of earthquakes since seismometers were invented approximately one hundred years ago yielded hazard maps that were didn't indicate the danger there.Tōhoku, Japan -- Scientists previously thought the faults off the northeast coast of Japan weren't capable of causing massive quakes and thus giant tsunamis like the one that destroyed the Fukushima nuclear reactor. This bad understanding of particular faults' capabilities led to a lack of adequate preparation. The area had been prepared for smaller quakes and the resulting tsunamis, but the Tōhoku quake overwhelmed the defenses."If we limit our attention to the earthquake records in the past, we will be unprepared for the future," Liu said. "Hazard maps tend to underestimate the likelihood of quakes in areas where they haven't occurred previously. In most places, including the central and eastern U.S., seismologists don't have a long enough record of earthquake history to make predictions based on historical patterns. Although bad luck can mean that quakes occur in places with a genuinely low probability, what we see are too many 'black swans,' or too many exceptions to the presumed patterns.""We're playing a complicated game against nature," said the study's first author, Seth Stein of Northwestern University. "It's a very high stakes game. We don't really understand all the rules very well. As a result, our ability to assess earthquake hazards often isn't very good, and the policies that we make to mitigate earthquake hazards sometimes aren't well thought out. For example, the billions of dollars the Japanese spent on tsunami defenses were largely wasted."We need to very carefully try to formulate the best strategies we can, given the limits of our knowledge," Stein said. "Understanding the uncertainties in earthquake hazard maps, testing them, and improving them is important if we want to do better than we've done so far."The study, "Why earthquake hazard maps often fail and what to do about it," was published by the journal | Earthquakes | 2,012 |
August 29, 2012 | https://www.sciencedaily.com/releases/2012/08/120829131625.htm | Hot spots pinpointed as earthquake trigger points: Small droplets of friction-generated melts can lead to 'megaquakes' | Scientists at Scripps Institution of Oceanography at UC San Diego have come a step closer to deciphering some of the basic mysteries and mechanisms behind earthquakes and how average-sized earthquakes may evolve into massive earthquakes. | In a paper published in the Aug. 30 issue of the journal They coined such locations as "melt welts" and describe the mechanism akin to an ice skater's blade reducing friction by melting the ice surface. The mechanism may be similar to "hot spots" known in automobile brake-clutch components."Melt welts appear to be working as part of a complicated feedback mechanism where complex dynamic weakening processes become further concentrated into initially highly stressed regions of a fault," said Brown, first author of the study and a professor in the Geosciences Research Division at Scripps. "The process allows highly stressed areas to rapidly break down, acting like the weakest links in the chain. Even initially stable regions of a fault can experience runaway slip by this process if they are pushed at velocities above a key tipping point.""This adds to the fundamental understanding of the earthquake process because it really addresses the question of how these ruptures become energetic and dynamic and run away for long distances," said Fialko, a paper coauthor and a professor in the Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics at Scripps.The study's results, supported by funding from the National Science Foundation, appear to help answer a longstanding paradox in seismology. Key fault zones such as the San Andreas Fault produce far too little heat from friction compared with the size and magnitude of the earthquakes they produce. Laboratory experiments show that thermal energy normally released by friction during slip can become rapidly reduced, potentially helping to account for a "low heat flow paradox." The melt welts also may help explain certain questions in earthquake rupture dynamics such as why some slowly slipping tremor-generating events can snowball into massive earthquakes if they pass a velocity tipping point."This may be relevant for how you get from large earthquakes to giant earthquakes," said Brown, who used the example of last year's magnitude 9.0 earthquake off Japan. "We thought that large patches of the fault were just creeping along at a constant rate, then all of a sudden they were activated and slipped to produce a mega earthquake that produced a giant tsunami."Fialko says the melt welt finding could eventually lead to improved "shake" maps of ground-shaking intensities, as well as improvements in structural engineering plans. Future studies include investigations about why the melt welt weakening occurs and if it applies to most or all common fault zone materials, as well as field research to locate melt welts along fault zones.The Scripps Marine Science Development Center provided the machinery used in the study's experiments. | Earthquakes | 2,012 |
August 27, 2012 | https://www.sciencedaily.com/releases/2012/08/120827142129.htm | Short- and mid-term cardiovascular effects of Japan's 2011 earthquake and tsunami: Incidence rises with the seismic peak | The Japanese earthquake and tsunami of 11 March 2011, which hit the north-east coast of Japan with a magnitude of 9.0 on the Richter scale, was one of the largest ocean-trench earthquakes ever recorded in Japan. The tsunami caused huge damage, including 15,861 dead and 3018 missing persons, and, as of 6 June 2012, 388,783 destroyed homes. | Following an investigation of the ambulance records made by doctors in the Miyagi prefecture, close to the epicentre of the earthquake and where the damage was greatest, cardiologist Dr Hiroaki Shimokawa and colleagues from the Tohoku University Graduate School of Medicine at Sendai, Japan, found that the weekly occurrence of five conditions -- heart failure, acute coronary syndrome (including unstable angina and acute MI), stroke, cardio-pulmonary arrest and pneumonia -- all increased sharply soon after the earthquake occurred.Such reactions -- in ACS, stroke and pulmonary embolism -- have been reported before, said Dr Shimokawa, in Japan, China and the USA. However, these studies reported only the short-term occurrence of individual CVD events, and the mid-term CVD effects of such great earthquakes remain to be elucidated. To this end, the study examined all ambulance transport records in the Miyagi prefecture from 11 February to 30 June for each year from 2008 to 2011 (ie, four weeks before to 16 weeks after 11 March, a total of 124,152 records). Incidence records from before, during and after the earthquake disaster were compared, the aftershocks counted and recorded according to a seismic intensity of 1 or greater.The number of aftershocks in the Miyagi prefecture was frequent during the six weeks after the earthquake, and the second peak was noted as a large aftershock on 7 April 2011 (magnitude of 7.0). Compared with the previous three years, the significant increases in the occurrence of heart failure and pneumonia were steadily prolonged for more than six weeks after the tsunami struck. On the other hand, the incident increases in stroke and cardio-pulmonary arrest followed the pattern of the first and aftershock seismic peaks. The rapid increases in the occurrence of acute coronary syndromes and cardio-pulmonary arrest was followed by a sharp and significant decline. Interestingly, said Dr Shimokawa, age, sex or residence area did not significantly affect the occurrences of CVD during or following the tsunami."To the best of our knowledge," he added, "this is the first report to describe the mid-term course of major cardiovascular events and pneumonia after a great earthquake in a large population. In particular, our findings provide the first evidence that the incidence of heart failure was markedly increased over a long period afterwards." Prevalence of pneumonia, a well known risk factor for deteriorating heart failure, was significantly increased.The Tohoku University study also found -- as reflected in self-monitoring measurements -- that blood pressure was significantly elevated after the Earthquake. However, transport disruption following the tsunami interrupted delivery of regular medications, such as antihypertensive or antithrombotic drugs, and this may have contributed to the increased cardiovascular events. There was also an increase in the occurrences of ventricular tachyarrhythmias in patients with implantable cardiac defibrillators."Taken together," said Dr Shimokawa, "we consider that discontinuation of drugs, activated sympathetic nervous system, rising blood pressure, and the increased occurrence of tachyarrhythmia and infections were all involved in the increased occurrence of cardiovascular events after the Great Earthquake of Japan." | Earthquakes | 2,012 |
August 23, 2012 | https://www.sciencedaily.com/releases/2012/08/120823161922.htm | Antarctic ice sheet quakes shed light on ice movement and earthquakes | Analysis of small, repeating earthquakes in an Antarctic ice sheet may not only lead to an understanding of glacial movement, but may also shed light on stick slip earthquakes like those on the San Andreas fault or in Haiti, according to Penn State geoscientists. | "No one has ever seen anything with such regularity," said Lucas K. Zoet, recent Penn State Ph. D. recipient, now a postdoctoral fellow at Iowa State University. "An earthquake every 25 minutes for a year."The researchers looked at seismic activity recorded during the Transantarctic Mountains Seismic Experiment from 2002 to 2003 on the David Glacier in Antarctica, coupled with data from the Global Seismic Network station Vanda. They found that the local earthquakes on the David Glacier, about 20,000 identified, were predominantly the same and occurred every 25 minutes give or take five minutes.The researchers note in the current "Our leading idea is that part of the bedrock is poking through the ductile till layer beneath the glacier," said Zoet.The researchers have determined that the asperity -- or hill -- is about a half mile in diameter.The glacier, passing over the hill, creates a stick slip situation much like that on the San Andreas fault. The ice sticks on the hill and stress gradually builds until the energy behind the obstruction is high enough to move the ice forward. The ice moves in a step-by-step manner rather than smoothly.But motion toward the sea is not the only thing acting on the ice streaming from David glacier. Like most glaciers near oceans, the edge of the ice floats out over the water and the floating ice is subject to the action of tides."When the tide comes in it pushes back on the ice, making the time between slips slightly longer," said Sridhar Anandakrishnan, professor of geoscience. "When the tide goes out, the time between slips decreases."However, the researchers note that the tides are acting at the ground line, a long way from the location of the asperity and therefore the effects that shorten or lengthen the stick slip cycle are delayed."This was something we didn't expect to see," said Richard B. Alley, Evan Pugh Professor of Geosciences. "Seeing it is making us reevaluate the basics."He also noted that these glacial earthquakes, besides helping glaciologists understand the way ice moves, can provide a simple model for the stick slip earthquakes that occur between landmasses."We have not completely explained how ice sheets flow unless we can reproduce this effect," said Alley. "We can use this as a probe and look into the physics so we better understand how glaciers move."Before 2002, this area of the David glacier flowed smoothly, but then for nearly a year the 20-minute earthquake intervals occurred and then stopped. Something occurred at the base of the ice to start and then stop these earthquakes."The best idea we have is that during those 300 days, a dirty patch of ice was in contact with the mount, changing the way stress was transferred," said Zoet. "The glacier is experiencing earthquakes again, although at a different rate. It would be nice to study that."Unfortunately, the seismographic instruments that were on the glacier in 2002 no longer exist, and information is coming from only one source at the moment. | Earthquakes | 2,012 |
August 20, 2012 | https://www.sciencedaily.com/releases/2012/08/120820132344.htm | Why do the Caribbean Islands arc? Movement of Earth modeled to 3,000 km depth | The Caribbean islands have been pushed east over the last 50 million years, driven by the movement of Earth's viscous mantle against the more rooted South American continent, reveals new research by geophysicists from USC. | The results, published August 19 in "Studying the deep earth interior provides insights into how the Earth has evolved into its present form," said Meghan S. Miller, assistant professor of earth sciences in the USC Dornsife College of Letters, Arts and Sciences, and lead author of the paper. "We're interested in plate tectonics, and the southeastern Caribbean is interesting because it's right near a complex plate boundary."Miller and Thorsten W. Becker, associate professor of earth sciences at USC Dornsife College, studied the margin between the Caribbean plate and the South American plate, ringed by Haiti, the Dominican Republic, Puerto Rico and a crescent of smaller islands including Barbados and St. Lucia.But just like the First Law of Ecology (and time travel), when it comes to Earth, everything really is connected. So to study the motion of the South American continent and Caribbean plate, the researchers had to first model the entire planet -- 176 models to be exact, so large that they took several weeks to compute even at the USC High Performance Computing Center.For most of us, earthquakes are something to be dreaded. But for Miller and Becker they are a necessary source of data, providing seismic waves that can be tracked all over the world to provide an image of Earth's deep interior like a CAT scan. The earthquake waves move slower or more quickly depending on the temperature and composition of the rock, and also depending on how the crystals within the rocks align after millions of years of being pushed around in mantle convection."If you can, you want to solve the whole system and then zoom in," Becker said. "What's cool about this paper is that we didn't just run one or two models. We ran a lot, and it allowed us to explore different possibilities for how mantle flow might work."Miller and Becker reconstructed the movement of Earth's mantle to a depth of almost 3,000 kilometers, upending previous hypotheses of the seismic activity beneath the Caribbean Sea and providing an important new look at the unique tectonic interactions that are causing the Caribbean plate to tear away from South America.In particular, Miller and Becker point to a part of the South American plate -- known as a "cratonic keel" -- that is roughly three times thicker than normal lithosphere and much stronger than typical mantle. The keel deflects and channels mantle flow, and provides an important snapshot of the strength of the continents compared to the rest of Earth's outer layers."Oceanic plates are relatively simple, but if we want to understand how the Earth works as a system -- and how faults evolved and where the flow is going over millions of years -- we also have to understand continental plates," Becker said.In the southeastern Caribbean, the interaction of the subducted plate beneath the Antilles island arc with the stronger continental keel has created the El Pilar-San Sebastian Fault, and the researchers believe a similar series of interactions may have formed the San Andreas Fault."We're studying the Caribbean, but our models are run for the entire globe," Miller said. "We can look at similar features in Japan, Southern California and the Mediterranean, anywhere we have instruments to record earthquakes." | Earthquakes | 2,012 |
August 16, 2012 | https://www.sciencedaily.com/releases/2012/08/120816101029.htm | Tibetan Plateau may be older than previously thought | The growth of high topography on the Tibetan Plateau in Sichuan, China, began much earlier than previously thought, according to an international team of geologists who looked at mountain ranges along the eastern edge of the plateau. | The Indian tectonic plate began its collision with Asia between 55 and 50 million years ago, but "significant topographic relief existed adjacent to the Sichuan Basin prior to the Indo-Asian collision," the researchers report online in "Most researchers have thought that high topography in eastern Tibet developed during the past 10 to 15 million years, as deep crust beneath the central Tibetan Plateau flowed to the plateau margin, thickening Earth's crust in this area and causing surface uplift," said Eric Kirby, associate professor of geoscience at Penn State. "Our study suggests that high topography began to develop as early as 30 million years ago, and perhaps was present even earlier."Kirby, working with Erchie Wang of the Institute of Geology and Geophysics at the Chinese Academy of Sciences in Beijing; Kevin Furlong, professor of geosciences at Penn State; and colleagues from Waikato University, New Zealand and Arizona State University, looked at samples taken from the hanging wall of the Yingxiu-Beichuan fault, the primary fault responsible for the 2008, Wenchuan earthquake. The researchers used a variety of methods including the decay rate of uranium and thorium to helium in the minerals apatite and zircon and fission track dating, an analysis of tracks or trails left by decaying uranium in minerals again in apatite and zircon."These methods allow us to investigate the thermal regime from about 250 degrees Celsius (482 degrees Fahrenheit) to about 60 degrees (140 degrees Fahrenheit)," said Kirby. "The results show that the rocks cooled relatively slowly during the early and mid-Cenozoic -- from 30 to 50 million years ago -- an indication that topography in the region was undergoing erosion."The results also suggest that gradual cooling during this time was followed by two episodes of rapid erosion, one beginning 30 to 25 million years ago and one beginning 15 to 10 million years ago that continues today."These results challenge the idea that the topographic relief along the margin of the plateau developed entirely in the Late Miocene, 5 to 10 million years ago," said Kirby. "The period of rapid erosion between 25 to 30 million years ago could only be sustained if the mountains were not only present, but actively growing, at this time."The researchers also note that this implies that fault systems responsible for the 2008 earthquake were also probably active early in the history of the growth of the Tibetan Plateau."We are still a long way from completely understanding when and how high topography in Asia developed in response to India-Asia collision," notes Kirby. "However, these results lend support to the idea that much of what we see today in the mountains of China may have developed earlier than we previously thought."The Chinese National Key Projects Program, the National Natural Science Foundation of China and the National Science Foundation funded this research. | Earthquakes | 2,012 |
August 15, 2012 | https://www.sciencedaily.com/releases/2012/08/120815202224.htm | Landslide fatalities are greater than previously thought | Landslides kill ten times more people across the world than was previously thought, according to research by Durham University, UK. | A new database of hazards shows that 32,300 people died in landslides between 2004 and 2010. Previous estimates ranged from 3,000 to 7,000 fatalities.The database, which provides the first detailed analysis of fatal landslides across the world, maps hotspots including China, Central and South America, and India.The researchers say that the new database, the Durham Fatal Landslide Database (DFLD), can help policymakers to prioritise areas for action to manage hazards and to lessen the risks to human populations living in hotspot regions.The findings are published in the journal Lead researcher, Professor David Petley, a Geographer at the International Landslide Centre, and Co-Director of The Institute of Hazard, Risk and Resilience, Durham University, said: "The environmental effects of landslides are often devastating for nearby human populations."We need to recognise the extent of the problem and take steps to manage what is a major environmental risk to people across the world. Our database will enable us to do this by identifying areas most at risk and could help to save thousands of lives."The DFLD includes only fatal landslides and is compiled using a number of search tools and analysis of government statistics, aid agency reports, and research papers.It is still likely that the database underestimates the number of landslides and deaths. The database excludes data from landslides caused by earthquakes due to the high level of uncertainty associated with these events. Following an earthquake, where there is a fatal landslide, the deaths are attributed to the earthquake trigger itself, rather than the landslide.The researchers say that weather patterns, deforestation, melting permafrost in high mountainous areas, and high and increasing human population densities are important factors in the cause, distribution, number, extent and effects of landslides.More fatal landslide events are recorded in May to October and the dominant global trigger is rain from the monsoon. Tropical cyclones also generate extreme rainfall events that trigger landslides in Asia, and hurricanes have the same effect on regions in the Caribbean and Central America.In some areas with a history of fatal landslides, such as Hong Kong, programmes to mitigate the risks of landslides have been successful.Professor David Petley said: "Areas with a combination of high relief, intense rainfall, and a high population density are most likely to experience high numbers of fatal landslides. Landslides are a global hazard requiring a major change in perception and policy."There are things that we can do to manage and mitigate landslide risks such as controlling land use, proactive forest management, and guiding development away from vulnerable areas."Global landslide hotspots: | Earthquakes | 2,012 |
August 14, 2012 | https://www.sciencedaily.com/releases/2012/08/120814110923.htm | Nearly 1,000 earthquakes recorded in Arizona over three years | Earthquakes are among the most destructive and common of geologic phenomena. Several million earthquakes are estimated to occur worldwide each year (the vast majority are too small to feel, but their motions can be measured by arrays of seismometers). Historically, most of Arizona has experienced low levels of recorded seismicity, with infrequent moderate and large earthquakes in the state. Comprehensive analyses of seismicity within Arizona have not been previously possible due to a lack of seismic stations in most regions, contributing to the perception that widespread earthquakes in Arizona are rare. Debunking that myth, a new study published by Arizona State University researchers found nearly 1,000 earthquakes rattling the state over a three-year period. | Jeffrey Lockridge, a graduate student in ASU's School of Earth and Space Exploration and the project's lead researcher, used new seismic data collected as part of the EarthScope project to develop methods to detect and locate small-magnitude earthquakes across the entire state of Arizona. EarthScope's USArray Transportable Array was deployed within Arizona from April 2006 to March 2009 and provided the first opportunity to examine seismicity on a statewide scale. Its increased sensitivity allowed Lockridge to find almost 1,000 earthquakes during the three-year period, including many in regions of Arizona that were previously thought to be seismically inactive."It is significant that we found events in areas where none had been detected before, but not necessarily surprising given the fact that many parts of the state had never been sampled by seismometers prior to the deployment of the EarthScope USArray," says Lockridge. "I expected to find some earthquakes outside of north-central Arizona, where the most and largest events had previously been recorded, just not quite so many in other areas of the state."One-thousand earthquakes over three years may sound alarmingly high, but the large number of earthquakes detected in the study is a direct result of the improved volume and quality of seismic data provided by EarthScope. Ninety-one percent of the earthquakes Lockridge detected in Arizona were "microquakes" with a magnitude of 2.0 or smaller, which are not usually felt by humans. Detecting small-magnitude earthquakes is not only important because some regions experiencing small earthquakes may produce larger earthquakes, but also because geologists use small magnitude earthquakes to map otherwise hidden faults beneath the surface.Historically, the largest earthquakes and the majority of seismicity recorded within Arizona have been located in an area of north-central Arizona. More recently, a pair of magnitude 4.9 and 5.3 earthquakes occurred in the Cataract Creek area outside of Flagstaff. Earthquakes of magnitude 4.0 or larger also have occurred in other areas of the state, including a magnitude 4.2 earthquake in December 2003 in eastern Arizona and a magnitude 4.9 earthquake near Chino Valley in 1976."The wealth of data provided by the EarthScope project is an unprecedented opportunity to detect and locate small-magnitude earthquakes in regions where seismic monitoring (i.e. seismic stations) has historically been sparse," explains Lockridge. "Our study is the first to use EarthScope data to build a regional catalog that detects all earthquakes magnitude 1.2 or larger."His results appear in a paper titled, "Seismicity within Arizona during the Deployment of the EarthScope USArray Transportable Array," published in the August 2012 issue of the "The most surprising result was the degree to which the EarthScope data were able to improve upon existing catalogs generated by regional and national networks. From April 2007 through November 2008, other networks detected only 80 earthquakes within the state, yet over that same time we found 884 earthquakes, or 11 times as many, which is really quite staggering," says Lockridge. "It's one of countless examples of how powerful the EarthScope project is and how much it is improving our ability to study Earth."Lockridge is also lead author on a study that focuses on a cluster of earthquakes located east of Phoenix, near Theodore Roosevelt Lake. The results from this study will be published in Seismological Research Letters later this year. In his current studies as doctoral student, Lockridge is using the same methods used for Arizona to develop a comprehensive earthquake catalog for the Great Basin region in Nevada and western Utah. | Earthquakes | 2,012 |
August 10, 2012 | https://www.sciencedaily.com/releases/2012/08/120810133110.htm | Earthquake risk in Europe detailed | For the first time, scientists of the GFZ German Research Centre for Geosciences have succeeded in setting up a harmonized catalogue of earthquakes for Europe and the Mediterranean for the last thousand years. This catalogue consists of about 45000 earthquakes. | How strong can earthquakes in Germany be? Where in Europa are the earthquake activities concentrated? These questions are the basis for risk assessments and become relevant when it comes to the safety of buildings or the generation of tsunami.For the first time, scientists of the GFZ German Research Centre for Geosciences have succeeded in setting up a harmonized catalogue of earthquakes for Europe and the Mediterranean for the last thousand years. This catalogue consists of about 45000 earthquakes, reported in the latest issue of the Earthquakes occur with different frequencies of occurrence, meaning that in some regions, strong earthquakes happen with time gaps of hundreds of years. Such rare events can cause a false feeling of security which belies the true risk. This is compounded by the fact that instrumental measurements only go back for roughly a century, and reliable data for smaller quakes for about half that long.Thus, the only way to assess the actual risk is the analysis of historical earthquake accounts. "The catalogue that we present here covers the earthquakes of the last thousand years with magnitudes of MIn bringing together all these sources with their considerable differences in magnitude or intensity scales, particular care was taken to harmonize the latter in the form of moment magnitude M"Part of the EMEC-publication is also a list of so-called fake-quakes, i.e. quakes that have been reported erroneously due to errors of the chroniclers, errors in dates or other mistakes," explains GFZ scientist Gottfried Grünthal. Importance was also attached to a user-friendly web page: "On the GFZ-EMEC-website, data can be downloaded according to the wishes of the users and epicentre maps can be generated, stored and printed in customized scales of time and space and map projections." | Earthquakes | 2,012 |
August 9, 2012 | https://www.sciencedaily.com/releases/2012/08/120809155831.htm | Scientist discovers plate tectonics on Mars | For years, many scientists had thought that plate tectonics existed nowhere in our solar system but on Earth. Now, a UCLA scientist has discovered that the geological phenomenon, which involves the movement of huge crustal plates beneath a planet's surface, also exists on Mars. | "Mars is at a primitive stage of plate tectonics. It gives us a glimpse of how the early Earth may have looked and may help us understand how plate tectonics began on Earth," said An Yin, a UCLA professor of Earth and space sciences and the sole author of the new research.Yin made the discovery during his analysis of satellite images from a NASA spacecraft known as THEMIS (Time History of Events and Macroscale Interactions during Substorms) and from the HIRISE (High Resolution Imaging Science Experiment) camera on NASA's Mars Reconnaissance Orbiter. He analyzed about 100 satellite images -- approximately a dozen were revealing of plate tectonics.Yin has conducted geologic research in the Himalayas and Tibet, where two of Earth's seven major plates divide."When I studied the satellite images from Mars, many of the features looked very much like fault systems I have seen in the Himalayas and Tibet, and in California as well, including the geomorphology," said Yin, a planetary geologist.For example, he saw a very smooth, flat side of a canyon wall, which can be generated only by a fault, and a steep cliff, comparable to cliffs in California's Death Valley, which also are generated by a fault. Mars has a linear volcanic zone, which Yin said is a typical product of plate tectonics."You don't see these features anywhere else on other planets in our solar system, other than Earth and Mars," said Yin, whose research is featured as the cover story in the August issue of the journal The surface of Mars contains the longest and deepest system of canyons in our solar system, known as Valles Marineris (Latin for Mariner Valleys and named for the Mariner 9 Mars orbiter of 1971-72, which discovered it). It is nearly 2,500 miles long -- about nine times longer than Earth's Grand Canyon. Scientists have wondered for four decades how it formed. Was it a big crack in Mars' shell that opened up?"In the beginning, I did not expect plate tectonics, but the more I studied it, the more I realized Mars is so different from what other scientists anticipated," Yin said. "I saw that the idea that it is just a big crack that opened up is incorrect. It is really a plate boundary, with horizontal motion. That is kind of shocking, but the evidence is quite clear."The shell is broken and is moving horizontally over a long distance. It is very similar to the Earth's Dead Sea fault system, which has also opened up and is moving horizontally."The two plates divided by Mars' Valles Marineris have moved approximately 93 miles horizontally relative to each other, Yin said. California's San Andreas Fault, which is over the intersection of two plates, has moved about twice as much -- but Earth is about twice the size of Mars, so Yin said they are comparable.Yin, whose research is partly funded by the National Science Foundation, calls the two plates on Mars the Valles Marineris North and the Valles Marineris South."Earth has a very broken 'egg shell,' so its surface has many plates; Mars' is slightly broken and may be on the way to becoming very broken, except its pace is very slow due to its small size and, thus, less thermal energy to drive it," Yin said. "This may be the reason Mars has fewer plates than on Earth."Mars has landslides, and Yin said a fault is shifting the landslides, moving them from their source.Does Yin think there are Mars-quakes?"I think so," he said. "I think the fault is probably still active, but not every day. It wakes up every once in a while, over a very long duration -- perhaps every million years or more."Yin is very confident in his findings, but mysteries remain, he said, including how far beneath the surface the plates are located."I don't quite understand why the plates are moving with such a large magnitude or what the rate of movement is; maybe Mars has a different form of plate tectonics," Yin said. "The rate is much slower than on Earth."Earth has a broken shell with seven major plates; pieces of the shell move, and one plate may move over another. Yin is doubtful that Mars has more than two plates."We have been able to identify only the two plates," he said. "For the other areas on Mars, I think the chances are very, very small. I don't see any other major crack."Did the movement of Valles Marineris North and Valles Marineris South create the enormous canyons on Mars? What led to the creation of plate tectonics on Earth?Yin, who will continue to study plate tectonics on Mars, will answer those questions in a follow-up paper that he also plans to publish in the journal Lithosphere. | Earthquakes | 2,012 |
August 6, 2012 | https://www.sciencedaily.com/releases/2012/08/120806151250.htm | Correlation between injection wells and small earthquakes discovered | Most earthquakes in the Barnett Shale region of North Texas occur within a few miles of one or more injection wells used to dispose of wastes associated with petroleum production such as hydraulic fracturing fluids, according to new research from The University of Texas at Austin. None of the quakes identified in the two-year study were strong enough to pose a danger to the public. | The study by Cliff Frohlich, senior research scientist at the university's Institute for Geophysics, appears this week in the "You can't prove that any one earthquake was caused by an injection well," says Frohlich. "But it's obvious that wells are enhancing the probability that earthquakes will occur."Frohlich analyzed seismic data collected between November 2009 and September 2011 by the EarthScope USArray Program, a National Science Foundation-funded network of broadband seismometers from the Canadian border to the Gulf of Mexico. Because of the high density of instruments (25 in or near the Barnett Shale), Frohlich was able to detect earthquakes down to magnitude 1.5, far too weak for people to feel at the surface.He found that the most reliably located earthquakes -- those that are accurate to within about 0.9 miles (1.5 kilometers) -- occurred in eight groups, all within 2 miles (3.2 kilometers) of one or more injection wells. Before this study, the National Earthquake Information Center had only identified two earthquake groups in the area strongly associated with specific injection wells. This suggests injection-triggered earthquakes are far more common than is generally recognized.The Barnett Shale is a geological formation in North Texas bearing a large amount of natural gas that was difficult to recover prior to recent technological advances such as hydraulic fracturing. The formation lies beneath Dallas and Fort Worth and extends over several counties, mostly to the west of those cities. Development of the Barnett Shale and other unconventional plays -- such as the Bakken Shale in North Dakota and the Marcellus Shale in Pennsylvania, New York and West Virginia -- have spurred dramatic growth in domestic natural gas production.This study comes as some policymakers and members of the public are expressing concern about possible environmental and health effects of hydraulic fracturing. Most earthquakes identified in the study ranged in magnitude from 1.5 to 2.5, meaning they posed no danger to the public.All the wells nearest to the eight earthquake groups reported high injection rates (maximum monthly injection rates exceeding 150,000 barrels of water). Yet in many other areas where wells had similarly high injection rates, there were no earthquakes. Frohlich tried to address those differences."It might be that an injection can only trigger an earthquake if injected fluids reach and relieve friction on a nearby fault that is already ready to slip," says Frohlich. "That just isn't the situation in many places."Hydraulic fracturing is an industrial process in which water and various chemicals are pumped deep underground in order to fracture rock, allowing oil or gas to more easily flow to a well. As petroleum is produced at the surface, most hydraulic fracturing fluids return to the surface too. Frohlich is careful to point out that he did not evaluate the possible correlation of earthquakes with the actual hydraulic fracturing process, but rather the effects of disposing of fracturing fluids and other wastes in these injection wells.Support for this study came from the U.S. Geological Survey and the Jackson School of Geosciences at The University of Texas at Austin. The author has no financial ties to the hydraulic fracturing industry. Frohlich has consulted for the construction industry on seismic risks for projects including dams, power plants and pipelines. He plans to participate in a future study relating to hydraulic fracturing in the Barnett Shale by the university's Energy Institute. | Earthquakes | 2,012 |
August 3, 2012 | https://www.sciencedaily.com/releases/2012/08/120803131932.htm | Ancient records shed light on Italian earthquakes (Aquila area) | When a damaging earthquake struck the area of L'Aquila in central Italy in 2009, it was the latest in the region's long history of strong and persistent quakes. The rich recorded history of settlement in the area, along with oral traditions, archaeological excavations, inscriptions and medieval texts, and offer insight into how often the region might expect destructive earthquakes. | But according to a new study by Emanuela Guidoboni and colleagues, the historical record on ancient and medieval earthquakes comes with its own shortcomings that must be addressed before the seismic history of L'Aquila can be useful in assessing the current seismic hazard in this area.To illustrate some of these potential gaps in knowledge, the researchers combed through written records and information from archaeological excavations, covering the period from ancient Roman occupation in the first century A.D. to the late Middle Ages of the 15th century A.D. The authors say, researchers must piece together information ranging from collapsed roofs within an ancient Roman city, to the evidences of rebuilding damaged baths and cisterns. In later years, better written records offer more detail on the specific location and size of earthquakes occurring in 1349, 1456, and 1461 (a long seismic sequence).The authors say that the early to middle Middle Ages, in particular, have a dearth of information that needs to be addressed to have a more complete picture of the region's seismic history. Overall, the records confirm that the region appears to have been host to a high number of strong earthquakes. The authors also point out a tendency in the area to produce multiple and nearly simultaneous seismic events that vary considerably in their impact.As Guidoboni and her colleagues note, the earthquakes have had a strong influence in the region's economy and culture. It is a impact that can be seen clearly in the historical records, such as a written account of a large earthquake in 1315. During that quake, warring factions in the town came together after they "were struck with fear at the strong shaking when a frightening earthquake soon afterwards struck that place in a terrible way," the official account says, "and they abandoned their wrongdoing and returned to the narrow path of their conscience." | Earthquakes | 2,012 |
August 2, 2012 | https://www.sciencedaily.com/releases/2012/08/120802122611.htm | Major recent earthquakes across the globe probably not linked | The past decade has been plagued with what seems to be a cluster of large earthquakes, with massive quakes striking Sumatra, Chile, Haiti and Japan since 2004. Some researchers have suggested that this cluster has occurred because the earthquakes may be "communicating" across large distances, possibly triggering each other. But a new analysis by Tom Parsons and Eric Geist of the US Geological Survey concludes that the cluster could just as well be the result of random chance. | Each of the devastating quakes in the 2000s drew huge media coverage and required extensive rebuilding and economic restoration. The intense interest in the earthquakes has led some to wonder if we are living in the middle of an "age of great quakes," similar to a global cluster of quakes in the 1960s. It's important to know whether these clusters occur because big earthquakes trigger others across the world, Parsons and Geist say, in order to predict whether more severely destructive quakes might be on the way.To determine if the quake clusters in the 1960s and 2000s could be attributed to random chance, the researchers looked at the timing between the world's largest earthquakes--magnitude 8.3 and above--at one-year intervals during the past 100 years. They compared simulated lists of large quakes and the list of real quakes during this time with the between-quake intervals expected from a random process. The intervals between the real-life large quakes are similar to what would be expected from a random process, they found. In other words, the global hazard of large earthquakes is constant in time. Except in the case of local aftershocks, the probability of a new large quake occurring isn't related to past global quakes.This could be disappointing news for researchers who thought global communication between quakes might offer a way to predict the most severe seismic activity. But there also may be some good news after a decade of destruction. If global great earthquakes are occurring at random, the authors say, then a specific number of quakes that cluster together within a short time is unlikely to be repeated in a similar way over a 100-year span. | Earthquakes | 2,012 |
August 2, 2012 | https://www.sciencedaily.com/releases/2012/08/120802122514.htm | Homing in on a potential pre-quake signal | Changes in seismic velocity--changes in the speeds at which seismic waves move through Earth's crust--have been identified during and after many earthquakes. But do these changes also happen before an earthquake, and could they be measured as a way to predict a quake on the way? The search for a clear and measurable pre-quake signal has been called "the holy grail of seismology." | In a new analysis of the 2004 magnitude 6.0 Parkfield earthquake in California, David Schaff suggests some limits on how changes measured by ambient seismic noise could be used as a pre-earthquake signal. Ambient seismic noise refers to the "background hum" of Earth--the surface waves found all over the planet's crust that are caused mostly by wind and ocean waves. Changes in seismic velocity can be measured using seismic noise observations, which are often recorded continuously at seismic stations and therefore can provide a detailed record of a pre-earthquake time period.Using a complete set of noise data from the Parkfield earthquake, Schaff was able to search for a pre-seismic signal to the quake. He was unable to detect any pre-seismic velocity change for Parkfield using the noise data, but he notes that any pre-seismic signal may have been too small, too short in duration, or in a different area outside of the network of seismic monitors. The analysis did allow Schaff to place an upper limit on how large such a signal might be, depending on how many days it might be observed before the main quake.The paper, "Placing an Upper Bound on Preseismic Velocity Changes Measured by Ambient Noise Monitoring for the 2004 Mw 6.0 Parkfield Earthquake (California)" will appear in the August issue of the | Earthquakes | 2,012 |
August 1, 2012 | https://www.sciencedaily.com/releases/2012/08/120801132717.htm | Northwest earthquake risk in U.S. looms large: 40% chance of major earthquake within 50 years | A comprehensive analysis of the Cascadia Subduction Zone off the Pacific Northwest coast confirms that the region has had numerous earthquakes over the past 10,000 years, and suggests that the southern Oregon coast may be most vulnerable based on recurrence frequency. | Written by researchers at Oregon State University, and published online by the U.S. Geological Survey, the study concludes that there is a 40 percent chance of a major earthquake in the Coos Bay, Ore., region during the next 50 years. And that earthquake could approach the intensity of the Tohoku quake that devastated Japan in March of 2011."The southern margin of Cascadia has a much higher recurrence level for major earthquakes than the northern end and, frankly, it is overdue for a rupture," said Chris Goldfinger, a professor in OSU's College of Earth, Ocean, and Atmospheric Sciences and lead author of the study. "That doesn't mean that an earthquake couldn't strike first along the northern half, from Newport, Ore., to Vancouver Island."But major earthquakes tend to strike more frequently along the southern end -- every 240 years or so -- and it has been longer than that since it last happened," Goldfinger added. "The probability for an earthquake on the southern part of the fault is more than double that of the northern end."The publication of the peer-reviewed analysis may do more than raise awareness of earthquake hazards and risks, experts say. The actuarial table and history of earthquake strength and frequency may eventually lead to an update in the state's building codes."We are considering the work of Goldfinger, et al, in the update of the National Seismic Hazard Maps, which are the basis for seismic design provisions in building codes and other earthquake risk-mitigation measures," said Art Frankel, who has dual appointments with the U.S. Geological Survey and the University of Washington.The Goldfinger-led study took four years to complete and is based on 13 years of research. At 184 pages, it is the most comprehensive overview ever written of the Cascadia Subduction Zone, a region off the Northwest coast where the Juan de Fuca tectonic plate is being subducted beneath the continent. Once thought to be a continuous fault line, Cascadia is now known to be at least partially segmented.This segmentation is reflected in the region's earthquake history, Goldfinger noted."Over the past 10,000 years, there have been 19 earthquakes that extended along most of the margin, stretching from southern Vancouver Island to the Oregon-California border," Goldfinger noted. "These would typically be of a magnitude from about 8.7 to 9.2 -- really huge earthquakes."We've also determined that there have been 22 additional earthquakes that involved just the southern end of the fault," he added. "We are assuming that these are slightly smaller -- more like 8.0 -- but not necessarily. They were still very large earthquakes that if they happened today could have a devastating impact."The clock is ticking on when a major earthquake will next strike, said Jay Patton, an OSU doctoral student who is a co-author on the study."By the year 2060, if we have not had an earthquake, we will have exceeded 85 percent of all the known intervals of earthquake recurrence in 10,000 years," Patton said. "The interval between earthquakes ranges from a few decades to thousands of years. But we already have exceeded about three-fourths of them."The last mega-earthquake to strike the Pacific Northwest occurred on Jan. 26, 1700. Researchers know this, Goldfinger said, because written records in Japan document how an ensuing tsunami destroyed that year's rice crop stored in warehouses.How scientists document the earthquake history of the Cascadia Subduction Zone is fascinating. When a major offshore earthquake occurs, Goldfinger says, the disturbance causes mud and sand to begin streaming down the continental margins and into the undersea canyons. Coarse sediments called turbidites run out onto the abyssal plain; these sediments stand out distinctly from the fine particulate matter that accumulates on a regular basis between major tectonic events.By dating the fine particles through carbon-14 analysis and other methods, Goldfinger and colleagues can estimate with a great deal of accuracy when major earthquakes have occurred over the past 10,000 years.Going back further than 10,000 years has been difficult because the sea level used to be lower and West Coast rivers emptied directly into offshore canyons. Because of that, it is difficult to distinguish between storm debris and earthquake turbidites."The turbidite data matches up almost perfectly with the tsunami record that goes back about 3,500 years," Goldfinger said. "Tsunamis don't always leave a signature, but those that do through coastal subsidence or marsh deposits coincide quite well with the earthquake history."With the likelihood of a major earthquake and possible tsunami looming, coastal leaders and residents face the unenviable task of how to prepare for such events. Patrick Corcoran, a hazards outreach specialist with OSU's Sea Grant Extension program, says West Coast residents need to align their behavior with this kind of research."Now that we understand our vulnerability to mega-quakes and tsunamis, we need to develop a culture that is prepared at a level commensurate with the risk," Corcoran said. "Unlike Japan, which has frequent earthquakes and thus is more culturally prepared for them, we in the Pacific Northwest have not had a mega-quake since European settlement. And since we have no culture of earthquakes, we have no culture of preparedness."The research, though, is compelling," he added. "It clearly shows that our region has a long history of these events, and the single most important thing we can do is begin 'expecting' a mega-quake, then we can't help but start preparing for it." | Earthquakes | 2,012 |
July 29, 2012 | https://www.sciencedaily.com/releases/2012/07/120729142249.htm | Giant ice avalanches on Saturn's moon Iapetus provide clue to extreme slippage elsewhere in the solar system | "We see landslides everywhere in the solar system," says Kelsi Singer, graduate student in earth and planetary sciences in Arts & Sciences at Washington University in St. Louis, "but Saturn's icy moon Iapetus has more giant landslides than any body other than Mars." | The reason, says William McKinnon, PhD, professor of earth and planetary sciences, is Iapetus' spectacular topography. "Not only is the moon out-of-round, but the giant impact basins are very deep, and there's this great mountain ridge that's 20 kilometers (12 miles) high, far higher than Mount Everest."So there's a lot of topography and it's just sitting around, and then, from time to time, it gives way," McKinnon says.Falling from such heights, the ice reaches high speeds -- and then something odd happens.Somehow, its coefficient of friction drops, and it begins to flow rather than tumble, traveling many miles before it dissipates the energy of the fall and finally comes to rest.In the July 29 issue of They challenge experimental physicists to measure friction when ice is sliding, and suggest a mechanism that might make ice or rocks slippery, not just during avalanches or landslides, but also during earthquakes or icy moonquakes.The ice avalanches on Iapetus aren't just large; they're larger than they should be given the forces scientists think set them in motion and bring them to a halt.The counterpart to the Iapetian ice avalanche on Earth is a long-runout rock landslide, or sturzstrom (German for "fallstream"). Most landslides travel a horizontal distance that is less than twice the distance the rocks have fallen.On rare occasions, however, a landslide will travel 20 or 30 times farther than it fell, traveling for long distances horizontally or even surging uphill. These extraordinarily mobile landslides, which seem to spill like a fluid rather than tumble like rocks, have long mystified scientists.The mechanics of a normal runout are straightforward. The debris travels outward until friction within the debris mass and with the ground dissipates the energy the rock gained by falling, and the rock mass comes to rest.But to explain the exceptionally long runouts, some other mechanism must be invoked as well. Something must be acting to reduce friction during the runout, Singer says.The trouble is, there is no agreement about what this something might be. Proposals have included a cushion of air, lubrication by water or by rock flour or a thin melted layer. "There are more mechanisms proposed for fiction reduction than I can put on a PowerPoint slide," McKinnon jokes."The landslides on Iapetus are a planet-scale experiment that we cannot do in a laboratory or observe on Earth," Singer says. "They give us examples of giant landslides in ice, instead of rock, with a different gravity, and no atmosphere. So any theory of long runout landslides on Earth must also work for avalanches on Iapetus."McKinnon, whose research focuses on the icy satellites of the outer solar system planets, has been studying Iapetus since the Cassini spacecraft flew by it in December 2004 and September 2007 and streamed images of the ice moon to Earth.Almost everything about Iapetus is odd. It should be spherical, but it's fatter at the equator than at the poles, probably because it froze in place when it was spinning faster than it is now. And it has an extremely tall, razor-striaght mountain range of mysterious origin that wraps most of the way around its equator. Because of its stoutness and giant ridge, the moon looks like an oversized walnut.If the Iapetian surface locked in place before it could spin down to a sphere, there must be stresses in its surface, McKinnon reasoned. So he suggested Singer check the Cassini images for stress fractures in the ice.She looked carefully at every Cassini image and didn't find much evidence of fracturing. Instead, she kept finding giant avalanches.Singer eventually identified 30 massive ice avalanches in the Cassini images -- 17 that had plunged down crater walls and another 13 that had swept down the slides of the equatorial mountain range.Careful measurements of the heights from which the ice had fallen and the avalanche runout did not find trends consistent with some of the most popular theories for the extraordinary mobility of long-runout landslides.The scientists say data can't exclude them, however. "We don't have the same range of measurements for the Iapetian avalanches that is available for landslides on Earth and Mars," Singer explains.But, it is nonetheless clear that the coefficient of friction of the avalanches (as measured by a proxy, the ratio between the drop height and the runout) is not consistent with the coefficients of friction of very cold ice measured in the laboratory.Coefficients of friction can range from near zero to greater than one. Laboratory measurements of the coefficients for really cold ice lie between 0.55 and 0.7."Really cold ice debris is as frictional as beach sand," McKinnon says.The coefficients for the Iapetus avalanches, however, scatter between 0.1 and 0.3. Something is off here.In a typical laboratory experiment to measure the frictional coefficient of ice, cylinders of ice are rotated against one another and their resistance to rotation is measured. If ice is moving slowly, it is very frictional.But if it were moving faster, the friction might be lower.Would rapid motion make even super-cold ice slippery? That's a testable hypothesis, the scientists point out, and one they hope experimental physicists soon will take for a spin.If ice becomes less frictional when traveling at speed, what about rock? "If you had some kind of quick movement, whether it was a landslide or the slip along a fault, the same kind of thing could happen," Singer says.Geologists now realize that major faults are weaker during earthquakes than laboratory measurements of rocks' coefficients of friction suggest they should be, she says.But in this case, higher velocity experiments already have been done. At slow slip rates, the friction coefficient of rocks ranges from 0.6 to 0.85. But when the rocks are sliding past one another fast enough, the friction coefficient is near 0.2. That's in the same range as the Iapetian ice avalanche's coefficients.Nobody is sure what lubricates the faults when they are jolted into motion by an earthquake, but one of the simplest hypotheses is something called flash heating, Singer says. The idea is that as the rocks slide past one another, asperities (tiny contact points) on their surfaces are heated by friction.Above a critical speed, the heat would not have time to escape the contact points, which would be flash-heated to temperatures high enough to weaken or even melt the rock. This weakening might explain high slip rates and large sliding displacements characteristic of earthquakes.The case for flash heating is buttressed by the discovery of rocks that seem to have undergone frictional melting, generically called frictionites, or pseudotachylites, along faults and associated with some rock slides, Singer says."You might think friction is trivial," McKinnon says, "but it's not. And that goes for friction between ices and friction between rocks. It's really important not just for landslides, but also for earthquakes and even for the stability of the land. And that's why these observations on an ice moon are interesting and thought-provoking." | Earthquakes | 2,012 |
July 25, 2012 | https://www.sciencedaily.com/releases/2012/07/120725162446.htm | New evidence for regular magnitude 8 earthquakes, study of New Zealand fault shows | A new study published in the journal | The Alpine Fault is the most hazardous fault on the South Island of New Zealand, and about 80 miles northwest of the South Island's main city of Christchurch.The team developed evidence for 22 earthquakes at the Hokuri Creek site, which, with two additional from nearby, led to the longest continuous earthquake record in the world for a major plate boundary fault. The team established that the Alpine Fault causes, on average, earthquakes of around a magnitude 8 every 330 years. Previous data put the intervals at about 485 years.Relative motion of Australian and Pacific plates across the Alpine Fault averages almost an inch per year. This motion builds up, and then is released suddenly in large earthquakes. The 530-mile-long fault is among the longest, straightest and fastest moving plate boundary faults in the world. More than 23 feet of potential slip has accumulated in the 295 years since the most recent event in A.D. 1717.Biasi, working with the GNS Science team led by Kelvin Berryman, used paleoseismology to extend the known seismic record from 1000 years ago to 8,000 years ago. They estimated earthquake dates by combining radiocarbon dating leaves, small twigs and marsh plants with geologic and other field techniques."Our study sheds new light on the frequency and size of earthquakes on the Alpine Fault. Earthquakes have been relatively periodic, suggesting that this may be a more general property of simple plate boundary faults worldwide," Biasi, of the Nevada Seismological Laboratory said. "By comparison, large earthquakes on California's San Andreas Fault have been less regular in size and timing.""Knowing the average rate of earthquakes is useful, but is only part of the seismic hazard equation," he said. "If they are random in time, then the hazard does not depend on how long it has been since the most recent event. Alpine Fault earthquakes are more regular in their timing, allowing us to use the time since the last earthquake to adjust the hazard estimate. We estimate the 50-year probability of a large Alpine fault earthquake to be about 27 percent."A magnitude 7.1 earthquake centered near Christchurch, the largest city in the South Island of New Zealand, caused extensive damage to buildings on Sept. 2, 2010, and no deaths. On Feb. 22, 2011, a triggered aftershock measuring magnitude 6.3, with one of the strongest ground motions ever recorded worldwide in an urban area, struck the city killing 185 people. | Earthquakes | 2,012 |
July 19, 2012 | https://www.sciencedaily.com/releases/2012/07/120719141808.htm | An earthquake in a maze: Highest-resolution observations yet of the complex 2012 Sumatra earthquake | The powerful magnitude-8.6 earthquake that shook Sumatra on April 11, 2012, was a seismic standout for many reasons, not the least of which is that it was larger than scientists thought an earthquake of its type -- an intraplate strike-slip quake -- could ever be. Now, as Caltech researchers report on their findings from the first high-resolution observations of the underwater temblor, they point out that the earthquake was also unusually complex -- rupturing along multiple faults that lie at nearly right angles to one another, as though racing through a maze. | The new details provide fresh insights into the possibility of ruptures involving multiple faults occurring elsewhere -- something that could be important for earthquake-hazard assessment along California's San Andreas fault, which itself is made up of many different segments and is intersected by a number of other faults at right angles."Our results indicate that the earthquake rupture followed an exceptionally tortuous path, breaking multiple segments of a previously unrecognized network of perpendicular faults," says Jean-Paul Ampuero, an assistant professor of seismology at Caltech and one of the authors of the report, which appears online July 19 in Most mega-earthquakes occur at the boundaries between tectonic plates, as one plate sinks beneath another. The 2012 Sumatra earthquake is the largest earthquake ever documented that occurred away from such a boundary -- a so-called intraplate quake. It is also the largest that has taken place on a strike-slip fault -- the type of fault where the land on either side is pushing horizontally past the other.The earthquake happened far offshore, beneath the Indian Ocean, where there are no geophysical monitoring sensors in place. Therefore, the researchers used ground-motion recordings gathered by networks of sensors in Europe and Japan, and an advanced source-imaging technique developed in Caltech's Seismological Laboratory as well as the Tectonics Observatory to piece together a picture of the earthquake's rupture process.Lingsen Meng, the paper's lead author and a graduate student in Ampuero's group, explains that technique by comparing it with how, when standing in a room with your eyes closed, you can often still sense when someone speaking is walking across the room. "That's because your ears measure the delays between arriving sounds," Meng says. "Our technique uses a similar idea. We measure the delays between different seismic sensors that are recording the seismic movements at set locations." Researchers can then use that information to determine the location of a rupture at different times during an earthquake. Recent developments of the method are akin to tracking multiple moving speakers in a cocktail party.Using this technique, the researchers determined that the three-minute-long Sumatra earthquake involved at least three different fault planes, with a rupture propagating in both directions, jumping to a perpendicular fault plane, and then branching to another."Based on our previous understanding, you wouldn't predict that the rupture would take these bends, which were almost right angles," says Victor Tsai, an assistant professor of geophysics at Caltech and a coauthor on the new paper.The team also determined that the rupture reached unusual depths for this type of earthquake -- diving as deep as 60 kilometers in places and delving beneath the Earth's crust into the upper mantle. This is surprising given that, at such depths, pressure and temperature increase, making the rock more ductile and less apt to fail. It has therefore been thought that if a stress were applied to such rocks, they would not react as abruptly as more brittle materials in the crust would. However, given the maze-like rupture pattern of the earthquake, the researchers believe another mechanism might be in play."One possible explanation for the complicated rupture is there might have been reduced friction as a result of interactions between water and the deep oceanic rocks," says Tsai. "And," he says, "if there wasn't much friction on these faults, then it's possible that they would slip this way under certain stress conditions."Adding to the list of the quake's surprising qualities, the researchers pinpointed the rupture to a region of the seafloor where seismologists had previously considered such large earthquakes unlikely based on the geometry of identified faults. When they compared the location they had determined using source-imaging with high-resolution sonar data of the topography of the seafloor, the team found that the earthquake did not involve what they call "the usual suspect faults.""This part of the oceanic plate has fracture zones and other structures inherited from when the seafloor formed here, over 50 million years ago," says Joann Stock, professor of geology at Caltech and another coauthor on the paper. "However, surprisingly, this earthquake just ruptured across these features, as if the older structure didn't matter at all."Meng emphasizes that it is important to learn such details from previous earthquakes in order to improve earthquake-hazard assessment. After all, he says, "If other earthquake ruptures are able to go this deep or to connect as many fault segments as this earthquake did, they might also be very large and cause significant damage."Along with Meng, Ampuero, Tsai, and Stock, additional Caltech coauthors on the paper, "An earthquake in a maze: compressional rupture branching during the April 11 2012 M8.6 Sumatra earthquake," are postdoctoral scholar Zacharie Duputel and graduate student Yingdi Luo. The work was supported by the National Science Foundation, the Gordon and Betty Moore Foundation, and the Southern California Earthquake Center, which is funded by the National Science Foundation and the United States Geological Survey. | Earthquakes | 2,012 |
July 17, 2012 | https://www.sciencedaily.com/releases/2012/07/120717084900.htm | Global health impacts of the Fukushima nuclear disaster | Radiation from Japan's Fukushima Daiichi nuclear disaster may eventually cause anywhere from 15 to 1,300 deaths and from 24 to 2,500 cases of cancer, mostly in Japan, Stanford researchers have calculated. | The estimates have large uncertainty ranges, but contrast with previous claims that the radioactive release would likely cause no severe health effects.The numbers are in addition to the roughly 600 deaths caused by the evacuation of the area surrounding the nuclear plant directly after the March 2011 earthquake, tsunami and meltdown.Recent PhD graduate John Ten Hoeve and Stanford civil engineering Professor Mark Z. Jacobson, a senior fellow at the Precourt Institute for Energy and the Stanford Woods Institute for the Environment, are set to publish their findings on July 17 in the journal The Fukushima Daiichi meltdown was the most extensive nuclear disaster since Chernobyl. Radiation release critically contaminated a "dead zone" of several hundred square kilometers around the plant, and low levels of radioactive material were found as far as North America and Europe.But most of the radioactivity was dumped in the Pacific -- only 19 percent of the released material was deposited over land -- keeping the exposed population relatively small."There are groups of people who have said there would be no effects," said Jacobson.A month after the disaster, the head of the United Nations Science Committee on the Effects of Atomic Radiation, for example, predicted that there would be no serious public health consequences resulting from the radiation.Evaluating the claim, Ten Hoeve and Jacobson used a 3-D global atmospheric model, developed over 20 years of research, to predict the transport of radioactive material. A standard health-effects model was used to estimate human exposure to radioactivity.Because of inherent uncertainties in the emissions and the health-effects model, the researchers found a range of possible death tolls, with a best estimate of 130. A wide span of cancer morbidities was also predicted, with a best estimate of 180.Those affected according to the model were overwhelmingly in Japan, with extremely small effects noticeable in mainland Asia and North America. The United States was predicted to suffer between 0 and 12 deaths and 0 and 30 cancer morbidities, although the methods used were less precise for areas that saw only low radionuclide concentrations."These worldwide values are relatively low," said Ten Hoeve. He explained they should "serve to manage the fear in other countries that the disaster had an extensive global reach."The Japanese government's response was much more rapid and coordinated than that of the Soviets in Chernobyl, which may have mitigated some of the cancer risk.Japanese government agencies, for example, evacuated a 20-kilometer radius around the plant, distributed iodine tablets to prevent radioiodine uptake and prohibited cultivation of crops above a radiation threshold -- steps that Ten Hoeve said "people have applauded."But the paper also notes that nearly 600 deaths were reported as a result of the evacuation process itself, mostly due to fatigue and exposure among the elderly and chronically ill. According to the model, the evacuation prevented at most 245 radiation-related deaths -- meaning the evacuation process may have cost more lives than it saved.Still, the researchers cautioned against drawing conclusions about evacuation policy."You still have an obligation to evacuate people according to the worst-case scenario," said Jacobson.To test the effects of varying weather patterns and geography on the reach of a nuclear incident, the two researchers also analyzed a hypothetical scenario: an identical meltdown at the Diablo Canyon Power Plant, near San Luis Obispo, Calif.Despite California's population density being about one-fourth that of Japan's, the researchers found the magnitude of the projected health effects to be about 25 percent larger.The model showed that rather than being whisked toward the ocean, as with Fukushima, a larger percentage of the Diablo Canyon radioactivity deposited over land, including population centers such as San Diego and Los Angeles.Jacobson stressed, however, that none of the calculations expressed the full scope of a nuclear disaster."There's a lot more to the issue than what we examined, which were the cancer-related health effects," he said. "Fukushima was just such a large disaster in terms of soil and water contamination, displacement of lives, confidence in government oversight, cost and anguish." | Earthquakes | 2,012 |
July 17, 2012 | https://www.sciencedaily.com/releases/2012/07/120717084806.htm | New inexpensive earthquake resistant houses | Researchers at the Universidad Politécnica de Madrid have successfully tested a new system to build earthquake resistant houses of high interest to third world countries with earthquakes. | As a result of a research carried out by researchers at the E.T.S. of Architecture of the UPM, they have developed and tested a new construction system, Integral Masonry System (IMS). The results of these tests proved that once a house is built with this stable permanent system without significant cracks and once its cracks are repaired, the building is able to continue resisting severe earthquakes.Many of the existing houses in seismic areas of low economic resources are built with adobe, hollow brick or concrete block. All of them are materials that proved not to be suitable by themselves to build earthquake resistant houses. From this idea, they developed a project for third world countries in order to give an alternative constructive solution which was able to give guaranties of reliability against natural disaster and at a low cost.This system (IMS) uses prefabricated trusses made with steel rods, very light and easy to install by hand, intersecting in the three directions of space to build the walls and floors that are then filled with debris, mud, brick or block for walls. The system can incorporate only a plank on the slab to give rigidity.To verify the safety of this new building for construction in seismic zones of third world countries, they tested prototypes carried out by using the system (IMS): two of them at half scale, one prototype filled with adobe, other prototype filled with hollow brick and the third prototype made at scale. The tests were conducted at the Antiseismic Structures Laboratory and at the Pontificia Católica Universidad del Perú (PUCP) in Lima and coordinated by researchers at the Universidad Politécnica Madrid (UPM) Belén Orta, Rosa Bustamente y José María Adell. The results obtained have demonstrated the high potential of the proposed construction system.The IMS provide an easy system of house building with typologies adapted to the form of the local life in which they want to apply it and at a minimum cost. It is an easy option of construction because it does not require concrete and it uses local materials what it is profitable for countries in development. | Earthquakes | 2,012 |
July 5, 2012 | https://www.sciencedaily.com/releases/2012/07/120705133716.htm | Toward a Better Understanding of Earthquakes | Earth is shaken daily by strong earthquakes recorded by a number of seismic stations worldwide. Tectonic tremor, however, is a new type of seismic signal that seismologist started studying only within the last few years. Tremor is less hazardous than earthquakes and occurs at greater depth. The link between tremor and earthquakes may provide clues about the more destructive earthquakes that occur at shallower depths. Geophysicists of Karlsruhe Institute of Technology (KIT) collected seismic data of tectonic tremor in California. These data are now being evaluated in order to better understand this new seismic phenomenon. | About a decade ago, researchers discovered a previously unknown seismic signal, now referred to as tectonic tremor. Contrary to earthquakes, tectonic tremor causes relatively weak ground shaking. While tremor may last longer than earthquakes, it does not cause any direct danger. "Both earthquakes and tremor have the same cause. They result from the relative movement on fault surfaces, a result of the motion of the tectonic plates," explains seismologist Dr. Rebecca Harrington, who heads a research group at KIT. "While earthquakes at our research site in California typically occur at depths of up to 15 km below the surface, tectonic tremor signals are generated at depths ranging from approximately 15 to 35 km."Tectonic tremor was first detected a decade ago in subduction zones in Japan and in the Pacific Northwest in North America. Since then, seismologists have discovered that tremor occurs in many other places, including the San Andreas fault in California. The San Andreas fault marks the boundary where the Pacific Plate and the North American plate drift past each other, generating many earthquakes in the process. KIT researchers have collected new seismic data recording tremor closer to where it occurs than the seismic stations currently installed near Cholame. In mid-2010, KIT researchers, together with scientists of the University of California, Riverside, and the US Geological Survey, Pasadena, installed 13 seismic stations near Cholame, located approximately halfway between San Francisco and Los Angeles. Each seismic station was equipped with a broadband seismometer in a thermally insulated hole in the ground, a small computer, and a solar panel for power. Broadband seismometers are extremely sensitive to small ground motions, are therefore ideal for detecting tremor and small earthquakes. The data recorded over a period of 14 months are presently being analyzed at KIT.Tectonic tremor signals have a unique character that differs from earthquakes, making them more difficult to detect using automated techniques. In order to address the detection problem, the KIT researchers first developed a new algorithm for the automatic isolation of tectonic tremor. Using their new technique, they found over 2600 tremor events that are now being studied in detail. "In addition to detecting tremor, we will determine their size or magnitude of the individual events. In order to do so, each of the tremor events must be precisely located," says Rebecca Harrington. Additionally, KIT geophysicists compare the tremor and earthquake recordings in California with earthquake recordings at Mount St. Helens volcano, located in the Cascadia subduction zone, located to north of California, in the US state of Washington. A volcano eruption from 2004-2008 produced a series of earthquakes on newly formed faults, where the scientists of the US Geological Survey collect data that are also made available to Rebecca Harrington.Seismology is still a long way from being able to predict earthquakes. However, seismologists can better estimate the danger posed by earthquakes by understanding what happens on a fault during a seismic event. According to Rebecca Harrington, research of tectonic tremor may play an important role understanding fault behavior. "We understand very little about what happens on a fault when it ruptures. The tectonic tremor generated on the deep part of a fault may provide clues about the behavior on the more shallow parts of a fault where more damaging earthquakes occur." | Earthquakes | 2,012 |
June 27, 2012 | https://www.sciencedaily.com/releases/2012/06/120627142520.htm | Potential for tsunamis in northwestern California documented | Using studies that span the last three decades, scientists at UC Santa Barbara have compiled the first evidence-based comprehensive study of the potential for tsunamis in Northwestern California. | The paper was co-written by professors Edward Keller and Alexander Simms from UCSB's Department of Earth Science, and published in a recent issue of the The paper is based on the Ph.D. dissertation of David Valentine, a research programmer at the Spatial Information Systems Laboratory at UC San Diego. Valentine, Keller's former student, completed his doctorate at UCSB in 2002 and is first author of the paper.The region has long been known to experience large earthquakes, and scientific studies of seismic activity in the southern end of the Cascadia Subduction Zone (CSZ) -- which stretches northward from the area of Mendocino, Calif. -- have previously appeared in grey literature and in guidebooks. However, comprehensive, reviewed evidence-based work has been lacking, according to Keller."Science goes on evidence," he said, adding that in light of the recent earthquakes in Japan and Chile, the study of the same potential closer to home is "timely." The authors studied sedimentation patterns in salt marshes, floodplains, and estuaries in the northwestern corner of California for signs of seismic events that could lead to tsunami activity. They combined this with information gathered from numerous studies conducted over nearly 30 years by researchers at Humboldt State UniversityDuring an earthquake, the researchers say, there is a tendency for the coastal wetlands to become submerged, with coastal sediments depositing over plants and animals that live there. These become a fossilized record of sea-level change in the area.The process has preserved a sequence of marsh surfaces and forest soils. Analysis of structure, texture, and organic content, as well as the use of radiocarbon dating to identify the age of the materials, revealed evidence of smaller strong-to-major earthquakes in the area (magnitude 6.5 to 7.2). Larger quakes (greater than magnitude 8.2) that involved the regional subduction zone, were also in evidence.According to the study, the local California section has experienced three major earthquakes over the last 2000 years, and accompanying local sea-level changes at roughly 300- to 400-year intervals, with the last one occurring 500 to 600 years ago. The researchers also found that the entire CSZ erupted, causing local submergence at least three times in roughly 500- to 600- year intervals, the last activity taking place in 1700 AD."It's not a matter of if, but when," said Keller, of the potential for the next major earthquake/tsunami event in the region -- a great earthquake that would impact not only the Northwest, but also send waves to Japan and Hawaii. The evidence, he said, is leading to far more foresight and planning along the impact areas in the region to avoid catastrophes on a level with the Japan earthquake of 2011 or the Indian Ocean quake of 2004.Other researchers contributing to the study include Gary Carver, a professor emeritus at Humboldt State University; Wen Hao Li from Northrup Grummond Co. in Redondo Beach; and Christine Manhart from Environmental Services and Consulting in Blacksburg, Va. | Earthquakes | 2,012 |
June 10, 2012 | https://www.sciencedaily.com/releases/2012/06/120610151453.htm | Undersea volcano gave off signals before eruption in 2011 | A team of scientists that last year created waves by correctly forecasting the 2011 eruption of Axial Seamount years in advance now says that the undersea volcano located some 250 miles off the Oregon coast gave off clear signals just hours before its impending eruption. | The researchers' documentation of inflation of the undersea volcano from gradual magma intrusion over a period of years led to the long-term eruption forecast. But new analyses using data from underwater hydrophones also show an abrupt spike in seismic energy about 2.6 hours before the eruption started, which the scientists say could lead to short-term forecasting of undersea volcanoes in the future.They also say that Axial could erupt again -- as soon as 2018 -- based on the cyclic pattern of ground deformation measurements from bottom pressure recorders.Results of the research, which was funded by the National Science Foundation, the National Oceanic and Atmospheric Administration, and the Monterey Bay Aquarium Research Institute (MBARI), are being published this week in three separate articles in the journal Bill Chadwick, an Oregon State University geologist and lead author on one of the papers, said the link between seismicity, seafloor deformation and the intrusion of magma has never been demonstrated at a submarine volcano, and the multiple methods of observation provide fascinating new insights."Axial Seamount is unique in that it is one of the few places in the world where a long-term monitoring record exists at an undersea volcano -- and we can now make sense of its patterns," said Chadwick, who works out of Oregon State's Hatfield Marine Science Center in Newport, Ore. "We've been studying the site for years and the uplift of the seafloor has been gradual and steady beginning in about 2000, two years after it last erupted."But the rate of inflation from magma went from gradual to rapid about 4-5 months before the eruption," added Chadwick. "It expanded at roughly triple the rate, giving a clue that the next eruption was coming."Bob Dziak, an Oregon State University marine geologist, had previously deployed hydrophones on Axial that monitor sound waves for seismic activity. During a four-year period prior to the 2011 eruption, there was a gradual buildup in the number of small earthquakes (roughly magnitude 2.0), but little increase in the overall "seismic energy" resulting from those earthquakes.That began to change a few hours before the April 6, 2011, eruption, said Dziak, who also is lead author on one of the "The hydrophones picked up the signal of literally thousands of small earthquakes within a few minutes, which we traced to magma rising from within the volcano and breaking through the crust," Dziak said. "As the magma ascends, it forces its way through cracks and creates a burst of earthquake activity that intensifies as it gets closer to the surface."Using seismic analysis, we were able to clearly see how the magma ascends within the volcano about two hours before the eruption," Dziak said. "Whether the seismic energy signal preceding the eruption is unique to Axial or may be replicated at other volcanoes isn't yet clear -- but it gives scientists an excellent base from which to begin."The researchers also used a one-of-a-kind robotic submersible to bounce sound waves off the seafloor from an altitude of 50 meters, mapping the topography of Axial Seamount both before and after the 2011 eruption at a one-meter horizontal resolution. These before-and-after surveys allowed geologists to clearly distinguish the 2011 lava flows from the many previous flows in the area.MBARI researchers used three kinds of sonar to map the seafloor around Axial, and the detailed images show lava flows as thin as eight inches, and as thick as 450 feet."These autonomous underwater vehicle-generated maps allowed us, for the first time, to comprehensively map the thickness and extent of lava flows from a deep-ocean submarine in high resolution," said David Caress, an MBARI engineer and lead author on one of the Nature Geoscience articles. "These new observations allow us to unambiguously differentiate between old and new lava flows, locate fissures from which these flows emerged, and identify fine-scale features formed as the lava flowed and cooled."The researchers also used shipboard sonar data to map a second, thicker lava flow about 30 kilometers south of the main flow -- also a likely result of the 2011 eruption.Knowing the events leading up to the eruption -- and the extent of the lava flows -- is important because over the next few years researchers will be installing many new instruments and underwater cables around Axial Seamount as part of the Ocean Observatories Initiative. These new instruments will greatly increase scientists' ability to monitor the ocean and seafloor off of the Pacific Northwest."Now that we know some of the long-term and short-term signals that precede eruptions at Axial, we can monitor the seamount for accelerated seismicity and inflation," said OSU's Dziak. "The entire suite of instruments will be deployed as part of the Ocean Observatories Initiative in the next few years -- including new sensors, samplers and cameras -- and next time they will be able to catch the volcano in the act."The scientists also observed and documented newly formed hydrothermal vents with associated biological activity, Chadwick said."We saw snowblower vents that were spewing out nutrients so fast that the microbes were going crazy," he pointed out. "Combining these biological observations with our knowledge of the ground deformation, seismicity and lava distribution from the 2011 eruption will further help us connect underwater volcanic activity with the life it supports."Scientists from Columbia University, the University of Washington, North Carolina State University, and the University of California at Santa Cruz also participated in the project and were co-authors on the Nature Geoscience articles. | Earthquakes | 2,012 |
June 1, 2012 | https://www.sciencedaily.com/releases/2012/06/120601120606.htm | Plate tectonics cannot explain dynamics of Earth and crust formation more than three billion years ago | The current theory of continental drift provides a good model for understanding terrestrial processes through history. However, while plate tectonics is able to successfully shed light on processes up to 3 billion years ago, the theory isn't sufficient in explaining the dynamics of Earth and crust formation before that point and through to the earliest formation of planet, some 4.6 billion years ago. This is the conclusion of Tomas Naæraa of the Nordic Center for Earth Evolution at the Natural History Museum of Denmark, a part of the University of Copenhagen. His new doctoral dissertation has just been published by the journal | "Using radiometric dating, one can observe that Earth's oldest continents were created in geodynamic environments which were markedly different than current environments characterised by plate tectonics. Therefore, plate tectonics as we know it today is not a good model for understanding the processes at play during the earliest episodes of Earths's history, those beyond 3 billion years ago. There was another crust dynamic and crust formation that occurred under other processes," explains Tomas Næraa, who has been a PhD student at the Natural History Museum of Denmark and the Geological Survey of Denmark and Greenland -- GEUS.Plate tectonics is a theory of continental drift and sea floor spreading. A wide range of phenomena from volcanism, earthquakes and undersea earthquakes (and pursuant tsunamis) to variations in climate and species development on Earth can be explained by the plate tectonics model, globally recognized during the 1960's. Tomas Næraa can now demonstrate that the half-century old model no longer suffices."Plate tectonics theory can be applied to about 3 billion years of the Earth's history. However, the Earth is older, up to 4.567 billion years old. We can now demonstrate that there has been a significant shift in the Earth's dynamics. Thus, the Earth, under the first third of its history, developed under conditions other than what can be explained using the plate tectonics model," explains Tomas Næraa. Tomas is currently employed as a project researcher at GEUS.Since 2006, the 40-year-old Tomas Næraa has conducted studies of rocks sourced in the 3.85 billion year-old bedrock of the Nuuk region in West Greenland. Using isotopes of the element hafnium (Hf), he has managed to shed light upon a research topic that has puzzled geologists around the world for 30 years. Næraa's instructor, Professor Minik Rosing of the Natural History Museum of Denmark considers Næraa's dissertation a seminal work:"We have come to understand the context of the Earth's and continent's origins in an entirely new way. Climate and nutrient cycles which nourish all terrestrial organisms are driven by plate tectonics. So, if the Earth's crust formation was controlled and initiated by other factors, we need to find out what controlled climate and the environments in which life began and evolved 4 billion years ago. This fundamental understanding can be of great significance for the understanding of future climate change," says Minik Rosing, who adds that: "An enormous job waits ahead, and Næraas' dissertation is an epochal step." | Earthquakes | 2,012 |
May 31, 2012 | https://www.sciencedaily.com/releases/2012/05/120531101905.htm | 'Like a jet through solid rock:' Volcanic arc fed by rapid fluid pulses | The depths of Earth are anything but peaceful: large quantities of liquids carve their way through the rock as fluids, causing magma to form. A research team led by the University of Münster, has shown that the fluids flow a lot faster through solid rock than previously assumed. In the Chinese Tian Shan Mountains, fluids pushed their way to Earth's mantle from great depths in just 200 years rather than in the course of tens or even hundreds of thousands of years. | The researchers from Münster, Kiel, Bochum, Erlangen, Bethlehem (USA) and Lausanne (Switzerland) present their findings, based on an innovative combination of fieldwork, geochemical analysis and numerical calculations, in the current issue of the journal When tectonic plates move towards each other and push over each other at the edges, so-called subduction zones are formed. The descending plate is heated and continuously releases the water stored in its rocks as fluid. The fluid penetrates Earth's mantle, which is located above the descending plate. The fluids thus lower the melting point of the mantle rocks, and the liquid rock formed rises to the volcanoes as magma. This magma feeds the many volcanoes throughout the world that occur along the convergent plate boundaries and form the "Ring of Fire," a volcanic belt that encircles the Pacific Ocean. The fluids are commonly assumed to flow through the rock in a defined flow system. Geologists call these structures veins.During field work in the Chinese part of the Tian Shan Mountains (Celestial Mountains), the research team found structures in the rocks they were studying which can be ascribed to massive fluid flows at great depth. "Our investigation has shown that a great deal of fluid must have flowed through a rock vein at about 70 km depth and that this fluid has obviously already covered a distance of several hundred meters or more -- the transport of such large quantities of fluid over such a great distance has not been demonstrated by anyone before us" explains Timm John from the Institute for Mineralogy, University of Münster. "And the most exciting thing is that this amount of fluid flowed through the rock in what is for geological processes a very short time, only about two hundred years," adds Nikolaus Gussone of the same institute.The release of fluids from minerals in the descending plates is a large-scale and continuous process that takes place at depths up to two-hundred kilometres and takes millions of years. During this time, the fluids first accumulate. As the researchers have now shown for the first time, the released fluids then flowed through the plate on their way to the mantle in pulses in a relatively short time along defined flow paths. "It's like a reservoir that continuously fills and then empties in a surge through defined channels" Timm John points out. "The fluid release is focused in space and time, and is much faster than expected -- almost like a jet through solid rock." The researchers hope to be able to show the spatial and temporal correlations between such fluid pulses and volcanic activity in future studies. It is also possible that such focused fluid releases are associated with the occurrence of earthquake events in subduction zones. To be able to demonstrate such relations, however, intensive research is still needed.The RUB's petrologists were involved in modelling the chemical data. This enabled the research team to determine the time it took the fluids to make their way to the mantle. Determining the time scales of various geological processes is a particular expertise of Bochum's petrologists. Among other things, they use minerals and rocks with zones that exhibit a different chemical composition. | Earthquakes | 2,012 |
May 30, 2012 | https://www.sciencedaily.com/releases/2012/05/120530133740.htm | San Andreas Fault in Santa Cruz Mountains: Large earthquakes more frequent than previously thought | Recent paleoseismic work has documented four surface-rupturing earthquakes that occurred across the Santa Cruz Mountains section of the San Andreas Fault (SAF) in the past 500 years. The research, conducted by the U.S. Geological Survey, with assistance from the California Geological Survey, suggests an average recurrence rate of 125 years, indicating the seismic hazard for the area may be significantly higher than currently recognized. | The observations help fill a gap in data on the seismic activity of the SAF in northern California, particularly south of San Francisco.Geologists Thomas Fumal and Tim Dawson conducted paleoseismic studies at Mill Canyon, near Watsonville, California. They documented evidence for four earthquakes, the most recent being the 1906 M 7.8 San Francisco event. They conclude that each of the three earthquakes prior to the 1906 quake was a large magnitude event that likely ruptured most, or all, of the Santa Cruz Mountains segment, producing similar physical deformation as the 1906 quake.In addition to filling in a data gap about the SAF in this region, this research adds to the understanding of how the SAF behaves, in particular whether individual segments of the fault system can produce destructive earthquakes and how often. This study joins to a growing body of work that suggests the SAF produces a wider array of magnitudes than previously appreciated in the current seismic hazard models. | Earthquakes | 2,012 |
May 24, 2012 | https://www.sciencedaily.com/releases/2012/05/120524123236.htm | Seismic hazard: Faults discovered near Lake Tahoe could generate earthquakes ranging from 6.3 to 6.9 | Results of a new U.S. Geological Survey study conclude that faults west of Lake Tahoe, Calif., referred to as the Tahoe-Sierra frontal fault zone, pose a substantial increase in the seismic hazard assessment for the Lake Tahoe region of California and Nevada, and could potentially generate earthquakes with magnitudes ranging from 6.3 to 6.9. A close association of landslide deposits and active faults also suggests that there is an earthquake-induced landslide hazard along the steep fault-formed range front west of Lake Tahoe. | Using a new high-resolution imaging technology, known as bare-earth airborne LiDAR (Light Detection And Ranging), combined with field observations and modern geochronology, USGS scientists, and their colleagues from the University of Nevada, Reno; the University of California, Berkeley; and the U.S. Army Corps of Engineers, have confirmed the existence of previously suspected faults. LiDAR imagery allows scientists to "see" through dense forest cover and recognize earthquake faults that are not detectable with conventional aerial photography."This study is yet one more stunning example of how the availability of LiDAR information to precisely and accurately map the shape of the solid Earth surface beneath vegetation is revolutionizing the geosciences," said USGS Director Marcia McNutt. "From investigations of geologic hazards to calculations of carbon stored in the forest canopy to simply making the most accurate maps possible, LiDAR returns its investment many times over."Motion on the faults has offset linear moraines (the boulders, cobbles, gravel, and sand deposited by an advancing glacier) providing a record of tectonic deformation since the moraines were deposited. The authors developed new three-dimensional techniques to measure the amount of tectonic displacement of moraine crests caused by repeated earthquakes. Dating of the moraines from the last two glaciations in the Tahoe basin, around 21 thousand and 70 thousand years ago, allowed the study authors to calculate the rates of tectonic displacement."Although the Tahoe-Sierra frontal fault zone has long been recognized as forming the tectonic boundary between the Sierra Nevada to the west, and the Basin and Range Province to the east, its level of activity and hence seismic hazard was not fully recognized because dense vegetation obscured the surface expressions of the faults," said USGS scientist and lead author, James Howle. "Using the new LiDAR technology has improved and clarified previous field mapping, has provided visualization of the surface expressions of the faults, and has allowed for accurate measurement of the amount of motion that has occurred on the faults. The results of the study demonstrate that the Tahoe-Sierra frontal fault zone is an important seismic source for the region." | Earthquakes | 2,012 |
May 16, 2012 | https://www.sciencedaily.com/releases/2012/05/120516140105.htm | Sumatra faces yet another risk: Major volcanic eruptions | The early April earthquake of magnitude 8.6 that shook Sumatra was a grim reminder of the devastating earthquakes and tsunami that killed tens of thousands of people in 2004 and 2005. | Now a new study, funded by the National Science Foundation, shows that the residents of that region are at risk from yet another potentially deadly natural phenomenon -- major volcanic eruptions.Researchers from Oregon State University working with colleagues in Indonesia have documented six major volcanic eruptions in Sumatra over the past 35,000 years -- most equaling or surpassing in explosive intensity the eruption of Washington's Mount St. Helens in 1980.Results of the research have just been published in the "Sumatra has a number of active and potentially explosive volcanoes and many show evidence of recent activity," said Morgan Salisbury, lead author on the study, who recently completed his doctoral studies in OSU's College of Earth, Ocean, and Atmospheric Sciences. "Most of the eruptions are small, so little attention has been paid to the potential for a catastrophic eruption."But our study found some of the first evidence that the region has a much more explosive history than perhaps has been appreciated," he added.Until this study, little was known about Sumatra's volcanic history -- in part because few western scientists have been allowed access to the region. The most visible evidence of recent volcanic activity among the estimated 33-35 potentially active volcanoes are their steep-sided cones and lack of vegetation, indicating at least some minor eruptive processes.But in 2007, an expedition led by OSU's Chris Goldfinger was permitted into the region and the Oregon State researchers and their Indonesian colleagues set out to explore the earthquake history of the region by studying sediment cores from the Indian Ocean. Funded by the National Science Foundation, it was the first research ship from the United States allowed into Indonesia/Sumatran waters in nearly 30 years.While searching the deep-sea sediment cores for "turbidites" -- coarse gravel deposits that can act as a signature for earthquakes -- they noticed unmistakable evidence of volcanic ash and began conducting a parallel investigation into the region's volcanic history."The ash was located only in certain cores, so the activity was localized," said Adam Kent, a professor of geosciences at OSU and an author on the study. "Yet the eruptions still were capable of spreading the ash for 300 kilometers or more, which gave us an indication of how powerful the explosive activity might have been."Salisbury and his colleagues found evidence of six major eruptions and estimated them to be at least from 3.0 to 5.0 on the Volcanic Explosivity Index. Mount St. Helens, by comparison, was 5.0.The Indian Ocean region is certainly known to have a violent volcanic history. The 1883 eruption of Krakatoa between Sumatra and Java is perhaps the most violent volcanic explosion in recorded history, measuring 6.0 on the VEI and generating what many scientists believe to have been one of the loudest noises ever heard on Earth.Sumatra's own Toba volcano exploded about 74,000 years ago, generating a major lake -- not unlike Oregon's own Crater Lake, but much larger. "It looks like a giant doughnut in the middle of Sumatra," said Jason "Jay" Patton, another OSU doctoral student and author on the study.Sumatra's volcanoes occasionally belch some ash and smoke, and provide comparatively minor eruptions, but residents there may not be fully aware of the potential catastrophic nature of some of its resident volcanoes, Goldfinger said."Prior to 2004, the risk from a major earthquake were not widely appreciated except, perhaps, in some of the more rural areas," Goldfinger said. "And earthquakes happen more frequently than major volcanic eruptions. If it hasn't happened in recent memory…"Kent said the next step in the research is to work with scientists from the region to collect ash and volcanic rock from the island's volcanoes, and then match their chemical signature to the ash they discovered in the sediment cores."Each volcano has a subtly different fingerprint," Kent said, "so if we can get the terrestrial data, we should be able to link the six major eruptions to individual volcanoes to determine the ones that provide the greatest risk factors."In addition to the Oregon State University scientists, two Indonesian researchers were authors on the journal article: Yusuf Djadjadihardja and Udrekh Hanif, of the Agency for the Assessment and Application of Technology in Jakarta. | Earthquakes | 2,012 |
May 10, 2012 | https://www.sciencedaily.com/releases/2012/05/120510142003.htm | Greater insight into earthquake cycles | For those who study earthquakes, one major challenge has been trying to understand all the physics of a fault -- both during an earthquake and at times of "rest" -- in order to know more about how a particular region may behave in the future. Now, researchers at the California Institute of Technology (Caltech) have developed the first computer model of an earthquake-producing fault segment that reproduces, in a single physical framework, the available observations of both the fault's seismic (fast) and aseismic (slow) behavior. | "Our study describes a methodology to assimilate geologic, seismologic, and geodetic data surrounding a seismic fault to form a physical model of the cycle of earthquakes that has predictive power," says Sylvain Barbot, a postdoctoral scholar in geology at Caltech and lead author of the study.A paper describing their model -- the result of a Caltech Tectonics Observatory (TO) collaborative study by geologists and geophysicists from the Institute's Division of Geological and Planetary Sciences and engineers from the Division of Engineering and Applied Science -- appears in the May 11 edition of the journal "Previous research has mostly either concentrated on the dynamic rupture that produces ground shaking or on the long periods between earthquakes, which are characterized by slow tectonic loading and associated slow motions -- but not on both at the same time," explains study coauthor Nadia Lapusta, professor of mechanical engineering and geophysics at Caltech. Her research group developed the numerical methods used in making the new model. "In our study, we model the entire history of an earthquake-producing fault and the interaction between the fast and slow deformation phases."Using previous observations and laboratory findings, the team -- which also included coauthor Jean-Philippe Avouac, director of the TO -- modeled an active region of the San Andreas Fault called the Parkfield segment. Located in central California, Parkfield produces magnitude-6 earthquakes every 20 years on average. They successfully created a series of earthquakes (ranging from magnitude 2 to 6) within the computer model, producing fault slip before, during, and after the earthquakes that closely matched the behavior observed in the past fifty years."Our model explains some aspects of the seismic cycle at Parkfield that had eluded us, such as what causes changes in the amount of time between significant earthquakes and the jump in location where earthquakes nucleate, or begin," says Barbot.The paper also demonstrates that a physical model of fault-slip evolution, based on laboratory experiments that measure how rock materials deform in the fault core, can explain many aspects of the earthquake cycle -- and does so on a range of time scales. "Earthquake science is on the verge of building models that are based on the actual response of the rock materials as measured in the lab -- models that can be tailored to reproduce a broad range of available observations for a given region," says Lapusta. "This implies we are getting closer to understanding the physical laws that govern how earthquakes nucleate, propagate, and arrest."She says that they may be able to use models much like the one described in the Avouac agrees. "Currently, seismic hazard studies rely on what is known about past earthquakes," he says. "However, the relatively short recorded history may not be representative of all possibilities, especially rare extreme events. This gap can be filled with physical models that can be continuously improved as we learn more about earthquakes and laws that govern them.""As computational resources and methods improve, dynamic simulations of even more realistic earthquake scenarios, with full account for dynamic interactions among faults, will be possible," adds Barbot. | Earthquakes | 2,012 |
May 3, 2012 | https://www.sciencedaily.com/releases/2012/05/120503162021.htm | Rapid Sierra Nevada uplift tracked by scientists | From the highest peak in the continental United States, Mt. Whitney at 14,000 feet in elevation, to the 10,000-foot-peaks near Lake Tahoe, scientific evidence from the University of Nevada, Reno shows the entire Sierra Nevada mountain range is rising at the relatively fast rate of 1 to 2 millimeters every year. | The exciting thing is we can watch the range growing in real time," University of Nevada, Reno's Bill Hammond, lead researcher on the multi-year project to track the rising range, said. "Using data back to before 2000 we can see it with accuracy better than 1 millimeter per year. Perhaps even more amazing is that these miniscule changes are measured using satellites in space."Miniscule as they may be, the data indicate that long-term trends in crustal uplift suggest the modern Sierra could be formed in less than 3 million years, which is relatively quick when compared to estimates using some geological techniques.Hammond and his colleagues in the University's Nevada Geodetic Laboratory and University of Glasgow use satellite-based GPS data and InSAR (space-based radar) data to calculate the movements to this unprecedented accuracy. The calculations show that the crust moves upward compared to Earth's center of mass and compared to relatively stable eastern Nevada.The data may help resolve an active debate regarding the age of the modern Sierra Nevada of California and Nevada in the western United States. The history of elevation is complex, exhibiting features of both ancient (40-60 million years) and relatively young (less than 3 million years) elevation. The "young" elevation is the uplift Hammond and colleagues have tracked."The Sierra Nevada uplift process is fairly unique on Earth and not well understood." Hammond said. "Our data indicate that uplift is distributed along the entire length of the 400-mile-long range, between 35 and 40 degrees north latitude, that it is active, and could have generated the entire range is less than 3 million years, which is young compared to estimates based on some other techniques. It basically means that the latest pulse of uplift is still ongoing."Possibly contributing to the rapid uplift is the tectonic extension in Nevada and a response to flow in the mantle. Seismologists indicate the mountain range may have risen when a fragment of lower plate peeled off the bottom of the lithosphere allowing the "speedy" uplift, like a ship that has lost its keel. In comparison, other ranges, such as the Alps or Andes, are being formed in an entirely different process caused by contraction as two plates collide."We've integrated GPS and InSAR measurement techniques, drawing from experience we developed in the past five years in our work with tectonic deformation, to see how the Sierra is gradually being pushed upwards," Hammond said. "Combined with more GPS stations, and more radar data, detecting motions in the Earth is becoming more precise and ubiquitous. We can see the steady and constant motion of the Sierra in addition to episodic events such as earthquakes."Hammond's team includes Geoff Blewitt, Hans-Peter Plag and Corné Kreemer from the University of Nevada, Reno's College of Science and Zhenhong Li of the Centre for the Observation and Modeling of Earthquakes, Volcanoes and Tectonics, School of Geographical and Earth Sciences, University of Glasgow in the UK.GPS data for Hammond and his team's research is collected through the team's MAGNET GPS Network based at the University of Nevada, Reno plus more than 1200 stations from the NSF EarthScope Plate Boundary Observatory and more than 10,000 stations from around the entire planet. These stations include hundreds that cover Nevada, California, Oregon, and Washington. The space-based radar data comes from the European Space Agency with support from NASA.This research was funded in the United States by the National Science Foundation and NASA and in the United Kingdom by the Natural Environment Research Council. | Earthquakes | 2,012 |
May 2, 2012 | https://www.sciencedaily.com/releases/2012/05/120502184712.htm | First-of-its-kind study reveals surprising ecological effects of earthquake and tsunami | The reappearance of long-forgotten habitats and the resurgence of species unseen for years may not be among the expected effects of a natural disaster. Yet that's exactly what researchers have found on the sandy beaches of south central Chile, after an 8.8-magnitude earthquake and devastating tsunami in 2010. Their study also revealed a preview of the problems wrought by sea level rise -- a major symptom of climate change. | In a scientific first, researchers from Universidad Austral de Chile and UC Santa Barbara's Marine Science Institute (MSI) were able to document the before-and-after ecological impacts of such cataclysmic occurrences. A new paper appearing in the journal "So often you think of earthquakes as causing total devastation, and adding a tsunami on top of that is a major catastrophe for coastal ecosystems. As expected, we saw high mortality of intertidal life on beaches and rocky shores, but the ecological recovery at some of our sandy beach sites was remarkable," said Jenifer Dugan, an associate research biologist at MSI. " Dune plants are coming back in places there haven't been plants, as far as we know, for a very long time. The earthquake created sandy beach habitat where it had been lost. This is not the initial ecological response you might expect from a major earthquake and tsunami."Their findings owe a debt to serendipity. With joint support from Chile's Fondo Nacional de Desarrollo Científico y Tecnológico and the U.S. National Science Foundation's Long Term Ecological Research program, the scientists were already knee-deep in a collaborative study of how sandy beaches in Santa Barbara and south central Chile respond, ecologically, to human-made armoring such as seawalls and rocky revetments. As part of that project, the Chilean team surveyed nine sandy beaches along the coasts of Maule and Bíobío in late January, 2010. The earthquake hit in February.Realizing their unique opportunity, the scientists quickly changed gears and within days were back on the beaches to reassess their study sites in the catastrophe's aftermath. They have returned many times since, diligently documenting the ecological recovery and long-term effects of the earthquake and tsunami on these coastlines, in both natural and human-altered settings.The magnitude and direction of land-level change brought the greatest impact, drowning beaches especially where the tsunami exacerbated earthquake-induced subsidence -- and widening and flattening beaches where the earthquake brought uplift. The drowned beach areas suffered mortality of intertidal life; the widened beaches quickly saw the return of plants and animals that had vanished due to the effects of coastal armoring."With the study in California and our study here, we knew that building coastal defense structures, such as seawalls, decreases beach area, and that a seawall results in the decline of intertidal diversity," said lead author Eduardo Jaramillo, of Universidad Austral de Chile. "But after the earthquake, where significant continental uplift occurred, the beach area that had been lost due to coastal armoring has now been restored. And the re-colonization of the mobile beach fauna was under way just weeks after."With responses varying so widely depending on land-level changes, mobility of flora and fauna, and shore type, the findings show not only that the interactions of extreme events with armored beaches can produce surprising ecological outcomes -- but also suggest that landscape alteration, including armoring, can leave lasting footprints in coastal ecosystems."When someone builds a seawall, not only is beach habitat covered up with the wall itself, but, over time, sand is lost in front of the wall until the beach eventually drowns," Dugan said. "The semi-dry and damp sand zones of the upper and mid intertidal are lost first, leaving only the wet lower beach zones. This causes the beach to lose diversity, including birds, and to lose ecological function. This is an underappreciated human impact on coastlines around the world, and with climate change squeezing beaches further, it's a very serious issue to consider."Jaramillo elaborated, "This is very important because sandy beaches represent about 80 percent of the open coastlines globally. Also, sandy beaches are very good barriers against the sea level rise we are seeing around the world. It is essential to take care of sandy beaches. They are not only important for recreation, but also for conservation."The study is said to be the first-ever quantification of earthquake and tsunami effects on sandy beach ecosystems along a tectonically active coastal zone. | Earthquakes | 2,012 |
April 25, 2012 | https://www.sciencedaily.com/releases/2012/04/120425193050.htm | Researchers give long look at who benefits from nature tourism | Using nature's beauty as a tourist draw can boost conservation in China's valued panda preserves, but it isn't an automatic ticket out of poverty for the human habitants, a unique long-term study shows. | The policy hitch: Often those who benefit most from nature-based tourism endeavors are people who already have resources. The truly impoverished have a harder time breaking into the tourism business.Wei Liu's study, published in the April 25 edition of But until now, no one has taken a close look at the long-term implications for people economically.Liu is a PhD student in the Center for Systems Integration and Sustainability at Michigan State University. He and his colleagues took advantage of the center's 15-year history of work in Wolong -- which they call the "excellent laboratory" -- to study the complex interactions of humans and nature."This study shows the power of having comprehensive long-term data to understand how everything works together," Liu said. "This is the first time we've been able to put it together to understand how changes are being made.""Drivers and Socioeconomic Impacts of Tourism Participation in Protected Areas" is co-authored by Christine Vogt, professor of community, agriculture, recreation and resource studies; research associate Junyan Luo; Guangming He, research assistant; Kenneth Frank, professor of environmental and natural resources economics and fisheries and wildlife; and Jianguo "Jack" Liu, Rachel Carson Chair in Sustainability. All but Vogt are members of in CSIS, of which Jack Liu is director.Wei Liu and his colleagues followed 220 families in Wolong from 1999 to 2007 as they rode the wave of change in an area shifting from farming to bringing in tourists who both want to see are land of the giant pandas as well as ecological beauty. That wave was abruptly stopped in 2008 with the massive earthquake whose damage to roads and buildings still impedes business development.China, Wei Liu said, has a political structure that heavily favors social connections. The group studied the impact of having resources in Wolong. Already having money, being educated, having relationships with governmental officials greatly increased a person's chances of being successful in nature-based tourism.Lacking these resources made it harder, which is significant since many of China's programs and initiatives aim to lift people out of poverty."The policies haven't yet reached their full potential," Wei Liu said. "But now we have the data to show what's happening.An interesting piece of the research was learning that people who are engaged in tourism trade more likely to acknowledge the tradeoffs between tourism development and conservation. Wei said they acknowledged that tourism did increase noise, traffic congestion and disturbance to wildlife."People in tourism have a lot of knowledge of where tourists go in the wild and what they bring," Wei Liu said.Wei Liu said this research can help China -- and others around the world -- with the next steps of policy to balance tourism with habitat management. The area is working hard to rebuild from the earthquake, just as other developing tourism areas are challenged by natural disasters. The study, he and his colleagues say, can point to opportunities to improve policies.The research was funded by the National Science Foundation and National Institutes of Health. Jack Liu's research also is supported by MSU AgBioResearch. | Earthquakes | 2,012 |
May 3, 2021 | https://www.sciencedaily.com/releases/2021/05/210503135638.htm | Human behavior must be factored into climate change analyses | A new Cornell University-led study examines how temperature affects fishing behavior and catches among inland fisher households in Cambodia, with important implications for understanding climate change. | The research, which used household surveys, temperature data and statistical models, revealed that when temperatures rise, people fish less often. At the same time, the study's authors indirectly found that stocks of fish and other aquatic foods also rise with temperatures, leading to slightly larger catches each time peopled fished. Without careful analysis, it would appear that overall fish catches appear unchanged annually, when in fact, more nuanced dynamics are at play.The study highlights why it's necessary when studying changing environmental conditions to include human behavior along with ecosystem responses; both are key variables when considering how climate change affects rural livelihoods, food production and food access.The paper, "Fishers' Response to Temperature Change Reveals the Importance of Integrating Human Behavior in Climate Change Analysis," published April 30 in the journal "This study underscores the importance of pulling human behavior into climate change modeling," said Kathryn Fiorella, an assistant professor in the Department of Population Medicine and Diagnostic Sciences and Master of Public Health Program in the College of Veterinary Medicine. "To accurately predict the impacts of climate change, we need to know about the effects on ecological systems, and also the effects on people who use them."In the study, Fiorella and colleagues used data provided by partner organization WorldFish, which collected survey data every two months over three years for households in Cambodia, which has the world's highest per-capita consumption of inland fish. WorldFish collected information on how often people fished, how much time they spent when they fished, and what method they used.The researchers used remotely sensed temperature data over the same three-year period, which revealed a range of 24 to 31 degrees Celsius (75 to 88 degrees Fahrenheit). The researchers also controlled for rainfall and flooding."The temperatures in the range of the study compare to regional climate projections in the area, which suggest around a 1.5 to 2.5 degrees Celsius [2.7 to 4.5 F] temperature rise above the average of 28 degrees Celsius [82.4 F]," Fiorella said. "What we observed is in range for what we might expect under climate change scenarios."The researchers found time spent fishing per outing and the gear choices were not affected by temperature, but fewer people fished as temperatures rose.They also analyzed fish catch. It turns out that, with effort holding constant, fish catch per outing went up as temperatures rose, which meant the ecosystem became a little more productive when it was warmer. The same pattern was true for other aquatic animals, like frogs or snakes, and aquatic plants. However, without factoring in effects of temperature on human behavior, it might have looked like temperature had no effect.The researchers suspect that fishing frequency declined as temperatures rose due to competing interests. "These households have a suite of different activities they are engaged in at the same time," Fiorella said, noting many of them are rice farmers or run small businesses. At the same time, heat may also be a factor, she added.Fiorella added that large swaths of the population migrate to cities or nearby countries for work, and these dynamics could be pulling them away from fishing."Ultimately," she said, "understanding both ecosystem responses and people's responses to temperature is going to be fundamental to understanding how climate change affects people who are directly reliant on the natural resources for their food and income."The study was funded by the Cornell Atkinson Center for Sustainability and the WorldFish's Rice Field Fisheries Enhancement project, which is supported by the U.S. Agency for International Development. | Climate | 2,021 |
May 3, 2021 | https://www.sciencedaily.com/releases/2021/05/210503135618.htm | Equipping crop plants for climate change | Biologists at Ludwig-Maximilians-Universitaet (LMU in Munich) have significantly enhanced the tolerance of blue-green algae to high light levels -- with the aid of artificial evolution in the laboratory. | Sunlight, air and water are all that cyanobacteria (more commonly known as blue-green algae), true algae and plants need for the production of organic (i.e. carbon-based) compounds and molecular oxygen by means of photosynthesis. Photosynthesis is the major source of building blocks for organisms on Earth. However, too much sunlight reduces the efficiency of photosynthesis because it damages the 'solar panels', i.e. the photosynthetic machineries of cyanobacteria, algae and plants. A team of researchers led by LMU biologist Dario Leister has now used "artificial laboratory evolution" to identify mutations that enable unicellular cyanobacteria to tolerate high levels of light. The long-term aim of the project is to find ways of endowing crop plants with the ability to cope with the effects of climate change.The cyanobacteria used in the study were derived from a strain of cells that were used to grow at low levels of light. "To enable them to emerge from the shadows, so to speak, we exposed these cells to successively higher light intensities," says Leister. In an evolutionary process based on mutation and selection, the cells adapted to the progressive alteration in lighting conditions -- and because each cell divides every few hours, the adaptation process proceeded at a far higher rate than would have been possible with green plants. To help the process along, the researchers increased the natural mutation rate by treating cells with mutagenic chemicals and irradiating them with UV light. By the end of the experiment, the surviving blue-green algae were capable of tolerating light intensities that were higher than the maximal levels that can occur on Earth under natural conditions.To the team's surprise, most of the over 100 mutations that could be linked to increased tolerance to bright light resulted in localized changes in the structures of single proteins. "In other words, the mutations involved primarily affect the properties of specific proteins rather than altering the regulatory mechanisms that determine how much of any given protein is produced," Leister explains. As a control, the team then introduced the genes for two of the altered proteins, which affect photosynthesis in different ways, into non-adapted strains. -- And in each case, they found that the change indeed enabled the altered cells to tolerate higher light intensities than the progenitor strain.Enhancing the tolerance of crop plants to higher or fluctuating light intensities potentially provides a means of increasing productivity, and is of particular interest against the background of ongoing global climate change. "Application of genetic engineering techniques to plant breeding has so far concentrated on quantitative change -- on making more or less of a specific protein," says Leister. "Our strategy makes qualitative change possible, allowing us to identify new protein variants with novel functions. Insofar as these variants retain their function in multicellular organisms, it should be possible to introduce them into plants." | Climate | 2,021 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.