Date
stringlengths
11
18
Link
stringlengths
62
62
Title
stringlengths
16
148
Summary
stringlengths
1
2.68k
Body
stringlengths
22
13k
Category
stringclasses
20 values
Year
int64
2k
2.02k
September 24, 1998
https://www.sciencedaily.com/releases/1998/09/980924074732.htm
Preparation Pays Off In Puerto Rico -- USGS River Data Keeps Flowing
Electricity may be out and communication lines cut off by the tremendous winds and torrential rains of Hurricane Georges, but thanks to good foresight in "hardening" monitoring systems by the U.S. Geological Survey, real-time streamflow data in Puerto Rico continued to flow to reservoir operators, emergency officials, and others who need streamflow information and need it fast.
The USGS maintains a network of 123 real-time gaging stations in Puerto Rico, part of a nationwide network of more than 4,000 real-time stations (part of a larger national network of nearly 7,000 stations) that keep a close eye on what is happening with the country's rivers and streams. A preliminary assessment of the streamflow network in Puerto Rico shows that about 12 stations were severely damaged.Because Puerto Rico is so often in the path of destructive hurricanes, USGS hydrologists had developed contingency operations to ensure that information on the effect of hurricane rains on local rivers would be available to those who need it. The streamflow gaging stations in Puerto Rico have been outfitted with satellite-linked data collection platforms that transmit streamflow in real time to the main computer in the USGS Puerto Rico office in San Juan. The entire computer and data relay system in Puerto Rico was backed up with a diesel-powered generator to ensure that information would continue to flow no matter what Mother Nature might do. Throughout the hurricane's pass over the island, data was received into USGS computers from the backup system and data has been provided on a continuous basis to key cooperators.The streamflow data for Puerto Rico and the other real-time stations throughout the United States are available to the public from the USGS via the World Wide Web As an example of the enormous rise in streamflow that can occur as a result of such torrential hurricane rains, the Rio de la Plata at Highway 2 at Toa Alta, downstream from the Puerto Rico Aqueduct and Sewer Authority reservoir, rose from about 10 cubic feet per second (cfs) to a maximum discharge of more than 130,000 cfs in less than 24 hours. Based on historic records available for this river, this may be the great discharge recorded at this site.The USGS commitment to keep information flowing had a very human side during Hurricane Georges. Dianne Lopez-Trujillo, a computer specialist in the San Juan office, who is also a graduate student at the University of Puerto Rico at Mayaguez, stayed throughout the night to keep the computers operating and to keep information going to the Puerto Rico Aqueduct and Sewer Authority. Information on streamflow stage (river height) and discharge (velocity of the river moving past a given point) are critical to reservoir operators in ensuring that reservoirs are not overtopped and that releases can be made to minimize flooding. Preliminary reports from the island were that reservoir gates were open.While the USGS district office remains closed in the wake of Hurricane Georges, hydrologists continue as best they can to meet their flood response commitments. Teams of hydrologists will be going out to note high-water marks in order to make "indirect" measurements of the discharge of rivers, to delineate the height of storm surges, to map inundated areas and to document the height and severity of flooding. Such post-flood information is critical in developing models of patterns of flooding, or recurrence intervals, that are essential in preparing for future flood events.As the nation's largest water, earth and biological science, and civilian mapping agency, the USGS works in cooperation with more than 2,000 organizations across the country to provide reliable, impartial scientific information to resource managers, planners, and other customers. This information is gathered in every state by USGS scientists to minimize the loss of life and property from natural disasters, contribute to the sound conservation and the economic and physical development of the nation's natural resources, and enhance the quality of life by monitoring water, biological, energy, and mineral resources.
Hurricanes Cyclones
1,998
August 27, 1998
https://www.sciencedaily.com/releases/1998/08/980827075105.htm
Researchers Plot Hurricane Damage Zipcode By Zipcode
Long before Hurricane Bonnie ever makes landfall, wind engineering researchers at Clemson University will be trying to predict - on a zip code by zip code basis - the amount of damage she's likely to cause.
"Our research is still ongoing, but we are trying to use this season's storms, such as Bonnie, to put our computer model through some 'real-world' tests," said David Rosowsky, an associate professor of civil engineering at Clemson. Clemson researchers hope to have the computer program available to emergency management personnel in time for next year's hurricane season. "This tool gives our state emergency preparedness officials a way to mobilize their resources and manpower to the areas that will most need help. It's often difficult to mobilize help once the hurricane has already come ashore and caused damage. This way, we can be ready to lend assistance the moment it's needed," said Rosowky. "Emergency managers and planners are often inundated with information and misinformation as a storm approaches. A wide variety of wind speeds is reported by the media, and it is difficult - if not impossible - for them to sort these out and rank them in terms of both credibility and applicability to their particular situations and locations." The Clemson computer modeling system can be used both for short-term planning as a storm approaches or for long-term studies and 'what-if' scenarios by emergency management personnel, emergency planners and the insurance industry. Researchers hope to be able to predict not only damage to residential structures but also maximum wind speed and estimated time of occurrence. All results will be displayed by zip code using a Geographic Information System. The predictions are based on hurricane tracking information supplied by reconnaissance aircraft that is integrated into a computer model based on information gathered from previous storms. The computer modeling system is part of a larger project funded by the S.C. Sea Grant Consortium to develop a hurricane hazard assessment system for the State of South Carolina. Rosowsky is the lead investigator of the multi-year project. Coastal areas along South Carolina were left devastated in the wake of Hurricane Hugo, which in September 1989 caused billions of dollars in damage. Through its research, Clemson is finding new ways to save lives and property in peril from potentially devastating wind storms by providing engineering data for new and existing construction. Clemson's Wind Load Test Facility is one of the nation's top laboratories for testing the effects of wind on low-rise structures such as homes and schools.
Hurricanes Cyclones
1,998
August 26, 1998
https://www.sciencedaily.com/releases/1998/08/980826083909.htm
Eye-To-Eye, And Bonnie Winks: NASA/NOAA Team Makes First Sortie Into Hurricane
August 24, 1998 -- A converted DC-8 jet airliner, outfitted as a remote sensing laboratory, took weather researchers on an historic ride Sunday into the eye of Hurricane Bonnie as she churned in the Atlantic near the Bahama Islands.
And while looking Bonnie in the eye, she winked.Ocean waves, whipped by Bonnie to 2.4 to 3.6 meters (8-12 ft) high, crashed ashore a few hundred meters from the runway at Patrick Air Force Base, Fla., where a DC-8 prepared for the first-ever NASA jet flight into the eye of an Atlantic hurricane on Sunday afternoon.The jetliner, flying at 11 km (37,000 ft), was joined at the storm by a NASA ER-2 jet overhead at 19.8 km (65,000 ft), and a NOAA WP-3D Orion turboprop 4.6 km (15,000 ft). The NASA planes took off at 1:34 p.m. EDT on their seven-hour mission."This is a significant achievement for this hurricane study," said Robbie Hood, mission scientist from NASA's Global Hydrology and Climate Center in Huntsville, Ala. "We achieved our number one objective, that we could accomplish the tricky maneuver of placing all three NASA and NOAA aircraft in the study of the structure of the same storm at the same time."The research program, called CAMEX-3, is a combined study effort including eight NASA Centers, NOAA, and a contingent of scientists from universities across the nation.The aircraft performed four passes over the eye of the then-Category-2 storm, centered at 24.5 N, 71.4 W. Two of the passes were coordinated with a NOAA Orion passing below. Researchers could not see into the eye on two passes due to cloud cover, but recorded infrared images on each pass. The location of the eye was obtained by information passed along by scientists stationed at Patrick Air Force Base or aboard NOAA's Orion aircraft (like the one at right).Once the aircraft reached the first hurricane of the 1998 season, the researchers encountered an unusual phenomenon: As the three aircraft flew in a stacked pattern, the eye wall turned from an oval to a oblong shape."This reshaping of the eye wall is characteristic of a hurricane that has stalled, and is preparing for a dramatic shift, either stronger or dying," said Dr. Ed Zipser, a weather expert from Texas A&M University.Another impressive step was taken when NASA researchers gave Bonnie some eye drops. Ten small tubes containing miniature weather stations were dropped into Bonnie's shifting eye to check her vital signs ­ wind speeds, barometric pressure, and humidity levels. The tiny weather stations dropped into the middle of the eye verified the readings the DC-8 remote sensing instruments were reading at 11 km (37,000 ft).Dropsondes can measure temperature, horizontal wind speed, pressure, and humidity from altitudes as great as 24 km (15 mi) until landing. The sondes themselves are marvels of miniaturization, only 7 cm (2.75 in) in diameter and 40.6 cm (16 in) long, and weighing just 400 grams (less than a pound).The RSS903 dropsonde used in CAMEX-3 and other campaigns were developed by the National Center for Atmospheric Research and the German Space Agency (DLR) jointly developed the new model to use advanced sensors and to incorporate Global Positioning Satellite (GPS) receivers. This last feature gives scientists precise measurements of the sonde's location - including altitude - as it is carried along by a storm. With Bonnie pushing towards the coast, wrote forecaster R. Wohlman, the Eastern U.S. is dominated by an intense high-pressure region. This is causing any shortwaves from the west to ride far north into Minnesota, Michigan, and Illinois. Otherwise, Ohio valley through Colorado is clear. Remnants of Tropical Storm Charlie, which charged ashore in the Texas gulf region, have slowed, filled and dropped lots of much needed rain over the southern half of the dry Lone Star state. I would expect that this moisture, which shows up well in the satellite water vapor imagery, would continue its westward movement. A large region of cloudiness and associated moisture is moving thou the New England states, and is forecast to slowly drift off shore. If there is a weakness in the extensive anti-cyclonic area over the U.S., it might develop just offshore, between that high and the one located in the mid-Atlantic.Meanwhile, Bonnie continues to develop nicely. Winds up to 167 km/h (90 knots) sustained observed in the morning reconnaissance, but much to the consternation of the forecasters at the National Hurricane Center, forward motion has all but ceased. At 11 a.m. (15Z), Bonnie was centered at 24.2N, 71.6W and forecast to start moving northwest, then gradually shift northward. Bonnie's recalcitrance is causing the various forecast programs, which had once seemed to be converging on a fairly uniform track, to appear to be diverging again.
Hurricanes Cyclones
1,998
August 18, 1998
https://www.sciencedaily.com/releases/1998/08/980818072806.htm
High-Altitude Hurricane Study Could Save Lives And Money
With an aim to better understand and improve ground-based predictions of hurricanes, two specially equipped NASA aircraft soon will take to the skies -- collecting high-altitude information about Atlantic hurricanes and tropical storms.
The Convection and Moisture Experiment (CAMEX) mission is scheduled for August and September. Results from the mission may increase warning time -- saving lives and property -- and decrease the size of evacuation areas -- saving money -- while giving scientists a better understanding of these dramatic weather phenomena. CAMEX will yield high-resolution information on hurricane structure, dynamics and motion, leading to improved hurricane prediction. Results also will be used to validate existing measurements from the Tropical Rainfall Measuring Mission of hurricanes and tropical storms and to develop mathematical models for future Earth science missions.Led by the Atmospheric Dynamics and Remote Sensing program at NASA Headquarters, Washington, DC, the experiment unites eight NASA centers, other government weather researchers and the university community for a coordinated, multi-agency and -university Atlantic hurricane and tropical storm study. "We only know what goes on in the bottom half of a hurricane -- from sea level to 27,000 feet," said atmospheric expert, Ms. Robbie Hood of the Global Hydrology and Climate Center at NASA's Marshall Space Flight Center in Huntsville, Ala. "With all of the agencies and the university community working together, we now can learn about these storms from top to bottom -- and hopefully improve hurricane prediction." When a hurricane or tropical storm erupts in the Atlantic, a NASA Dryden Flight Research Center DC-8 -- equipped with instruments to measure the storm's structure, environment and changes in intensity and tracking -- will fly into the storm at 35,000-40,000 feet. At the same time, a specially equipped Dryden ER-2 -- a high-altitude research plane -- will soar above the storm at 65,000 feet. The modified U-2 will measure the storm's structure and the surrounding atmosphere that steers the storm's movement. On the ground, the storm research team will launch weather balloons and monitor land-based sensors to validate the high-altitude measurements taken by instruments aboard the planes. Hood and her team plan to fly the NASA planes in conjunction with scheduled storm flights of the National Oceanic and Atmospheric Administration (NOAA) that will take off from MacDill Air Force Base, Tampa, Fla., and the "Hurricane Hunters" -- the U.S. Air Force's 53rd Weather Reconnaissance Squadron from Keesler Air Force Base, Miss. The Air Force's Hurricane Hunters and NOAA routinely fly into tropical storms and hurricanes to determine their location, motion, strength and size. Information gathered by the two organizations is used to predict the potential strength and size of the storms as well as landfall. In addition to providing Doppler radars on each research plane, NASA for the first time will bring state-of-the-art airborne instruments to measure moisture and wind fields around the hurricanes under observation.NOAA flies a WP-3 "Orion" -- a four engine turboprop plane -- into storms at altitudes below 27,000 feet. And the Hurricane Hunters fly a WC-130 "Hercules" -- also a four-engine turboprop craft -- at 5,000-10,000 feet. "We will analyze the high-altitude storm information within the context of more traditional low-level aircraft observations, and satellite and ground-based radar observations," said Hood. "This new information should provide insight to hurricane modelers -- forecasters who continually strive to improve hurricane predictions." Scientific instruments provided by Marshall to be flown on the Dryden aircraft will be augmented by instruments from NASA's Goddard Space Flight Center, Greenbelt, Md.; Jet Propulsion Laboratory, Pasadena, Calif,; Langley Research Center, Hampton, Va.; and Ames Research Center, Moffett Field, Calif. The hurricane study is part of NASA's Earth Science enterprise to better understand the total Earth system and the effects of natural and human-induced changes on the global environment.
Hurricanes Cyclones
1,998
August 14, 1998
https://www.sciencedaily.com/releases/1998/08/980814070429.htm
Study Suggests Weekly Hurricane And Rainfall Patterns Are Linked To East Coast Pollution
Who hasn't felt like going back to work on Monday has affected the weather? It turns out the weather itself may indeed be controlled by the weekly calendar, and that even mighty Atlantic hurricanes may feel the punch of the workweek, according to a study by two Arizona State University researchers appearing in the journal Nature.
Examining some basic data sets in a way that has never been tried before, ASU climatologists Randall Cerveny and Robert Balling, Jr. have found proof for what many a weekend boater has secretly suspected: rain is most likely to occur along the Atlantic coast on the weekend and the weather is most likely to be better on a Monday, Tuesday or Wednesday. The most obvious culprit is the "natural" cloud-seeding effect created by the massive drift of East Coast pollution, which also follows a well defined weekly cycle. The gray, smelly cloud of pollution has a strange silver lining, however. While pollution makes for more rainy weekends, it also apparently reduces the intensity of hurricanes that hit over the weekend, such that weekend hurricanes tend to be much weaker than, say, Tuesday storms."Hurricanes are the biggest storms that we have on this planet, in terms of energy and precipitation," noted Cerveny. "And what we've found is that we're having an impact on them. It's a little daunting, when you start to think about it."Cerveny and Balling examined and compared three different data sets -- daily carbon monoxide and ozone measurements from a Canadian monitoring station on Sable Island off the coast of Nova Scotia, daily satellite-derived rainfall data for the Atlantic Ocean, and databases of coastal Atlantic hurricane measurements. In each case, when the two ASU scientists examined the data by day of the week, they found significant differences between days, and similar patterns of variation, with pronounced differences between beginnings and the ends of weeks. All three sets of climate data revealed a seven-day cycle."The human week is not a natural time period," said Balling. "Human effect on weather is the only explanation." "If you're going to go out boating in the Atlantic, you're going to get wet if it's a weekend," Cerveny said. "And what we suggest is that this is probably linked to the pollution cycle."In examining precipitation in the Atlantic, they found no daily variation when looking at the ocean as a whole, but a pronounced sine-wave pattern of variation for just the coastal areas, with average daily precipitation rising on Thursday and into the weekend and then dipping from Sunday through the middle of the week. Balling notes that when the team analyzed satellite data grid cells for an area a little further away from the coast, they found the same pattern, time-shifted in accordance with the rate of pollution drift. Though the study does not directly address causation, a comparable fluctuation in the levels of East Coast air pollution points to an obvious connection. The fact that coastal hurricane intensity data taken from 1945 to 1996 follow a similar pattern (rather than being statistically uniform for each day of the week, as one would expect), supports this hypothesis."The fact that pollution can affect rainfall is actually well understood," said Balling. "We just had to look for the evidence in the right place. The hurricane data, though, surprised the heck out of me." "We knew that cities have an effect on local weather with urban heat islands and so forth, and people are pretty sure that we're having a general global effect with carbon dioxide," said Cerveny. "But nobody had ever looked at the in-between area of large-scale regional weather. We appear to be affecting global weather on a scale that is comparable to El Nino."The hypothesis is particularly important when applied to hurricanes, because of the destructive potential of the storms. Cerveny and Balling looked at 50 years worth of hurricane records, which include observations taken every 6 hours and found surprising statistical differences with important implications. "Storms are substantially weaker during the first part of the week and stronger in the last part of the week," said Cerveny. "Pollution's thermal changes on the storm are apparently helping hurricanes blow themselves out. The difference is as high as 10 miles-per-hour wind speed, downgrading the storm almost as much as a Saffir-Simpson Scale category, meaning that if a hurricane were to hit on Tuesday or Thursday, it might be a Category 3, but if it were to hit on a Saturday or Sunday, it might be only be a Category 2." Cerveny notes that the effect is similar to a weather-control method once attempted in a military experiment: "Back in the 1960s, the military had a project called Project Storm Fury that was developed to cloud-seed hurricanes. What we're suggesting here is that they were on the right track, but they just didn't do it on a large enough scale. We're looking at the combined pollution from the entire eastern seaboard -- that's what it takes to influence a hurricane."Though the study has interesting implications, what most surprised the ASU scientists was the fact that no other researcher had ever attempted to analyze these major data sets in such a basic way."Interestingly, no one had ever looked at this pollution data from a daily standpoint before and, curiously, nobody has bothered to look at seven-day cycles in the weather data," said Cerveny. "Oftentimes the most fundamental research is that way you say to yourself 'why didn't anybody look at this' When we were putting this together, we went through every journal we could find, saying 'somebody has got to have done this before!' Luckily for us, no one had."Cerveny and Balling's study can by found in the August 6 issue of Nature. A simplified version can also be found on the World Wide Web at
Hurricanes Cyclones
1,998
July 31, 1998
https://www.sciencedaily.com/releases/1998/07/980731082758.htm
Study 'Gone With The Wind' Provides Stellar Ecological Example
ST. LOUIS, Mo., July 31, 1998 -- A biologist at Washington University in St. Louis and colleagues at the University of California, Davis were literal recipients of a natural windfall in October 1996 during Hurricane Lili.
Through a quirk of fate, the biologists saw one study metamorphose into a completely different one that graphically reveals how natural forces periodically play with an ecosystem's populations and tip the so-called "balance of nature."Jonathan Losos, Ph.D., associate professor of biology at Washington University in St. Louis, and biologists David A. Spiller and Thomas W. Schoener at the University of California, Davis, had just finished censusing lizard and spider populations on 19 tiny islands in the Bahamas when Hurricane Lili hit the area on October 19. The trio had introduced lizards to the islands in 1993 to conduct an experiment, "the effect of predators on island ecosystems." The day after the hurricane blew through the large island of Great Exuma, where they were staying, the biologists quickly took to their boats to re-examine the islands for a suddenly different study, "the effect of natural catastrophe on island organisms." Fate had handed them a marvelously unique chance to record results that previously had only been hypothesized.The scientists published the results of their study in the July 31, 1998 issue of Science magazine.Eleven of the islands -- all about one-third the size of an American football field -- experienced 110-mph winds; eight other islands on the northeast of Great Exuma also were directly hit by Lili after it had passed over Great Exuma. Location made a difference in the fate of organisms. Spiders and lizards were completely wiped out and vegetation greatly damaged on the 11 southwest, or catastrophically hit, islands, whereas populations of lizards were reduced approximately by one-third and those of spiders nearly 80 percent on the northeast, or moderately damaged, islands. Vegetation was affected, but to a much smaller degree.The group found proof of several ecological principles. One is that the recovery rate of different organisms increases strongly with their ability to disperse. For instance, spiders, which produce a silk string to which they cling and get blown into areas courtesy of wind ( a phenomenon called "ballooning"), rebounded quickly on islands where they had been wiped out, unlike lizards, which don't have such "high-tech" dispersal abilities.Another is that larger organisms, lizards in this case, are more resistant to the immediate impact of moderate disturbance than smaller organisms. On the moderately disturbed islands, lizard populations were less affected by the hurricane than were spider populations, but the spiders rebounded much more quickly because of their more prolific breeding capability.A third is that the risk of extinction is related to population size when disturbance is moderate but not when it is catastrophic. In relation to this, the biologists uncovered perhaps the first concrete evidence of how hurricanes wreak devastation on low-lying island organisms. It's not the wind so much as the water. The biologists found a starfish on top of one southwest island, and sand deposits on many of the islands, which were bereft of spiders and lizards. This indicated that a tidal surge as high as 15 to 20 feet -- a response to the lower air pressure caused by the hurricane -- inundated the islands, which are about five feet above sea level."All of the study islands are within several miles of Great Exuma, which is at most one mile wide," explained Losos. "While hurricanes slow down over land, a mile width is not enough to substantially slow down a hurricane. Thus, the wind speed of the hurricane probably was the same for all of the islands. However, the indications are that the southwest islands were immersed in water for a while. The northeast ones weren't because the size of Great Exuma is substantial enough to halt the impetus of a storm surge. The effect is that with the whole ocean at a higher level for several hours, everything that wasn't stripped away when the surge hit the island was drowned or carried off by water."Losos said there are several unique aspects to the study."We had data on the island ecosystems for the three years preceeding the storm," he said. "Many times scientists go into an ecosystem and study the affects in the aftermath of a disturbance, but they don't know the situation beforehand. Moreover, we had information not just on past populations, but on populations immediately before the event and immediately afterward. We know exactly what effect the hurricane had on the islands because we had been there just days before and then we repeatedly visited the sites in the following months to see how the ecosystem recovered."The investigators came back to the islands six weeks after the hurricane and during regular intervals up to one year to census populations and observe vegetation regrowth."Moreover, it has long been a hypothesis that the reason you don't find these common lizards on the small islands is that hurricanes keep coming in and wiping them out," Losos said "And because lizards don't get from one island to another very readily, once they're wiped out, they don't come back. Well, now that hypothesis is documented."Over the past 20 years there has been increasing discussion and debate over the role of natural disturbances -- flood, fire, winds -- in structuring ecosystems. The question is: do they play a major, or transient role? The results of the Spiller, Losos, Schoener study lean toward a conclusion that rare catastrophic events may plan an important role and have a long-lasting effect on an ecosystem's content."To my knowledge, this is one of the best documented studies of the effect of catastrophic disturbances," said Losos. "We have before-and-after data, a set of islands that were devastated as opposed to others that were moderately damaged, and multiple islands from which we can deduce general principles. We were very happy that there were no serious injuries on Great Exuma during the hurricane, but we also know how lucky we were to be there when it happened to come up with a study such as this."
Hurricanes Cyclones
1,998
April 15, 1998
https://www.sciencedaily.com/releases/1998/04/980415075634.htm
Wind Hazards Like Tornadoes Receive Only Fraction Of Research Funding
CLEMSON, S.C. -- Wind hazards like hurricanes and tornadoes result in a greater dollar loss than floods and earthquakes in the United States but receive only a fraction of the research funding, according to Clemson University "wind researcher" Ben Sill.
Wind accounts for $1,000 in loss for every $1 in research funding, while flooding accounts for only $70 in loss for every $1 spent and seismic activity accounts for $45 for every $1 spent, Sill said. "Although some tornadoes or hurricanes will be so strong that extensive damage would be expected, it is not unrealistic to expect that most buildings should withstand even severe storms - Very frequently, though, that's not the case," said Sill, who heads Clemson's wind-load testing facility, where research into making homes and other structures safer from the destructive forces of high wind is conducted. "In many cases, better design and construction could have saved houses or lives," Sill said. The Clemson facility is the only one in the nation able to give a complete picture of the effects of wind on so-called "low-rise structures" like homes, schools and churches. That's because it tests not only the wind load on structures - i.e., how strong the wind is - but also the reciprocating resistance of the building itself - i.e., how strong the building is. "Studying only one side of the equation gives an incomplete picture," said Sill, an Alumni Distinguished Professor of Civil Engineering at Clemson. Dr. Sill, along with other Clemson researchers, was part of the team that made recommendations on construction practices in the wake of Hurricane Andrew in 1992. He was co-chairman of the American Society of Civil Engineers conference "Hugo: One Year Later."
Hurricanes Cyclones
1,998
March 27, 1998
https://www.sciencedaily.com/releases/1998/03/980327074151.htm
El Niño Not The Driving Force Behind North Pacific Hurricanes
COLUMBUS, Ohio -- El Niño may be responsible for severe weather conditions across North America, but an Ohio State University study has revealed that El Niño weather systems don’t always spawn severe hurricanes in the North Pacific.
The current El Niño is different than most, however, and this difference may help researchers solve the mystery behind some curious atmospheric trends, like why the average intensity of North Pacific hurricanes is increasing over time.Jay Hobgood, associate professor of geography at Ohio State, and graduate student Luke Whitney reviewed 31 years of weather data and found that hurricanes in the 7 years during which a severe El Niño visited the North Pacific were not much more intense than hurricanes during the other 24 years. Hurricanes during El Niño years averaged 49.3 percent of the maximum possible intensity, versus 48.6 percent in non-El Niño years. The work appeared in a recent issue of the Journal of Climate.Hobgood explained that the warm surface water caused by El Niño off the west coast of South America heats the air and gives rise to stormy weather over the Pacific as far north as Southern California. The researchers expected that El Niño would intensify hurricanes in that area, but it didn’t.“We were surprised,” said Hobgood. “But then we realized that El Niño also may generate more wind shear. In the Atlantic, wind shear keeps the storms down. We think that in the Pacific the warm water and the wind shear may cancel each other out.”In fact, wind shear is the major reason that most hurricanes only reach 40 to 80 percent of their maximum possible intensity.“But this El Niño is different,” said Hobgood. “Most El Niños start around Christmas, but this El Niño started in March of 1997. The warming was also greater than anything we’ve seen this century, and during the hurricane season it didn’t create much wind shear to weaken storms over the Eastern North Pacific. On the average, El Niños aren’t going to have much of an effect on hurricanes, but this one has.”The continued warm surface temperatures and low winds aloft from this El Niño nurtured Hurricane Linda in September of 1997, allowing it to become the strongest storm ever observed in the East Pacific, with winds reaching 190 miles per hour.Hobgood said this work represents a natural next step in the research that’s been going on for decades at the National Hurricane Center (NHC), part of the National Centers for Environmental Prediction in Miami. The work may one day help scientists predict hurricane strength.“For the last 30 years most hurricane research has focused on prediction of where hurricanes are going to hit, and we’ve gotten pretty good at it,” said Hobgood. “We can warn people about 2 days in advance that a storm may hit their area. Now we’re trying to learn how to predict the strength of hurricanes.”NHC scientists were already investigating hurricane strength in the Atlantic, so Hobgood and Whitney decided to look at the Pacific. When they did, they noticed that hurricanes in the North Pacific have on the average grown more intense between 1963 and 1993.While the average intensity fluctuated year to year, it gradually increased over the 31 years. The minimum intensity of 32.6 percent occurred in 1964, then grew to about 50 percent in 1973, remained relatively stable until 1987 when it began to increase again, and then reached a maximum of 63.7 percent in 1990.Hobgood admits that the gradual increase in intensity may be at least partially due to an overall increase in ocean surface temperature, which may be due to global warming.“I’m not convinced that the change is entirely due to global warming,” he said. “It’s probably a combination of things, possibly including global warming.”Hobgood was quick to point out that as satellite technology has improved over the years, analytical techniques have become more systematic, which could account for some of the increase in recorded intensity.Now that Hobgood and Whitney have investigated ocean surface temperature, they plan to examine the influences of other atmospheric conditions on hurricanes. Currently they are studying data collected at Socorro, a tiny volcanic island off the west coast of Mexico, an area of much hurricane activity. They are examining weather balloon information such as wind speed, humidity, and temperature, all of which contribute to hurricane incidence and strength. Hobgood presented his preliminary analyses of the Socorro data at the March 1998 meeting of the Association of American Geographers meeting in Boston.This work was supported by the Air Force Institute of Technology at Wright-Patterson Air Force Base in Ohio.
Hurricanes Cyclones
1,998
March 4, 1998
https://www.sciencedaily.com/releases/1998/03/980304073730.htm
Wind Expert Cites Poor Building Practices In Connection With Storm Deaths And Destruction
CLEMSON,S.C. -- National wind expert Dr. Peter Sparks, a professor of civil engineering and engineering mechanics at Clemson University, cites poor practices in connection with this week's tornado-related death and destruction.
Mobile homes, which were particularly hard hit in the Florida tornadoes, seemed at particular risk. But mobile homes aren't as much of a problem as are their foundations -- or lack of foundations, according to Sparks. "The standards are quite reasonable now on mobile homes -- unfortunately, not as much care is put into actually putting the home onto a proper concrete foundation," said Sparks. Sparks, who testified before the U.S. House Subcommittee on Housing and Community Development in connection with mobile home standards, is a leading expert on wind damage to structures, as well as the relationship between insurance losses and wind conditions. His engineering recommendations on mobile homes helped lead to tougher industry standards. "But there's no way that such light-weight structures can be engineered to survive these sort of winds. People need to know that and make their decisions accordingly," said Sparks. Clemson has the nation's only wind-load test facility built solely for the study of wind on low-rise structures such as homes and schools. Civil engineers from Clemson were part of the team that made recommendations on how to build back safer stronger homes after Hurricane Andrew hit Florida and Louisiana in August 1992. Most hurricanes continue to be "man-made disasters," said Sparks, who cites construction practices and the continued proximity of houses to the ocean.
Hurricanes Cyclones
1,998
December 2, 1997
https://www.sciencedaily.com/releases/1997/12/971202073207.htm
Strongest El Niño In History Dampers '97 Hurricane Season; Colorado State's Gray Says Still Most Active Three-Year Period
FORT COLLINS -- Despite the strongest summer El Niño event on record, 1997 hurricane activity in the Atlantic Basin was 54 percent of the long-term average but was less than predicted by Colorado State University's noted team of hurricane forecasters.
The team, lead by Professor William Gray, issued a report Nov. 26 that outlined why the El Niño of 1997 flattened the team's August prediction of 11 named storms, six hurricanes and two intense hurricanes for the season. Instead, the Atlantic Basin saw seven named storms, three hurricanes and one intense hurricane during the season, which ends Nov. 30. On average, 9.3 tropical storms, 5.8 hurricanes and 2.2 intense hurricanes form annually. Although the hurricane season was below average, Gray's statistics show that the period between 1995-1997 was still the busiest three-year period for hurricane activity on record. The three-year span generated 39 named storms, 23 hurricanes (13 of which were intense) and 116 hurricane days. "We knew going into the hurricane season that this would be an extremely difficult year to forecast," Gray said. "The El Niño proved to be twice as strong as any other previous record El Niño event in history for this time of year. No one guessed that it would grow to be so intense. And yet, despite this very extreme weather event, we still saw hurricane activity--more than was to be expected." El Niño is a weather phenomenon marked by warmer-than-normal water temperatures in the eastern Pacific Ocean off the coast of Peru and along the equator. This rise in ocean temperatures causes strong upper tropospheric winds to blow in a westerly direction from the Pacific Ocean to the tropical Atlantic Ocean. These winds typically act to shear off developing hurricanes. Gray said that in other years with strong El Niño events, such as 1957, 1972 and 1982, waters warmed only 2 or 3 degrees centigrade above normal. But the El Niño of 1997 actually warmed waters 4 or 5 degrees centigrade above normal--nearly twice as much as the previous record El Niño of 1982-83. This rare and extreme rise in ocean temperatures helped produce even more intense westerly upper tropospheric winds in the Atlantic Basin, which caused strong wind shear and prevented most easterly waves from Africa from forming. Gray and his team of researchers are investigating the possibility that the extreme El Niño this year may have been the result of a long period of warm water accumulating in the western Pacific, possibly left over from smaller El Niño events in 1991-1993. The team contends this kind of warm water build-up could only have produced the type of El Niño that emerged this year. Despite El Niño's extreme influence over the Colorado State team's 1997 hurricane forecast, Gray points out that factors in the Atlantic favorable for hurricane activity were still enough to produce seven named storms this year. These factors included warmer sea surface temperatures in the north and tropical Atlantic and colder sea surface temperatures in the South Atlantic, as well as colder than normal air temperatures 54,000 feet above Singapore. Also present was the Quasi-Biennial Oscillation, equatorial stratospheric winds at 68,000-75,000 feet than tend to promote hurricane formation when they blow from the west--as they did this year. And, as predicted in the team's August forecast, El Niño pushed many of the storms that did form in 1997 to higher latitudes--some of them closer to the United States. Of the seven named storms that formed in the Atlantic, six originated above 25 degrees north latitude, higher latitudes than hurricanes typically form. Gray attributes this to the fact that while El Niño produces strong upper-level westerly winds at lower latitudes that block African-origin storms, it also creates weaker upper-level westerly winds at higher latitudes that are less able to thwart hurricane development. Using atmospheric models, Gray and his colleagues have shown that if the El Niño of 1997 had only been as intense as previous record El Niño events in 1957, 1972 and 1982, those positive factors for hurricane formation would have generated 10 named storms, six hurricanes and three intense hurricanes--virtually on target with the team's prediction. "The 1997 El Niño was truly in a class by itself," Gray said. "But I don't think it will be around to influence the 1998 hurricane season to any significant degree." The Colorado State team's historical data shows that nine out of the past 30 years have actually produced less hurricane activity than in 1997. Of the nine years that were less active, seven occurred during El Niño events. When Gray's team issues the first forecast for the 1998 season on Dec. 5, the statistical model will now include the extreme 1997 El Niño conditions. The team's hurricane forecasts--issued in early December, April, June and August--do not predict landfall and apply only to the Atlantic Basin, which encompasses the Atlantic Ocean, Caribbean Sea and Gulf of Mexico. In addition to Gray, the hurricane research team includes John Knaff, Paul Mielke and Kenneth Berry from Colorado State; and Chris Landsea, a Colorado State graduate and a researcher at NOAA's Hurricane Research Laboratory in Miami, Fla.
Hurricanes Cyclones
1,997
November 7, 1997
https://www.sciencedaily.com/releases/1997/11/971107070528.htm
Great Lakes Intensify Ferocity Of Passing Storms, Scientists Say
CHAMPAIGN, Ill. -- The Great Lakes exert a significant influence on passing cyclones, causing storms to speed up and grow in strength, say researchers at the University of Illinois and the Illinois State Water Survey. Also, the number of potentially dangerous storms is on the rise, they report.
"Cyclones that traverse the Great Lakes have important impacts on the physical environment and human habitation in the region," said James Angel, a climatologist with the Survey. "There is a lot of development along the lakes, and when the water level is high -- as it is now -- the area becomes extremely vulnerable to shoreline damage from these storms. A better understanding of how the Great Lakes affect passing cyclones may allow better forecasting of these storms and their potential effects."Cyclones are low-pressure storm centers, "often accompanied by high winds and heavy precipitation," said Scott Isard, a U. of I. professor of geography. "The ensuing storms can be huge, ranging in size from 800 to 1,500 miles in diameter."To study the effect the Great Lakes have on passing cyclones, Angel and Isard examined the rates of movement and the changes in intensity for 583 cyclones that passed over the region between the years 1965 to 1990. The researchers' findings, published in the September issue of Monthly Weather Review, identify several important features regarding the lakes' influence on these storm systems."In general, we found that cyclones accelerated as they approached the Great Lakes region and increased in intensity over the lakes," Angel said. "This effect was most pronounced from September to November, when the surface waters of the lakes are warmer than the surrounding air and can provide a major source of both moisture and heat that energizes passing storms."From January to March, when broken ice cover is generally present on the lakes, cyclones accelerated less and did not intensify, Angel said. However, cyclones that traversed the region during May and June did speed up and grow in strength."This surprised us, because the lakes are usually cooler than the overriding air mass during spring and summer, and have not generally been considered as an important energy source for cyclones at that time," Angel said. "We don't yet have a satisfactory explanation for this phenomenon."In another study (to appear in the journal Climate), Angel and Isard analyzed trends in storm strength for the years 1900 to 1990. "We are seeing evidence of an increase in the number of stronger storms, particularly in the months of November and December," Angel said.Historically, some of these cyclones have produced hurricane-force winds and caused extensive damage to shipping. The "great storm of 1913," for example, sank a dozen ships and claimed more than 250 lives. More recently, the ore carrier Edmund Fitzgerald -- popularized in a ballad by Canadian singer and songwriter Gordon Lightfoot -- sank in Lake Superior during a major storm on Nov. 10, 1975. All hands were lost.
Hurricanes Cyclones
1,997
May 22, 1997
https://www.sciencedaily.com/releases/1997/05/970522143805.htm
New Findings Blame Jump In Hurricane Toll On Coastal Growth, Not Climate Change
BOULDER--In the past eight years, three U.S. hurricanes--Andrew
A draft of the referenced paper will be available on the Web after
Hurricanes Cyclones
1,997
April 23, 2021
https://www.sciencedaily.com/releases/2021/04/210423130234.htm
Fiber optic cable monitors microseismicity in Antarctica
At the Seismological Society of America's 2021 Annual Meeting, researchers shared how they are using fiber optic cable to detect the small earthquakes that occur in ice in Antarctica.
The results could be used to better understand the movement and deformation of the ice under changing climate conditions, as well as improve future monitoring of carbon capture and storage projects, said Anna Stork, a geophysicist at Silixa Ltd.Stork discussed how she and her colleagues are refining their methods of distributed acoustic sensing, or DAS, for microseismicity -- earthquakes too small to be felt. DAS works by using the tiny internal flaws within an optical fiber as thousands of seismic sensors. An instrument at one end sends laser pulses down the cable and measures the "echo" of each pulse as it is reflected off the fiber's internal flaws.When the fiber is disturbed by earthquakes or icequakes, there are changes in the size, frequency and phase of laser light scattered back to the DAS receiver that can be used characterize the seismic event.Michael Kendall of the University of Oxford said the Antarctic research demonstrates how DAS can be used to monitor underground carbon capture and storage at other sites in the world. For instance, the layout of the Antarctic network offers a good example for how a similar network could be configured to best detect microseismicity that could be triggered by carbon storage."Our work also demonstrates a method of using DAS fiber arrays to investigate microseismic earthquake source mechanisms in more detail than conventional geophones," said Tom Hudson of the University of Oxford. "If we can analyze the source mechanism -- how an earthquake fails or fractures -- then we may be able to attribute the earthquake to the movement of fluids like carbon dioxide in a reservoir."The Antarctic microseismic icequakes recorded by DAS "are approximately magnitude -1, corresponding to approximately the size of a book falling off a table," Hudson explained, "so they are very small earthquakes."The study by Hudson and colleagues is the first to use DAS to look at icequakes in Antarctica. The fiber optic cable was deployed in a linear and triangular configuration on the ice surface at the Rutford Ice Stream.Kendall said there are a number of challenges to using fiber optic sensors in the harsh Antarctica environment. The equipment had to travel in pieces by boat and several planes to the study site. The researchers had to bury the fiber to reduce wind noise contaminating the seismic signal, as well as remove the signal of a generator that powered the DAS instrument."We housed the instrument in a mountaineering tent, which basically served as a tiny office," Stork explained. "Keeping temperatures within the recommended operating limits was a challenge. The radiative heating from the sun warned the tent to well in the 30s [degrees Celsius], even though it was -10 degrees Celsius outside."The researchers share their analyses of icequake data with climatologists and other researchers studying the slip of glaciers and other ice movements in Antarctica, Kendall said."Hopefully in the future we will interact more with scientists drilling ice cores too, as they use fiber as distributed temperature sensors, but these fibers that they put down boreholes could also be used for seismic studies like ours," he noted.
Earthquakes
2,021
April 22, 2021
https://www.sciencedaily.com/releases/2021/04/210422181851.htm
Machine learning model generates realistic seismic waveforms
A new machine-learning model that generates realistic seismic waveforms will reduce manual labor and improve earthquake detection, according to a study published recently in
"To verify the e?cacy of our generative model, we applied it to seismic ?eld data collected in Oklahoma," said Youzuo Lin, a computational scientist in Los Alamos National Laboratory's Geophysics group and principal investigator of the project. "Through a sequence of qualitative and quantitative tests and benchmarks, we saw that our model can generate high-quality synthetic waveforms and improve machine learning-based earthquake detection algorithms."Quickly and accurately detecting earthquakes can be a challenging task. Visual detection done by people has long been considered the gold standard, but requires intensive manual labor that scales poorly to large data sets. In recent years, automatic detection methods based on machine learning have improved the accuracy and efficiency of data collection; however, the accuracy of those methods relies on access to a large amount of high?quality, labeled training data, often tens of thousands of records or more.To resolve this data dilemma, the research team developed SeismoGen based on a generative adversarial network (GAN), which is a type of deep generative model that can generate high?quality synthetic samples in multiple domains. In other words, deep generative models train machines to do things and create new data that could pass as real.Once trained, the SeismoGen model is capable of producing realistic seismic waveforms of multiple labels. When applied to real Earth seismic datasets in Oklahoma, the team saw that data augmentation from SeismoGen?generated synthetic waveforms could be used to improve earthquake detection algorithms in instances when only small amounts of labeled training data are available.
Earthquakes
2,021
April 22, 2021
https://www.sciencedaily.com/releases/2021/04/210422123649.htm
Ground and satellite observations map building damage after Beirut explosion
Days after the 4 August 2020 massive explosion at the port of Beirut in Lebanon, researchers were on the ground mapping the impacts of the explosion in the port and surrounding city.
The goal was to document and preserve data on structural and façade damage before rebuilding, said University of California, Los Angeles civil and environmental engineer Jonathan Stewart, who spoke about the effort at the Seismological Society of America (SSA)'s 2021 Annual Meeting.The effort also provided an opportunity to compare NASA Jet Propulsion Laboratory satellite surveys of the blast effects with data collected from the ground surveys. Stewart and his colleagues concluded that satellite-based Damage Proxy Maps were effective at identifying severely damaged buildings and undamaged buildings, but were less effective for assessing intermediate levels of structural or façade damage."The main take away is that the Damage Proxy Maps can definitely distinguish severe damage from lack of damage" for both structural and façade assessments, Stewart said, "but they are not as good at finer tuning.""If what you're interested in is a fairly detailed picture of what has happened, it's not able to replace a person who actually knows what they're doing looking at the structure, particularly from the inside," he added.The reconnaissance of the Beirut blast was organized through the National Science Foundation-sponsored Geotechnical Extreme Events Reconnaissance Association (GEER). In addition to Stewart and his colleagues at the American University of Beirut, the team included members from the University of Illinois and the University of Calabria in Italy. The information analyzed by the GEER team can help engineers learn more about how to build safely against similarly destructive events, including earthquakes, in the future.Their findings, detailed in a GEER report, also "make some recommendations about how you can optimize human resources when doing these inspections," Stewart said.On that August day, a fire at the port detonated an estimated 2.75 kilotons TNT equivalent of ammonium nitrate and fuel, an event about the size of a magnitude 3.3 earthquake. Within days, engineers at the American University of Beirut "had set up a hotline where people could call in who were concerned with the stability of damaged structures," Stewart said.Professors and students made visits to inspect and assess the stability of these structures and others, but the in-person visits were scaled back in September due to COVID-19. After that, the researchers depended on street view surveys, using GoPro 360-degree cameras mounted on cars driven around the city.The damage was ranked using scales adapted from those used for post-earthquake events, said Stewart. For instance, structural damage was ranked on a scale that began with minor damage to non-load bearing elements up to the complete collapse of a structure. Façade damage was ranked using a scale that begins with cracked windows and extends to complete blowout of windows and doors.The spatial patterns of damage from an explosion differ from those seen in an earthquake. Site conditions such as underlying soil matter much more when it comes to the structural impact of an earthquake, while explosion damage depends "on how much are you feeling that blast," Stewart explained. "With an explosion, the damage decreases with distance and with the number of buildings between you and the blast that can deflect its effects."Stewart isn't an expert in explosion seismology, but he has experience in assessing structural damage after earthquakes from his work in post-earthquake zones with GEER. He reached out to a colleague at the American University in Beirut after the disaster to offer his help in collecting observations that could be useful to future researchers and engineers."We felt that it was important to gather perishable data that we anticipate will be useful to people who study blast effects in an urban setting, and to learn something from this disaster to improve our resilience to future such disasters," he said.
Earthquakes
2,021
April 20, 2021
https://www.sciencedaily.com/releases/2021/04/210420160912.htm
Was Cascadia's 1700 earthquake part of a sequence of earthquakes?
The famous 1700 Cascadia earthquake that altered the coastline of western North America and sent a tsunami across the Pacific Ocean to Japan may have been one of a sequence of earthquakes, according to new research presented at the Seismological Society of America (SSA)'s 2021 Annual Meeting.
Evidence from coastlines, tree rings and historical documents confirm that there was a massive earthquake in the U.S. Cascadia Subduction Zone on 26 January 1700. The prevailing hypothesis is that one megathrust earthquake, estimated at magnitude 8.7 to 9.2 and involving the entire tectonic plate boundary in the region, was responsible for the impacts recorded on both sides of the Pacific.But after simulating more than 30,000 earthquake ruptures within that magnitude range using software that models the 3D tectonic geometry of the region, Diego Melgar, the Ann and Lew Williams Chair of Earth Sciences at the University of Oregon, concluded that those same impacts could have been produced by a series of earthquakes.Melgar's analysis suggests that a partial rupture of as little as 40% of the megathrust boundary in one magnitude 8.7 or larger earthquake could explain some of the North American coastal subsidence and the 26 January 1700 Japan tsunami. But there could have also been as many as four more earthquakes, each magnitude 8 or smaller, that could have produced the rest of the subsidence without causing a tsunami large enough to be recorded in Japan.His findings do not rule out the possibility that the 1700 Cascadia earthquake was a stand-alone event, but "the January 26, 1700 event, as part of a longer-lived sequence of earthquakes potentially spanning many decades, needs to be considered as a hypothesis that is at least equally likely," he said.Knowing whether the 1700 earthquake is one in a sequence has implications for how earthquake hazard maps are created for the region. For instance, calculations for the U.S. Geological Survey hazard maps are based on the Cascadia fault zone fully rupturing about half the time and partially rupturing the other half of the time, Melgar noted."But are we really sure that that's real, or maybe it's time to revisit that issue?" said Melgar. "Whether there was a partial or full rupture fundamentally drives everything we put on the hazard maps, so we really need to work on that."Since the first analyses of the 1700 earthquake, there have been more data from the field, repeated earthquake modeling of the Cascadia Subduction Zone and a better understanding of the physics of megathrust earthquakes -- all of which allowed Melgar to revisit the possibilities behind the 1700 earthquake. Researchers also have been writing code for years now to simulate earthquakes and tsunamis in the region, in part to inform earthquake early warning systems like ShakeAlert.If there was a sequence of earthquakes instead of one earthquake, this might help explain why there is little good geologic evidence of the 1700 event in places such as the Olympic Mountains in Washington State and in southern Oregon, Melgar said.He noted, however, that these specific areas are difficult to work in, "and may not necessarily be good recorders of the geological signals that paleoseismologists look for."Melgar's models show that even a smaller Cascadia earthquake could cause a tsunami energetic enough to reach Japan. These smaller earthquakes could still pose a significant tsunami risk to North America as well, he cautioned. "They might be less catastrophic, because they don't affect such a wide area because the rupture is more compact, but we'd still be talking a mega-tsunami."He suggested that it could be valuable to revisit and re-do old paleoseismological analyses of the 1700 event, to gain an even clearer picture of how it fits into the overall earthquake history of the region."Cascadia actually records earthquake geology much better than many other parts of the world," Melgar said, "so I think that just going back with modern methods would probably yield a lot of new results."
Earthquakes
2,021
April 20, 2021
https://www.sciencedaily.com/releases/2021/04/210420131045.htm
Fixed network of smartphones provides earthquake early warning in Costa Rica
Earthquake early warnings can be delivered successfully using a small network of off-the-shelf smartphones attached to building baseboards, according to a study conducted in Costa Rica last year.
In his presentation at the Seismological Society of America (SSA)'s 2021 Annual Meeting, Ben Brooks of the U.S. Geological Survey said the ASTUTI (Alerta Sismica Temprana Utilizando Teléfonos Inteligentes) network of more than 80 stations performed comparably to scientific-grade warning systems.During six months' of ASTUTI operation, there were 13 earthquakes that caused noticeable shaking in Costa Rica, including in the city of San Jose where the network was deployed. The system was able to detect and alert on five of these earthquakes, Brooks and his colleagues determined when they "replayed" the seismic events to test their network.Alerts for the system are triggered when shaking exceeds a certain threshold, equivalent to slightly less than what would be expected for a magnitude 5 earthquake, as measured by the accelerometers that are already built into the phones, Brooks said.In simulations of the magnitude 7.6 Nicoya earthquake that took place in 2012 in Costa Rica, ASTUTI would have delivered its first alerts on average nine to 13 seconds after the event."The performance level over the six months is encouraging," Brooks said. "Cascadia events in the Pacific Northwest are similar to the Costa Rican subduction zone, and latencies for ShakeAlert in Cascadia are about 10 seconds, so it's comparable."ASTUTI demonstrates the possibilities of lower-cost earthquake early warning for regions that lack the scientific-grade network stations such as those behind ShakeAlert, he noted."I would imagine that would be attractive for countries with less resources to dedicate to earthquake early warning" Brooks said, "but the performance is also at a level that I imagine would interest even wealthier countries."The accelerometers within smartphones are ideal for a low-cost network, Brooks and his colleagues suggested. "If you were to build your own sensors, there's a lot of cost in maintaining them, and the sensors will quickly become obsolete," Brooks said.By using commercial technology, "we let the big telecommunications companies do the research and development and we just deploy it," he added.The phones were deployed in the homes of researchers and staff at the Observatorio Vulcanologico y Sismologico de Costa Rica (OVSICORI). The phones were enclosed in a plastic protective box before being mounted to the baseboard with 3M adhesive tape. As a result, the phones sense an earthquake through the filter of the structures, rather than directly as is the case for buried seismometers.During testing, the system did not issue any false alerts. The researchers' analysis suggests the system could deliver alerts to 15% to 75% of Costa Rica's population in enough time to carry out drop-cover-hold-on responses, if it alerted the entire country after detecting an earthquake.Country-wide alerting might be a successful strategy for the network, since recent studies show that people tend to be more tolerant than previously suspected of receiving warnings even if they didn't feel shaking, especially if they live in hazard-prone regions such as Costa Rica, said Brooks.The work was funded by the U.S. Agency for International Development Bureau for Humanitarian Assistance. Next steps for ASTUTI will be to develop an app to make it more user-friendly and to deploy it more widely throughout the country.Low-cost networks like ASTUTI will be a growing trend in earthquake early warning over the next decade, Brooks predicted. "When your objective is warning, you don't necessarily need the fanciest equipment."
Earthquakes
2,021
April 19, 2021
https://www.sciencedaily.com/releases/2021/04/210419135652.htm
GPS data reveal possible earthquake, tsunami hazard in Northwestern Colombia
Data from a GPS network in Colombia have revealed a shallow and fully locked part on the Caribbean subduction zone in the country that suggests a possible large earthquake and tsunami risk for the northwest region.
The locked patch south of Cartagena city is capable of generating a magnitude 8.0 earthquake every 600 years, said Sindy Lizarazo of Nagoya University in Japan, who presented the study at the Seismological Society of America (SSA)'s 2021 Annual Meeting.Colombia lies in the middle of a complex tectonic zone, where the Caribbean, Nazca and South American tectonic plates and other smaller tectonic blocks converge. The Caribbean plate is very slowly converging with the northern part of Colombia -- moving at 7 millimeters per year -- which may in part be the reason for the long time between large earthquakes in northwest Colombia."The only recent historical record of a disastrous [magnitude 6.4] earthquake in the Colombian Caribbean region was on May 22, 1834 close to Santa Marta," said Lizarazo. "However, there is no seismic event that meets the magnitude estimated by our study, nor tsunamis in the historical record on the northern part of Colombia."To better understand the complex movements and crustal deformation taking place in the region, Lizarazo and colleagues analyzed data from the nationwide GPS network called GeoRED (GEOdesia: Red de Estudios de Deformación in Spanish). The network has been operated by the Geological Survey of Colombia since 2007 and has 150 permanent stations in continuous operation.GPS data can be used to estimate the movements and crustal deformation of the tectonic plates interacting against each other. The data analyzed by Lizarazo and colleagues revealed motion of the northern part of the North Andean Block -- a "microplate" squeezed between the Nazca and South American plates -- that causes it to interact with the subducting Caribbean plate.Using these data along with a realistic slab configuration, the researchers estimated the extent and degree of interplate locking along this boundary, where stresses may build without being released in an earthquake.The study "provides the first evidence of a shallow locked region south of Cartagena," Lizarazo said. "This indicates that this segment of the Caribbean-South America plate boundary in northwestern Colombia can be the locus of significant earthquake and tsunami hazard."To fully evaluate this hazard potential, researchers need to conduct large-scale geological mapping, and look for evidence of past tsunamis and large earthquakes in the region, among other studies. "It is also necessary to continue with the densification of the GPS network in the country, increasing its coverage and operation in real time," Lizarazo said.
Earthquakes
2,021
April 13, 2021
https://www.sciencedaily.com/releases/2021/04/210413121012.htm
Why are there relatively few aftershocks for certain cascadia earthquakes?
In the Cascadia subduction zone, medium and large-sized "intraslab" earthquakes, which take place at greater than crustal depths within the subducting plate, will likely produce only a few detectable aftershocks, according to a new study.
The findings could have implications for forecasting aftershock seismic hazard in the Pacific Northwest, say Joan Gomberg of the U.S. Geological Survey and Paul Bodin of the University of Washington in Seattle, in their paper published in the Researchers now calculate aftershock forecasts in the region based in part on data from subduction zones around the world. But Cascadia intraslab earthquakes produce fewer aftershocks compared to others in subduction zones around the world. In Cascadia, these aftershock rates are lower by more than half that of the global average, Gomberg and Bodin concluded.They also suggest that aftershock rates for Cascadia earthquakes generally appear consistent with a "clock-advance" model, in which the mainshock causes tectonically loaded fault patches to slip earlier than they would have under the normal background seismicity of the region.Gomberg and Bodin decided to study the phenomenon further after recent intraslab earthquakes in Mexico and Alaska produced robust aftershock sequences. "This was startling because the lore in Cascadia was that intraslab earthquakes had puny aftershock sequences," Gomberg explained, noting that in Cascadia three magnitude 6.5 to 6.8 intraslab earthquakes in 1949, 1965 and 2001 produced few to no aftershocks."Additionally, the USGS has begun to generate quantitatively estimated aftershock forecasts based initially on global patterns" she added, "and given these contrasting experiences, it seemed time to generate some objective numbers to base Cascadia's forecasts on."The researchers analyzed earthquake catalogs produced by the Geological Survey of Canada and the Pacific Northwest Seismic Network from January 1985 to January 2018. Mainshocks that took place in the upper plate produced the most aftershocks, they found. Aftershock productivity was lowest for intraplate earthquakes in the Puget Lowlands portion of the subduction zone (which contains the Seattle metropolitan area), while aftershock rates were variable at the northern end of the zone near Vancouver Island and within the expected range for the southern end near Cape Mendocino.The tectonic environment at each end of the subduction zone could help explain why aftershock production is higher there, the researchers said. Multiple plate boundaries meet in these areas, which could "concentrate stress, so more faults exist and are closer to failure than in other areas," they noted.The reasons why Cascadia aftershock production is so low compared to global rates are still unclear, but "one strong possibility would seem to be that temperature for the deeper slab earthquakes is a dominant controlling parameter," said Bodin, noting that "the young, hot Juan de Fuca plate is being jammed beneath North America" in Cascadia.The deeper the earthquake, the higher the temperatures, and the researchers did find that aftershock productivity decreases with depth, Bodin explained. "However, this is not so different than southern Mexico, where, as we noted, recent intraslab mainshocks have supported vigorous aftershock sequences."Gomberg and Bodin said their analysis was limited by the fact that seismicity rates in Cascadia are generally low and there are sparse data to constrain the location and depth of most earthquakes in the region. Methods that help researchers detect and locate smaller earthquakes could provide a better sense of overall aftershock rates and the physical processes that control them, they suggested.
Earthquakes
2,021
April 13, 2021
https://www.sciencedaily.com/releases/2021/04/210413114117.htm
Tremors triggered by typhoon talas tell tales of tumbling terrain
Tropical cyclones like typhoons may invoke imagery of violent winds and storm surges flooding coastal areas, but with the heavy rainfall these storms may bring, another major hazard they can cause is landslides -- sometimes a whole series of landslides across an affected area over a short time. Detecting these landslides is often difficult during the hazardous weather conditions that trigger them. New methods to rapidly detect and respond to these events can help mitigate their harm, as well as better understand the physical processes themselves.
In a new study published in As study first author Professor Ryo Okuwaki explains, "Our surface-wave detection to locate seismic events was based on the AELUMA method, short for Automated Event Location Using a Mesh of Arrays. One hundred and three seismic stations across Japan were divided into triangular subarrays, and data from the days of the typhoon were analyzed to differentiate earthquake-related events from landslide signals."Using this method, multiple landslides that occurred during Typhoon Talas were identified, including one in the Tenryu Ward of Shizuoka Prefecture, about 400 km east of the typhoon's track. In 2011, it took 3 days for this landslide to be detected, after the storm had cleared and conventional observation methods became possible. This shows the potential usefulness of applying this new method to more rapidly identify landslide events. The Tenryu landslide was much smaller than landslides previously identified based on globally recorded surface waves, and it was detected as far as 3,000 km from the epicenter using the new approach.According to Professor Okuwaki, "We found that both small and large landslides may follow the same empirical scaling relationships. This allows previous research based on larger landslides to be applied to better understand the behavior of smaller landslides detected by using our novel method, which will have important implications in further research."This new method, based on data from a sparse seismic network, is a promising step forward for monitoring landslide occurrences down to a scale of about 100 meters across a broad region in real time, which may help with development of emergency alert technology in the future.
Earthquakes
2,021
April 9, 2021
https://www.sciencedaily.com/releases/2021/04/210409104459.htm
Preseismic atmospheric radon anomaly associated with 2018 northern Osaka earthquake
The concentration of the radioactive element radon is known to change in the ground before and after earthquakes. Previous studies have shown elevated radon levels in the atmosphere before the mainshock of a large inland earthquake due to foreshock activity and slow slip.
But now, researchers from Tohoku University have revealed an anomaly in this phenomenon. Through the analysis of data before and after the 2018 Northern Osaka Earthquake, they discovered that the atmospheric radon concentration decreased.The results of their research were published online in "For the first time, we found a decrease in the atmospheric radon associated with seismic quiescence before the mainshock of an inland earthquake," said professor Jun Muto of the Graduate School of Science at Tohoku University. Muto, who led the research group with professor Hiroyuki Nagahama, collaborated with Osaka Medical and Pharmaceutical University and Kobe Pharmaceutical University.Observation in atmospheric radon concentration levels from the Osaka Medical and Pharmaceutical University showed a decline about one year before the earthquake, a trend that continued until June 2020. Seismic activity in the vicinity of the monitoring site also declined prior to the earthquake along with a reduced seismic activity after the mainshock for the entire Kansai region, excluding the aftershock area.All of this suggests that radon levels did not increase after the earthquake. It also paints a more complicated picture of the radon and fluid movements that cause radon exhalation in the subsurface when large earthquakes occur."The discovery revealed that more processes are occurring before earthquakes than previously thought," said Muto. "Further analysis of other earthquakes will lead to a better understanding of the physiochemical processes at play and help us use atmospheric radon concentrations to clarify various crustal movements correlated with major earthquakes."The research group has established atmospheric radon concentration measurement monitoring networks at radioisotope facilities across Japan. Monitoring is also expected be carried out at nuclear power plants.Expansion of the monitoring network will help clarify the area and timing of radon anomalies, the geological characteristics that cause such anomalies, and the relationship with the different earthquake types. All of this will help realize a radon-based earthquake prediction system.
Earthquakes
2,021
April 8, 2021
https://www.sciencedaily.com/releases/2021/04/210408152314.htm
Mountain growth influences greenhouse effect
Taiwan is an island of extremes: severe earthquakes and typhoons repeatedly strike the region and change the landscape, sometimes catastrophically. This makes Taiwan a fantastic laboratory for geosciences. Erosion processes, for example, occur up to a thousand times faster in the center of the island than in its far south. This difference in erosion rates influences the chemical weathering of rocks and yields insights into the carbon cycle of our planet on a scale of millions of years. A group of researchers led by Aaron Bufe and Niels Hovius of the German Research Center for Geosciences (GFZ) has now taken advantage of the different erosion rates and investigated how uplift and erosion of rocks determine the balance of carbon emissions and uptake. The surprising result: at high erosion rates, weathering processes release carbon dioxide; at low erosion rates, they sequester carbon from the atmosphere. The study will be published in
Behind all this are tectonic and chemical processes. In rapidly growing mountains in particular, tectonic uplift and erosion constantly bring fresh rock material up from underground. There it is exposed to circulating acidic water which dissolves or alters the rock. Depending on the type of rock, this weathering has very different effects on Earth's climate. For example, if carbonic acid from the soil comes into contact with silicate minerals, limestone (calcium-carbonate or CaCO3) precipitates, in which the carbon is then bound for a very long time. In the case of a combination of sulfurous mineral, such as pyrite, and limestone, the opposite happens. The sulfuric acid that forms when pyrite comes into contact with water and oxygen dissolves carbonate minerals, thus producing COThis question can be answered in southern Taiwan. Taiwan is located at a subduction zone, where an ocean plate slides under the Asian continent. This subduction causes rapid mountain growth. While the center of the island has been standing tall for several million years, the southern tip has just emerged from the sea. There, the mountains have low relief and they erode relatively slowly. Farther north, where the mountains are steep and tall, fresh rock is quickly brought to the Earth's surface to weather. Usefully, the rocks of southern Taiwan are typical of many young mountain ranges around the world, containing mostly silicate minerals with some carbonate and pyrite.In their study, the researchers sampled rivers that collect water from these mountains at different erosion rates. From the material dissolved in the rivers, the researchers estimated the proportion of sulfide, carbonate, and silicate minerals in the weathering. These results allowed them to estimate the both the amount of COSo, does weathering of mountain ranges increase CO
Earthquakes
2,021
April 6, 2021
https://www.sciencedaily.com/releases/2021/04/210406120738.htm
Silencing vibrations in the ground and sounds underwater
Metamaterials that can control the refractive direction of light or absorb it to enable invisible cloaks are gaining attention. Recently, a research team at POSTECH has designed a metasurface that can control the acoustic or elastic waves. It is gaining attention as it can be used to escape from threatening earthquakes or build submarines untraceable to SONAR.
Professor Junsuk Rho of POSTECH's departments of mechanical engineering and chemical engineering and Ph.D. candidate Dongwoo Lee of the Department of Mechanical Engineering in collaboration with Professor Jensen Li of HKUST have designed an artificial structure that can control not only the domain of underwater sound but also of vibration. The research team has presented an underwater stealth metasurface independent from SONAR by controlling the acoustic resonance to absorb the wave. They also confirmed that the wave propagation through a curved plate, such as vibrations, can be drastically altered; and have presented a methodology that can actually achieve the cloaking effect with singularity of infinite refractive index, which has been considered impossible to demonstrate until now. These research findings were recently published in When light encounters a substance in nature, it generally refracts in the positive (+) direction. Metamaterials can design this refractive of light as a negative (-) direction, a zero refractive index (0) that allows complete transmission, or a complete absorber. This is the reason why things look transparent when they encounter metamaterials.The research team theoretically confirmed a metasurface that can significantly absorb sound waves without reflecting them by tailoring the resonance of sound. Through layers of split-orifice-conduit (SOC) hybrid resonators, a thin metasurface was designed to absorb sound waves in broadband (14 kHz to 17 kHz). The metasurface designed this way can achieve underwater stealth capability untraceable by SONAR, which detects objects using the information between transmitted and reflected waves.The research team confirmed that it is possible to transmit or change the direction of elastic waves -- like seismic waves -- according to the design of curved plates. Applying Albert Einstein's general theory of relativity which states that the path of light changes in the warping of space-time due to the change in the gravitational field caused by mass, the research team proposed a platform capable of extremely controlling elastic waves on a curved plate. As an example, a refractive index singularity lens, which is a metasurface lens that approaches near-zero thickness, is implemented to demonstrate an elastic version of Eaton lenses that can be bent at 90 and 180 degrees in the broad frequency range (15 kHz to 18 kHz) on thin curved plates.In addition, by proposing a methodology that can actually implement the cloaking effect in which the singularity exists in theory, it is expected that extreme celestial phenomena such as black hole enabled by gravitational fields can be explored as a test bed in the elastic platform in the near future. Based on this understanding of the singularity of refractive index, it is anticipated to enable technologies that protect nuclear power plants or buildings from earthquakes or to control wave energy generated when tectonic plates collide or split."Until now, metamaterial research has focused on light and electromagnetic waves, but we have confirmed that it can be applied to sound waves and seismic waves," remarked Professor Junsuk Rho, renown worldwide for his research on metamaterials. "We anticipate that it will be applicable to building untraceable submarines or nuclear power plants that can stay intact even during earthquakes."
Earthquakes
2,021
April 6, 2021
https://www.sciencedaily.com/releases/2021/04/210406120704.htm
Seismic coda used to locate and define damage from explosions
Comparison of coda waves, the scattered waves that arrive after the direct waves of a seismic event, can be used to determine the relative locations of two underground explosions, according to a new study published in the open-access journal
The technique, called coda wave interferometry, was tested on explosions conducted as part of the Source Physics Experiment (SPE). Lawrence Livermore National Laboratory researchers Sean Ford and Bill Walter report that coda wave interferometry can also put a limit on the extent of damage caused by an explosion.The findings suggest the technique could be used to improve the estimates of the relative locations of larger explosions, such as the series of announced nuclear tests conducted by the Democratic People's Republic of Korea over the past two decades."Based on the size and frequency scaling that we were able to employ in the paper and successes at SPE," said Ford, "a conclusion point is that this technique could be used for larger explosions at larger separations recorded at more distant stations" such as those used to monitor North Korean testing.Unlike the direct and strong P- and S-seismic waves produced by an earthquake or explosion event, coda waves arrive later and are more sensitive to scattering by the rock that they pass through. Any changes in the scattering structure -- from rocks pushed or crushed by an explosion, for example -- "will show up in how these later arriving waves have bounced around in that medium over a greater duration," Ford explained.In the new study, Ford and Walter used data from the SPE, an ongoing multi-institutional project involving Lawrence Livermore, Los Alamos and Sandia National Laboratories at the former Nevada nuclear test site. The SPE conducts chemical explosions to better understand the seismic waves they produce and to refine explosion detection techniques, using the analyses to improve monitoring of global nuclear explosions.The researchers used coda wave interferometry to determine the known relative location of two chemical explosions that took place during Phase I of the SPE. The first explosion, SPE-1, was equivalent to 87.9 kilograms of TNT. The second explosion, SPE-2, was equivalent to 997 kilograms of TNT.Their analysis concluded that the two explosions were located between 6 and 18 meters apart, and most likely 9.2 meters apart. The known separation between the two explosions is about 9.4 meters apart.Previous research by seismologists David Robinson and coworkers, showed that coda wave interferometry could precisely locate earthquakes separated by hundreds of meters. "We were confident that the approach would work for chemical explosions, but the question for us was whether it could work for such small and closely located explosions," said Ford.Ford and Walter also used the technique to better characterize the underground damage caused by SPE-2, comparing its coda waves with those produced by the 905-kilogram TNT equivalent SPE-3 that was later detonated in the same spot as SPE-2.The details of the damage "can't be seen from the direct waves arriving at the [1-kilometer or more] distant stations that we're used to, so we thought perhaps we could see it in these more sensitive scattered waves, the coda waves," Ford explainedBased on the analysis, the damage caused by SPE-2 must have been confined to a spherical region with a radius less than 10 meters, the researchers concluded."We thought there would be much more damage, or at least more of an effect on the outgoing waves, but now there is evidence against that hypothesis, so this points us in other directions to explain the observed P- and S-waves," Ford said.The study is the first research paper published in
Earthquakes
2,021
March 25, 2021
https://www.sciencedaily.com/releases/2021/03/210325101233.htm
Researchers reveal cost of key climate solution
Perhaps the best hope for slowing climate change -- capturing and storing carbon dioxide emissions underground -- has remained elusive due in part to uncertainty about its economic feasibility.
In an effort to provide clarity on this point, researchers at Stanford University and Carnegie Mellon University have estimated the energy demands involved with a critical stage of the process.Their findings, published April 8 in "Designing massive new infrastructure systems for geological carbon storage with an appreciation for how they intersect with other engineering challenges -- in this case the difficulty of managing high salinity brines -- will be critical to maximizing the carbon benefits and reducing the system costs," said study senior author Meagan Mauter, an associate professor of Civil and Environmental Engineering at Stanford University.Getting to a clean, renewable energy future won't happen overnight. One of the bridges on that path will involve dealing with carbon dioxide emissions -- the dominant greenhouse gas warming the Earth -- as fossil fuel use winds down. That's where carbon sequestration comes in. While most climate scientists agree on the need for such an approach, there has been little clarity about the full lifecycle costs of carbon storage infrastructure.An important aspect of that analysis is understanding how we will manage brines, highly concentrated salt water that is extracted from underground reservoirs to increase carbon dioxide storage capacity and minimize earthquake risk. Saline reservoirs are the most likely storage places for captured carbon dioxide because they are large and ubiquitous, but the extracted brines have an average salt concentration that is nearly three times higher than seawater.These brines will either need to be disposed of via deep well injection or desalinated for beneficial reuse. Pumping it underground -- an approach that has been used for oil and gas industry wastewater -- has been linked to increased earthquake frequency and has led to significant public backlash. But desalinating the brines is significantly more costly and energy intensive due, in part, to the efficiency limits of thermal desalination technologies. It's an essential, complex step with a potentially huge price tag.The new study is the first to comprehensively assess the energy penalties and carbon dioxide emissions involved with brine management as a function of various carbon transport, reservoir management and brine treatment scenarios in the U.S. The researchers focused on brine treatment associated with storing carbon from coal-fired power plants because they are the country's largest sources of carbon dioxide, the most cost-effective targets for carbon capture and their locations are generally representative of the location of carbon dioxide point sources.Perhaps unsurprisingly, the study found higher energy penalties for brine management scenarios that prioritize treatment for reuse. In fact, brine management will impose the largest post-capture and compression energy penalty on a per-tone of carbon dioxide basis, up to an order of magnitude greater than carbon transport, according to the study."There is no free lunch," said study lead author Timothy Bartholomew, a former civil and environmental engineering graduate student at Carnegie Mellon University who now works for KeyLogic Systems, a contractor for the Department of Energy's National Energy Technology Laboratory. "Even engineered solutions to carbon storage will impose energy penalties and result in some carbon emissions. As a result, we need to design these systems as efficiently as possible to maximize their carbon reduction benefits."Solutions may be at hand.The energy penalty of brine management can be reduced by prioritizing storage in low salinity reservoirs, minimizing the brine extraction ratio and limiting the extent of brine recovery, according to the researchers. They warn, however, that these approaches bring their own tradeoffs for transportation costs, energy penalties, reservoir storage capacity and safe rates of carbon dioxide injection into underground reservoirs. Evaluating the tradeoffs will be critical to maximizing carbon dioxide emission mitigation, minimizing financial costs and limiting environmental externalities."There are water-related implications for most deep decarbonization pathways," said Mauter, who is also a fellow at the Stanford Woods Institute for the Environment. "The key is understanding these constraints in sufficient detail to design around them or develop engineering solutions that mitigate their impact."Funding for this research provided by the National Science Foundation, the ARCS Foundation and the U.S. Department of Energy.
Earthquakes
2,021
March 24, 2021
https://www.sciencedaily.com/releases/2021/03/210324142845.htm
Weird earthquake reveals hidden mechanism
The wrong type of earthquake in an area where there should not have been an earthquake led researchers to uncover the cause for this unexpected strike-slip earthquake -- where two pieces of crust slide past each other on a fault -- in places where subduction zone earthquakes -- one geologic plate slipping beneath another -- are common.
"The first earthquake that occurred in the Shumagin Islands region of Alaska was the right type," said Kevin P. Furlong, professor of geoscience, Penn State. "The second one was a strike-slip earthquake and that made no sense. That was the part that got us thinking."In subduction zones, where two tectonic plates meet, one plate slides beneath the other. If the plates slide smoothly, they are considered unlocked or uncoupled. If the plates hang-up on each other for a time until the forces overcome the friction holding them and then release, causing an earthquake, they are considered locked or coupled. Some portions of a subduction zone can be locked while other parts may be unlocked.In 2020, a subduction zone interface earthquake, the type expected at a subduction zone, occurred on the Alaska-Aleutian subduction zone to the east of an area called the Shumagin Gap. This first earthquake happened where other, similar earthquakes occurred in the past.The Shumagin Gap is an area of the subduction zone considered to be unlocked and where geoscientists assume no earthquakes will occur. However, where the first earthquake occurred, just on the edge of the Shumagin Gap, the subduction zone is locked, and earthquakes have occurred there.In October 2020, a strike-slip earthquake occurred in the Pacific Plate right in the middle of the gap, which was unexpected."There must be a fault in the subducting Pacific Plate, and we can't see it," said Furlong.He explained that in the oceanic crust there are strike-slip faults that develop at the mid-oceanic ridges. This fault in the Shumagin Gap could be a relic of a fault from the mid-ocean ridge, activated in a different way. It appears to be in the correct direction, Furlong added.To investigate this event, Furlong and Matthew Herman, assistant professor of geology, California State University, Bakersfield, modeled the earthquakes. They also included data from the small tsunami that occurred from the second earthquake. They found that, with the presence of a fault in the subducting plate, the uncoupled nature of the Shumagin Gap made an earthquake there more likely than if the area was coupled. The researchers report their result today (Mar. 24) in "The potential for unusual earthquakes in these regions makes sense from our computational models," said Herman. "But it is still pretty counterintuitive that making the expected kind of earthquakes less likely actually makes other types of big earthquakes more likely."The researchers found that tsunami data was helpful, especially in areas where GPS data were not available. Tsunamis also allow paleoseismologists to look at past events through any deposits left by previous earthquakes. Previously there was no evidence of large earthquakes in this area from tsunami data."There are probably other areas that are uncoupled where we assume they are safe from earthquakes, but they aren't," said Furlong. "They are unlikely to have a big subduction earthquake, but they could have a strike-slip earthquake. If there are people in the area, it could do damage with shaking and a small tsunami."Furlong suggested there is an increased recognition that there are other ways of generating earthquakes on plate boundaries, and that we need to be a little more forward-thinking when we consider earthquakes on these boundaries.The National Science Foundation supported this research.
Earthquakes
2,021
March 22, 2021
https://www.sciencedaily.com/releases/2021/03/210322143313.htm
New basalt type discovered beneath the ocean
A new type of rock created during large and exceptionally hot volcanic eruptions has been discovered beneath the Pacific Ocean.
An international team of researchers including the University of Leeds unearthed the previously unknown form of basalt after drilling through the Pacific ocean floor.The discovery suggests that ocean floor eruptions sourced in the Earth's mantle were even hotter and more voluminous than previously thought. Report co-author is Dr Ivan Savov, of Leeds' Institute of Geophysics and Tectonics, in the university's School of Earth and Environment.He said: "In an era when we rightly admire discoveries made through space exploration, our findings show there are still many discoveries still to make on our own planet."The rocks that we recovered are distinctly different to rocks of this type that we already know about. In fact, they may be as different to Earth's known ocean floor basalts as Earth's basalts are to the Moon's basalts."Now that we know where and how this rock type is formed, we anticipate that many other rocks that we know were originally formed by ocean floor eruptions will be re-examined and potentially alter our wider understanding of the basalt formation."The newly-discovered basalt is distinct from known rocks in both its chemical and mineral makeup.Its existence was previously not known because no new examples have been formed in millions of years. As a result, the new basalt type lay buried deep beneath sediment at the bottom of the ocean.To find the new rock, the research team, aboard the Research Vessel (RV) JOIDES "Resolution," sank their drilling equipment 6km down to the ocean floor of the Amami Sankaku Basin -- about 1,000km southwest of Japan's Mount Fuji volcano.They then drilled a further 1.5km into the ocean floor, extracting samples that had never before been examined by scientists.The research area was part of the birth of the "Ring of Fire" -- a horseshoe-shaped belt known for regular volcanic eruptions and earthquakes. It stretches about 40,000 km around the Pacific, and is thought to have begun forming at least 50 million years ago.Dr Savov explained: "This was one of the deepest waters ever to be considered for drilling, using a research vessel specifically designed for such challenging deep sea environments."Basalt is among the most common type of rock on earth. We were looking for basalt that was formed during early Ring of Fire volcanic eruptions.."The eruptions that created the newly-discovered basalt were very widespread (covering areas the size of western Europe) and occurred in a relatively short geological timescale of between 1-2 million years.The research team's findings have been published in
Earthquakes
2,021
March 22, 2021
https://www.sciencedaily.com/releases/2021/03/210322093727.htm
Virtues of modeling many faults: New method illuminates shape of Alaskan quake
An earthquake is generally viewed to be caused by a rupture along a fault that is transmitted outward from its point of origin in a uniform, predictable pattern. Of course, given the complexity of the environments where these ruptures typically occur, the reality is often much more complicated.
In a new study published in As study co-author Professor Yuji Yagi explains, "Our method uses a flexible finite-fault inversion framework with improved smoothness constraints. This approach allows us to analyze seismic P waves and estimate the focal mechanisms and rupture evolution of geometrically complex earthquakes involving rupture of multiple fault segments."Based on the distribution of aftershocks within one week of the main shock of the Gulf of Alaska earthquake, this method was applied to represent slip along a horizontal plane at a depth of 33.6 km.The main rupture stage of the earthquake, which lasted for 27 seconds, affected fault segments oriented both north-south and east-west."Our results confirm previous reports that this earthquake ruptured a conjugate fault system in a multi-shock sequence," says study first author Shinji Yamashita. "Our model further suggests that this rupture tended to occur along weak zones in the sea floor: fracture zones that extend east-west, as well as plate-bending faults that run parallel to north-south-oriented magnetic lineaments."These features caused discontinuities in the fault geometry that led to irregular rupture behavior. "Our findings show that irregular rupture stagnation 20 kilometers north of the earthquake's epicenter may have been promoted by a fault step across the seafloor fracture zone," explains co-author Assistant Professor Ryo Okuwaki, "They also indicate a causal link between rupture evolution and pre-existing bathymetric features in the Gulf of Alaska."This method represents a promising step forward in modeling earthquake rupture processes in complex fault systems based only on seismic body waves, which may improve modeling of seismic wave propagation and mapping of complex fault networks in tectonically active areas.
Earthquakes
2,021
March 18, 2021
https://www.sciencedaily.com/releases/2021/03/210318170319.htm
Melting glaciers contribute to Alaska earthquakes
In 1958, a magnitude 7.8 earthquake triggered a rockslide into Southeast Alaska's Lituya Bay, creating a tsunami that ran 1,700 feet up a mountainside before racing out to sea.
Researchers now think the region's widespread loss of glacier ice helped set the stage for the quake.In a recently published research article, scientists with the University of Alaska Fairbanks Geophysical Institute found that ice loss near Glacier Bay National Park has influenced the timing and location of earthquakes with a magnitude of 5.0 or greater in the area during the past century.Scientists have known for decades that melting glaciers have caused earthquakes in otherwise tectonically stable regions, such as Canada's interior and Scandinavia. In Alaska, this pattern has been harder to detect, as earthquakes are common in the southern part of the state.Alaska has some of the world's largest glaciers, which can be thousands of feet thick and cover hundreds of square miles. The ice's weight causes the land beneath it to sink, and, when a glacier melts, the ground springs back like a sponge."There are two components to the uplift," said Chris Rollins, the study's lead author who conducted the research while at the Geophysical Institute. "There's what's called the 'elastic effect,' which is when the earth instantly springs back up after an ice mass is removed. Then there's the prolonged effect from the mantle flowing back upwards under the vacated space."In the study, researchers link the expanding movement of the mantle with large earthquakes across Southeast Alaska, where glaciers have been melting for over 200 years. More than 1,200 cubic miles of ice have been lost.Southern Alaska sits at the boundary between the continental North American plate and the Pacific Plate. They grind past each other at about two inches per year -- roughly twice the rate of the San Andreas fault in California -- resulting in frequent earthquakes.The disappearance of glaciers, however, has also caused Southeast Alaska's land to rise at about 1.5 inches per year.Rollins ran models of earth movement and ice loss since 1770, finding a subtle but unmistakable correlation between earthquakes and earth rebound.When they combined their maps of ice loss and shear stress with seismic records back to 1920, they found that most large quakes were correlated with the stress from long-term earth rebound.Unexpectedly, the greatest amount of stress from ice loss occurred near the exact epicenter of the 1958 quake that caused the Lituya Bay tsunami.While the melting of glaciers is not the direct cause of earthquakes, it likely modulates both the timing and severity of seismic events.When the earth rebounds following a glacier's retreat, it does so much like bread rising in an oven, spreading in all directions. This effectively unclamps strike-slip faults, such as the Fairweather in Southeast Alaska, and makes it easier for the two sides to slip past one another.In the case of the 1958 quake, the postglacial rebound torqued the crust around the fault in a way that increased stress near the epicenter as well. Both this and the unclamping effect brought the fault closer to failure."The movement of plates is the main driver of seismicity, uplift and deformation in the area," said Rollins. "But postglacial rebound adds to it, sort of like the de-icing on the cake. It makes it more likely for faults that are in the red zone to hit their stress limit and slip in an earthquake."
Earthquakes
2,021
March 17, 2021
https://www.sciencedaily.com/releases/2021/03/210317141719.htm
A new view on plate tectonics
Forces acting inside the Earth have been constantly reshaping the continents and ocean basins over millions of years. What Alfred Wegener published as an idea in 1915 has finally been accepted since the 1960s, providing a unifying view about our planet. The fact that the theory of plate tectonics took so long to gain acceptance had two simple reasons. First, the geological formations that are most important for its understanding lie at the bottom of the oceans. Secondly, forces controlling the processes act below the seafloor and are hence hidden from our view. Many details of plate tectonics are therefore still unclear today.
Today, five scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel, the Southern University of Science and Technology (Shenzhen, China) and GeoModelling Solutions GmbH (Switzerland) publish a study in the international scientific journal A look at a global overview map of the ocean floors helps to understand the study. Even at low resolution, several tens of thousands of kilometres long mid-ocean ridges can be recognised on such maps. They mark the boundaries of the Earth's plates. In between, hot material from the Earth's interior reaches the surface, cools down, forms new ocean floor and pushes the older ocean floor apart. "This is the engine that keeps the plates moving," explains Prof. Grevemeyer.However, the mid-ocean ridges do not form unbroken lines. They are cut by transverse valleys at almost regular intervals. The individual segments of the ridges each begin or end in an offset at these incisions. "These are the transform faults. Because the Earth is a sphere, plate movements repeatedly cause faults that produce these ridge offsets," explains Prof. Lars Rüpke from GEOMAR, co-author of the study.Earthquakes can occur at the transform faults and they leave long scars, so-called fracture zones, on oceanic plates. Until now, however, research assumed that the two plates only slide past each other at transform faults, but that seafloor is neither formed nor destroyed in the process.The authors of the current study have now looked at available maps of 40 transform faults in all ocean basins. "In all examples, we could see that the transform valleys are significantly deeper than the adjacent fractures zones, which were previously thought to be simple continuations of the transform valleys," says co-author Prof. Colin Devey from GEOMAR. The team also detected traces of extensive magmatism at the outer corners of the intersections between transform valleys and the mid-ocean ridges.Using sophisticated numerical models, the team found an explanation for the phenomenon. According to this, the plate boundary along the transform fault is increasingly tilted at depth, so that shearing occurs. This causes extension of the seafloor, forming the deep transform valleys. Magmatism at the outer corners to the mid-ocean ridges then fills up the valleys, so that the fracture zones become much shallower. Oceanic crust that forms at the corners is therefore the only crust in the ocean that is formed by two-stage volcanism. What effects this has on its composition or, for example, the distribution of metals in the crust is still unknown.Since transform faults are a fundamental type of plate boundary and frequent phenomenon along active plate boundaries in the oceans, this new finding is an important addition to the theory of plate tectonics and thus to understanding our planet. "Actually, the observation was obvious. But there are simply not enough high-resolution maps of the seafloor yet, so no one has noticed it until now," says Prof. Grevemeyer.
Earthquakes
2,021
March 17, 2021
https://www.sciencedaily.com/releases/2021/03/210317141713.htm
Icy ocean worlds seismometer passes further testing in Greenland
The NASA-funded Seismometer to Investigate Ice and Ocean Structure (SIIOS) performed well in seismic experiments conducted in snowy summer Greenland, according to a new study by the SIIOS team led by the University of Arizona published this week in
SIIOS could be a part of proposed NASA spacecraft missions to the surface of Europa or Enceladus. These moons of Jupiter and Saturn are encrusted by an icy shell over subsurface liquid oceans, and seismic data could be used to better define the thickness and depth of these layers. Other seismic points of interest on these worlds could include ice volcanoes, drainage events below the ice shell and possibly even a timely glimpse of the reverberations from a meteorite impact.To better mimic mission conditions, the SIIOS team attached flight candidate seismometers to the platform and legs of a buried and aluminum-shielded mock spacecraft lander on the Greenland Ice Sheet. Angela Marusiak of NASA's Jet Propulsion Laboratory and colleagues found that the lander's recordings of seismic waves from passive and active seismic sources were comparable to recordings made by other ground seismometers and geophones up to a kilometer away.Although the attached seismometers did pick up some of the shaking of the lander itself, Marusiak said the lander and ground-based seismometers "performed very similar to each other, which is definitely promising," in detecting earthquakes and ice cracking.The experimental array was placed over a subglacial lake (a new feature in Greenland that had not yet been studied with seismic approaches) and the lander-coupled seismometers were also able to detect the ice-water interface, which would be one of the instrument's primary tasks on the icy ocean worlds.The scientists buried the lander and nearby seismometers a meter deep in granular snow, and covered the lander with an aluminum box, to reduce the effects of wind and temperature variation on the instruments. This brought the experiment closer to the atmospheric conditions that might be expected on an airless moon like Europa. During an icy ocean world mission, however, the seismometer would likely only be deployed to the surface and may not be buried."What we're hoping for is if we are able to go to Europa or Enceladus or one of these icy worlds that doesn't have huge temperature fluctuations or a very thick atmosphere and we're taking away that wind noise, essentially you're taking away what's going to cause a lot of shaking of the lander," explained Marusiak, who conducted the research while she was a Ph.D. student at the University of Maryland.And unlike on Earth, researchers for these missions wouldn't be able to deploy a large array of seismometers and gather data for months at a time to build a picture of the moon's interior. The available solar energy to power the devices would be 25 times less than that on Earth, and devastating radiation would be likely to destroy the instruments within a couple weeks on a moon like Europa, she said.After taking an Air Greenland helicopter ride to the site in the summer of 2018, the SIIOS deployment team set up the experimental lander and array on the ice sheet about 80 kilometers north of Qaanaaq. For the active source experiment, the instruments recorded seismic signals created by the team members striking aluminum plates with a sledgehammer at locations up to 100 meters from the array's center.The array then made passive recordings of local and regional seismic events and the ice sheet's ambient creaking and cracking noises for about 12 days, until an unusual summer snow buried the solar panels powering the array.Marusiak was proud to be a member of an all-female demobilization team, and by the warm reception that the scientists received at the Thule AFB. The work would not have been possible without the logistics support provided by the National Science Foundation, Polar Field Services, and local guides.The team plans to return to Greenland this summer to test a prototype seismometer that has been designed to account for more mission-ready conditions of radiation, vacuum and launch vibration, she said.
Earthquakes
2,021
March 16, 2021
https://www.sciencedaily.com/releases/2021/03/210316183643.htm
The potential economic impact of volcano alerts
The Volcano Alert Level (VAL) system, standardized by the United States Geological Survey (USGS) in 2006, is meant to save lives and keep citizens living in the shadow of an active volcano informed of their current level of risk.
A new study published in A team of geoscientists and statistical experts examined the historical relationship between volcano alerts issued by the United States Geological Survey (USGS) and regional economic growth for three of the country's most dangerous volcanoes: Washington State's Mount St. Helens, Hawaii's Kīlauea, and California's Long Valley Caldera.They analyzed the effect of VALs and their predecessors (such as hazard alerts and volcano alerts) on local housing prices and business patterns over a 42-year period, from 1974 to 2016. The economic indicators used in the analysis included annual housing price, number of business establishments per 1,000 square kilometers, the number of employees per 1,000 inhabitants, and payroll per employee.The team used econometric models to observe economic indicator trends during times when an increase in volcanic activity above "normal" led to a public alert. "Signs of volcanic unrest include ground deformation, rising COBoth lower and higher alert level notifications were shown to have short-term effects on housing prices and business indicators in all three regions. The most significant negative impacts were seen for California's Long Valley Caldera area from 1982-83 and 1991-97. Home to Mono Lake, Mammoth Mountain, and the very popular Mammoth Lakes ski area, this complex volcanic region has experienced prolonged episodic unrest.Not all of the volcanic regions experienced a significant long-term economic impact from an elevated VAL. The greatest exception was Mount St. Helens. Peers suggests this could be due to "volcano tourism and close proximity to the major tech hub of Portland, Oregon." Despite catastrophic volcanic potential, the regional economy in the footprint of Mount St. Helens has benefited from tourism to the volcano -- accelerated by the establishment of Mount St. Helens National Volcanic Monument in 1982.The study's findings are consistent with those from other natural hazards studies that have documented temporary declines in housing prices following successive hurricanes, floods, and wildfires. With natural hazards, the mere presence of information about hazard potential in the form of a public alert level notification may have an adverse effect on local economies.This sheds light on a systemic issue in disaster resilience, the authors argue. The federal government currently provides disaster relief for direct impacts of volcanic eruptions and other natural disasters, but limited or no assistance for the indirect effects experienced from long periods of volcanic unrest. Durations of volcanic unrest are often protracted in comparison to precursory periods for other hazardous events (such as earthquakes, hurricanes, and floods). As Peers points out, this makes the issue of disaster relief for indirect effects particularly important in high-risk volcanic regions.For experts who study the risks of natural hazards, the team suggests they have developed a repeatable and reliable methodology to test hazard alert effects on local economies using publicly available federal U.S. business statistics. "This could be utilized to examine the impacts of all hazard alerts, such as those for wildfires or earthquakes," the authors write.And for citizens, "we hope this research will help people better understand that the risks involved with living around a volcano are not entirely from the physical hazards associated with volcanism. It's more financially complicated than that," says Peers.
Earthquakes
2,021
March 11, 2021
https://www.sciencedaily.com/releases/2021/03/210311142024.htm
Geologists discover powerful 'river of rocks' below Caribbean
Geologists have long thought tectonic plates move because they are pulled by the weight of their sinking portions and that an underlying, hot, softer layer called asthenosphere serves as a passive lubricant. But a team of geologists at the University of Houston has found that layer is actually flowing vigorously, moving fast enough to drive plate motions.
In their study published in "Without the extra support generated by this flow in the asthenosphere, portions of Central America would still be below sea level. The Atlantic and the Pacific Oceans would be connected without a need for the Panama Canal," said study co-author Lorenzo Colli, assistant professor of geophysics, geodynamics and mantle structure in the Department of Earth and Atmospheric Sciences.The findings have implications for understanding the shape of the Earth's surface, of its evolution over time through the appearance and disappearance of shallows seas, low-lying land bridges and the forces that move tectonic plates and cause earthquakes.Another fascinating discovery, according to the researchers, is the asthenosphere is moving six inches per year, which is three times faster than an average plate. It can move independently from the overlying plates and drag them in a different direction."This challenges the top-down notion that subduction is always the driver," explained Jonny Wu, study co-author and assistant professor of structural geology, tectonics and mantle structure. "Think of the plates moving like an air hockey puck and being lubricated from below. Instead, what we found is the air hockey table is imposing its own currents on the puck that's moving around, creating a bottom-up movement that has not been well recognized, and that's being quantified here."
Earthquakes
2,021
March 10, 2021
https://www.sciencedaily.com/releases/2021/03/210310150426.htm
Catching energy-exploration caused earthquakes before they happen
Geoscientists at Sandia National Laboratories used 3D-printed rocks and an advanced, large-scale computer model of past earthquakes to understand and prevent earthquakes triggered by energy exploration.
Injecting water underground after unconventional oil and gas extraction, commonly known as fracking, geothermal energy stimulation and carbon dioxide sequestration all can trigger earthquakes. Of course, energy companies do their due diligence to check for faults -- breaks in the earth's upper crust that are prone to earthquakes -- but sometimes earthquakes, even swarms of earthquakes, strike unexpectedly.Sandia geoscientists studied how pressure and stress from injecting water can transfer through pores in rocks down to fault lines, including previously hidden ones. They also crushed rocks with specially engineered weak points to hear the sound of different types of fault failures, which will aid in early detection of an induced earthquake.To study different types of fault failures, and their warning signs, Sandia geoscientist Hongkyu Yoon needed a bunch of rocks that would fracture the same way each time he applied pressure -- pressure not unlike the pressure caused by injecting water underground.Natural rocks collected from the same location can have vastly different mineral orientation and layering, causing different weak points and fracture types.Several years ago, Yoon started using additive manufacturing, commonly known as 3D printing, to make rocks from a gypsum-based mineral under controlled conditions, believing that these rocks would be more uniform. To print the rocks, Yoon and his team sprayed gypsum in thin layers, forming 1-by-3-by-0.5 inch rectangular blocks and cylinders.However, as he studied the 3D-printed rocks, Yoon realized that the printing process also generated minute structural differences that affected how the rocks fractured. This piqued his interest, leading him to study how the mineral texture in 3D-printed rocks influences how they fracture."It turns out we can use that variability of mechanical and seismic responses of a 3D-printed fracture to our advantage to help us understand the fundamental processes of fracturing and its impact on fluid flow in rocks," Yoon said. This fluid flow and pore pressure can trigger earthquakes.For these experiments, Yoon and collaborators at Purdue University, a university with which Sandia has a strong partnership, made a mineral ink using calcium sulfate powder and water. The researchers, including Purdue professors Antonio Bobet and Laura Pyrak-Nolte, printed a layer of hydrated calcium sulfate, about half as thick as a sheet of paper, and then applied a water-based binder to glue the next layer to the first. The binder recrystallized some of the calcium sulfate into gypsum, the same mineral used in construction drywall.The researchers printed the same rectangular and cylindrical gypsum-based rocks. Some rocks had the gypsum mineral layers running horizontally, while others had vertical mineral layers. The researchers also varied the direction in which they sprayed the binder, to create more variation in mineral layering.The research team squeezed the samples until they broke. The team examined the fracture surfaces using lasers and an X-ray microscope. They noticed the fracture path depended on the direction of the mineral layers. Yoon and colleagues described this fundamental study in a paper published in the journal Also, working with his collaborators at Purdue University, ?Yoon monitored acoustic waves coming from the printed samples as they fractured. These sound waves are signs of rapid microcracks. Then the team combined the sound data with machine-learning techniques, a type of advanced data analysis that can identify patterns in seemingly unrelated data, to detect signals of minute seismic events.First, Yoon and his colleagues used a machine-learning technique known as a random forest algorithm to cluster the microseismic events into groups that were caused by the same types of microstructures and identify about 25 important features in the microcrack sound data. They ranked these features by significance.Using the significant features as a guide, they created a multilayered "deep" learning algorithm -- like the algorithms that allow digital assistants to function -- and applied it to archived data collected from real-world events. The deep-learning algorithm was able to identify signals of seismic events faster and more accurately than conventional monitoring systems.Yoon said that within five years they hope to apply many different machine-learning algorithms, like these and those with imbedded geoscience principles, to detect induced earthquakes related to fossil fuel activities in oil or gas fields. The algorithms can also be applied to detect hidden faults that might become unstable due to carbon sequestration or geothermal energy stimulation?, he said."One of the nice things about machine learning is the scalability," Yoon said. "We always try to apply certain concepts that were developed under laboratory conditions to large-scale problems -- that's why we do laboratory work. Once we proved those machine-learning concepts developed at the laboratory scale on archived data, it's very easy to scale it up to large-scale problems, compared to traditional methods."A hidden fault was the cause of a surprise earthquake at a geothermal stimulation site in Pohang, South Korea. In 2017, two months after the final geothermal stimulation experiment ended, a magnitude 5.5 earthquake shook the area, the second strongest quake in South Korea's recent history.After the earthquake, geoscientists discovered a fault hidden deep between two injection wells. To understand how stresses from water injection traveled to the fault and caused the quake, Kyung Won Chang, a geoscientist at Sandia, realized he needed to consider more than the stress of water pressing on the rocks. In addition to that deformation stress, he also needed to account for how that stress transferred to the rock as the water flowed through pores in the rock itself in his complex large-scale computational model.Chang and his colleagues described the stress transfer in a paper published in the journal However, understanding deformation stress and transfer of stress through rock pores is not enough to understand and predict some earthquakes induced by energy-exploration activities. The architecture of different faults also needs to be considered.Using his model, Chang analyzed a cube 6 miles long, 6 miles wide and 6 miles deep where a swarm of more than 500 earthquakes took place in Azle, Texas, from November 2013 to May 2014. The earthquakes occurred along two intersecting faults, one less than 2 miles beneath the surface and another longer and deeper. While the shallow fault was closer to the sites of wastewater injection, the first earthquakes occurred along the longer, deeper fault.In his model, Chang found that the water injections increased the pressure on the shallow fault. At the same time, injection-induced stress transferred through the rock down to the deep fault. Because the deep fault was under more stress initially, the earthquake swarm began there. He and Yoon shared the advanced computational model and their description of the Azle earthquakes in a paper recently published in the "In general, we need multiphysics models that couple different forms of stress beyond just pore pressure and the deformation of rocks, to understand induced earthquakes and correlate them with energy activities, such as hydraulic stimulation and wastewater injection," Chang said.Chang said he and Yoon are working together to apply and scale up machine-learning algorithms to detect previously hidden faults and identify signatures of geologic stress that could predict the magnitude of a triggered earthquake.In the future, Chang hopes to use those stress signatures to create a map of potential hazards for induced earthquakes around the United States.
Earthquakes
2,021
March 5, 2021
https://www.sciencedaily.com/releases/2021/03/210305123805.htm
Making sense of commotion under the ocean to locate tremors near deep-sea faults
Researchers from Japan and Indonesia have pioneered a new method for more accurately estimating the source of weak ground vibrations in areas where one tectonic plate is sliding under another in the sea. Applying the approach to Japan's Nankai Trough, the researchers were able to estimate previously unknown properties in the region, demonstrating the method's promise to help probe properties needed for better monitoring and understanding larger earthquakes along this and other plate interfaces.
Episodes of small, often imperceptible seismic events known as tremors occur around the world and are particularly common in areas near volcanoes and subduction zones -- regions where one of the massive plates forming Earth's outer layers slides under another. Though they may be weak, studying these vibrations is important for estimating features of the associated tectonic plate boundaries and is necessary for detecting slipping among the plates that can be used to warn against larger earthquake events and tsunamis."Tremor episodes occur frequently in subduction zones, but their point of origin can be difficult to determine as they have no clear onset features like the sudden, strong shaking seen with ordinary earthquakes," explains Takeshi Tsuji, leader of the study's research team from Kyushu University's International Institute for Carbon-Neutral Energy Research (I2CNER)."Current techniques to identify their source rely on waveform readings from nearby seismic stations together with a modelling system, but complex geological structures can greatly influence the results, leading to inaccurate travel times."The I2CNER team developed the new methodology to take into account some of the complexities of subduction zones such as the Nankai Trough and estimate more accurate travel times from source to station. The novel approach involves a model that does not rely on a constant waveform and also considers the relationships between tremors detected at all possible pairs of monitoring stations."Applying this method to the Nankai Trough, we found that most tremors occurred in areas of high fluid pressure called the shear zone on the plate boundary," says study lead author Andri Hendriyana."The thickness of the shear zone was found to be a major controlling factor for the tremor epicentre, with the tremor sequence initiating at regions where fluid pressures within the rocks are the greatest."Having better determined the locations of several tremors, the research could also more accurately estimate the speed of tremor propagation. Using this information, the team was then able to estimate how easily liquids can move through the deep fault. Known as permeability, this property is important for evaluating earthquake rupture processes and had never before been reported for the deep plate interface of the Nankai Trough."Accurately determining tremor source and related geophysical properties is crucial in the monitoring and modelling of larger earthquakes along the plate interface," comments Tsuji. "Our method can also be applied in other regions where tremor location estimation is difficult because of a complex geography to better obtain this vital information."
Earthquakes
2,021
March 4, 2021
https://www.sciencedaily.com/releases/2021/03/210304100429.htm
Revisiting the Kobe earthquake and the variations of atmospheric radon concentration
Tohoku University researchers have unearthed further details about radon concentration in the atmosphere before and after earthquakes, moving us closer to being able to anticipate when large earthquakes may hit.
The results of their research were published in the journal Radon is a radioactive noble gas derived from radioactive decays of radium-226 in the ground. Radon bubbles up to the surface and is expelled into the atmosphere.It has long been known that elevated levels of radon underneath the ground can be detected before and after earthquakes. But the relationship between the mechanisms that cause abnormal changes in radon concentration and the occurrence of earthquakes requires greater understanding in order to predict earthquakes accurately.Professor Hiroyuki Nagahama and associate professor Jun Muto of the Graduate School of Science at Tohoku University, in collaboration with Fukushima Medical University and Kobe Pharmaceutical University, analyzed radon concentration data observed before the 1995 Kobe Earthquake."We found that there were changes in radon concentration data that originated from tides," said Muto. "This caused periodic loading on the earth's crust."They also noticed that crustal compression rates on faults near the radon observation point had decreased, and this may have triggered the periodic change in radon concentration.Radioisotope facilities, which measure the atmospheric radon, exist across Japan.Muto hopes that their research leads to an increase in radon monitoring across the globe. "We believe that further examination of seismological and geological conditions that differ from Japan will lead to a better understanding of the physical and chemical processes that cause radon concentration variations preceding earthquakes."
Earthquakes
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302150036.htm
Unusual earthquakes highlight central Utah volcanoes
If you drive south through central Utah on Interstate 15 and look west somewhere around Fillmore, you'll see smooth hills and fields of black rock. The area is, aptly, named the Black Rock Desert. It may not look like much, but you're looking at some of Utah's volcanoes.
A pair of earthquake sequences, in September 2018 and April 2019, focused scientists' attention on the Black Rock Desert. The sequences, which included the main quakes and their aftershocks, were very different from the Magna earthquake that shook the Wasatch Front in 2020 and other Utah earthquakes. The Black Rock sequences were captured by the Utah Regional Seismic Network and by nearby temporary seismic equipment deployment that was monitoring a geothermal well. Earthquakes in the Black Rock Desert are rare and capturing the seismic recordings from these earthquakes provides a glimpse into the volcanic system of the Black Rock Desert that, while not showing any signs of erupting, is still active. A study of the earthquake sequences is published in "The results showed us that we should give more attention to the Black Rock area," says Maria Mesimeri, a postdoctoral research associate with the University of Utah Seismograph Stations. "We need to improve seismic and volcanic monitoring in this area, so that we are aware of small changes that may occur."The earthquake sequences, with main shocks of magnitude 4.0 and 4.1 respectively, were picked up by both the Utah Regional Seismic Network and a dense temporary network of seismometers deployed as part of Utah FORGE, an experimental geothermal project funded by the U.S. Department of Energy and operated by the University of Utah, located about 19 miles south of the Black Rock Desert near Milford, Utah. The temporary network allowed researchers to detect more aftershocks than usual. For example, the regional network detected 19 earthquakes as part of the April 2019 sequence. But the dense temporary network detected an additional 35 quakes. Each additional aftershock provided a bit more information for seismologists studying the sequence.The Black Rock sequences showed some interesting features that set them apart from the 2020 Magna sequence and other Utah earthquake sequences. While the initial Magna quake occurred at a depth of about six miles below the surface, a typical depth for Utah earthquakes, the Black Rock quakes were much shallower -- around 1.5 miles below the surface."Because these earthquakes were so shallow," Mesimeri says, "we could measure surface deformation [due to the quakes] using satellites, which is very unusual for earthquakes this small."Also, Mesimeri and her colleagues found, the quakes produced much lower-frequency seismic energy than usually seen in Utah quakes. And one of the main types of seismic waves, shear waves or S-waves, wasn't detected in the Black Rock sequences.All of these signs point to the Black Rock sequences having a very different origin than the Magna sequence, which was generated by movement of the Wasatch Fault. The Black Rock quakes, on the other hand, may have been generated by ongoing activity in the Black Rock volcanic field.What are volcanoes doing in the middle of Utah? The Wasatch Mountains (and Wasatch Fault) form the eastern margin of a region called the Basin and Range province that stretches west to the Sierra Nevada. The province is being stretched apart by plate tectonics, and that stretching thins the crust, allowing more heat to rise up from the Earth's interior. In the Black Rock area, that heat resulted in eruption of basalt lava up until around 9,000 to 12,000 years ago.So what do these earthquake sequences mean for the volcanoes of the Black Rock Desert?"Our findings suggest that the system is still active and that the earthquakes were probably the result of fluid-related movement in the general area," Mesimeri says, referring to potentially magma or heated water. "The earthquakes could be the result of the fluid squeezing through rock or the result of deformation from fluid movement that stressed the surface faults."Activity in a volcanic field does not mean eruption, and Mesimeri says that there's no evidence that any eruption is imminent in the Black Rock Desert. But, she says, it's an area that geoscientists may want to monitor a little more closely.
Earthquakes
2,021
March 2, 2021
https://www.sciencedaily.com/releases/2021/03/210302075346.htm
Lead up to volcanic eruption in Galapagos captured in rare detail
Hours before the 2018 eruption of Sierra Negra, the Galápagos Islands' largest volcano, an earthquake rumbled and raised the ground more than 6 feet in an instant. The event, which triggered the eruption, was captured in rare detail by an international team of scientists, who said it offers new insights into one of the world's most active volcanoes.
"The power of this study is that it's one of the first times we've been able to see a full eruptive cycle in this detail at almost any volcano," said Peter La Femina, associate professor of geosciences at Penn State. "We've monitored Sierra Negra from when it last erupted in 2005 through the 2018 eruption and beyond, and we have this beautiful record that's a rarity in itself."For nearly two months in 2018, lava erupted from the volcano, covering about 19 square miles of Isabela Island, the largest island in the Galápagos and home to about 2,000 people and endangered animal species like the Galápagos giant tortoise."The 2018 eruption of Sierra Negra was a really spectacular volcanic event, occurring in the 'living laboratory' of the Galápagos Islands," said Andrew Bell, a volcanologist at the University of Edinburgh. "Great teamwork, and a bit of luck, allowed us to capture this unique dataset that provide us with important new understanding as to how these volcanoes behave, and how we might be able to better forecast future eruptions."While Sierra Negra is among the world's most active volcanos, its remote location previously made monitoring difficult. Scientists now use networks of ground-based seismic and GPS monitoring stations and satellite observations to observe the volcano."Based on constant monitoring of activity of Galapagos volcanoes, we detected a dramatic increase of seismicity and a steady uplift of crater floor at Sierra Negra," said Mario Ruiz, director of the Ecuador Geophysical Institute, the country's national monitoring agency. "Soon we contacted colleagues from the United Kingdom, United States and Ireland and proposed them to work together to investigate the mechanisms leading to an impending eruption of this volcano. This research is an example of international collaboration and partnership."The scientists captured data over 13 years as the volcano's magma chamber gradually refilled following the 2005 eruption, stressing the surrounding crust and creating earthquakes. This continued until June 2018, when an earthquake occurred on the calderas fault system and triggered the subsequent eruption, the scientists said."We have this story of magma coming in and stressing the system to the point of failure and the whole system draining again through the eruption of lava flows," La Femina said. "This is the first time anyone's seen that in the Galápagos to this detail. This is the first time we've had the data to say, 'okay, this is what happened here.'"Often during volcanic eruptions, as magma chambers empty the ground above them sinks and forms a bowl-like depression, or a caldera. But Sierra Negra experienced a caldera resurgence, leaving this area higher in elevation than it was before the eruption, the scientists said.Inside the Sierra Negra caldera is a "trap-door fault," which is hinged at one end while the other can be uplifted by rising magma. The scientists found the fault caused hills inside of the six-mile-wide caldera to lift vertically by more than 6 feet during the earthquake that triggered the eruption.Caldera resurgence, important to better understanding eruptions, had not been previously observed in such detail, the scientists reported in the journal "Resurgence is typical of explosive calderas at volcanoes like Yellowstone, not the kind of shield volcanoes we see in the Galápagos or Hawaii," La Femina said. "This gives us the ability to look at other volcanoes in the Galápagos and say, 'well that's what could have happened to form that caldera or that resurgent ridge.'"The scientists said the findings could help their counterparts in Ecuador better track unrest and warn of future eruptions."There are people who live on Isabella Island, so studying and understanding how these eruptions occur is important to manage the hazards and risks to local populations," La Femina said.Scientists from the Dublin Institute for Advanced Studies, Trinity College Dublin, the University of Cambridge, the University of Miami, Tulane University, NASA Goddard Space Flight Center and NASA Jet Propulsion Laboratory also contributed to the study.The National Science Foundation, NASA and the Nature Environment Research Council funded this research.
Earthquakes
2,021
February 25, 2021
https://www.sciencedaily.com/releases/2021/02/210225143819.htm
Post-wildfire landslides becoming more frequent in southern California
Southern California can now expect to see post-wildfire landslides occurring almost every year, with major events expected roughly every ten years, a new study finds. The results show Californians are now facing a double whammy of increased wildfire and landslide risk caused by climate change-induced shifts in the state's wet and dry seasons, according to researchers who mapped landslide vulnerability in the southern half of the state.
"This is our attempt to get people thinking about where these hazards are going to be before there's even a fire," said Jason Kean, a hydrologist at the U.S. Geological Survey in Denver and lead author of the new study in Wildfires make the landscape more susceptible to landslides when rainstorms pass through, as the water liquefies unstable, dry soil and burned vegetation. Geologists routinely conduct landslide hazard assessments after wildfires occur, but there is often not enough time between a fire and a rainstorm to implement an effective emergency response plan, Kean said.In the new study, Kean and his colleague combined historical fire, rainfall and landslide data with computer simulations to forecast where post-wildfire landslides are likely to occur in southern California, how big those landslides might be and how often they can be expected to happen. Their goal was to map which regions of the state are most vulnerable to landslides before they happen, in a manner similar to how geologists map earthquake hazards.Their results show small landslides can now be expected to occur almost every year in southern California. Major landslides capable of damaging 40 or more structures can be expected every 10 to 13 years -- about as frequently as magnitude 6.7 earthquakes occur in California, according to the study. The results also suggest more intense rainfall, which is likely to happen in the coming decades, could make landslides much more frequent.Combined with recent research showing California's wildfire season is getting longer and the rainy season is getting shorter and more intense, the new findings suggest Californians face a higher risk of wildfires and post-wildfire landslides that can damage property and endanger people's lives."We're going to have a longer season to burn and then when it does rain, it's going to come down harder. And that's a bad recipe for these post-fire debris flows," Kean said. "The reason you can expect one just about every year is because it doesn't take very much rain to cause one. The rainstorms that can trigger debris flows -- they're kind of garden-variety storms."California's central coast has already seen a significant landslide this year. A portion of Highway 1 near Big Sur was washed out in a landslide in late January after a severe rainstorm. Kean hopes the new study's results can help emergency managers plan out evacuation zones for landslides before they happen."We'll still always do hazard assessments after fires because we really want to know the details of the actual fire, but these wildfires scenarios and storm scenarios are useful because we can start looking ahead and have the luxury of time to make a better plan," he said.
Earthquakes
2,021
February 16, 2021
https://www.sciencedaily.com/releases/2021/02/210216185904.htm
Slow motion precursors give earthquakes the fast slip
At a glacier near the South Pole, earth scientists have found evidence of a quiet, slow-motion fault slip that triggers strong, fast-slip earthquakes many miles away, according to Cornell University research published in
During an earthquake, a fast slip happens when energy builds up underground and is released quickly along a fault. Blocks of earth rapidly slide against one another.However, at an Antarctic glacier called Whillans Ice Plain, the earth scientists show that "slow slips" precede dozens of large magnitude 7 earthquakes. "We found that there is almost always a precursory 'slow slip' before an earthquake," said lead author Grace Barcheck, research associate in Earth and Atmospheric Sciences at Cornell University.Barcheck said that these slow-slip precursors -- occurring as far as 20 miles away from the epicenter -- are directly involved in starting the earthquake. "These slow slips are remarkably common," she said, "and they migrate toward where the fast earthquake slip starts."Observations before several large tsunami-generating magnitude 8 and 9 earthquakes on subduction zone faults suggest a similar process may have occurred, according to Patrick Fulton, assistant professor and Croll Sesquicentennial Fellow in the Department of Earth and Atmospheric Sciences.As these faults are mostly offshore and deep underwater, and because it is difficult to know when or where a large earthquake will occur, the start of large earthquakes is generally hard to observe.To overcome these challenges, the scientists placed GPS sensors above an icy glacial fault at Whillans Ice Plain, where large magnitude 7 earthquakes occur nearly twice a day over a 60-mile-wide area of the glacier.Within a period of two months in 2014, the group captured 75 earthquakes at the bottom of the Antarctic glacier. Data from GPS stations indicated that 73 -- or 96% -- of the 75 earthquakes showed a period of precursory slow motion.The data from the GPS tracking stations and surface seismometers allowed the team to identify how the slow precursory slip triggers the fast earthquake slip."Our group was a little surprised to see so many precursors," Barcheck said."In some cases, we can actually see the migration of the earthquake precursor towards where the earthquake begins.""Before we pored over the data, I thought that if we saw any precursors before the earthquakes, they would be rare and in the same place as the earthquake epicenter," she said. "Instead, we found many slow-slip precursors -- starting miles from the epicenters and migrating across the fault."
Earthquakes
2,021
February 16, 2021
https://www.sciencedaily.com/releases/2021/02/210216133402.htm
Past earthquakes triggered large rockslides in the Eastern Alps
Geologists shed new light on a long-lasting debate about the trigger mechanism of large rockslides. Lake mud in two Alpine lakes in Tyrol reveal that rare strong earthquakes are the final cause of multiple, prehistoric rockslides in the Eastern Alps. The steep rock slopes were degraded by a series of prehistoric earthquakes, larger than any of the historically documented events in the region of the past ~1000 years.
Many steep valleys in the European Alps show the relicts of large rockslides, during which several hundreds of million cubic metres of rocks get instable, collapse and impact everything on their path. "For most of these, we still do not know how they are caused, because these rockslides occurred long before the start of written history in the region about 1000 years ago," says Patrick Oswald, PhD student at the Department of Geology of the University of Innsbruck and lead author of the study. "Curiously, many of these ancient rockslides occurred together in clusters, meaning they are found in small regions and have a rather comparable age." This enigmatic pattern has puzzled researchers over the last decades and fuelled some intense debates. Some experts propose that abrupt climate shifts can degrade rock slopes towards unstable conditions, whereas others think that strong earthquake shaking is the main driver. Such problems are notoriously difficult to solve, as the object of study -- the rock slope -- has collapsed and cannot be investigated anymore. Therefore, the research team decided to turn the perspective around and searched underwater for answers to these questions. "The different sediment layers that are deposited year after year on the bottom of our lakes and oceans provide long-term information on climatic and ecological conditions, but also record the disturbances induced by strong earthquake shaking that happened long ago, in the prehistoric past," says Michael Strasser, head of the Sedimentary Geology working group at the Department of Geology and the Austrian Core Facility for scientific core analysis at the University of Innsbruck.The geologists focused on two of the most massive rockslides of Tyrol, the Tschirgant and the Fernpass. "Instead of investigating the remnants of these rockslides in the landscape, we drilled into the muddy sedimentary archives on the bottom of the lakes Piburgersee and Plansee in the region and searched for specific traces that could tell us when strong earthquakes took place," explains Jasper Moernaut from the Department of Geology, head of the project this study is based on, "Tyrol on Shaky Slopes." "By comparing the earthquake and rockslide reconstructions for the past 10,000 years, we can evaluate whether these relate to each other, or not." By applying state-of-the-art techniques, such as hydroacoustic profiling of the lake's subsurface or computer-tomography scans of the 8 m long sediment cores, the researchers found two different types of earthquake traces in the sediments: Strong seismic shaking has deformed surficial sediments on the bottom of the lakes and also triggered numerous underwater mud avalanches. By radiocarbon dating of organic matter in the cores, this research discovered ten prehistorical earthquakes during the past 10,000 years. The ground shaking associated to these earthquakes was stronger than for those that have struck the region in the past ~1000 years. "Through a painstaking evaluation of historical earthquake reports and comparison with the sedimentary imprints in the lakes, we estimated the earthquake magnitudes to be M5.5 to 6.5," says Christa Hammerl, a historical seismologist at the Austrian Central Institute for Meteorology and Geodynamics. "As the earthquakes in the Eastern Alps occur at only a few kilometres depth, such earthquakes can produce considerable damage on infrastructure and the natural landscape."Strikingly, the ages of two extraordinary strong earthquakes coincide very well with those of multiple large rockslides around 3,000 and 4,100 years ago. This age coincidence let the geologists deduce that extreme seismic shaking ultimately triggered the rockslides at these times, solving the debate on the cause of rockslide clusters in the region. Since then, no large rockslides or such extraordinary strong earthquakes struck the area. The results also indicate that a close succession of at least five severe earthquakes preceded the rock slope collapses at about 3,000 years ago. "Therefore, we propose that seismic shaking cannot only trigger rockslides, but can also gradually degrade the rock slopes towards their critical tipping point," explains Michael Strasser. "With all this new information, the challenge now lies in its implementation for better assessing future earthquake and rockslide hazards in the densely populated Alpine valleys. An adequate mitigating of such low probability but high impact events forms a big challenge. Knowledge about these past events can help better understand earthquake occurrence and provide key information for assessing future earthquake- and rockslide hazard."
Earthquakes
2,021
February 16, 2021
https://www.sciencedaily.com/releases/2021/02/210216114917.htm
A groundbreaking solution? Polymers can protect buildings from large fault ruptures
Surface rupturing during earthquakes is a significant risk to any structure that is built across a fault zone that may be active, in addition to any risk from ground shaking. Surface rupture can affect large areas of land, and it can damage all structures in the vicinity of the fracture. Although current seismic codes restrict the construction in the vicinity of active tectonic faults, finding the exact location of fault outcrop is often difficult.
In many regions around the world, engineering structures such as earth dams, buildings, pipelines, landfills, bridges, roads and railroads have been built in areas very close to active fault segments. Strike-slip fault rupture occurs when the rock masses slip past each other parallel to the strike.A team of researchers led by Associate Professor Behzad Fatahi and supported by PhD candidate Habib Rasouli in the School of Civil and Environmental Engineering at University of Technology Sydney (UTS) has recently found a groundbreaking solution to protect buildings sitting on deep foundations subjected to large ground deformations due to strike-slip fault rupture."The strike-slip fault rupture can significantly damage structures such as buildings and infrastructure such as bridges," Associate Professor Fatahi said. "The unacceptable performance of conventional deep foundations under strike-slip fault rupture is due to a high level of shear forces in the raft and the large deformation and bending moment in the piles supporting the structures."Associate Professor Fatahi and his team have proposed a new composite foundation system using inexpensive polymeric materials to protect structures sitting on deep foundations."In this novel mitigation technique, the piles are disconnected from the building using an interposed layer of soil which is reinforced using geotextile layers," Associate Professor Fatahi said. "Geotextiles are polymeric materials made of polypropylene or polyethylene, which are manufactured in large sheets that can be easily transported to construction sites. The geotextiles embedded in the compacted sand and gravel act as isolator and reduce the impact of large ground deformations due to fault rupture."Associate Professor Fatahi and his team have developed an advanced three-dimensional computer model to evaluate the performance of commonly used connected piles and the proposed composite foundation as a novel mitigation technique. Their findings, recently published the official Journal of the International Geosynthetics Society, "Considering an increasing world population and a need to construct more infrastructure such as bridges and buildings, this novel new foundation system can significantly improve the safety of infrastructure and substantially decrease fatality and damage due to large ground deformations," Associate Professor Fatahi said.The team now is looking at extending the solution for protection of structures affected by ground subsidence due to mining and tunnelling activities.
Earthquakes
2,021
February 11, 2021
https://www.sciencedaily.com/releases/2021/02/210211171111.htm
The songs of fin whales offer new avenue for seismic studies of the oceanic crust
The songs of fin whales can be used for seismic imaging of the oceanic crust, providing scientists a novel alternative to conventional surveying, a new study published this week in
Fin whale songs contain signals that are reflected and refracted within the crust, including the sediment and the solid rock layers beneath. These signals, recorded on seismometers on the ocean bottom, can be used to determine the thickness of the layers as well as other information relevant to seismic research, said John Nabelek, a professor in Oregon State University's College of Earth, Ocean, and Atmospheric Sciences and a co-author of the paper."People in the past have used whale calls to track whales and study whale behavior. We thought maybe we can study the Earth using those calls," Nabelek said. "What we discovered is that whale calls may serve as a complement to traditional passive seismic research methods."The paper serves as a proof of concept that could provide new avenues for using data from whale calls in research, Nabelek said."This expands the use of data that is already being collected," he said. "It shows these animal vocalizations are useful not just for understanding the animals, but also understanding their environment."The study's lead author is Vaclav M. Kuna, who worked on the project as a doctoral student at Oregon State and has since completed his Ph.D.Kuna and Nabelek were studying earthquakes from a network of 54 ocean-bottom seismometers placed along the Blanco transform fault, which at its closest is about 100 miles off Cape Blanco on the Oregon Coast.They noted strong signals on the seismometers that correlated with whales' presence in the area."After each whale call, if you look closely at the seismometer data, there is a response from the Earth," Nabelek said.Whale calls bounce between the ocean surface and the ocean bottom. Part of the energy from the calls transmits through the ground as a seismic wave. The wave travels through the oceanic crust, where it is reflected and refracted by the ocean sediment, the basalt layer underneath it and the gabbroic lower crust below that.When the waves are recorded at the seismometer, they can provide information that allows researchers to estimate and map the structure of the crust.Using a series of whale songs that were recorded by three seismometers, the researchers were able to pinpoint the whale's location and use the vibrations from the songs to create images of the Earth's crust layers.Researchers use information from these layers to learn more about the physics of earthquakes in the region, including how sediment behaves and the relationship between its thickness and velocity. Earthquakes shake up the sediment, expelling water and speeding up the settlement of the sediment.The current traditional method for imaging of the crust can be expensive and permits can be difficult to obtain because the work involves deploying air guns, Nabelek said. The imaging created using the whale songs is less invasive, though overall it is of lower resolution.Future research could include using machine learning to automate the process of identifying whale songs and developing images of their surroundings, Nabelek said."The data from the whale songs is useful but it doesn't completely replace the standard methods," he said. "This method is useful for investigating the Earth's oceanic crust where standard science survey methods are not available."
Earthquakes
2,021
January 27, 2021
https://www.sciencedaily.com/releases/2021/01/210127152528.htm
New report charts path toward superior earthquake recovery
For the last century, seismic building codes and practices have primarily focused on saving lives by reducing the likelihood of significant damage or structural collapse. Recovery of critical functions provided by buildings and infrastructure have received less attention, however. As a result, many remain vulnerable to being knocked out of service by an earthquake for months, years or for good.
A committee of experts, formed by the National Institute of Standards and Technology (NIST) and the Federal Emergency Management Agency (FEMA) under the direction of Congress, has urged officials at all levels of government to support research and policies that could help get the buildings and services society depends on up and running quickly after an earthquake. In a report delivered to Congress, the committee outlines seven recommendations that, if acted upon, may greatly improve the resilience of communities across the nation."As structural engineers we feel confident that the current building codes can deliver life safety design objectives. Now, it's time to go beyond that and think about recovery of function," said Siamak Sattar, a NIST structural engineer and co-author of the report.In 2011, a magnitude 6.3 earthquake struck Christchurch, New Zealand. Over 180 lives were lost as a result, but many more were likely saved by modern building codes. However, the city's economy and quality of life were not spared.The quake damaged the city's central business district to the point that hundreds of buildings were closed or demolished, displacing thousands of workers. Lifeline infrastructure systems -- including power, clean water and roads -- sustained heavy damage, further crippling the community's ability to bounce back. In total, the estimated costs of rebuilding the city amounted to 40 billion New Zealand dollars ($26.6 billion).The toll taken by the Christchurch earthquake and other damaging events can in part be attributed to limitations in seismic codes and standards, as most offer little guidance on designing buildings or lifelines to recover in a timely manner in the wake of extreme events.To prevent major earthquakes from leaving such lasting impressions in the future, Congress entrusted NIST and FEMA -- both member agencies of the National Earthquake Hazards Reduction Program (NEHRP), which NIST leads -- with the responsibility of mapping a path to greater community resilience.Drawing expertise from both public and private sectors, NIST and FEMA assembled a committee of more than 30 engineers, architects, building owners, code officials and social scientists -- including several of their own researchers -- to devise options for addressing gaps in codes, standards and practices, which are described in their report to Congress.The first recommendation summarizes the core of the report. The authors call for members of the government, codes and standards organizations and industry to work together in developing a national framework for setting and achieving goals based on recovery time. To produce this framework, experts must first identify what level of function provided by buildings and lifelines should be maintained after an earthquake, and then determine an acceptable time for them to be out of commission."There are different metrics that we can use to help guide this framework. For example, a building may need to recover within a predefined number of days, weeks or months. If it is a hospital or emergency center then you may not want it to go down at all," said Steve McCabe, director of NEHRP.The authors also highlight the need for new recovery-based design criteria for buildings and lifelines. If developed with recovery in mind, these criteria could steer design parameters -- such as increasing a school's structural strength to limit damage or designing an electrical power supply to return to service faster -- toward improving community resilience. A critical phase of this process would be identifying the level of ground shaking that designs should be tailored to for recovery goals, which may vary by region.Other recommendations seek to help leaders meet recovery goals aligned with the first recommendation, offering guidance on implementing new design requirements for buildings and lifelines. They also provide direction for pre-disaster planning -- a key step in preparing authorities to make timely decisions in the immediate aftermath of a disaster.The authors seek to empower communities as well by recommending the launch of an education campaign on earthquake risk and recovery, which could reach the public through social media, streaming services or other media."Informed citizens are an important resource needed to develop the kind of vision required for this effort, which may well represent the largest change in building codes in 75 years," McCabe said.In the report, the authors encourage officials to consider adopting functional recovery approaches that go beyond the current requirements. They assert that the initial investments of adopting new recovery-focused codes and upgrading older buildings and lifelines could likely be offset by the reduction of future losses. They also suggest that increased access to financial resources through mechanisms such as grant programs, incentive systems and public financing would help local governments scale the upfront costs."The immediate aim of the report is to spark a national conversation about developing a consensus for recovery goals and timelines. This approach may eventually be reflected in building codes, but first, a considerable amount of research must be tackled," Sattar said.New policies could make use of the NEHRP agencies, such as NIST and FEMA, whose expertise may enable them to provide the necessary science for sound public policy.The road toward this goal could take years to traverse, but it is critical.In the meantime, the authors encourage early action by leaders at state and local levels, as each community may have needs that national guidelines cannot fully address. Their experiences with functional recovery planning and design could also make for valuable feedback at the national level, speeding up progress toward widespread earthquake resilience that preserves quality of life in addition to life itself.Report:
Earthquakes
2,021
January 27, 2021
https://www.sciencedaily.com/releases/2021/01/210127122426.htm
Geological phenomenon widening the Atlantic Ocean
An upsurge of matter from deep beneath the Earth's crust could be pushing the continents of North and South America further apart from Europe and Africa, new research has found.
The plates attached to the Americas are moving apart from those attached to Europe and Africa by four centimetres per year. In between these continents lies the Mid-Atlantic Ridge, a site where new plates are formed and a dividing line between plates moving to the west and those moving to the east; beneath this ridge, material rises to replace the space left by the plates as they move apart.Conventional wisdom is that this process is normally driven by distant gravity forces as denser parts of the plates sink back into the Earth. However, the driving force behind the separation of the Atlantic plates has remained a mystery because the Atlantic ocean is not surrounded by dense, sinking plates.Now a team of seismologists, led by the University of Southampton, have found evidence of an upwelling in the mantle -- the material between the Earth's crust and its core -- from depths of more than 600 kilometres beneath the Mid Atlantic ridge, which could be pushing the plates from below, causing the continents to move further apart.Upwellings beneath ridges are typically thought to originate from much shallower depths of around 60 km.The findings, published in the journal Over two research cruises on the RV Langseth and RRV Discovery, the team deployed 39 seismometers at the bottom of the Atlantic as part of the PI-LAB (Passive Imaging of the Lithosphere-Asthenosphere Boundary) experiment and EURO-LAB (Experiment to Unearth the Rheological Oceanic Lithosphere-Asthenosphere Boundary). The data provides the first large scale and high-resolution imaging of the mantle beneath the Mid-Atlantic Ridge.This is one of only a few experiments of this scale ever conducted in the oceans and allowed the team to image variations in the structure of the Earth's mantle near depths of 410 km and 660 km -- depths that are associated with abrupt changes in mineral phases. The observed signal was indicative of a deep, sluggish and unexpected upwelling from the deeper mantle.Lead author, Matthew Agius, a former post-doctoral fellow at the University of Southampton and currently at Università degli studi Roma Tre said: "This was a memorable mission that took us a total of 10 weeks at sea in the middle of the Atlantic Ocean. The incredible results shed new light in our understanding of how the Earth interior is connected with plate tectonics, with observations not seen before."Dr Kate Rychert and Dr Nick Harmon from the University of Southampton and Professor Mike Kendall from the University of Oxford led the experiment and were the chief scientists on the cruises. The experiment was funded by NERC (Natural Environment Research Council, UK) and the ERC (European Research Council).Dr Harmon said: "There is a growing distance between North America and Europe, and it is not driven by political or philosophical differences -- it is caused by mantle convection!"As well as helping scientists to develop better models and warning systems for natural disasters, plate tectonics also has an impact on sea levels, and therefore affects climate change estimates over geologic times scales.Dr Rychert said: "This was completely unexpected. It has broad implications for our understanding of Earth's evolution and habitability. It also demonstrates how crucial it is to gather new data from the oceans. There is so much more to explore!"Professor Mike Kendall added: "This work is exciting and that it refutes long held assumptions that mid-ocean ridges might play a passive role in plate tectonics. It suggests that in places such as the Mid-Atlantic, forces at the ridge play an important role in driving newly-formed plates apart."Video:
Earthquakes
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125144550.htm
Simulating 800,000 years of California earthquake history to pinpoint risks
Massive earthquakes are, fortunately, rare events. But that scarcity of information blinds us in some ways to their risks, especially when it comes to determining the risk for a specific location or structure.
"We haven't observed most of the possible events that could cause large damage," explained Kevin Milner, a computer scientist and seismology researcher at the Southern California Earthquake Center (SCEC) at the University of Southern California. "Using Southern California as an example, we haven't had a truly big earthquake since 1857 -- that was the last time the southern San Andreas broke into a massive magnitude 7.9 earthquake. A San Andreas earthquake could impact a much larger area than the 1994 Northridge earthquake, and other large earthquakes can occur too. That's what we're worried about."The traditional way of getting around this lack of data involves digging trenches to learn more about past ruptures, collating information from lots of earthquakes all around the world and creating a statistical model of hazard, or using supercomputers to simulate a specific earthquake in a specific place with a high degree of fidelity.However, a new framework for predicting the likelihood and impact of earthquakes over an entire region, developed by a team of researchers associated with SCEC over the past decade, has found a middle ground and perhaps a better way to ascertain risk.A new study led by Milner and Bruce Shaw of Columbia University, published in the According to the developers, the new approach improves the ability to pinpoint how big an earthquake might occur in a given location, allowing building code developers, architects, and structural engineers to design more resilient buildings that can survive earthquakes at a specific site."For the first time, we have a whole pipeline from start to finish where earthquake occurrence and ground-motion simulation are physics-based," Milner said. "It can simulate up to 100,000s of years on a really complicated fault system."RSQSim transforms mathematical representations of the geophysical forces at play in earthquakes -- the standard model of how ruptures nucleate and propagate -- into algorithms, and then solves them on some of the most powerful supercomputers on the planet. The computationally-intensive research was enabled over several years by government-sponsored supercomputers at the Texas Advanced Computing Center, including Frontera -- the most powerful system at any university in the world -- Blue Waters at the National Center for Supercomputing Applications, and Summit at the Oak Ridge Leadership Computing Facility."One way we might be able to do better in predicting risk is through physics-based modeling, by harnessing the power of systems like Frontera to run simulations," said Milner. "Instead of an empirical statistical distribution, we simulate the occurrence of earthquakes and the propagation of its waves.""We've made a lot of progress on Frontera in determining what kind of earthquakes we can expect, on which fault, and how often," said Christine Goulet, Executive Director for Applied Science at SCEC, also involved in the work. "We don't prescribe or tell the code when the earthquakes are going to happen. We launch a simulation of hundreds of thousands of years, and just let the code transfer the stress from one fault to another."The simulations began with the geological topography of California and simulated over 800,000 virtual years how stresses form and dissipate as tectonic forces act on the Earth. From these simulations, the framework generated a catalogue -- a record that an earthquake occurred at a certain place with a certain magnitude and attributes at a given time. The catalog that the SCEC team produced on Frontera and Blue Waters was among the largest ever made, Goulet said. The outputs of RSQSim were then fed into CyberShake that again used computer models of geophysics to predict how much shaking (in terms of ground acceleration, or velocity, and duration) would occur as a result of each quake."The framework outputs a full slip-time history: where a rupture occurs and how it grew," Milner explained. "We found it produces realistic ground motions, which tells us that the physics implemented in the model is working as intended." They have more work planned for validation of the results, which is critical before acceptance for design applications.The researchers found that the RSQSim framework produces rich, variable earthquakes overall -- a sign it is producing reasonable results -- while also generating repeatable source and path effects."For lots of sites, the shaking hazard goes down, relative to state-of-practice estimates" Milner said. "But for a couple of sites that have special configurations of nearby faults or local geological features, like near San Bernardino, the hazard went up. We are working to better understand these results and to define approaches to verify them."The work is helping to determine the probability of an earthquake occurring along any of California's hundreds of earthquake-producing faults, the scale of earthquake that could be expected, and how it may trigger other quakes.Support for the project comes from the U.S. Geological Survey (USGS), National Science Foundation (NSF), and the W.M. Keck Foundation. Frontera is NSF's leadership-class national resource. Compute time on Frontera was provided through a Large-Scale Community Partnership (LSCP) award to SCEC that allows hundreds of U.S. scholars access to the machine to study many aspects of earthquake science. LSCP awards provide extended allocations of up to three years to support long-lived research efforts. SCEC -- which was founded in 1991 and has computed on TACC systems for over a decade -- is a premier example of such an effort.The creation of the catalog required eight days of continuous computing on Frontera and used more than 3,500 processors in parallel. Simulating the ground shaking at 10 sites across California required a comparable amount of computing on Summit, the second fastest supercomputer in the world."Adoption by the broader community will be understandably slow," said Milner. "Because such results will impact safety, it is part of our due diligence to make sure these results are technically defensible by the broader community," added Goulet. But research results such as these are important in order to move beyond generalized building codes that in some cases may be inadequately representing the risk a region face while in other cases being too conservative."The hope is that these types of models will help us better characterize seismic hazard so we're spending our resources to build strong, safe, resilient buildings where they are needed the most," Milner said.Video:
Earthquakes
2,021
January 25, 2021
https://www.sciencedaily.com/releases/2021/01/210125112305.htm
GEFS: Searching beyond seismology for earthquake precursors
To predict when earthquakes are likely to occur, seismologists often use statistics to monitor how clusters of seismic activity evolve over time. However, this approach often fails to anticipate the time and magnitude of large-scale earthquakes, leading to dangerous oversights in current early-warning systems. For decades, studies outside the seismology field have proposed that these major, potentially devastating seismic events are connected to a range of non-seismic phenomena -- which can be observed days or even weeks before these large earthquakes occur. So far, however, this idea hasn't caught on in the wider scientific community. In this special issue,
By promoting the integration of these ideas with existing theories in seismology, GEFS could lead to significant improvements of earthquake early warning systems; potentially saving lives and protecting critical infrastructures when future disasters hit. The initiative is rationalised via a subtle atomic-level defect-based mechanism for explaining a variety of earthquake precursors, building on decades of laboratory experiments in physical chemistry and solid-state physics. The theory suggests that, as stresses build up in tectonic plates prior to seismic activity, electron-hole pairs are generated in the Earth's crust. The electrons are confined to the stressed rocks, but the positively charged holes flow out into the surrounding, less stressed rocks, producing electrical currents that can travel over large distances. These currents in turn can trigger wide-ranging secondary effects ranging from unusual low to ultralow electromagnetic radiation, to emissions of spectroscopically distinct thermal infrared from the Earth's surface, to changes in the atmosphere and ionosphere.This special issue documents the findings of researchers around the world, who have used both ground- and space-based observations to link these non-seismic patterns to the occurrence of subsequent large earthquakes. The work creates a strong rationale for global efforts to continually monitor the Earth for key signs of these precursors, which are often intermittent and weak. If its aims are realised, GEFS could be the first step towards a widespread collaboration between different scientific communities, each with the shared goal of improving our ability to forecast large earthquakes in the future.
Earthquakes
2,021
January 21, 2021
https://www.sciencedaily.com/releases/2021/01/210121132222.htm
When it comes to eyewitness accounts of earthquake shaking, representation matters
As scientists increasingly rely on eyewitness accounts of earthquake shaking reported through online systems, they should consider whether those accounts are societally and spatially representative for an event, according to a new paper published in
Socioeconomic factors can play a significant if complex role in limiting who uses systems such as the U.S. Geological Survey's "Did You Feel It?" (DYFI) to report earthquake shaking. In California, for instance, researchers concluded that DYFI appears to gather data across a wide socioeconomic range, albeit with some intriguing differences related to neighborhood income levels during earthquakes such as the 1989 Loma Prieta, the 1994 Northridge and 2018 Ridgecrest earthquakes.In India, by contrast, stark gaps in literacy and urban versus rural communities can lead to gaps in self-reported earthquake accounts though DYFI, write Susan Hough of the USGS and Stacey Martin of Australian National University.Previous studies have looked at the reasons why people respond to DYFI, including a 2016 publication by Sum Mak and Danijel Schorlemmer. But socioeconomic differences in who reports earthquake shaking "is a factor we haven't thought enough about, even though it is shaping the data sets that are available, especially outside of the United States" said Hough.Intensity data gleaned from DYFI are used to develop ShakeMap representations of ground motion in places with sparse instrumentation. ShakeMap in turn informs the Prompt Assessment of Global Earthquakes for Response (PAGER) system that provides crucial rapid information for earthquake response."The end result is that we are relying on unrepresentative [DYFI] data to flesh out ShakeMaps for large global earthquakes," Hough noted. "If the data are limited and unrepresentative, PAGER may not give emergency managers a good indication of where to direct their resources.""I know many who take the DYFI observations from outside the United States at face value without any scrutiny and made the incorrect assumption that that's all there is to the story," Martin added. "As we've shown in this study, that would be a really inappropriate assumption."Representation can also come into play when scientists rely on archival accounts to study historic earthquakes. Hough described the potential impact of unrepresentative earthquake reports in an earlier study when she and her colleague Morgan Page found a letter published in an Arkansas newspaper that helped to re-locate an 1882 earthquake within the Choctaw Nation in southeastern Oklahoma. The single chance account has helped seismologists better understand historical seismicity in Oklahoma, but there are still many "unknown unknowns" about earthquakes in the region during and after the 1882 event because Native American accounts are unavailable, Hough said.When Hough and Martin compared DYFI responses with ZIP code average household income for the three California earthquakes, the researchers uncovered some complex and intriguing trends. For the Northridge earthquake, for instance, relatively affluent areas were more likely to contribute strong shaking reports, and strong shaking levels from poorer areas may be underrepresented in the DYFI data.The researchers found that in India, DYFI reports skewed heavily toward urban individuals and depended strongly on a region's literacy rates. In some cases, the difference between DYFI self-reports and accounts gathered through traditional means such as local press accounts was significant. For the 2015 Gorkha earthquake, for instance, 74% of DYFI responses were from urban areas, while only 34% of traditional accounts were from urban centers."Being Indian, I know firsthand that there are disparities on numerous fronts in my country," said Martin. "Nonetheless the stark contrast in urban and rural DYFI reports from India for the three earthquakes that were analyzed for this study was still surprising to me. I did not anticipate that the social disparities would show up in something as seemingly far removed as earthquake felt reports."Further development of online systems will potentially make them more inclusive; for example, including online surveys in multiple languages, and designing easy-to-use apps. It also remains important, the researchers said, to survey earthquake effects using media reports, which the study showed tend to be more inclusive in India.Hough noted that the geoscience community is grappling with how underrepresentation affects its workforce, but studies like this show how underrepresentation "is actually an issue for science itself.""You can connect the dots, I think," she said. "If you don't have a diverse community of scientists, you don't have people who are asking the right questions."
Earthquakes
2,021
December 23, 2020
https://www.sciencedaily.com/releases/2020/12/201223142433.htm
Evidence for a massive paleo-tsunami at ancient Tel Dor
Underwater excavation, borehole drilling, and modelling suggests a massive paleo-tsunami struck near the ancient settlement of Tel Dor between 9,910 to 9,290 years ago, according to a study published December 23, 2020 in the open-access journal
Tsunamis are a relatively common event along the eastern Mediterranean coastline, with historical records and geographic data showing one tsunami occurring per century for the last six thousand years. The record for earlier tsunami events, however, is less defined. In this study, Shtienberg and colleagues describe a large early Holocene tsunami deposit (between 9,910 to 9,290 years ago) in coastal sediments at Tel Dor in northwest Israel, a maritime city-mound occupied from the Middle Bronze II period (2000-1550 BCE) through the Crusader period.To conduct their analysis, the authors used photogrammetric remote sensing techniques to create a digital model of the Tel Dor site, combined with underwater excavation and terrestrial borehole drilling to a depth of nine meters.Along the coast of the study area, the authors found an abrupt marine shell and sand layer with an age of constraint 9,910 to 9,290 years ago, in the middle of a large ancient wetland layer spanning from 15,000 to 7,800 years ago. The authors estimate the wave capable of depositing seashells and sand in the middle of what was at the time fresh to brackish wetland must have travelled 1.5 to 3.5 km, with a coastal wave height of 16 to 40 m. For comparison, previously documented tsunami events in the eastern Mediterranean have travelled inland only around 300 m -- suggesting the tsunami at Dor was generated by a far stronger mechanism. Local tsunamis tend to arise due to earthquakes in the Dead Sea Fault system and submarine landslides; the authors note that an earthquake contemporary to the Dor paleo-tsunami (dating to around 10,000 years ago) has already been identified using cave damage in the nearby Carmel ridge, suggesting this specific earthquake could have triggered an underwater landslide causing the massive tsunami at Dor.This paleo-tsunami would have occurred during the Early to Middle Pre-Pottery Neolithic B cultural period of the region (10,700-9,250 years ago 11,700-10,500 cal BP), and potentially wiped out evidence of previous Natufian (12,500-12,000 years ago) and Pre-Pottery Neolithic coastal villages (previous surveys and excavations show a near absence of low-lying coastal villages in this region). The re-appearance of abundant Late Neolithic archaeological sites (ca. 6,000 BCE) along the coast in the years after the Dor tsunami coincides with the resumption of wetland deposition in the Dor core samples and indicates resettlement followed the event -- highlighting residents' resilience in the face of massive disruption.According to Gilad Shtienberg, a postdoc at the Scripps Center for Marine Archaeology at UC San Diego who is studying the sediment cores, "Our project focuses on reconstructing ancient climate and environmental change over the past 12,000 years along the Israeli coast; and we never dreamed of finding evidence of a prehistoric tsunami in Israel. Scholars know that at the beginning of the Neolithic, around 10,000 years ago, the seashore was 4 kilometers from where it is today. When we cut the cores open in San Diego and started seeing a marine shell layer embedded in the dry Neolithic landscape, we knew we hit the jackpot."
Earthquakes
2,020
December 21, 2020
https://www.sciencedaily.com/releases/2020/12/201221173131.htm
Deep, slow-slip action may direct largest earthquakes and their tsunamis
Megathrust earthquakes and subsequent tsunamis that originate in subduction zones like Cascadia -- Vancouver Island, Canada, to northern California -- are some of the most severe natural disasters in the world. Now a team of geoscientists thinks the key to understanding some of these destructive events may lie in the deep, gradual slow-slip behaviors beneath the subduction zones. This information might help in planning for future earthquakes in the area.
"What we found was pretty unexpected," said Kirsty A. McKenzie, doctoral candidate in geoscience, Penn State.Unlike the bigger, shallower megathrust earthquakes that move and put out energy in the same direction as the plates move, the slow-slip earthquakes' energy may move in other directions, primarily down.Subduction zones occur when two of the Earth's plates meet and one moves beneath the other. This typically creates a fault line and some distance away, a line of volcanoes. Cascadia is typical in that the tectonic plates meet near the Pacific coast and the Cascade Mountains, a volcanic range containing Mount St. Helens, Mount Hood and Mount Rainier, forms to the east.According to the researchers, a megathrust earthquake of magnitude 9 occurred in Cascadia in 1700 and there has not been a large earthquake there since then. Rather, slow-slip earthquakes, events that happen deeper and move very short distances at a very slow rate, happen continuously."Usually, when an earthquake occurs we find that the motion is in the direction opposite to how the plates have moved, accumulating that slip deficit," said Kevin P. Furlong, professor of geosciences, Penn State. "For these slow-slip earthquakes, the direction of movement is directly downward in the direction of gravity instead of in the plate motion directions."The researchers have found that areas in New Zealand, identified by other geologists, slow slip the same way Cascadia does."But there are subduction zones that don't have these slow-slip events, so we don't have direct measurements of how the deeper part of the subducting plate is moving," said Furlong. "In Sumatra, the shallower seismic zone, as expected, moves in the plate-motion direction, but even though there are no slow-slip events, the deeper plate movement still appears to be primarily controlled by gravity."Slow-slip earthquakes occur at a deeper depth than the earthquakes that cause major damage and earth-shaking events, and the researchers have analyzed how this deep slip may affect the timing and behavior of the larger, damaging megathrust earthquakes."Slow-slip earthquakes rupture over several weeks, so they are not just one event," said McKenzie. "It's like a swarm of events."According to the researchers, in southern Cascadia, the overall plate motion is about an inch of movement per year and in the north by Vancouver Island, it is about 1.5 inches."We don't know how much of that 30 millimeters (1 inch) per year is accumulating to be released in the next big earthquake or if some movement is taken up by some non-observable process," said McKenzie. "These slow-slip events put out signals we can see. We can observe the slow-slip events going east to west and not in the plate motion direction."Slow-slip events in Cascadia occur every one to two years, but geologists wonder if one of them will be the one that will trigger the next megathrust earthquake.The researchers measure surface movement using permanent, high-resolution GPS stations on the surface. The result is a stair step pattern of loading and slipping during slow-slip events. The events are visible on the surface even though geologists know they are about 22 miles beneath the surface. They report their results in "The reason we don't know all that much about slow-slip earthquakes is they were only discovered about 20 years ago," said Furlong. "It took five years to figure out what they were and then we needed precise enough GPS to actually measure the motion on the Earth's surface. Then we had to use modeling to convert the slip on the surface to the slip beneath the surface on the plate boundary itself, which is bigger."The researchers believe that understanding the effects of slow-slip earthquakes in the region at these deeper depths will allow them to understand what might trigger the next megathrust earthquake in the area. Engineers want to know how strong shaking in an earthquake will be, but they also want to know the direction the forces will be in. If the difference in direction of slow-slip events indicates a potential change in behavior in a large event, that information would be helpful in planning."More fundamentally, we don't know what triggers the big earthquake in this situation," said McKenzie. "Every time we add new data about the physics of the problem, it becomes an important component. In the past, everyone thought that the events were unidirectional, but they can be different by 40 or 50 degrees."While the slow-events in Cascadia are shedding light on potential megathrust earthquakes in the area and the tsunamis they can trigger, Furlong thinks that other subduction zones may also have similar patterns."I would argue that it (differences in direction of motion) is happening in Alaska, Chile, Sumatra," said Furlong. "It is only in a few that we see the evidence of it, but it may be a universal process that has been missed. Cascadia exhibits it because of the slow-slip events, but it may be fundamental to subduction zones."Also working on this project was Matthew W. Herman, assistant professor of geology, California State University, Bakersfield.The National Science Foundation supported this work.
Earthquakes
2,020
December 21, 2020
https://www.sciencedaily.com/releases/2020/12/201221121754.htm
New model reveals previously unrecognized complexity of oceanic earthquake zones
Researchers from the University of Tsukuba applied seismic data from around the world to build a model of the 2020 Caribbean earthquake. Oceanic transform faults are generally considered to be linear and simple and have been widely used in studies of earthquake dynamics. However, the research team found that high complexity in rupture speed and direction can occur even in a supposedly simple linear fault system.
On 28 January 2020, a large oceanic earthquake with magnitude 7.7 occurred at the Oriente transform fault in the Caribbean Sea, between Jamaica and Cuba. It caused a minor tsunami of 0.11 m height and was felt as far afield as Florida.A research team at the University of Tsukuba have developed a new finite-fault inversion method for building models based on teleseismic waveform data from earthquake monitoring stations. This new approach to using the data takes a more flexible approach to resolving the fault geometry. Rather than relying on prior assumptions, the faulting components are separately evaluated in a wider model in both time and space, allowing all possible rupture evolutions to be considered. The team were keen to use the Caribbean earthquake to help to understand the faulting processes that occur during these shallow oceanic quakes."Some cases of complex rupture dynamics have recently been reported in previous earthquake studies, raising the question of whether or not we are correctly modeling these even in supposedly simple fault systems," says study author Professor Yuji Yagi. "The initial monitoring of this January 2020 event suggested variations in the waveform shape between two stations at similar distances from the epicenter, suggesting that there remains complexity to be explored at this fault."This was an excellent opportunity to test the new method developed by the team, which used data from 52 seismic stations to construct a detailed model of the geophysical processes within the fault that gave rise to the earthquake."The results revealed complex rupture during the earthquake, caused by a bend in the fault that led to the changes in rupture speed and direction detected in the monitoring data," explains author Professor Ryo Okuwaki. "These variations triggered several successive rupture episodes that occurred along the 300-km-long fault." The modeling approach also allows some suggestions to be made about the possible occurrence of subsidence and the shape of the surrounding seabed following the earthquake event.These findings reveal that oceanic transform faults, considered to be simple and linear, may be much more complicated than previously accepted, and therefore require a more comprehensive approach to earthquake modeling. This work will shed light on a possible interaction between the earthquake-fault motion and the evolution of the ocean floor around the transform boundary.
Earthquakes
2,020
December 16, 2020
https://www.sciencedaily.com/releases/2020/12/201216155217.htm
Secret of Australia's volcanoes revealed
Australia's east coast is littered with the remnants of hundreds of volcanoes -- the most recent just a few thousand years old -- and scientists have been at a loss to explain why so many eruptions have occurred over the past 80 million years.
Now, geoscientists at the University of Sydney have discovered why part of a stable continent like Australia is such a hotbed of volcanic activity. And the findings suggest there could be more volcanic activity in the future."We aren't on the famous Pacific 'Ring of Fire' that produces so many volcanoes and earthquakes," said Dr Ben Mather from the School of Geosciences and the EarthByte group at the University of Sydney."So, we needed another explanation why there have been so many volcanoes on Australia's east coast."Many of the volcanoes that form in Australia are one-off events, he said."Rather than huge explosions like Krakatoa or Vesuvius, or iconic volcanoes like Mount Fuji, the effect is more like the bubbles emerging as you heat your pancake mix," Dr Mather said.Their remnants can look like regular hills or notable structures like Cradle Mountain in Tasmania, the Organ Pipes in Victoria, the Undara Lava Tubes in Queensland and Sawn Rocks, near Narrabri, in NSW. Many are yet to be identified, Dr Mather said."Under our east coast we find a special volatile mix of molten rock that bubbles up to the surface through the younger, thinner east coast Australian crust," he said.The study is published today in the journal Dr Mather and his team looked at how hundreds of eruptions have occurred along the east coast from North Queensland to Tasmania and across the Tasman to the largely submerged continent Zealandia. They were particularly interested in 'recent' peaks of volcanic activity 20 million and 2 million years ago."Most of these eruptions are not caused by Australia's tectonic plate moving over hot plumes in the mantle under the Earth's crust. Instead, there is a fairly consistent pattern of activity, with a few notable peaks," said co-author Dr Maria Seton, from the School of Geosciences and EarthByte group.What tipped them off was that these peaks were happening at the same time there was increased volume of sea-floor material being pushed under the continent from the east by the Pacific plate."The peaks of volcanic activity correlate nicely with the amount of seafloor being recycled at the Tonga-Kermadec trench east of New Zealand," Dr Mather said.Taking this evidence Dr Mather and his team have built a new model that unifies the observations of so many eruptions occurring over millions of years along Australia's east coast."The most recent event was at Mount Gambier in Victoria just a few thousand years ago," he said.While the model explains the consistent volcanic activity, it can't predict when the next volcano will emerge.The sea floor of the Pacific plate to the east is being pushed under the Australian plate. This process is called subduction. The material is literally being pushed under the Australian continental shelf, starting at the Tonga-Kermadec Trench east and north of New Zealand."From there it is being slammed into the transition zone between the crust and the magma at depths of about 400 to 500 kilometres. This material is then re-emerging as a series of volcanic eruptions along Australia's east coast, which is thinner and younger than the centre and west of the continent," Dr Mather said.This subduction process is not unique to the Australian east coast."What sets the east Australia-Zealandia region apart is that the sea-floor being pushed under the continent from the western Pacific is highly concentrated with hydrous materials and carbon-rich rocks. This creates a transition zone right under the east coast of Australia that is enriched with volatile materials."The new explanation improves on previous models that have suggested volcanoes in Victoria were due to convection eddies in the mantle from being near the trailing edge of the tectonic plate or models that relied on the plate passing over hot spots in the mantle."Neither of these gave us the full picture," Dr Mather said. "But our new approach can explain the volcanic pattern up and down the Australian east coast."Dr Mather said this model could also explain other intraplate volcanic regions in the Western USA, Eastern China and around Bermuda.Co-author Professor Dietmar Müller, Joint Coordinator of the EarthByte group in the School of Geosciences, said: "We now need to apply this research to other corners of the Earth to help us understand how other examples of enigmatic volcanism have occurred."
Earthquakes
2,020
December 9, 2020
https://www.sciencedaily.com/releases/2020/12/201209124939.htm
Hawai'i researchers kept the data flowing during crisis response on Kilauea
The summer 2018 eruption of Kīlauea Volcano on the Island of Hawai'i was one of the most significant in the volcano's history, collapsing a large portion of the summit caldera, erupting massively from its flank and triggering a magnitude 6.9 earthquake in the process. Through it all, scientists at the Hawaiian Volcano Observatory were installing new geophysical stations, processing data and making real-time reports to local authorities and neighborhoods.
In the journal The researchers had been monitoring signs of imminent eruption before the 30 April 2018 collapse of the Pu'u 'Ō'ō vent, which had been erupting continually since 1983. Within a few hours, an unexpected magma intrusion began migrating through the volcano's East Rift Zone, and "we knew this would not be the next episode of the Pu'u 'Ō'ō eruption," said Shiro. "Within a day or so when the intrusion was approaching the populated area of Leilani Estates, we knew this could potentially be devastating."When the 2018 event was finished, lava had covered a 35-kilometer square area and 716 structures had been destroyed by the flow, displacing more than 2,500 people. Sulfur dioxide emissions rates were among the highest measured on the island and more than 60,000 earthquakes were recorded.Shiro and others at the observatory sprang into action to continue monitoring the volcano even as lava, fires, ashfall and collapsing cliffs destroyed geophysical monitoring stations. They also deployed new temporary stations to the sparsely instrumented lower East Rift Zone. The team was able to respond quickly by leveraging capacities that had been established earlier, they write.Since 2014, the observatory had been building portable solar power and electronic systems for monitoring stations that could be delivered by pickup truck and helicopter slings. They had also designed the island's network structure so that data could be quickly rerouted in case of station or network relay failure. And only four months before the eruption, the observatory had rebuilt and migrated all its seismic data processing systems to live on virtual machines that could be backed up to cloud servers."To monitor a hazard, we need both the instruments in the field to collect the data and a way to get the data back to the scientists and decision makers to make use of it. Having a pre-assembled set of stations ready for rapid deployment can be a key capability to help monitor an emerging, changeable hazard," Shiro explained.In the wake of multiple earthquakes during May 2018, the researchers had to permanently evacuate their facilities at the volcano's summit, setting up two temporary facilities in Hilo. "This is where the virtual machines or VMs came in handy for HVO," said Shiro. "Since we had to evacuate our facility, threatening our data center, we were able to easily move those VMs elsewhere and assure no downtime with data processing."Shiro recalls discussing the dangers with the scientist in charge at HVO in mid-May 2018. The flank eruption was devastating, he said, but he thought the bigger problems might come at the summit. "Within a few days from that conversation she made the call to abandon the facility as the seismic shaking only continued to worsen," he said. "It was the right call given the emerging evidence of structural damage that had begun to show and worsened over the next three months."The scientists divided into three main teams to collect data in the field, to analyze and interpret the data and to communicate and coordinate with government officials and communities. The observatory's full-time staff of 29 grew to 90, with people joining from other USGS offices, universities and volunteers."In a sense, the staff members of all five USGS volcano observatories acted as one for the Kilauea response, providing valuable cross-training for everyone and helping us all get to know one another so that we will be even better prepared for the next crisis," Shiro said.
Earthquakes
2,020
December 1, 2020
https://www.sciencedaily.com/releases/2020/12/201201144036.htm
Seismic activity of New Zealand's alpine fault more complex than suspected
A rupture along the full length of the fast-slipping Alpine Fault on New Zealand's South Island poses the largest potential seismic threat to the southern and central parts of the country. But new evidence of a 19th century earthquake indicates that in at least one portion of the fault, smaller earthquakes may occur in between such large rupture events.
The findings published in the The best paleoseismic evidence to date suggests the southern and central sections of the Alpine Fault, at the boundary separating the Australian and Pacific tectonic plates, typically rupture during very large full-section earthquakes of magnitude 7.7 or larger. The last such earthquake took place in 1717.After trenching along the fault at the Staples site near the Toaroha River, however, Robert Langridge of GNS Science and colleagues uncovered evidence of a more recent earthquake along the northeastern end of the fault's central portion. Radiocarbon dating places this earthquake between 1813 and 1848."One of the real challenges with the Alpine Fault -- because it is so bush-covered -- is actually finding sites that have been cleared and therefore can be studied," said Langridge. "Once we started working there [at the Staples site] the story really grew in large part because of the richness of dateable organic material in the trenches."The four most recent earthquakes uncovered by the researchers at the site range in dates from 1084 to 1848. The events were confirmed by data collected from other nearby trenching sites and from geological deposits called turbidites, which are sediments shaken loose into a body of water by seismic activity, in lakes along the central section of the Alpine fault.The most recent earthquake could represent a "partial-section" rupture of only the central portion of the Alpine fault, a rupture of the fault's northern section that continued southwest into the central segment, or even triggered slip from a rupture along the nearby Marlborough Fault System. Langridge and colleagues said that there isn't enough evidence yet to favor one of these scenarios over the others.However, the findings do suggest that seismic activity on the Alpine Fault is more complex than suspected, particularly along its northern reaches where the plate boundary transitions into another fault zone."One of the outcomes of this study is that you should expect a shorter recurrence interval of strong shaking at fault section ends," Langridge said. "Because of the recurrence times of earthquakes though, you obviously have to wait a long time to see the effects of such fault behavior.""That's why paleoseismology is a vital tool in understanding faults," he added, "because otherwise we'd have only short insights into the past."The Alpine Fault is sometimes compared with California's San Andreas Fault, being another fast-moving strike slip fault near a plate boundary. Langridge said researchers in California and New Zealand have a long history of earthquake science collaboration and are learning from each other about the treatment of active faults and fault segmentation for seismic hazard models."The San Andreas Fault, being on the opposite side of the Pacific plate, it is like our distant brother or whanau -- family," said Langridge.
Earthquakes
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130131417.htm
Seismic guidelines underestimate impact of 'The Big One' on metro Vancouver buildings
Scientists examining the effects of a megathrust earthquake in the Pacific Northwest say tall buildings across Metro Vancouver will experience greater shaking than currently accounted for by Canada's national seismic hazard model.
The region lies above the Georgia sedimentary basin, which is made up of layers of glacial and river sediments sitting on top of sedimentary rock. In the event of an earthquake, it would jiggle and amplify the seismic waves, causing more intense and longer-lasting tremors. However, the amplification caused by the sedimentary basin is not explicitly accounted for in the 2015 seismic hazard model, which informs Canada's national building code.The latest U.S. national seismic hazard model now explicitly accounts for sedimentary basin amplification, but Canada's latest seismic hazard model, released this October, still doesn't, says lead researcher Carlos Molina Hutt, a structural and earthquake engineering professor at UBC."As a result, we're underestimating the seismic hazard of a magnitude-9 earthquake in Metro Vancouver, particularly at long periods. This means we're under-predicting the shaking that our tall buildings will experience," he warned. "Fortunately, Natural Resources Canada, responsible for the development of our national seismic hazard model, recognizes the potential importance of basin effects in certain parts of Vancouver and is actively reviewing and participating in research on the topic. They intend to address basin effects in the next seismic hazard model."Using physics-based computer simulations, the researchers found that regions where the Georgia Basin is deepest will have the greatest seismic amplification. Delta and Richmond will experience the most amplification, followed by Surrey, New Westminster, Burnaby, Vancouver and North Vancouver. West Vancouver, which sits just outside the basin, will have the least.The researchers also evaluated the impact of the magnitude-9 simulations on tall reinforced concrete shear wall buildings, of which there are more than 3,000 located in the Lower Mainland. They found that those built to building codes from the 1980s and earlier are at the greatest risk of severe damage or even collapse, with buildings in the 10- to 20-storey range experiencing the worst impacts."We have these pockets of tall buildings within the Georgia Basin -- in Vancouver, Burnaby, Surrey and New Westminster. In general, based on a comparison of the code requirements in the past versus the code requirements now, many of our older buildings are vulnerable to these large earthquakes, particularly if we consider the amplification effect of the Georgia Basin," said Molina Hutt. The differences in expected performance between new buildings and older constructions reflects continuous improvements in seismic hazard estimates and engineering design provisions."When we build a structure, it only needs to meet the code of the time when it was built. If there is a future change in the code, you don't have to go back and upgrade your building. To address vulnerable existing buildings, jurisdictions must explore different seismic risk reduction policy options and adopt the most effective mitigation strategies," Molina Hutt added.The study, published recently in "Typically, people think that, if we have a magnitude-9 Cascadia subduction zone earthquake, it will be worse in Victoria, because they're closer to the seismic source. But the reality is that, for tall buildings, we're going to be worse off in Vancouver, because this basin amplifies the shaking in taller structures," Molina Hutt noted. The probability of a magnitude 8 or 9 Cascadia earthquake is estimated to be 14 per cent in the next 50 years."We're collaborating closely with our neighbours to the south, who are taking active steps to account for these basin amplification effects," said Molina Hutt. "Our work attempts to assess the impacts of neglecting these effects so we can appreciate their significance and take action."
Earthquakes
2,020
November 30, 2020
https://www.sciencedaily.com/releases/2020/11/201130131410.htm
Earthquake scenario for large German city
What if there is a major earthquake near Cologne? This scenario is the subject of the "Risk Analysis in Civil Protection 2019," whose report was recently submitted to the German Bundestag (document: Bundestag Drucksache 19/23825). In the 125-page document, a group of experts has listed in detail, on the basis of extensive research work, what effects can be expected in the event of strong ground movements. What Germans usually only know from TV and media reports from other countries is the result of a modeling of a strong earthquake near the megacity of Cologne: ground shaking, damaged and destroyed houses, blocked roads, many injured and dead.
The German Research Center for Geosciences GFZ and its researchers played a central role in this analysis. The GFZ had the task of modeling the ground movements caused by such an earthquake and quantifying possible damage to the city's buildings. In particular, new geophysical models for the Lower Rhine Bay were developed to estimate the influence of the near-surface layers of the subsoil on ground movements. The researchers created a "building-by-building" model of the city in order to quantify the number and vulnerability of buildings that could be affected by the earthquake.A massive earthquake in the Lower Rhine Bay with a magnitude of 6.5, as assumed for the underlying scenario, is quite possible. The GFZ expert for historical earthquakes, Gottfried Grünthal, says: "Statistical analyses show that an earthquake with a magnitude of 5.5 is to be expected in the Lower Rhine Bay approximately every hundred to three hundred years. A quake with a magnitude of 6.5 is to be expected approximately every 1000 to 3000 years.Marco Pilz, scientist of the GFZ section earthquake hazard and dynamic risks, describes the fictitious initial situation: "At a depth of only a few kilometers, a tectonic fault ruptures in the Lower Rhine Bay. Only seconds later the shock waves reach the surface and the nearby city of Cologne. The ground starts to shake, buildings creak and sometimes collapse, streets are blocked by falling debris. Good knowledge of the local underground conditions has shown us that these conditions must be taken into account for an accurate modeling of the shaking."Based on this, a building-related damage assessment suggests that major impacts can be expected in the city of Cologne. "Old buildings are likely to be particularly affected, so that the distribution of damage in the city area could be quite heterogeneous," adds Cecilia Nievas, a researcher from the same section. "Of the estimated 170,000 residential buildings in the city, more than 10,000 could suffer moderate to severe damage according to our calculations."The further effects, for example on utilities, are more difficult to assess and require detailed investigations: How many hospitals are affected, what capacities remain for the treatment of the injured, and how well do emergency services reach affected regions? GFZ-researcher Pilz: "Although we at GFZ had contributed a large part to this risk analysis, what was remarkable about the cooperation was the involvement of many experts from federal and state authorities, the district government, the affected districts, the cities and their immediately affected services such as the fire department, THW, railroads and energy suppliers. Everyone has worked together, from the very top down to the local level."Section head Fabrice Cotton adds: "It was a very productive exchange of information. The elaboration of such scenarios is important because they provide an effective tool for dialogue with the authorities and for understanding their needs when planning relief operations. Such exercises can also help to gain a complete overview of the entire seismic risk chain (from the physics of the earthquake to its effects) and to work at the interface between different scientific disciplines (e.g. here between seismology and civil engineering)."
Earthquakes
2,020
November 17, 2020
https://www.sciencedaily.com/releases/2020/11/201117192555.htm
Piecing together the Alaska coastline's fractured volcanic activity
Among seismologists, the geology of Alaska's earthquake- and volcano-rich coast from the Aleutian Islands to the southeast is fascinating, but not well understood. Now, with more sophisticated tools than before, a University of Massachusetts Amherst team reports unexpected new details about the area's tectonic plates and their relationships to volcanoes.
Plate tectonics -- the constant underground movement of continental and ocean shelves, is often characterized by "subduction zones" where plates clash, one usually sliding under another. Many are prime earthquake- and volcano-prone regions.Lead author Xiaotao Yang says, "For a long time, the whole central Alaska region was thought to have one simple subduction plate. What we discovered is that there are actually two major subduction slabs. It's a surprise that we see differences between these two slabs and the associate mantle materials." Overall, Yang says the new research shows, "there are many more subtleties and variations that we had not seen before."Yang, who did this work at UMass Amherst with co-author Haiying Gao, is now on the faculty at Purdue University. Writing in the Yang says their study highlights how complex a subduction zone can be and how this complexity may control volcano distribution. It also helps to clarify a long-standing question in seismology: what determines whether volcanoes are present and whether they are in a linear arc, or in clusters. Yang says it depends in part on whether rocks deep in the mantle above the subducting slab melt into magma, and how magma is stored in the crust.For their investigations, Yang and Gao used a powerful seismic imaging technique that Yang says is similar to a medical CAT scan of the Earth. With it, they constructed a detailed seismic velocity model of the Aleutian-Alaska margin from crust to the uppermost mantle. Seismic velocity refers to the rate at which a seismic wave travels through a material such as magma or crust. Waves travel more slowly through low-density, low-velocity material compared to surrounding rocks, for example, he says.The researchers' new model reveals multiple downgoing slabs, with various seismic velocities, thicknesses and dip angles, they write. Yang adds, "Once we got to look at the two central Alaska volcanoes for the first time in a really precise way, what we see is a much more complicated subduction system than we knew before. This new information about the complexity helps us to understand the distribution of volcanoes in Alaska. It's all more complicated than the tools could show us before," he adds.Their findings help to explain why there is a break in the arc of volcanoes called the Denali Volcanic Gap, Yang says. Below it is a wedge-shaped region of high seismic velocity material above the subduction plate but below the mantle. It is relatively cold and dry with no melting, which explains why there is no volcano in the region.By contrast, the cluster of volcanoes in the Wrangell Volcanic Field do not have the same signature, he adds. The Wrangell volcanoes have distinctly low seismic velocity material in the crust. It's a rather large magma reservoir that may explain why they're in a cluster instead of an arc, Yang says, though "the fact that it's there helps to explain where the magma came from for past eruptions."This study was made possible by the National Science Foundation's (NSF) array of seismic sensors in Alaska, part of its EarthScope Transportable Array program, Yang notes. His co-author Gao had startup funding from UMass Amherst and an NSF CAREER grant. They also used computational resources at the Massachusetts Green High Performance Computing Center in Holyoke.Yang says that their work adds to seismologists' understanding of volcano distribution in the Cascades in the Pacific Northwest, South America and the south Pacific. He hopes to follow up with more detailed analyses of magma reservoirs in the crust, how volcanoes are fed and particularly, whether Aleutian volcanoes have magma in the crust.
Earthquakes
2,020
November 16, 2020
https://www.sciencedaily.com/releases/2020/11/201116075717.htm
Former piece of Pacific Ocean floor imaged deep beneath China
In a study that gives new meaning to the term "rock bottom," seismic researchers have discovered the underside of a rocky slab of Earth's surface layer, or lithosphere, that has been pulled more than 400 miles beneath northeastern China by the process of tectonic subduction.
The study, published by a team of Chinese and U.S. researchers in Rice University seismologist Fenglin Niu, a co-corresponding author, said the study provides the first high-resolution seismic images of the top and bottom boundaries of a rocky, or lithospheric, tectonic plate within a key region known as the mantle transition zone, which starts about 254 miles (410 kilometers) below Earth's surface and extends to about 410 miles (660 kilometers)."A lot of studies suggest that the slab actually deforms a lot in the mantle transition zone, that it becomes soft, so it's easily deformed," Niu said. How much the slab deforms or retains its shape is important for explaining whether and how it mixes with the mantle and what kind of cooling effect it has.Earth's mantle convects like heat in an oven. Heat from Earth's core rises through the mantle at the center of oceans, where tectonic plates form. From there, heat flows through the mantle, cooling as it moves toward continents, where it drops back toward the core to collect more heat, rise and complete the convective circle.Previous studies have probed the boundaries of subducting slabs in the mantle, but few have looked deeper than 125 miles (200 kilometers) and none with the resolution of the current study, which used more than 67,000 measurements collected from 313 regional seismic stations in northeastern China. That work, which was done in collaboration with the China Earthquake Administration, was led by co-corresponding author Qi-Fu Chen from the Chinese Academy of Sciences.The research probes fundamental questions about the processes that shaped Earth's surface over billions of years. Mantle convection drives the movements of Earth's tectonic plates, rigid interlocked pieces of Earth's surface that are in constant motion as they float atop the asthenosphere, the topmost mantle layer and the most fluid part of the inner planet.Where tectonic plates meet, they jostle and grind together, releasing seismic energy. In extreme cases, this can cause destructive earthquakes and tsunamis, but most seismic motion is too faint for humans to feel without instruments. Using seismometers, scientists can measure the magnitude and location of seismic disturbances. And because seismic waves speed up in some kinds of rock and slow in others, scientists can use them to create images of Earth's interior, in much the same way a doctor might use ultrasound to image what's inside a patient.Niu, a professor of Earth, environmental and planetary sciences at Rice, has been at the forefront of seismic imaging for more than two decades. When he did his Ph.D. training in Japan more than 20 years ago, researchers were using dense networks of seismic stations to gather some of the first detailed images of the submerged slab boundaries of the Pacific plate, the same plate that was imaged in study published this week."Japan is located about where the Pacific plate reaches around 100-kilometer depths," Niu said. "There is a lot of water in this slab, and it produces a lot of partial melt. That produces arc volcanoes that helped create Japan. But, we are still debating whether this water is totally released in that depth. There is increasing evidence that a portion of the water stays inside the plate to go much, much deeper."Northeastern China offers one of the best vantage points to investigate whether this is true. The region is about 1,000 kilometers from the Japan trench where the Pacific plate begins its plunge back into the planet's interior. In 2009, with funding from the National Science Foundation and others, Niu and scientists from the University of Texas at Austin, the China Earthquake Administration, the Earthquake Research Institute of Tokyo University and the Research Center for Prediction of Earthquakes and Volcanic Eruptions at Japan's Tohoku University began installing broadband seismometers in the region."We put 140 stations there, and of course the more stations the better for resolution," Niu said. "The Chinese Academy of Sciences put additional stations so they can get a finer, more detailed image."In the new study, data from the stations revealed both the upper and lower boundaries of the Pacific plate, dipping down at a 25-degree angle within the mantle transition zone. The placement within this zone is important for the study of mantle convection because the transition zone lies below the asthenosphere, at depths where increased pressure causes specific mantle minerals to undergo dramatic phase changes. These phases of the minerals behave very differently in seismic profiles, just as liquid water and solid ice behave very different even though they are made of identical molecules. Because phase changes in the mantle transition zone happen at specific pressures and temperatures, geoscientists can use them like a thermometer to measure the temperature in the mantle.Niu said the fact that both the top and bottom of the slab are visible is evidence that the slab hasn't completely mixed with the surrounding mantle. He said heat signatures of partially melted portions of the mantle beneath the slab also provide indirect evidence that the slab transported some of its water into the transition zone."The problem is explaining how these hot materials can be dropped into the deeper part of the mantle," Niu said. "It's still a question. Because they are hot, they are buoyant."That buoyancy should act like a life preserver, pushing upward on the underside of the sinking slab. Niu said the answer to this question could be that holes have appeared in the deforming slab, allowing the hot melt to rise while the slab sinks."If you have a hole, the melt will come out," he said. "That's why we think the slab can go deeper."Holes could also explain the appearance of volcanos like the Changbaishan on the border between China and North Korea."It's 1,000 kilometers away from the plate boundary," Niu said. "We don't really understand the mechanism of this kind of volcano. But melt rising from holes in the slab could be a possible explanation."
Earthquakes
2,020
November 13, 2020
https://www.sciencedaily.com/releases/2020/11/201113124038.htm
East African Rift System is slowly breaking away, with Madagascar splitting into pieces
The African continent is slowly separating into several large and small tectonic blocks along the diverging East African Rift System, continuing to Madagascar -- the long island just off the coast of Southeast Africa -- that itself will also break apart into smaller islands.
These developments will redefine Africa and the Indian Ocean. The finding comes in a new study by D. Sarah Stamps of the Department of Geosciences for the journal Rest assured, though, this isn't happening anytime soon."The rate of present-day break-up is millimeters per year, so it will be millions of years before new oceans start to form," said Stamps, an assistant professor in the Virginia Tech College of Science. "The rate of extension is fastest in the north, so we'll see new oceans forming there first.""Most previous studies suggested that the extension is localized in narrow zones around microplates that move independent of surrounding larger tectonic plates," Stamps said. The new GPS dataset of very precise surface motions in Eastern Africa, Madagascar, and several islands in the Indian Ocean reveal that the break-up process is more complex and more distributed than previously thought, according to the study, completed by Stamps with researchers from the University of Nevada-Reno, University of Beira Interior in Portugal, and the Institute and Observatory of Geophysics of Antananarivo at the University of Antananarivo in Madagascar itself.In one region, the researchers found that extension is distributed across a wide area. The region of distributed extension is about 600 kilometers (372 miles) wide, spanning from Eastern Africa to whole parts of Madagascar. More precisely, Madagascar is actively breaking up with southern Madagascar moving with the Lwandle microplate -- a small tectonic block -- and a piece of central Madagascar is moving with the Somalian plate. The rest of the island is found to be deforming nonrigidly, Stamps added.Also working on the paper was geosciences Ph.D. student Tahiry Rajaonarison, who previously was a master's student at Madagascar's University of Antananarivo. He assisted Stamps in 2012 in collecting GPS data that was used in this study. He joined Virginia Tech in 2015 and returned to Madagascar later to collect more data as the lead on a National Geographic Society grant. "Leading a team to collect GPS data in Madagascar in summer 2017 was an amazing field experience," Rajaonarison said.The team used new surface motion data and additional geologic data to test various configurations of tectonic blocks in the region using computer models. Through a comprehensive suite of statistical tests, the researchers defined new boundaries for the Lwandle microplate and Somalian plate. This approach allowed for testing if surface motion data are consistent with rigid plate motion."Accurately defining plate boundaries and assessing if continents diverge along narrowly deforming zones or through wide zones of diffuse deformation is crucial to unraveling the nature of continental break-up," Stamps said. "In this work, we have redefined how the world's largest continental rift is extending using a new GPS velocity solution."The discovery of the broad deforming zone helps geoscientists understand recent and ongoing seismic and volcanic activity happening in the Comoros Islands, located in the Indian Ocean between East Africa and Madagascar. The study also provides a framework for future studies of global plate motions and investigations of the forces driving plate tectonics for Stamps and her team.
Earthquakes
2,020
November 13, 2020
https://www.sciencedaily.com/releases/2020/11/201113103730.htm
Love waves from the ocean floor
Vibrations travel through our planet in waves, like chords ringing out from a strummed guitar. Earthquakes, volcanoes and the bustle of human activity excite some of these seismic waves. Many more reverberate from wind-driven ocean storms.
As storms churn the world's seas, wind-whipped waves at the surface interact in a unique way that produces piston-like thumps of pressure on the seafloor, generating a stream of faint tremors that undulate through Earth to every corner of the globe."There is an imprint of those three Earth systems in this ambient seismic data: atmosphere, Earth's rocky outer layers and ocean," said Stanford University geophysicist Lucia Gualtieri, lead author of a paper in Known as secondary microseisms, the small seismic waves excited by rumbling oceans are so ubiquitous and chaotic that seismologists have long set the data aside. "When you record these waves, the seismic record looks like random noise because there are so many sources, one close to the other across the extended area of a storm. They're all acting at the same time, and the resulting wavefields interfere with each other," Gualtieri said. "You want to just discard it."Yet over the last 15 years, researchers have found a way to extract meaning from this noisy data. By analyzing how quickly pairs of waves travel from one seismic station to another, they have begun to glean insights about the materials they're moving through. "We use seismic waves like X-rays in medical imaging for scanning the Earth," said Gualtieri, who is an assistant professor of geophysics in Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).Unlike a single ocean wave rolling across the surface, which dies out before it reaches the deep sea, the chaotic interactions of waves traveling in opposite directions during a storm can create an up-and-down bobbing motion at the surface that pulses all the way to the solid Earth below. Vibrations known as Rayleigh waves then travel outward from the pulse, moving the ground up and down as they go.For decades, scientists have understood the vertical component of ocean-storm microseisms, where Rayleigh waves dominate. But there is another set of vibrations recorded during ocean storms that are inexplicable in the accepted theories for how stormy seas generate movements in the solid Earth. These vibrations, named Love waves after their 20th-century discoverer, jostle underground rock particles side to side -- perpendicular to their path forward -- like a slithering snake. "These waves shouldn't be there at all," Gualtieri said. "We didn't know where they were coming from."Scientists have presented two plausible explanations. One idea is that when the vertical force pumping down from colliding ocean waves encounters a slope on the seafloor, it splits and forms the two different surface wave types: Rayleigh and Love. "In that case, the source of Love waves would be very close to the source of Rayleigh waves, if not the same location," Gualtieri said.But Gualtieri's research, co-authored with geoscientists from Princeton University, finds the slopes and inclines of the seafloor are not steep enough to generate the strong horizontal force necessary to produce the Love waves picked up by seismic recorders. Their results, published Nov. 9, support an alternative theory, in which Love waves originate within the Earth itself. It turns out that when windswept seas throttle pressure down to the seafloor, the patchwork structure of the solid Earth underneath answers with a thrum all its own."We understand how earthquakes create Love waves, but we've never exactly figured out how ocean waves create them," said ambient seismic noise expert Keith Koper, a professor of geology and geophysics and director of seismograph stations at the University of Utah, who was not involved with the study. "This is a little embarrassing because ocean-generated Love waves have been observed for over 50 years." The paper led by Gualtieri, he said, "provides conclusive evidence" for how ocean waves generate this particular kind of vibration in the Earth.Using the Summit supercomputer at Oak Ridge National Laboratory, the researchers simulated the complex interactions that occur between storms, ocean waves and the solid Earth over three-hour periods. Accurate down to four seconds, each simulation included 230,400 pressure sources scattered across the entire globe. "We're using the computer as a lab, to let seismic waves propagate from realistic sources all over the world's oceans based on known physics about how and where seismic waves are generated by ocean storms, as well as how they move through the Earth," Gualtieri said.One version of the model Earth represented the planet as a simplistic stratified world, where properties vary only with depth, like a layer cake. The other, more true-to-life model captured more of the three-dimensional variation in its underground terrain, like a chocolate chip cookie. For each version, the researchers switched underwater depth data on and off to test whether seafloor features like canyons, ravines and mountains -- as opposed to the deeper structure -- could produce Love waves.The results show that Love waves are poorly generated in the layer-cake-like, one-dimensional Earth. Given about 30 minutes and a rumbling ocean, however, Love waves emanated from below the seafloor in the three-dimensional model. When Rayleigh waves and other seismic waves generated by ocean storms encounter hotter or cooler zones and different materials in their lateral journey through Earth, the study suggests their energy scatters and refocuses. In the process, a portion of the wavefield converts to Love waves. "If you apply those pressure sources from interfering ocean waves and you wait, the Earth will give you the entire wavefield," Gualtieri said. "It's the Earth itself that will generate the Love waves."According to Gualtieri, better understanding of how these vibrations arise and propagate through Earth could help to fill in gaps in knowledge of not only our planet's interior but also its changing climate. Analog seismic recordings date back to before the satellite era, and high-quality digital data has been logged for several decades."This database holds information about environmental processes, and it's virtually untapped," she said.Computational resources were provided by the U.S. Department of Energy's Oak Ridge Leadership Computing Facility and the Princeton Institute for Computational Science & Engineering (PICSciE).
Earthquakes
2,020
November 12, 2020
https://www.sciencedaily.com/releases/2020/11/201112123827.htm
Landslide along Alaskan fjord could trigger tsunami
A glacier that had held an Alaskan slope in place for centuries is melting, releasing the soil beneath in what can be described as a slow-motion landslide, researchers say. But there's also the possibility of a real landslide that could cause a devastating tsunami.
In a study published last week, scientists noted that the slope on Barry Arm fjord on Prince William Sound in southeastern Alaska slid some 120 meters from 2010 to 2017. These are some of the first measurements to quantify how the slope is falling there."We are measuring this loss of land before the tsunami occurs," said Chunli Dai, lead author of the paper and a research scientist at The Ohio State University's Byrd Polar and Climate Research Center.The study was published in Landslides on slopes near glaciers generally occur when glacial ice melts, a phenomenon occurring more rapidly around the world because of climate change. Landslides can prompt tsunamis by sending massive amounts of dirt and rocks into nearby bodies of water. Such a landslide happened in 2017 in western Greenland, prompting a tsunami that killed four people.Scientists estimate that a landslide at Barry Arm fjord could be about eight times larger than that Greenland landslide.If the entire slope collapsed at once, the researchers found, tsunami waves could reach communities throughout the sound, which are home to hundreds of people and visitors, including fishermen, tourists and members of an indigenous Alaskan group called the Chugach.For this study, researchers used satellite data to measure and monitor the size of the glacier that had covered the Barry Arm slope, and to measure the amount of land that had already been displaced, which is found to be directly linked to Barry Arm glacier's melting. Then, they built models to identify the potential landslide risk.The data showed that, from 1954 to 2006, Barry Glacier thinned by less than a meter per year. But after 2006, the melt rapidly increased, so that the glacier was thinning by about 40 meters per year. The glacier retreated rapidly from 2010 to 2017, the researchers found. The land's "toe" -- the bottom point of the falling slope -- had butted against the glacier in 2010. By 2017, that toe was exposed, and butted up against the water in Prince William Sound.The researchers modeled potential tsunami scenarios, and found that, if the land along that slope collapsed at once, the resulting tsunami would send currents between 25 and 40 meters per second -- enough to cause significant damage to large cruise and cargo ships and fishing boats, as well as overwhelming kayakers, all of which frequent Prince William Sound.Waves could reach 10 meters in the nearby town of Whittier. The tsunami could disrupt fiber optic service to parts of Alaska, the researchers noted -- two of the five submarine fiber optic lines to Alaska run below Prince William Sound. And oil from the 1989 Exxon Valdez oil spill still lingers in sediment in Prince William Sound, meaning it is possible that a tsunami could send that oil back into the environment."If the slope fails at once, it would be catastrophic," said Dr. Bretwood Higman, a geologist with Ground Truth Alaska and co-author of the study.When and if that massive landslide occurs depends on geology, climate and luck. An earthquake, prolonged rains, thawing permafrost or snowmelt could trigger one, the researchers said. (A 2018 earthquake in Alaska did not trigger a landslide, the researchers noted.)"People are working on early-detection warnings, so if a landslide happens, people in nearby communities might at least get a warning," said Anna Liljedahl, an Alaska-based hydrologist with Woodwell Climate Research Center, and another co-author. "This kind of research might help with building those early-warning systems."
Earthquakes
2,020
November 12, 2020
https://www.sciencedaily.com/releases/2020/11/201112100855.htm
The connectivity of multicomponent fluids in subduction zones
A team of researchers has discovered more about the grain-scale fluid connectivity beneath the earth's surface, shedding new light on fluid circulation and seismic velocity anomalies in subduction zones.
Lithospheric plates collide at convergent boundaries. Here, the less dense oceanic lithosphere subducts below the continental plate, and releases an abundance of water due to a progressive metamorphic reaction at high pressure and high temperature. The released water can infiltrate into the mantle wedge which lies between the subducting oceanic lithosphere and the continental crust.Fluids that circulate in subduction zones have a significant effect on magma genesis, global material exchange between the Earth's interior and surface, and seismicity. The dihedral angle (θ) -- the angle between two intersecting planes -- holds the key to revealing the fluid connectivity and migration regime for a fluid-bearing, deep-seated rock in the Earth's interior known as pyrolite -- a rock mainly composed of olivine.Although HSalt (NaCl) and non-polarized gases such as COClarifying the competing effects of NaCl and COTo do this, doctoral student Yongsheng Huang, professor Michihiko Nakamura, and postdoctoral researcher Takayuki Nakatani from Tohoku University worked alongside professor Catherine McCammon from the University of Bayreuth. The research team sought to constrain θ in olivine + HThe results in the HAdditional experiments on olivine-magnesite + H"Our study has revealed that COThe contrasting effects of aqueous fluid and silicate melt on the seismic wave velocity may allow for mapping partial melt in the mantle wedge.
Earthquakes
2,020
November 4, 2020
https://www.sciencedaily.com/releases/2020/11/201104121438.htm
Smaller earthquakes with 'ambition' produce the most ground shaking
An earthquake of magnitude 8.0 or larger will almost always cause strong shaking, but a new study suggests that smaller earthquakes -- those around magnitude 5.5 or so -- are the cause of most occurrences of strong shaking at a 60-kilometer (37-mile) distance.
Small earthquakes are expected to produce relatively weak shaking, and for the most part that's true, said Sarah Minson of the U.S. Geological Survey. However, ground motion is highly variable, and there are always outlier earthquakes at every size that generate more shaking than expected.Combine that with the fact that there are more smaller magnitude earthquakes than large magnitude earthquakes, and most shaking comes from these "little earthquakes with ambition," Minson and her colleagues report in The researchers found that for all distances and for all levels of shaking, "the earthquakes that cause that level of shaking are systematically smaller magnitude than the earthquakes that should cause that level of shaking," said Minson, noting that this makes these smaller earthquakes a significant source of earthquake damage.The findings could change how people think about and prepare for the "Big One," the large magnitude earthquakes that loom large in the imaginations of people from California to Chile, said Minson.A future magnitude 8.0 San Andreas Fault earthquake will cause more total damage across the Los Angeles Basin than a smaller, local earthquake like the 1933 magnitude 6.4 Long Beach earthquake, simply because the larger earthquake causes shaking over a wider area. But that just means that there will be more overall shaking, not that the shaking will necessarily be stronger in any particular locality, she explained.While waiting for the Big One, places like Long Beach are likely to have multiple damaging medium earthquakes, "and thus most damage at any location is probably coming from smaller earthquakes with ambition," Minson added.The 1969 Santa Rosa, California earthquakes, around magnitude 6, caused about $50 million damage in today's dollars, while the magnitude 5.7 Magna, Utah earthquake earlier this year caused similar amounts of damage just to 100 government buildings, the researchers noted."For a lot of us, if we do look back over our personal experiences, the earthquake that we had the greatest amount of damage from is not the largest magnitude earthquake that we've felt at all," Minson said.It's a "sharks versus cows" concept, she added. "Sharks are scary, and cows are not, but cows kill more people every year than sharks do."The researchers began with calculations of the variation of expected ground acceleration from an earthquake of a certain magnitude and distance away from the shaking, along with the well-known Gutenberg-Richter magnitude-frequency relationship. The relationship demonstrates how the frequency of earthquakes decreases as the magnitude grows, so that for each magnitude 8 earthquake that occurs within a given region and time period, there will be 10 magnitude 7 earthquakes, 100 magnitude 6 earthquakes, and so on.Together, these two factors suggest that most shaking should come from smaller earthquakes that are "ambitious outliers" in terms of the amount of ground acceleration they cause. "The probability of any of these small earthquakes producing shaking is tiny, but there are many of them," Minson said.Minson and colleagues confirmed this hypothesis after examining three data sets of earthquakes from across the globe, ranging from magnitude 0.5 to 8.3.Ambitious little earthquakes may cause difficulty for some earthquake early warning systems, which alert users to potential damaging shaking after an earthquake begins, the researchers write. The closer users are to the earthquake source, the less likely it is that the alert arrives before they feel the shaking."If it turns that most of our shaking is coming from smaller magnitude earthquakes, well, smaller magnitude earthquakes are spatially more compact," Minson said, who noted that some systems also may not send out an alert at all for small earthquakes that aren't expected to produce damaging shaking.The findings do not change the total amount of earthquake hazard calculated for a region, Minson stressed. "All we did is say, ok, when that shaking comes, what is it likely to come as? And it's much more likely to come as little earthquakes with ambition than a big earthquake doing what big earthquakes do."This means, as Minson's UGGS co-author Sara McBride says, that "it's time to talk about the medium ones." Surveys and studies show that people often are demotivated by efforts to prepare for the Big One, overwhelmed by fatalism in the face of such an event. Focusing on smaller but significant events could encourage people to devote more time and effort to earthquake preparedness, the researchers suggest."If we talk about earthquakes like Loma Prieta and Northridge, and ask people to be prepared for that, it's more tractable," Minson said. "Those are the earthquakes that people have experienced and know how to prepare for and survive."
Earthquakes
2,020
October 30, 2020
https://www.sciencedaily.com/releases/2020/10/201030111823.htm
New fault zone measurements could help us to understand subduction earthquakes
A research team from the University of Tsukuba has conducted detailed structural analyses of a fault zone located in central Japan, with the aim to help identify the specific conditions that lead to earthquake faulting, a hazard that can cause enormous social damage. Subduction is a geological process that takes place in areas where two tectonic plates meet, such as the Japan Trench, in which one plate moves under another and is forced to sink.
Regions in which this process occurs are known as subduction zones and the seismic activity that they produce causes devastating damage through ground shaking and tsunamis. However, understanding these seismic processes can be difficult because of the problems associated with taking measurements from their deepest sections, where much of the activity occurs."To overcome this problem, we examined fault rocks exhumed from source depths of subduction earthquakes, which are now exposed at the land surface at the Jurassic accretionary complex in central Japan," explains study lead author Professor Kohtaro Ujiie. "At this complex, we were able to examine pseudotachylyte, a solidified frictional melt produced during subduction earthquakes, to help us to infer what may occur in the subduction zones deep beneath the oceans."The exposed fault zone was characterized through a range of measurements such as scanning electron microscope and Raman spectroscopy to provide a detailed picture of the pseudotachylytes and make some constraints about the heating conditions at the time of formation."The pseudotachylyte at the site derived from the frictional melting of black carbonaceous mudstone together with chert, which accumulated under low-oxygen conditions," says Ujiie. "Thermal fracturing tends to occur along slip zones flanked by rocks with high thermal diffusivities such as chert, and may happen during seismic slip within the Jurassic accretionary complex. This thermal fracturing could lead to a fluid pressure drop in the slip zone and reduction in stiffness of surrounding rocks, potentially contributing to the generation of frictional melt and acceleration of seismic slip."The seismic slip processes recorded in the studied complex may be applicable to other fault zones with similar rock layers, such as the Japan Trench subduction zone. Therefore, the data gathered from this area could be useful in future attempts to describe or model the subduction earthquakes that lead to ground shaking and tsunami risk.
Earthquakes
2,020
October 26, 2020
https://www.sciencedaily.com/releases/2020/10/201026153938.htm
Ancient lake contributed to past San Andreas fault ruptures
The San Andreas fault, which runs along the western coast of North America and crosses dense population centers like Los Angeles, California, is one of the most-studied faults in North America because of its significant hazard risk. Based on its roughly 150-year recurrence interval for magnitude 7.5 earthquakes and the fact that it's been over 300 years since that's happened, the southern San Andreas fault has long been called "overdue" for such an earthquake. For decades, geologists have been wondering why it has been so long since a major rupture has occurred. Now, some geophysicists think the "earthquake drought" could be partially explained by lakes -- or a lack thereof.
Today, at the Geological Society of America's 2020 Annual Meeting, Ph.D. student Ryley Hill will present new work using geophysical modeling to quantify how the presence of a large lake overlying the fault could have affected rupture timing on the southern San Andreas in the past. Hundreds of years ago, a giant lake -- Lake Cahuilla -- in southern California and northern Mexico covered swathes of the Mexicali, Imperial, and Coachella Valleys, through which the southern San Andreas cuts. The lake served as a key point for multiple Native American populations in the area, as evidenced by archaeological remains of fish traps and campsites. It has been slowly drying out since its most recent high water mark (between 1000 and 1500 CE). If the lake over the San Andreas has dried up and the weight of its water was removed, could that help explain why the San Andreas fault is in an earthquake drought?Some researchers have already found a correlation between high water levels on Lake Cahuilla and fault ruptures by studying a 1,000-year record of earthquakes, written in disrupted layers of soils that are exposed in deeply dug trenches in the Coachella Valley. Hill's research builds on an existing body of modeling but expands to incorporate this unique 1,000-year record and focuses on improving one key factor: the complexity of water pressures in rocks under the lake.Hill is exploring the effects of a lake on a fault's rupture timing, known as lake loading. Lake loading on a fault is the cumulative effect of two forces: the weight of the lake's water and the way in which that water creeps, or diffuses, into the ground under the lake. The weight of the lake's water pressing down on the ground increases the stress put on the rocks underneath it, weakening them -- including any faults that are present. The deeper the lake, the more stress those rocks are under, and the more likely the fault is to slip.What's more complicated is how the pressure of water in empty spaces in soils and bedrock (porewater) changes over both time and space. "It's not that [water] lubricates the fault," Hill explains. It's more about one force balancing another, making it easier or harder for the fault to give way. "Imagine your hands stuck together, pressing in. If you try to slip them side by side, they don't want to slip very easily. But if you imagine water between them, there's a pressure that pushes [your hands] out -- that's basically reducing the stress [on your hands], and they slip really easily." Together, these two forces create an overall amount of stress on the fault. Once that stress builds up to a critical threshold, the fault ruptures, and Los Angeles experiences "the Big One."Where previous modeling work focused on a fully drained state, with all of the lake water having diffused straight down (and at a single time), Hill's model is more complex, incorporating different levels of porewater pressure in the sediments and rocks underneath the lake and allowing pore pressures to be directly affected by the stresses from the water mass. That, in turn, affects the overall fault behavior.While the work is ongoing, Hill says they've found two key responses. When lake water is at its highest, it increases the stresses enough to push the timeline for the fault reaching that critical stress point just over 25% sooner. "The lake could modulate this [fault slip] rate just a little bit," Hill says. "That's what we think maybe tipped the scales to cause the [fault] failure."The overall effect of Lake Cahuilla drying up makes it harder for a fault to rupture in his model, pointing to its potential relevance for the recent quiet on the fault. But, Hill stresses, this influence pales in comparison to continent-scale tectonic forces. "As pore pressures decrease, technically, the bedrock gets stronger," he says. "But how strong it's getting is all relevant to tectonically driven slip rates. They're much, much stronger."
Earthquakes
2,020
October 26, 2020
https://www.sciencedaily.com/releases/2020/10/201026135802.htm
Bridges with limb-inspired architecture can withstand earthquakes, cut repair costs
Structural damage to any of the nation's ailing bridges can come with a hefty price of billions of dollars in repairs. New bridge designs promise more damage-resistant structures and, consequently, lower restoration costs. But if these designs haven't been implemented in the real world, predicting how they can be damaged and what repair strategies should be implemented remain unresolved.
In a study published in the journal "Bridges, particularly those in high-seismic regions, are vulnerable to damage and will need repairs at some point. But now the question is what kind of repairs should be used for different types and levels of damage, what will be the cost of these repairs and how long will the repairs take -- these are all unknowns for new bridge designs," said Dr. Petros Sideris, assistant professor in the Zachry Department of Civil and Environmental Engineering. "We have answered these questions for a novel bridge design using an approach that is seldomly used in structural engineering."Most bridges are monolithic systems made of concrete poured over forms that give the bridges their shape. These bridges are strong enough to support their own weight and other loads, such as traffic. However, Sideris said if there is an unexpected occurrence of seismic activity, these structures could crack, and remedying the damage would be exorbitantly expensive.To overcome these shortcomings, Sideris and his team have developed a new design called a hybrid sliding-rocking bridge. Instead of a monolithic design, these bridges are made of columns containing limb-inspired joints and segments. Hence, in the event of an earthquake, the joints allow some of the energy from the ground motion to diffuse while the segments move slightly, sliding over one another rather than bending or cracking. Despite the overall appeal of the hybrid sliding-rocking bridge design, little is known about how the bridges will behave in real-world situations."To find the correct repair strategy, we need to know what the damages look like," said Sideris. "Our bridge design is relatively new and so there is little scientific literature that we could refer to. And so, we took an unconventional approach to fill our gap in knowledge by recruiting a panel of experts in bridge damage and repair."For their study, Sideris, Dr. Abbie Liel, professor at the University of Colorado, Boulder, and their team recruited a panel of eight experts from industry and academia to determine the damage states in experimentally tested hybrid sliding-rocking segment designed columns. Based on their evaluations of the observed damage, the panel provided repair strategies and estimated costs for repair. The researchers then used that information to fix the broken columns, retested the columns under the same initial damage-causing conditions and compared the repaired column's behavior to that of the original column through computational investigations.The panel found that columns built with their design sustained less damage overall compared to bridges built with conventional designs. In fact, the columns showed very little damage even when subject to motions reminiscent of a powerful once-in-a-few-thousand-years earthquake. Furthermore, the damage could be repaired relatively quickly with grout and carbon fibers, suggesting that no special strategy was required for restoration."Fixing bridges is a slow process and costs a significant amount of money, which then indirectly affects the community," said Sideris. "Novel bridge designs that may have a bigger initial cost for construction can be more beneficial in the long run because they are sturdier. The money saved can then be used for helping the community rather than repairing infrastructure."
Earthquakes
2,020
October 23, 2020
https://www.sciencedaily.com/releases/2020/10/201023095852.htm
A new technique predicts how earthquakes would affect a city's hospitals
In an increasingly urbanized world, population density often leads to more deaths and injuries when floods, typhoons, landslides and other disasters strike cities.
But the risks to life and limb are compounded when earthquakes are the agent of destruction, because they not only kill and maim but can also cripple the hospitals needed to treat survivors.Now, an international research team led by the Stanford Blume Center for Earthquake Engineering has developed a methodology to help disaster preparedness officials in large cities make contingency plans on a region-wide basis to make sure that emergency responders can get patients to the hospital facilities that are likeliest to remain in commission after a quake."Previously, most hospital preparedness plans could only look at smaller areas because they focused on single hospitals," said Anne Kiremidjian, professor of civil and environmental engineering at Stanford and a co-author, with colleague Greg Deierlein, of a paper published in Regional response to quakes is not completely new in quake-prone California, where after the 1994 Northridge earthquake the Los Angeles County Emergency Medical Services Agency used shortwave and ham radios to coordinate the movement of patients among 76 hospitals in the damage zone.To assure hospital survivability, the California state legislature has mandated that all acute care hospitals are brought up to current seismic standards by 2030. "We need to ensure that hospitals remain operational to treat patients and avoid greater loss of life," Deierlein said.The new research provides disaster response officials in seismically active countries like Turkey, Chile, Indonesia or Peru with an effective but relatively simple way to create regional contingency plans: Start by using statistical risk analysis models to estimate where deaths and injuries are likeliest to occur in populous metropolitan areas; apply building-specific performance assessment techniques to project how much damage different hospitals might suffer; and map out the best routes between hospitals should the need arise to move injured patients to less damaged facilities with available capacity.The new regional planning methodology comes at a time when the world is awakening to the consequences of population growth and dense urbanization. When the researchers looked at 21,000 disasters that have occurred worldwide since 1900, half of those with the largest injury totals occurred during the last 20 years. For example, the 7.6 magnitude earthquake that struck Izmit, Turkey, caused approximately 50,000 injuries and disrupted 10 major hospitals.Luis Ceferino, who coordinated the research as a PhD candidate in civil engineering at Stanford, said the paper focused on what happened in 2007, after an 8.0 magnitude earthquake struck the city of Pisco, about 150 miles from Lima, Peru. Pisco lost more than half its total number of hospital beds in a few minutes.Ceferino also worked with two other two experts in hospital responses, professor Celso Bambarén from Universidad Peruana Cayetano Heredia in Peru and Judith Mitrani-Reiser of the U.S. National Institute of Standards and Technology, who gathered post-quake damage assessments from the Pisco temblor and the 8.8 magnitude temblor that occurred in 2010 near Maule on the central coast of Chile, data that also helps to inform the new methodology."Hospital systems are at the core of disaster resilience," said Ceferino, who will become an assistant professor of civil and urban engineering at New York University in 2021. "Cities need regional contingency plans to ensure that hospitals, doctors and medical teams are ready to care for our most vulnerable populations."
Earthquakes
2,020
October 22, 2020
https://www.sciencedaily.com/releases/2020/10/201022143939.htm
AI detects hidden earthquakes
Measures of Earth's vibrations zigged and zagged across Mostafa Mousavi's screen one morning in Memphis, Tenn. As part of his PhD studies in geophysics, he sat scanning earthquake signals recorded the night before, verifying that decades-old algorithms had detected true earthquakes rather than tremors generated by ordinary things like crashing waves, passing trucks or stomping football fans.
"I did all this tedious work for six months, looking at continuous data," Mousavi, now a research scientist at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth), recalled recently. "That was the point I thought, 'There has to be a much better way to do this stuff.'"This was in 2013. Handheld smartphones were already loaded with algorithms that could break down speech into sound waves and come up with the most likely words in those patterns. Using artificial intelligence, they could even learn from past recordings to become more accurate over time.Seismic waves and sound waves aren't so different. One moves through rock and fluid, the other through air. Yet while machine learning had transformed the way personal computers process and interact with voice and sound, the algorithms used to detect earthquakes in streams of seismic data have hardly changed since the 1980s.That has left a lot of earthquakes undetected.Big quakes are hard to miss, but they're rare. Meanwhile, imperceptibly small quakes happen all the time. Occurring on the same faults as bigger earthquakes -- and involving the same physics and the same mechanisms -- these "microquakes" represent a cache of untapped information about how earthquakes evolve -- but only if scientists can find them.In a recent paper published in Mousavi began working on technology to automate earthquake detection soon after his stint examining daily seismograms in Memphis, but his models struggled to tune out the noise inherent to seismic data. A few years later, after joining Beroza's lab at Stanford in 2017, he started to think about how to solve this problem using machine learning.The group has produced a series of increasingly powerful detectors. A 2018 model called PhaseNet, developed by Beroza and graduate student Weiqiang Zhu, adapted algorithms from medical image processing to excel at phase-picking, which involves identifying the precise start of two different types of seismic waves. Another machine learning model, released in 2019 and dubbed CRED, was inspired by voice-trigger algorithms in virtual assistant systems and proved effective at detection. Both models learned the fundamental patterns of earthquake sequences from a relatively small set of seismograms recorded only in northern California.In the According to Mousavi, the model builds on PhaseNet and CRED, and "embeds those insights I got from the time I was doing all of this manually." Specifically, Earthquake Transformer mimics the way human analysts look at the set of wiggles as a whole and then hone in on a small section of interest.People do this intuitively in daily life -- tuning out less important details to focus more intently on what matters. Computer scientists call it an "attention mechanism" and frequently use it to improve text translations. But it's new to the field of automated earthquake detection, Mousavi said. "I envision that this new generation of detectors and phase-pickers will be the norm for earthquake monitoring within the next year or two," he said.The technology could allow analysts to focus on extracting insights from a more complete catalog of earthquakes, freeing up their time to think more about what the pattern of earthquakes means, said Beroza, the Wayne Loel Professor of Earth Science at Stanford Earth.Understanding patterns in the accumulation of small tremors over decades or centuries could be key to minimizing surprises -- and damage -- when a larger quake strikes.The 1989 Loma Prieta quake ranks as one of the most destructive earthquake disasters in U.S. history, and as one of the largest to hit northern California in the past century. It's a distinction that speaks less to extraordinary power in the case of Loma Prieta than to gaps in earthquake preparedness, hazard mapping and building codes -- and to the extreme rarity of large earthquakes.Only about one in five of the approximately 500,000 earthquakes detected globally by seismic sensors every year produce shaking strong enough for people to notice. In a typical year, perhaps 100 quakes will cause damage.In the late 1980s, computers were already at work analyzing digitally recorded seismic data, and they determined the occurrence and location of earthquakes like Loma Prieta within minutes. Limitations in both the computers and the waveform data, however, left many small earthquakes undetected and many larger earthquakes only partially measured.After the harsh lesson of Loma Prieta, many California communities have come to rely on maps showing fault zones and the areas where quakes are likely to do the most damage. Fleshing out the record of past earthquakes with Earthquake Transformer and other tools could make those maps more accurate and help to reveal faults that might otherwise come to light only in the wake of destruction from a larger quake, as happened with Loma Prieta in 1989, and with the magnitude-6.7 Northridge earthquake in Los Angeles five years later."The more information we can get on the deep, three-dimensional fault structure through improved monitoring of small earthquakes, the better we can anticipate earthquakes that lurk in the future," Beroza said.To determine an earthquake's location and magnitude, existing algorithms and human experts alike look for the arrival time of two types of waves. The first set, known as primary or P waves, advance quickly -- pushing, pulling and compressing the ground like a Slinky as they move through it. Next come shear or S waves, which travel more slowly but can be more destructive as they move the Earth side to side or up and down.To test Earthquake Transformer, the team wanted to see how it worked with earthquakes not included in training data that are used to teach the algorithms what a true earthquake and its seismic phases look like. The training data included one million hand-labeled seismograms recorded mostly over the past two decades where earthquakes happen globally, excluding Japan. For the test, they selected five weeks of continuous data recorded in the region of Japan shaken 20 years ago by the magnitude-6.6 Tottori earthquake and its aftershocks.The model detected and located 21,092 events -- more than two and a half times the number of earthquakes picked out by hand, using data from only 18 of the 57 stations that Japanese scientists originally used to study the sequence. Earthquake Transformer proved particularly effective for the tiny earthquakes that are harder for humans to pick out and being recorded in overwhelming numbers as seismic sensors multiply."Previously, people had designed algorithms to say, find the P wave. That's a relatively simple problem," explained co-author William Ellsworth, a research professor in geophysics at Stanford. Pinpointing the start of the S wave is more difficult, he said, because it emerges from the erratic last gasps of the fast-moving P waves. Other algorithms have been able to produce extremely detailed earthquake catalogs, including huge numbers of small earthquakes missed by analysts -- but their pattern-matching algorithms work only in the region supplying the training data.With Earthquake Transformer running on a simple computer, analysis that would ordinarily take months of expert labor was completed within 20 minutes. That speed is made possible by algorithms that search for the existence of an earthquake and the timing of the seismic phases in tandem, using information gleaned from each search to narrow down the solution for the others."Earthquake Transformer gets many more earthquakes than other methods, whether it's people sitting and trying to analyze things by looking at the waveforms, or older computer methods," Ellsworth said. "We're getting a much deeper look at the earthquake process, and we're doing it more efficiently and accurately."The researchers trained and tested Earthquake Transformer on historic data, but the technology is ready to flag tiny earthquakes almost as soon as they happen. According to Beroza, "Earthquake monitoring using machine learning in near real-time is coming very soon."
Earthquakes
2,020
October 21, 2020
https://www.sciencedaily.com/releases/2020/10/201021112348.htm
Deep magma facilitates the movement of tectonic plates
A small amount of molten rock located under tectonic plates encourages them to move. This is what scientists from the Laboratoire de géologie de Lyon: Terre, planètes et environnement (CNRS/ENS de Lyon/Université Claude Bernard Lyon 1) have recently discovered. Their new model takes into account not only the velocity of seismic waves but also the way in which they are attenuated by the medium they pass through. The velocity of tectonic plates near the surface is thus directly correlated with the quantity of magma present. This research is published on October 21, 2020 in
The lithosphere, the outer part of the Earth, is made up of the crust and part of the upper mantle. It is subdivided into rigid plates, known as tectonic or lithospheric plates. These move on a more fluid layer of the mantle, the asthenosphere. The lower viscosity of the asthenosphere allows the tectonic plates to move around on the underlying mantle, but until today the origin of this low viscosity remained unknown.Seismic tomography produces three-dimensional images of the Earth's interior by analysing millions of seismic waves recorded at seismological stations spread across the surface of the globe. Since the 1970s, seismologists have analysed these waves with a view to identifying a single parameter: their propagation speed. This parameter varies with temperature (the colder the medium, the faster the waves arrive), composition, and the possible presence of molten rocks in the medium the waves pass through. Seismologists from the Laboratoire de géologie de Lyon: Terre, planètes et environnement (CNRS/ENS de Lyon/Université Claude Bernard Lyon 1) instead studied another parameter, wave attenuation, alongside the variation in wave propagation speeds. This analysis, which provides new information on the temperature of the medium traversed by the waves, makes it possible to ascertain the quantity of molten rock in the medium the waves pass through.Their new model made it possible, for the first time, to map the amount of molten rock under tectonic plates. This work reveals that a small amount of molten rock (less than 0.7% by volume) is present in the asthenosphere under the oceans, not only where this was expected, i.e. under ocean ridges and some volcanoes such as Tahiti, Hawaii or Reunion, but also under all oceanic plates. The low percentage of molten rock observed is enough to reduce the viscosity by one or two orders of magnitude underneath the tectonic plates, thus "decoupling" them from the underlying mantle. Moreover, the seismologists from Lyon observed that the amount of molten rock is higher under the fastest-moving plates, such as the Pacific plate. This suggests that the melting of the rocks encourages the plates to move and the deformation at their bases. This research improves our understanding of plate tectonics and how it works.
Earthquakes
2,020
October 20, 2020
https://www.sciencedaily.com/releases/2020/10/201020150514.htm
New evidence for geologically recent earthquakes near Portland, Oregon metro area
A paleoseismic trench dug across the Gales Creek fault, located about 35 kilometers (roughly 22 miles) west of Portland, Oregon, documents evidence for three surface-rupturing earthquakes that took place about 8,800, 4,200 and 1,000 years ago.
The findings, published in the By comparison, the 1993 Scotts Mills earthquake about 50 kilometers (31 miles) south of Portland was a magnitude 5.7 earthquake, and caused damages totaling about $30 million, the researchers noted.The region is part of the seismically active Cascadia subduction zone, where the Juan de Fuca tectonic plate bends beneath the North American plate. The Gales Creek fault lays within the Cascadia forearc, the land wedged between the oceanic trench where the Juan de Fuca begins its bend and the line of Cascadia volcanoes in Washington State and Oregon that are fueled by the subducting plate."In general, little paleoseismic work has been done on forearc faults in Oregon, but many faults in the region are of interest based on their proximity to population centers," said Horst, a paleoseismologist formerly at Portland State University and now at the Washington State Department of Resources.Mapping and analyzing faults in the Pacific Northwest can be difficult, since fault surface traces are often covered by urban development and thick forests, or are difficult to reach in mountainous areas. To learn more about possible recent seismic activity along these forearc faults, Horst and her colleagues dug a trench across the Gales Creek fault, which had been mapped previously and is being investigated by the U.S. Bureau of Reclamation and the U.S. Geological Survey along part of the fault that projected through Scoggins Dam in Oregon's Washington CountyThe Reclamation project had turned up evidence of surface deformation along the fault in sediments from the most recent geological time period, called the Holocene. After digging a trench across the fault -- first by hand and later by backhoe -- the researchers looked for evidence of past earthquakes in the rock layers, assigning an estimated date for each earthquake using radiocarbon analysis of charcoal contained in the layers.The trenching turned up strong evidence for at least three Holocene-age surface-rupturing earthquakes along the fault, with some weaker signs of one potential earthquake occurring after 1,000 years ago, and one earthquake occurring before 8,800 years ago.The researchers also estimated the magnitude of an earthquake that would rupture the entire mapped length of the Gales Creek fault, assuming that the full length ruptured at once and that the rupture event did not extend across multiple faults."The linkage between rupture on the Gales Creek fault and neighboring faults is still unknown, as there are no other paleoseismic studies with earthquake ages for neighboring faults and as a result no indication of paleo-earthquakes with overlapping age estimates on neighboring faults," Horst explained. "Future work on faults in the region could allow us to improve our understanding of the connectivity of rupture on these low slip, long recurrence forearc faults."The findings suggest that other faults within the Oregon portion of the Cascadia forearc should be studied for signs of Holocene earthquakes, the researchers concluded.
Earthquakes
2,020
October 20, 2020
https://www.sciencedaily.com/releases/2020/10/201020131403.htm
Lost and found: Geologists 'resurrect' missing tectonic plate
The existence of a tectonic plate called Resurrection has long been a topic of debate among geologists, with some arguing it was never real. Others say it subducted -- moved sideways and downward -- into the earth's mantle somewhere in the Pacific Margin between 40 and 60 million years ago.
A team of geologists at the University of Houston College of Natural Sciences and Mathematics believes they have found the lost plate in northern Canada by using existing mantle tomography images -- similar to a CT scan of the earth's interior. The findings, published in Geological Society of America Bulletin, could help geologists better predict volcanic hazards as well as mineral and hydrocarbon deposits."Volcanoes form at plate boundaries, and the more plates you have, the more volcanoes you have," said Jonny Wu, assistant professor of geology in the Department of Earth and Atmospheric Sciences. "Volcanoes also affect climate change. So, when you are trying to model the earth and understand how climate has changed since time, you really want to know how many volcanoes there have been on earth."Wu and Spencer Fuston, a third-year geology doctoral student, applied a technique developed by the UH Center for Tectonics and Tomography called slab unfolding to reconstruct what tectonic plates in the Pacific Ocean looked like during the early Cenozoic Era. The rigid outermost shell of Earth, or lithosphere, is broken into tectonic plates and geologists have always known there were two plates in the Pacific Ocean at that time called Kula and Farallon. But there has been discussion about a potential third plate, Resurrection, having formed a special type of volcanic belt along Alaska and Washington State."We believe we have direct evidence that the Resurrection plate existed. We are also trying to solve a debate and advocate for which side our data supports," Fuston said.Using 3D mapping technology, Fuston applied the slab unfolding technique to the mantle tomography images to pull out the subducted plates before unfolding and stretching them to their original shapes."When 'raised' back to the earth's surface and reconstructed, the boundaries of this ancient Resurrection tectonic plate match well with the ancient volcanic belts in Washington State and Alaska, providing a much sought after link between the ancient Pacific Ocean and the North American geologic record," explained Wu.This study is funded by a five-year, $568,309 National Science Foundation CAREER Award led by Wu.Video:
Earthquakes
2,020
October 15, 2020
https://www.sciencedaily.com/releases/2020/10/201015134215.htm
Deep learning artificial intelligence keeps an eye on volcano movements
RADAR satellites can collect massive amounts of remote sensing data that can detect ground movements -- surface deformations -- at volcanoes in near real time. These ground movements could signal impending volcanic activity and unrest; however, clouds and other atmospheric and instrumental disturbances can introduce significant errors in those ground movement measurements.
Now, Penn State researchers have used artificial intelligence (AI) to clear up that noise, drastically facilitating and improving near real-time observation of volcanic movements and the detection of volcanic activity and unrest."The shape of volcanoes is constantly changing and much of that change is due to underground magma movements in the magma plumbing system made of magma reservoirs and conduits," said Christelle Wauthier, associate professor of geosciences and Institute for Data and Computational Sciences (ICDS) faculty fellow. "Much of this movement is subtle and cannot be picked up by the naked eye."Geoscientists have used several methods to measure the ground changes around volcanoes and other areas of seismic activity, but all have limitations, said Jian Sun, lead author of the paper and a postdoctoral scholar in geosciences, funded by Dean's Postdoc-Facilitated Innovation through Collaboration Award from the College of Earth and Mineral Sciences.He added that, for example, scientists can use ground stations, such as GPS or tiltmeters, to monitor possible ground movement due to volcanic activity. However, there are a few problems with these ground-based methods. First, the instruments can be expensive and need to be installed and maintained on site."So, it's hard to put a lot of ground-based stations in a specific area in the first place, but, let's say there actually is a volcanic explosion or an earthquake, that would probably damage a lot of these very expensive instruments," said Sun. "Second, those instruments will only give you ground movement measurements at specific locations where they are installed, therefore those measurements will have a very limited spatial coverage."On the other hand, satellites and other forms of remote sensing can gather a lot of important data about volcanic activity for geoscientists. These devices are also, for the most part, out of harm's way from an eruption and the satellite images offer very extended spatial coverage of ground movement. However, even this method has its drawbacks, according to Sun."We can monitor the movement of the ground caused by earthquakes or volcanoes using RADAR remote sensors, but while we have access to a lot of remote sensing data, the RADAR waves must go through the atmosphere to get recorded at the sensor," he said. "And the propagation path will likely be affected by that atmosphere, especially if the climate is tropical with a lot of water vapor and clouds variations in time and space."According to the researchers, who report their findings in a recent issue of the Using this deep learning method, scientists could gain valuable insights into the movement of the ground, particularly in areas with active volcanoes or earthquake zones and faults, said Sun. The program may be able spot potential warning signs, such as sudden land shifts that might be a portent of an oncoming volcanic eruption, or earthquake."It's really important for areas close to active volcanoes, or near where there have been earthquakes, to have as early warning as possible that something might happen," said Sun.Deep learning, as its name suggests, uses training data to teach the system to recognize features that the programmers want to study. In this case, the researchers trained the system with synthetic data that was similar to satellite surface deformation data. The data included signals of volcanic deformation, both spatially and topographically correlated atmospheric features and errors in the estimation of satellite orbits.Future research will focus on refining and expanding our deep learning algorithm, according to Wauthier."We wish to be able to identify earthquake and fault movements as well as magmatic sources and include several underground sources generating surface deformation," she said. "We will apply this new groundbreaking method to other active volcanoes thanks to support from NASA."
Earthquakes
2,020
October 13, 2020
https://www.sciencedaily.com/releases/2020/10/201013124143.htm
Magnitude comparison distinguishes small earthquakes from chemical explosions in US west
By comparing two magnitude measurements for seismic events recorded locally, researchers can tell whether the event was a small earthquake or a single-fire buried chemical explosion.
The findings, published in the Seismologists use a variety of methods to distinguish earthquakes from explosions, such as analyzing the ratio of P waves (which compress rock in the same direction as a wave's movement) to S waves (which move rock perpendicular to the wave direction). However, methods like the P/S-wave ratio do not work as well for events of magnitude 3 or smaller, making it essential to develop other discrimination techniques, said University of Utah seismologist Keith Koper. Scientists have debated, for instance, whether a small seismic event that took place on 12 May 2010 in North Korea was a natural earthquake or an earthquake induced by a low-yield nuclear explosion.The new study looks at the difference between local magnitude (ML) and coda duration magnitude (MC) measurements. Local magnitude, sometimes referred to as Richter magnitude, estimates magnitude based on the maximum amplitude of seismic waves detected. Coda duration magnitude is based on the duration of a seismic wave train and the resulting length of the seismogram it produces.Koper and his students stumbled across the potential usefulness of this comparison in one of his graduate seminars about four years ago, as the students practiced programming and comparing different types of magnitudes. "It turned out that when you looked at these magnitude differences, there was a pattern," he said. "All these earthquakes in Utah that are associated with coal mining have a bigger coda magnitude, with seismograms longer than normal."Compared to naturally occurring earthquakes, seismic events caused by human activity tend to have a larger MC than ML, the researchers concluded in a 2016 paper. Very shallow seismic events have a larger MC than deeper buried events, they found, while noting that most human activities that would induce earthquakes take place at shallow depths in the crust, compared to the deeper origins of natural earthquakes.The findings suggested that ML-MC difference could be useful in detecting nuclear explosions at a local level, but the multiple detonations in a coal mining operation, scattered in space and time, produce a different seismic signature than the compact single shot of a nuclear explosion.To further test the discrimination method, the researchers searched for "explosions that were better proxies, compact, and not your typical industrial explosions," Koper said.In the BSSA study, Koper and colleagues applied the ML-MC difference to three experiments in the U.S. West that recorded data on local networks from buried single-fire explosions as well as natural earthquakes: the 2010 Bighorn Arch Seismic Experiment (BASE) in northern Wyoming, the imaging Magma Under St. Helens (iMUSH) experiment in Washington State from 2014 to 2016, and the Phase I explosions of the Source Physics Experiment (SPE) in Nevada from 2011 to 2016.The method was able to successfully separate explosions from natural earthquakes in the data from all three sites, the researchers found, confirming that it would be potentially useful for identifying small underground nuclear explosions in places that are only covered by a local seismic network.Beyond explosion seismology, the method might also help identify and analyze other earthquakes that have shallow sources, including some earthquakes induced by human activities such as oil and gas recovery, Koper said.
Earthquakes
2,020
October 13, 2020
https://www.sciencedaily.com/releases/2020/10/201013101630.htm
Scientist gains fresh insight into the origins of earthquakes
Sometimes barely noticeable, and at other times devasting, earthquakes are a major geological phenomenon which provide a stark reminder that our planet is constantly evolving. Scientists have made significant progress in understanding these events over the past 50 years thanks to sensors set up around the world. And while we know that earthquakes are caused by shifts in tectonic plates, a lot remains to be learned about how and why they occur.
Passelègue, a scientist at ENAC's Laboratory of Experimental Rock Mechanics (LEMR), has been studying the dynamics of faults -- or the areas between tectonic plates, where most earthquakes occur -- for the past ten years. He recently made a breakthrough in understanding the rupture mechanisms that eventually lead to seismic shifts along fault lines. His findings were published in "We know that rupture speeds can vary from a few millimeters per second to a few kilometers per second once nucleation occurs [the process by which a slip expands exponentially]. But we don't know why some ruptures propagate very slowly and others move quickly," says Passelègue. "However, that's important to know because the faster the propagation, the quicker the energy that accumulates along the fault is released."An earthquake will generally release the same amount of energy whether it moves slowly or quickly. The difference is that if it moves slowly, its seismic waves can be absorbed by the surrounding earth. These types of slow earthquakes are just as frequent as regular ones; it's just that we can't feel them. In extremely fast earthquakes -- which occur much less often -- the energy is released in just a few seconds through potentially devasting high-frequency waves. That's what sometimes occurs in Italy, for example. The country is located in a friction zone between two tectonic plates. While most of its earthquakes aren't (or are barely) noticeable, some of them can be deadly -- like the one on 2 August 2016 that left 298 people dead.In his study, Passelègue developed an experimental fault with the same temperature and pressure conditions as an actual fault running 8 km deep. He installed sensors along the fault to identify the factors causing slow vs. fast rupture propagation. "There are lots of hypotheses out there -- most scientists think it's related to the kind of rock. They believe that limestone and clay tend to result in slow propagation, whereas harder rocks like granite are conducive to fast propagation," he says. Passelègue's model uses a complex rock similar to granite. He was able to replicate various types of slip on his test device, and found that "the difference isn't necessarily due to the properties of the surrounding rock. A single fault can demonstrate all kinds of seismic mechanisms."Passelègue's experiments showed that the amount of energy released during a slip, and the length of time over which it's released, depend on the initial strain exerted along the fault; that is, the force applied on the fault line, generally from shifting tectonic plates. By applying forces of different magnitudes to his model, he found that higher strains triggered faster ruptures and lower strains triggered slower ruptures. "We believe that what we observed in the lab would apply under real-world conditions too," he says.Using the results of his model, Passelègue developed equations that factor in the initial strain on a fault and not just the amount of energy accumulated immediately before a slip, which was the approach used in other equations until now. "François is one of the first scientists to measure rupture speeds in rocks under the same temperature and pressure conditions that you find out in nature. He developed a way to model the mechanisms physically -- something that had never been done before. And he showed that all earthquakes follow the same laws of physics," says Marie Violay, head of LEMR.Passelègue warns that his model cannot be used to determine when or where an earthquake will occur. Since faults run too deep, scientists still aren't able to continually measure the strain on rock right along a fault. "We can identify how much strain there needs to be to cause a rupture, but since we don't know how much a fault is 'loaded up' with energy deep underground, we can't predict the rupture speed."One implication of Passelègue's research is that earthquakes may not be as random as we thought. "Most people think that faults that have been stable for a long time will never cause a serious earthquake. But we found that any kind of fault can trigger many different types of seismic events. That means a seemingly benign fault could suddenly rupture, resulting in a fast and dangerous wave propagation."
Earthquakes
2,020
October 9, 2020
https://www.sciencedaily.com/releases/2020/10/201009162422.htm
'Universal law of touch' will enable new advances in virtual reality
Seismic waves, commonly associated with earthquakes, have been used by scientists to develop a universal scaling law for the sense of touch. A team, led by researchers at the University of Birmingham, used Rayleigh waves to create the first scaling law for touch sensitivity. The results are published in
The researchers are part of a European consortium (H-Reality) that are already using the theory to develop new Virtual Reality technologies that incorporate the sense of touch.Rayleigh waves are created by impact between objects and are commonly thought to travel only along surfaces. The team discovered that, when it comes to touch, the waves also travel through layers of skin and bone and are picked up by the body's touch receptor cells.Using mathematical modelling of these touch receptors the researchers showed how the receptors were located at depths that allowed them to respond to Rayleigh waves. The interaction of these receptors with the Rayleigh waves will vary across species, but the ratio of receptor depth vs wavelength remains the same, enabling the universal law to be defined.The mathematics used by the researchers to develop the law is based on approaches first developed over a hundred years ago to model earthquakes. The law supports predictions made by the Nobel-Prize-winning physicist Georg von Békésy who first suggested the mathematics of earthquakes could be used to explore connections between Rayleigh waves and touch.The team also found that the interaction of the waves and receptors remained even when the stiffness of the outermost layer of skin changed. The ability of the receptors to respond to Rayleigh waves remained unchanged despite the many variations in this outer layer caused by, age, gender, profession, or even hydration.Dr Tom Montenegro-Johnson, of the University of Birmingham's School of Mathematics, led the research. He explains: "Touch is a primordial sense, as important to our ancient ancestors as it is to modern day mammals, but it's also one of the most complex and therefore least understood. While we have universal laws to explain sight and hearing, for example, this is the first time that we've been able to explain touch in this way."James Andrews, co-author of the study at the University of Birmingham, adds: "The principles we've defined enable us to better understand the different experiences of touch among a wide range of species. For example, if you indent the skin of a rhinoceros by 5mm, they would have the same sensation as a human with a similar indentation -- it's just that the forces required to produce the indentation would be different. This makes a lot of sense in evolutionary terms, since it's connected to relative danger and potential damage."The work was funded by the European Union's Horizon 2020 research and innovation programme, under collaborative project "H-Reality." The other institutions involved in the project are Ultraleap Ltd. (UK), Actronika (France), TU Delft (The Netherlands), and CNRS (France).
Earthquakes
2,020
October 7, 2020
https://www.sciencedaily.com/releases/2020/10/201007123027.htm
Unusually shallow earthquake ruptures in Chinese fracking field
An unusually shallow earthquake triggered by hydraulic fracturing in a Chinese shale gas field could change how experts view the risks of fracking for faults that lie very near the Earth's surface.
In the journal The earthquake, along with two foreshocks with magnitudes larger than 4, appear to be related to activity at nearby hydraulic fracturing wells. Although earthquakes induced by human activity such as fracking are typically more shallow than natural earthquakes, it is rare for any earthquake of this size to take place at such a shallow depth."Earthquakes with much smaller magnitudes, for example magnitude 2, have been reported at such shallow depths. They are understood by having small scale fractures in such depths that can slip fast," said Yang. "However, the dimensions of earthquakes are scale-dependent. Magnitude 4 is way bigger than magnitude 2 in term of rupture length and width, and thus needs a sizeable fault as the host.""The results here certainly changed our view in that a shallow fault can indeed slip seismically," he added. "Therefore, we should reconsider our strategies of evaluating seismic risk for shallow faults."Two people died and twelve were injured in the 25 February earthquake, and the economic loss due to the event has been estimated at 14 million RMB, or about $2 million. There have been few historic earthquakes in the region, and before 2019 there had been no earthquakes larger than magnitude 3 on the fault where the main earthquake took place.Since 2018, there have been at least 48 horizontal fracking wells drilled from 13 well pads in the region, with three well pads less than two kilometers (1.2 miles) from the Molin fault, where the main earthquake took place.Yang and his colleagues located the earthquakes and were able to calculate the length of the main rupture using local and regional seismic network data, as well as InSAR satellite data.It is unusual to see clear satellite data for a small earthquake like this, Yang said. "InSAR data are critical to determine the depth and accurate location of the mainshock, because the ground deformation was clearly captured by satellite images," he noted. "Given the relatively small size of the mainshock, it would not be able to cause deformation above the 'noise' level of satellite data if it were deeper than about two kilometers."The two foreshocks took place on a previously unmapped fault in the area, the researchers found, underscoring how difficult it can be to prevent fracking-induced earthquakes in an area where fault mapping is incomplete.The researchers note that the Molin fault is separated from the geologic formation where fracking took place by a layer of shale about 800 meters (2625 feet) thick. The separating layer sealed off the fault from fracking fluids, so it is unlikely that the pressures of fluid injected into rock pores around the fault caused the fault to slip. Instead, Yang and colleagues suggest that changes in elastic stress in rock may have triggered the main earthquake on the Molin fault, which was presumed to be stable."The results here certainly pose a significant concern: we cannot ignore a shallow fault that was commonly thought to be aseismic," Yang said, who said more public information on fracking injection volume, rate and duration could help calculate safe distances for well placement in the future.
Earthquakes
2,020
October 1, 2020
https://www.sciencedaily.com/releases/2020/10/201001200242.htm
Research may curb economic losses to power plants after earthquakes
Sitting atop power transformers are wavy shaped bushing systems that play a critical role in supplying communities with electricity. However, these objects are also susceptible to breaking during earthquakes. Once damaged, bushings can cause widespread outages and burden the state with expensive repairs.
In a recent study, Texas A&M University researchers have shown that during high seismic activity, the structural integrity of bushing systems can be better maintained by reinforcing their bases with steel stiffeners. Also, by using probability-based loss assessment studies, they found that the economic burden due to damage to bushing systems from earthquakes is up to 10 times lower for steel-reinforced transformer bushing systems compared to other bushing configurations."Transformer bushing systems are vital to electrical substation networks, and these components are especially vulnerable in high-seismic regions, like in California or parts of the northeast," said Dr. Maria Koliou, assistant professor in the Zachry Department of Civil and Environmental Engineering. "We have conducted a full risk and loss assessment of the impact of damaged bushings in terms of cost and time to recovery for electrical power networks."The details of the study are provided in the July issue of the journal An electrical bushing is a sleeve-like covering that surrounds a conductor carrying a high voltage electrical current. Generally found at close proximity to transformers or circuit breakers, these systems ensure that electric currents do not leak out of metal wires. Thus, bushings are made of insulators, porcelain in particular, and are filled with mineral oil.Despite their ability to withstand strong electric fields, bushings are brittle and can crack easily in the event of high seismic activity. Consequently, any damage to them is an electrical hazard. More extensive structural injuries to the bushing system can cause widespread power outages and high replacement costs.One possible way to mitigate damage and thereby repair is by strengthening the bushing with steel plates. Just like a strong foundation can improve a building's stability, steel flexural stiffeners as close as possible to the bushing base has been shown to improve bushing stability during earthquakes. However, Koliou said a more comprehensive analysis of the impact of seismic vulnerability on bushing systems in terms of recovery costs has been lacking.To address this gap, Koliou and her graduate student, Andrew Brennan, conducted a probabilistic analysis to compare the economic losses incurred from the damage of bushings for different intensities of ground motions. They investigated bushings of different geometries representative of medium- and high-voltage scenarios. More importantly, some bushings had steel plate stiffeners and others did not in their original designs.Koliou and Brennan found that the economic losses for the earthquake intensities considered in the study were 33-55% lesser when the bushings' bases are reinforced with steel plates. In fact, the expected annual losses for bushings without the steel stiffeners were at least 2.5-10 times larger when subjected to different ground motions."Our results show that steel stiffeners are effective at preventing bushings from damage, but what 'effective' means for a structural engineer can have little meaning for someone who is not. We wanted to generalize our findings in more practical terms for stakeholders other than engineers," said Koliou. "And so, we quantified the benefit of using steel stiffeners in terms of a dollar value and the time it would take to recover for a variety of earthquake scenarios, which is more easily interpretable."
Earthquakes
2,020
October 1, 2020
https://www.sciencedaily.com/releases/2020/10/201001113628.htm
Earthquake forecasting clues unearthed in strange precariously balanced rocks
Precariously balanced rocks (PBRs) are formations found throughout the world where a slender boulder is balanced precariously on a pedestal boulder. They form as blocks preserved on cliffs, or when softer rocks erode and leave the harder rocks behind. They can also form when landslides or retreating glaciers deposit them in strange positions.
Despite their delicate balancing act, many PBRs -- like the Brimham Rocks in Yorkshire, or Chiricahua National Monument in Arizona -- have survived earthquake shaking over thousands of years. They can therefore tell us the upper limit of earthquake shaking that has occurred since they were first formed -- shaking that, were it strong enough, would have caused them to topple.By tapping into ancient geological data locked within Californian PBRs, Imperial College London researchers have broken ground on a new technique to boost the precision of hazard estimates for large earthquakes by up to 49 per cent.Earthquake hazard models estimate the likelihood of future earthquakes in a given location. They help engineers decide where bridges, dams, and buildings should be built and how robust they should be -- as well as informing earthquake insurance prices in high-risk areas.The findings are published today in Lead author Anna Rood, from Imperial's Department of Civil and Environmental Engineering, said: "This new approach could help us work out which areas are most likely to experience a major earthquake. PBRs act like inverse seismometers by capturing regional seismic history that we weren't around to see, and tell us the upper limit of past earthquake shakes simply by not toppling. By tapping into this, we provide uniquely valuable data on the rates of rare, large-magnitude earthquakes."Current earthquake hazard estimates rely largely on observations like proximity to fault lines and how seismically active a region has been in the past. However, estimates for rarer earthquakes that have occurred over periods of 10,000 to 1,000,000 years are extremely uncertain due to the lack of seismic data spanning those timescales and subsequent reliance on rocky assumptions.By counting rare cosmic ray-generated atoms in PBRs and digitally modelling PBR-earthquake interactions, Imperial researchers have created a new method of earthquake hazard validation that could be built into existing models to finetune their precision.To tap into the seismology of the past, the researchers set out to determine the fragility (likelihood of toppling due to ground shaking) and age of PBRs at a site near to the Diablo Canyon Nuclear Power Plant in coastal California.They used a technique called cosmogenic surface exposure dating -- counting the number of rare beryllium atoms formed within rocks by long-term exposure to cosmic rays -- to determine how long PBRs had existed in their current formation.They then used 3D modelling software to digitally recreate the PBRs and calculate how much earthquake ground shaking they could withstand before toppling.Both the age and fragility of the PBRs were then compared with current hazard estimates to help boost their certainty.They found that combining their calculations with existing models reduced the uncertainty of earthquake hazard estimates at the site by 49 per cent, and, by removing the 'worst-case-scenario' estimates, reduced the average size of earthquakes estimated to happen once every 10,000 years by 27 per cent. They also found that PBRs can be preserved in the landscape for twice as long as previously thought.They conclude that this new method reduces the amount of assumptions, and therefore the uncertainty, used in estimating and extrapolating historic earthquake data for estimates of future risk.Study co-author Dr Dylan Rood, of Imperial's Department of Earth Science and Engineering, said: "We're teetering on the edge of a breakthrough in the science of earthquake forecasting. Our 'rock clock' techniques have the potential to save huge costs in seismic engineering, and we see them being used broadly to test and update site-specific hazard estimates for earthquake-prone areas -- specifically in coastal regions where the controlling seismic sources are offshore faults whose movements are inherently more difficult to investigate."The team are now using their techniques to validate hazard estimates for southern California -- one of the most hazardous and densely populated regions of the United States.Anna said: "We're now looking at PBRs near major earthquake faults like the San Andreas fault near Los Angeles. We're also looking at how to pinpoint which data -- whether it be fault slip rates or choice of ground shaking equations -- are skewing the results in the original hazard models. This way we can improve scientists' understanding of big earthquakes even more."
Earthquakes
2,020
September 25, 2020
https://www.sciencedaily.com/releases/2020/09/200925113640.htm
How earthquake swarms arise
Earthquakes can be abrupt bursts of home-crumbling, ground-buckling energy when slices of the planet's crust long held in place by friction suddenly slip and lurch.
"We typically think of the plates on either side of a fault moving, deforming, building up stresses and then: Boom, an earthquake happens," said Stanford University geophysicist Eric Dunham.But deeper down, these blocks of rock can slide steadily past one another, creeping along cracks in Earth's crust at about the rate that your fingernails grow.A boundary exists between the lower, creeping part of the fault, and the upper portion that may stand locked for centuries at a stretch. For decades, scientists have puzzled over what controls this boundary, its movements and its relationship with big earthquakes. Chief among the unknowns is how fluid and pressure migrate along faults, and how that causes faults to slip.A new physics-based fault simulator developed by Dunham and colleagues provides some answers. The model shows how fluids ascending by fits and starts gradually weaken the fault. In the decades leading up to big earthquakes, they seem to propel the boundary, or locking depth, a mile or two upward.The research, published Sept. 24 in Each of the earthquakes in a swarm has its own aftershock sequence, as opposed to one large mainshock followed by many aftershocks. "An earthquake swarm often involves migration of these events along a fault in some direction, horizontally or vertically," explained Dunham, senior author of the paper and an associate professor of geophysics at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).The simulator maps out how this migration works. Whereas much of the advanced earthquake modeling of the last 20 years has focused on the role of friction in unlocking faults, the new work accounts for interactions between fluid and pressure in the fault zone using a simplified, two-dimensional model of a fault that cuts vertically through Earth's entire crust, similar to the San Andreas Fault in California."Through computational modeling, we were able to tease out some of the root causes for fault behavior," said lead author Weiqiang Zhu, a graduate student in geophysics at Stanford. "We found the ebb and flow of pressure around a fault may play an even bigger role than friction in dictating its strength."Faults in Earth's crust are always saturated with fluids -- mostly water, but water in a state that blurs distinctions between liquid and gas. Some of these fluids originate in Earth's belly and migrate upwards; some come from above when rainfall seeps in or energy developers inject fluids as part of oil, gas or geothermal projects. "Increases in the pressure of that fluid can push out on the walls of the fault, and make it easier for the fault to slide," Dunham said. "Or, if the pressure decreases, that creates a suction that pulls the walls together and inhibits sliding."For decades, studies of rocks unearthed from fault zones have revealed telltale cracks, mineral-filled veins and other signs that pressure can fluctuate wildly during and between big quakes, leading geologists to theorize that water and other fluids play an important role in triggering earthquakes and influencing when the biggest temblors strike. "The rocks themselves are telling us this is an important process," Dunham said.More recently, scientists have documented that fluid injection related to energy operations can lead to earthquake swarms. Seismologists have linked oil and gas wastewater disposal wells, for example, to a dramatic increase in earthquakes in parts of Oklahoma starting around 2009. And they've found that earthquake swarms migrate along faults faster or slower in different environments, whether it's underneath a volcano, around a geothermal operation or within oil and gas reservoirs, possibly because of wide variation in fluid production rates, Dunham explained. But modeling had yet to untangle the web of physical mechanisms behind the observed patterns.Dunham and Zhu's work builds on a concept of faults as valves, which geologists first put forth in the 1990s. "The idea is that fluids ascend along faults intermittently, even if those fluids are being released or injected at a steady, constant rate," Dunham explained. In the decades to thousands of years between large earthquakes, mineral deposition and other chemical processes seal the fault zone.With the fault valve closed, fluid accumulates and pressure builds, weakening the fault and forcing it to slip. Sometimes this movement is too slight to generate ground shaking, but it's enough to fracture the rock and open the valve, allowing fluids to resume their ascent.The new modeling shows for the first time that as these pulses travel upward along the fault, they can create earthquake swarms. "The concept of a fault valve, and intermittent release of fluids, is an old idea," Dunham said. "But the occurrence of earthquake swarms in our simulations of fault valving was completely unexpected."The model makes quantitative predictions about how quickly a pulse of high-pressure fluids migrates along the fault, opens up pores, causes the fault to slip and triggers certain phenomena: changes in the locking depth, in some cases, and imperceptibly slow fault movements or clusters of small earthquakes in others. Those predictions can then be tested against the actual seismicity along a fault -- in other words, when and where small or slow-motion earthquakes end up occurring.For instance, one set of simulations, in which the fault was set to seal up and halt fluid migration within three or four months, predicted a little more than an inch of slip along the fault right around the locking depth over the course of a year, with the cycle repeating every few years. This particular simulation closely matches patterns of so-called slow-slip events observed in New Zealand and Japan -- a sign that the underlying processes and mathematical relationships built into the algorithm are on target. Meanwhile, simulations with sealing dragged out over years caused the locking depth to rise as pressure pulses climbed upward.Changes in the locking depth can be estimated from GPS measurements of the deformation of Earth's surface. Yet the technology is not an earthquake predictor, Dunham said. That would require more complete knowledge of the processes that influence fault slip, as well as information about the particular fault's geometry, stress, rock composition and fluid pressure, he explained, "at a level of detail that is simply impossible, given that most of the action is happening many miles underground."Rather, the model offers a way to understand processes: how changes in fluid pressure cause faults to slip; how sliding and slip of a fault breaks up the rock and makes it more permeable; and how that increased porosity allows fluids to flow more easily.In the future, this understanding could help to inform assessments of risk related to injecting fluids into the Earth. According to Dunham, "The lessons that we learn about how fluid flow couples with frictional sliding are applicable to naturally occurring earthquakes as well as induced earthquakes that are happening in oil and gas reservoirs."This research was supported by the National Science Foundation and the Southern California Earthquake Center.
Earthquakes
2,020
September 23, 2020
https://www.sciencedaily.com/releases/2020/09/200923124736.htm
Flood risks: More accurate data due to COVID-19
Emerging use of Global Navigation Satellite System (GNSS) makes it possible to continuously measure shallow changes in elevation of Earth surface. A study by the University of Bonn now shows that the quality of these measurements may have improved significantly during the pandemic, at least at some stations. The results show which factors should be considered in the future when installing GPS antennas. More precise geodetic data are important for assessing flood risks and for improving earthquake early warning systems. The journal
A number of countries went into politically decreed late hibernation at the onset of the Covid-19 pandemic. Many of those affected by the lockdown suffered negative economic and social consequences. Geodesy, a branch of the Earth Science to study Earth's gravity field and its shape, on the other hand, has benefited from the drastic reduction in human activity. At least that is what the study now published in the GNSS receivers can determine their positions to an accuracy of a few mm. They do this using the US GPS satellites and their Russian counterparts, GLONASS. For some years now, it has also been possible to measure the distance between the antenna and the ground surface using a new method. "This has recently allowed our research group to measure elevation changes in the uppermost of soil layers, without installing additional equipment," explains Dr. Makan Karegar from the Institute of Geodesy and Geoinformation at the University of Bonn. Researchers, for instance, can measure the wave-like propagation of an earthquake and the rise or fall of a coastal area.The measuring method is based on the fact that the antenna does not only pick up the direct satellite signal. Part of the signal is reflected by the nearby environment and objects and reaches the GNSS antenna with some delays. This reflected part therefore travels a longer path to the antenna. When superimposed on the directly received signal, it forms certain patterns called interference. The can be used to calculate the distance between the antenna and the ground surface which can change over time. To calculate the risk of flooding in low-elevation coastal areas, it is important to know this change -- and thus the subsidence of the Earth surface -- precisely.This method works well if the surrounding ground is flat, like the surface of a mirror. "But many GNSS receivers are mounted on buildings in cities or in industrial zones," explains Prof. Dr. Jürgen Kusche. "And they are often surrounded by large parking lots -- as is the case with the antenna we investigated in Boston."In their analysis, the researchers were able to show that parked cars significantly reduce the quality of the elevation data: Parked vehicles scatter the satellite signal and cause it to be reflected several times before it reaches the antenna, like a cracked mirror. This not only reduces the signal intensity, but also the information that can be extracted from it: It's "noisy." In addition, because the "pattern" of parked cars changes from day to day, these data can not be easily corrected."Before the pandemic, measurements of antenna height had an average accuracy of about four centimeters due to the higher level of noise," says Karegar. "During the lockdown, however, there were almost no vehicles parked in the vicinity of the antenna; this improved the accuracy to about two centimeters." A decisive leap: The more reliable the values, the smaller the elevation fluctuations that can be detected in the upper soil layers.In the past, GNSS stations were preferably installed in sparsely populated regions, but this has changed in recent years. "Precise GNSS sensors are often installed in urban areas to support positioning services for engineering and surveying applications, and eventually for scientific applications such as deformation studies and natural hazards assessment," says Karegar. "Our study recommends that we should try to avoid installation of GNNS sensors next to parking lots."
Earthquakes
2,020
September 22, 2020
https://www.sciencedaily.com/releases/2020/09/200922144312.htm
Seismic data explains continental collision beneath Tibet
In addition to being the last horizon for adventurers and spiritual seekers, the Himalaya region is a prime location for understanding geological processes. It hosts world-class mineral deposits of copper, lead, zinc, gold and silver, as well as rarer elements like lithium, antimony and chrome, that are essential to modern technology. The uplift of the Tibetan plateau even affects global climate by influencing atmospheric circulation and the development of seasonal monsoons.
Yet despite its importance, scientists still don't fully understand the geological processes contributing to the region's formation. "The physical and political inaccessibility of Tibet has limited scientific study, so most field experiments have either been too localized to understand the big picture or they've lacked sufficient resolution at depths to properly understand the processes," said Simon Klemperer, a geophysics professor at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).Now, new seismic data gathered by Klemperer and his colleagues provides the first west-to-east view of the subsurface where India and Asia collide. The research contributes to an ongoing debate over the structure of the Himalaya collision zone, the only place on Earth where continental plates continue crashing today -- and the source of catastrophes like the 2015 Gorkha earthquake that killed about 9,000 people and injured thousands more.The new seismic images suggest that two competing processes are simultaneously operating beneath the collision zone: movement of one tectonic plate under another, as well as thinning and collapse of the crust. The research, conducted by scientists at Stanford University and the Chinese Academy of Geological Sciences, was published in The study marks the first time that scientists have collected truly credible images of what's called an along-strike, or longitudinal, variation in the Himalaya collision zone, co-author Klemperer said.As the Indian plate collides with Asia it forms Tibet, the highest and largest mountain plateau on the planet. This process started very recently in geological history, about 57 million years ago. Researchers have proposed various explanations for its formation, such as a thickening of the Earth's crust caused by the Indian plate forcing its way beneath the Tibetan Plateau.To test these hypotheses, researchers began the major logistical effort of installing new seismic recorders in 2011 in order to resolve details that might have been previously overlooked. Importantly, the new recorders were installed from east to west across Tibet; traditionally, they had only been deployed from north to south because that is the direction the country's valleys are oriented and thus the direction that roads have historically been built.The final images, pieced together from recordings by 159 new seismometers closely spaced along two 620-mile long profiles, reveal where the Indian crust has deep tears associated with the curvature of the Himalayan arc."We're seeing at a much finer scale what we never saw before," Klemperer said. "It took a heroic effort to install closely spaced seismometers across the mountains, instead of along the valleys, to collect data in the west-east direction and make this research possible."As the Indian tectonic plate moves from the south, the mantle, the thickest and strongest part of the plate, is dipping beneath the Tibetan plateau. The new analyses reveal that this process is causing small parts of the Indian plate to break off beneath two of the surface rifts, likely creating tears in the plate -- similar to how a truck barreling through a narrow gap between two trees might chip off pieces of tree trunk. The location of such tears can be critical for understanding how far a major earthquake like Gorkha will spread."These transitions, these jumps between the faults, are so important and they're at a scale that we don't normally notice until after an earthquake has happened," Klemperer said.An unusual aspect of Tibet involves the occurrence of very deep earthquakes, more than 40 miles below the surface. Using their seismic data, the researchers found associations between the plate tears and the occurrence of those deep quakes.The research also explains why the strength of gravity varies in different parts of the collision zone. The co-authors hypothesized that after the small pieces dropped off of the Indian plate, softer material from underneath bubbled up, creating mass imbalances in the India-Tibet collision zone.The India-Tibet region also provides insight into how parts of the eastern U.S. could have been formed through continental collisions about a billion years ago."The only way to understand what might have happened in eastern North America today is to come to Tibet," Klemperer said. "For geologists, this is the one big continental collision that is taking place on Earth today -- it's this natural laboratory where we can study these processes."
Earthquakes
2,020
September 22, 2020
https://www.sciencedaily.com/releases/2020/09/200922144309.htm
Can ripples on the sun help predict solar flares?
Solar flares are violent explosions on the sun that fling out high-energy charged particles, sometimes toward Earth, where they disrupt communications and endanger satellites and astronauts.
But as scientists discovered in 1996, flares can also create seismic activity -- sunquakes -- releasing impulsive acoustic waves that penetrate deep into the sun's interior.While the relationship between solar flares and sunquakes is still a mystery, new findings suggest that these "acoustic transients" -- and the surface ripples they generate -- can tell us a lot about flares and may someday help us forecast their size and severity.A team of physicists from the United States, Colombia and Australia has found that part of the acoustic energy released from a flare in 2011 emanated from about 1,000 kilometers beneath the solar surface -- the photosphere -- and, thus, far beneath the solar flare that triggered the quake.The results, published Sept. 21 in The Helioseismic holography allows scientists to analyze acoustic waves triggered by flares to probe their sources, much as seismic waves from megaquakes on Earth allow seismologists to locate their epicenters. The technique was first applied to acoustic transients released from flares by a graduate student in Romania, Alina-Catalina Donea, under the supervision of Lindsey and Braun. Donea is now at Monash University in Melbourne, Australia."It's the first helioseismic diagnostic specifically designed to directly discriminate the depths of the sources it reconstructs, as well as their horizontal locations," Braun said."We can't see the sun's inside directly. It is opaque to the photons that show us the sun's outer atmosphere, from where they can escape to reach our telescopes," said co-author Juan Camilo Buitrago-Casas, a University of California, Berkeley, doctoral student in physics from Colombia. "The way we can know what happens inside of the sun is via seismic waves that make ripples on the solar surface similar to those caused by earthquakes on our planet. A big explosion, such as a flare, can inject a powerful acoustic pulse into the sun, whose subsequent signature we can use to map its source in some detail. The big message of this paper is that the source of at least some of this noise is deeply submerged. We are reporting the deepest source of acoustic waves so far known in the sun."The acoustic explosions that cause sunquakes in some flares radiate acoustic waves in all directions, primarily downward. As the downward-traveling waves move through regions of ever-increasing temperature, their paths are bent by refraction, ultimately heading back up to the surface, where they create ripples like those seen after throwing a pebble in a pond. The time between the explosion and the arrival of the ripples is about 20 minutes."The ripples, then, are not just a surface phenomenon, but the surface signature of waves that have gone deep beneath the active region and then back up to the outlying surface in the succeeding hour," Lindsey said. Analyzing the surface ripples can pinpoint the source of the explosion."It has been widely supposed that the waves released by acoustically active flares are injected into the solar interior from above. What we are finding is the strong indication that some of the source is far beneath the photosphere," said Juan Carlos Martínez Oliveros, a solar physics researcher at UC Berkeley's Space Sciences Laboratory and a native of Colombia. "It seems like the flares are the precursor, or trigger, of the acoustic transient released. There is something else happening inside the sun that is generating at least some part of the seismic waves.""Using an analogy from medicine, what we (solar physicists) were doing before is like using X-rays to look at one snapshot of the interior of the sun. Now, we are trying to do a CAT scan, to view the solar interior in three dimensions," added Martínez Oliveros.The Colombians, including students Ángel Martínez and Valeria Quintero Ortega at Universidad Nacional de Colombia, in Bogotá, are co-authors of the ApJ Letters paper with their supervisor, Benjamín Calvo-Mozo, associate professor of astronomy."We have known about acoustic waves from flares for a little over 20 years now, and we have been imaging their sources horizontally since that time. But we have only recently discovered that some of those sources are submerged below the solar surface," said Lindsey. "This may help explain a great mystery: Some of these acoustic waves have emanated from locations that are devoid of local surface disturbances that we can directly see in electromagnetic radiation. We have wondered for a long time how this can happen."For more than 50 years, astronomers have known that the sun reverberates with seismic waves, much like the Earth and its steady hum of seismic activity. This activity, which can be detected by the Doppler shift of light emanating from the surface, is understood to be driven by convective storms that form a patchwork of granules about the size of Texas, covering the sun's surface and continually rumbling.Amid this background noise, magnetic regions can set off violent explosions releasing waves that make the spectacular ripples that then appear on the sun's surface in the succeeding hour, as discovered 24 years ago by astronomers Valentina Zharkova and Alexander Kosovichev.As more sunquakes have been discovered, flare seismology has blossomed, as have the techniques to explore their mechanics and their possible relationship to the architecture of magnetic flux underlying active regions.Among the open questions: Which flares do and don't produce sunquakes? Can sunquakes occur without a flare? Why do sunquakes emanate primarily from the edges of sunspots, or penumbrae? Do the weakest flares produce quakes? What is the lower limit?Until now, most solar flares have been studied as one-offs, since strong flares, even during times of maximum solar activity, may occur only a few times a year. The initial focus was on the largest, or X-class, flares, classified by the intensity of the soft X-rays they emit. Buitrago-Casas, who obtained his bachelor's and master's degrees from Universidad Nacional de Colombia, teamed up with Lindsey and Martínez Oliveros to conduct a systematic survey of relatively weak solar flares to increase their database, for a better understanding of the mechanics of sunquakes.Of the 75 flares captured between 2010 and 2015 by the RHESSI satellite -- a NASA X-ray satellite designed, built and operated by the Space Sciences Laboratory and retired in 2018 -- 18 produced sunquakes. One of Buitrago-Casas's acoustic transients, the one released by the flare of July 30, 2011, caught the eyes of undergraduate students Martínez, now a graduate student, and Quintero Ortega."We gave our student collaborators at the National University the list of flares from our survey. They were the first ones who said, 'Look at this one. It's different! What happened here?'" Buitrago-Casas said. "And so, we found out. It was super exciting!"Martínez and Quintero Ortega are the first authors on a paper describing the extreme impulsivity of the waves released by that flare of July 30, 2011, that appeared in the May 20, 2020, issue of The Astrophysical Journal Letters. These waves had spectral components that gave the researchers unprecedented spatial resolution of their source distributions.Thanks to superb data from NASA's Solar Dynamics Observatory satellite, the team was able to pinpoint the source of the explosion that generated the seismic waves 1,000 kilometers below the photosphere. This is shallow, relative to the sun's radius of nearly 700,000 kilometers, but deeper than any previously known acoustic source in the sun.A source submerged below the sun's photosphere with its own morphology and no conspicuous directly overlying disturbance in the outer atmosphere suggests that the mechanism that drives the acoustic transient is itself submerged."It may work by triggering a compact explosion with its own energy source, like a remotely triggered earthquake," Lindsey said. "The flare above shakes something beneath the surface, and then a very compact unit of submerged energy gets released as acoustic sound," he said. "There is no doubt that the flare is involved, it's just that the existence of this deep compact source suggests the possibility of a separate, distinctive, compact, submerged energy source driving the emission."About half of the medium-sized solar flares that Buitrago-Casas and Martínez Oliveros have catalogued have been associated with sunquakes, showing that they commonly occur together. The team has since found other submerged sources associated with even weaker flares.The discovery of submerged acoustic sources opens the question of whether there are instances of acoustic transients being released spontaneously, with no surface disturbance, or no flare, at all."If sunquakes can be generated spontaneously in the sun, this might lead us to a forecasting tool, if the transient can come from magnetic flux that has yet to break the sun's surface," Martínez Oliveros said. "We could then anticipate the inevitable subsequent emergence of that magnetic flux. We may even forecast some details about how large an active region is about to appear and what type -- even, possibly, what kinds of flares -- it might produce. This is a long shot, but well worth looking into."
Earthquakes
2,020
September 18, 2020
https://www.sciencedaily.com/releases/2020/09/200918185050.htm
Undersea earthquakes shake up climate science
Despite climate change being most obvious to people as unseasonably warm winter days or melting glaciers, as much as 95 percent of the extra heat trapped on Earth by greenhouse gases is held in the world's oceans. For that reason, monitoring the temperature of ocean waters has been a priority for climate scientists, and now Caltech researchers have discovered that seismic rumblings on the seafloor can provide them with another tool for doing that.
In a new paper publishing in They do this by listening for the sounds from the many earthquakes that regularly occur under the ocean, says Jörn Callies, assistant professor of environmental science and engineering at Caltech and study co-author. Callies says these earthquake sounds are powerful and travel long distances through the ocean without significantly weakening, which makes them easy to monitor.Wenbo Wu, postdoctoral scholar in geophysics and lead author of the paper, explains that when an earthquake happens under the ocean, most of its energy travels through the earth, but a portion of that energy is transmitted into the water as sound. These sound waves propagate outward from the quake's epicenter just like seismic waves that travel through the ground, but the sound waves move at a much slower speed. As a result, ground waves will arrive at a seismic monitoring station first, followed by the sound waves, which will appear as a secondary signal of the same event. The effect is roughly similar to how you can often see the flash from lightning seconds before you hear its thunder."These sound waves in the ocean can be clearly recorded by seismometers at a much longer distance than thunder -- from thousands of kilometers away," Wu says. "Interestingly, they are even 'louder' than the vibrations traveling deep in the solid Earth, which are more widely used by seismologists."The speed of sound in water increases as the water's temperature rises, so, the team realized, the length of time it takes a sound to travel a given distance in the ocean can be used to deduce the water's temperature."The key is that we use repeating earthquakes -- earthquakes that happen again and again in the same place," he says. "In this example we're looking at earthquakes that occur off Sumatra in Indonesia, and we measure when they arrive in the central Indian ocean. It takes about a half hour for them to travel that distance, with water temperature causing about one-tenth-of-a second difference. It's a very small fractional change, but we can measure it."Wu adds that because they are using a seismometer that has been in the same location in the central Indian Ocean since 2004, they can look back at the data it collected each time an earthquake occurred in Sumatra, for example, and thus determine the temperature of the ocean at that same time."We are using small earthquakes that are too small to cause any damage or even be felt by humans at all," Wu says. "But the seismometer can detect them from great distances , thus allowing us to monitor large-scale ocean temperature changes on a particular path in one measurement."Callies says the data they have analyzed confirm that the Indian Ocean has been warming, as other data collected through other methods have indicated, but that it might be warming even faster than previously estimated."The ocean plays a key role in the rate that the climate is changing," he says. "The ocean is the main reservoir of energy in the climate system, and the deep ocean in particular is important to monitor. One advantage of our method is that the sound waves sample depths below 2,000 meters, where there are very few conventional measurements."Depending on which set of previous data they compare their results to, ocean warming appears to be as much as 69 percent greater than had been believed. However, Callies cautions against drawing any immediate conclusions, as more data need to be collected and analyzed.Because undersea earthquakes happen all over the world, Callies says it should be possible to expand the system he and his fellow researchers developed so that it can monitor water temperatures in all of the oceans. Wu adds that because the technique makes use of existing infrastructure and equipment, it is relatively low-cost."We think we can do this in a lot of other regions," Callies says. "And by doing this, we hope to contribute to the data about how our oceans are warming."
Earthquakes
2,020
September 17, 2020
https://www.sciencedaily.com/releases/2020/09/200917105413.htm
Formation of the Alps: Detaching and uplifting, not bulldozing
For a long time, geoscientists have assumed that the Alps were formed when the Adriatic plate from the south collided with the Eurasian plate in the north. According to the textbooks, the Adriatic plate behaved like a bulldozer, thrusting rock material up in front of it into piles that formed the mountains. Supposedly, their weight subsequently pushed the underlying continental plate downwards, resulting in the formation of a sedimentary basin in the north adjacent to the mountains -- the Swiss Molasse Plateau. Over time, while the mountains grew higher the basin floor sank deeper and deeper with the rest of the plate.
A few years ago, however, new geophysical and geological data led ETH geophysicist Edi Kissling and Fritz Schlunegger, a sediment specialist from the University of Bern, to express doubts about this theory. In light of the new information, the researchers postulated an alternative mechanism for the formation of the Alps.Kissling and Schlunegger pointed out that the topography and altitude of the Alps have barely changed over the past 30 million years, and yet the trench at the site of the Swiss Plateau has continued to sink and the basin extended further north. This leads the researchers to believe that the formation of the Central Alps and the sinking of the trench are not connected as previously assumed.They argue that if the Alps and the trench indeed had formed from the impact of two plates pressing together, there would be clear indications that the Alps were steadily growing. That's because, based on the earlier understanding of how the Alps formed, the collision of the plates, the formation of the trench and the height of the mountain range are all linked.Furthermore, seismicity observed during the past 40 years within the Swiss Alps and their northern foreland clearly documents extension across the mountain ranges rather than the compression expected for the bulldozing Adria model.The behaviour of the Eurasian plate provides a possible new explanation. Since about 60 Ma ago, the former oceanic part of the Eurasian plate sinks beneath the continental Adriatic microplate in the south. By about 30 Ma ago, this process of subduction is so far advanced that all oceanic lithosphere has been consumed and the continental part of the Eurasian plate enters the subduction zone.This denotes the begin of the so-called continent-continent collision with the Adriatic microplate and the European upper, lighter crust separates from the heavier, underlying lithospheric mantle. Because it weighs less, the Earth's crust surges upwards, literally creating the Alps for the first time around 30 Ma ago. While this is happening, the lithospheric mantle sinks further into the Earth's mantle, thus pulling the adjacent part of the plate downwards.This theory is plausible because the Alps are mainly made up of gneiss and granite and their sedimentary cover rocks like limestone. These crustal rocks are significantly lighter than the Earth's mantle -- into which the lower layer of the plate, the lithospheric mantle, plunges after the detachment of the two layers that form the continental plate. "In turn, this creates strong upward forces that lift the Alps out of the ground," Kissling explains. "It was these upward forces that caused the Alps to form, not the bulldozer effect as a result of two continental plates colliding," he says.To investigate the lift hypothesis, Luca Dal Zilio, former doctoral student in ETH geophysics professor Taras Gerya's group, has now teamed up with Kissling and other ETH researchers to develop a new model. Dal Zilio simulated the subduction zone under the Alps: the plate tectonic processes, which took place over millions of years, and the associated earthquakes."The big challenge with this model was bridging the time scales. It takes into account lightning-fast shifts that manifest themselves in the form of earthquakes, as well as deformations of the crust and lithospheric mantle over thousands of years," says Dal Zilio, lead author of the study recently published in the journal According to Kissling, the model is an excellent way to simulate the uplifting processes that he and his colleague are postulating. "Our model is dynamic, which gives it a huge advantage," he says, explaining that previous models took a rather rigid or mechanical approach that did not take into account changes in plate behaviour. "All of our previous observations agree with this model," he says.The model is based on physical laws. For instance, the Eurasian plate would appear to subduct southwards. In contrast to the normal model of subduction, however, it doesn't actually move in this direction because the position of the continent remains stable. This forces the subducting lithosphere to retreat northwards, causing the Eurasian plate to exert a suction effect on the relatively small Adriatic plate.Kissling likens the action to a sinking ship. The resulting suction effect is very strong, he explains. Strong enough to draw in the smaller Adriatic microplate so that it collides with the crust of the Eurasian plate. "So, the mechanism that sets the plates in motion is not in fact a pushing effect but a pulling one," he says, concluding that the driving force behind it is simply the pull of gravity on the subducting plate.In addition, the model simulates the occurrence of earthquakes, or seismicity, in the Central Alps, the Swiss Plateau and below the Po Valley. "Our model is the first earthquake simulator for the Swiss Central Alps," says Dal Zilio. The advantage of this earthquake simulator is that it covers a very long period of time, meaning that it can also simulate very strong earthquakes that occur extremely rarely."Current seismic models are based on statistics," Dal Zilio says, "whereas our model uses geophysical laws and therefore also takes into account earthquakes that occur only once every few hundreds of years." Current earthquake statistics tend to underestimate such earthquakes. The new simulations therefore improve the assessment of earthquake risk in Switzerland.
Earthquakes
2,020
September 11, 2020
https://www.sciencedaily.com/releases/2020/09/200911200008.htm
Ancient earthquake may have caused destruction of Canaanite palace at Tel Kabri
A team of Israeli and American researchers funded by grants from the National Geographic Society and the Israel Science Foundation has uncovered new evidence that an earthquake may have caused the destruction and abandonment of a flourishing Canaanite palatial site about 3,700 years ago.
The group made the discovery at the 75-acre site of Tel Kabri in Israel, which contains the ruins of a Canaanite palace and city that dates back to approximately 1900-1700 B.C. The excavations, located on land belonging to Kibbutz Kabri in the western Galilee region, are co-directed by Assaf Yasur-Landau, a professor of Mediterranean archaeology at the University of Haifa, and Eric Cline, a professor of classics and anthropology at the George Washington University."We wondered for several years what had caused the sudden destruction and abandonment of the palace and the site, after centuries of flourishing occupation," Yasur-Landau said. "A few seasons ago, we began to uncover a trench which runs through part of the palace, but initial indications suggested that it was modern, perhaps dug within the past few decades or a century or two at most. But then, in 2019, we opened up a new area and found that the trench continued for at least 30 meters, with an entire section of a wall that had fallen into it in antiquity, and with other walls and floors tipping into it on either side."According to Michael Lazar, the lead author of the study, recognizing past earthquakes can be extremely challenging in the archaeological record, especially at sites where there isn't much stone masonry and where degradable construction materials like sun-dried mud bricks and wattle-and-daub were used instead. At Tel Kabri, however, the team found both stone foundations for the bottom part of the walls and mud-brick superstructures above."Our studies show the importance of combining macro- and micro-archaeological methods for the identification of ancient earthquakes," he said. "We also needed to evaluate alternative scenarios, including climatic, environmental and economic collapse, as well as warfare, before we were confident in proposing a seismic event scenario."The researchers could see areas where the plaster floors appeared warped, walls had tilted or been displaced, and mud bricks from the walls and ceilings had collapsed into the rooms, in some cases rapidly burying dozens of large jars."It really looks like the earth simply opened up and everything on either side of it fell in," Cline said. "It's unlikely that the destruction was caused by violent human activity because there are no visible signs of fire, no weapons such as arrows that would indicate a battle, nor any unburied bodies related to combat. We could also see some unexpected things in other rooms of the palace, including in and around the wine cellar that we excavated a few years ago."In 2013, the team discovered 40 jars within a single storage room of the palace during an expedition also supported by a National Geographic Society grant. An organic residue analysis conducted on the jars indicated that they held wine; it was described at the time as the oldest and largest wine cellar yet discovered in the Near East. Since then, the team has found four more such storage rooms and at least 70 more jars, all buried by the collapse of the building."The floor deposits imply a rapid collapse rather than a slow accumulation of degraded mud bricks from standing walls or ceilings of an abandoned structure," Ruth Shahack-Gross, a professor of geoarchaeology at the University of Haifa and a co-author on the study, said. "The rapid collapse, and the quick burial, combined with the geological setting of Tel Kabri, raises the possibility that one or more earthquakes could have destroyed the walls and the roof of the palace without setting it on fire."The investigators are hopeful that their methodological approach can be applied at other archaeological sites, where it can serve to test or strengthen cases of possible earthquake damage and destruction.
Earthquakes
2,020
September 10, 2020
https://www.sciencedaily.com/releases/2020/09/200910120112.htm
Understanding Earth's 'deep-carbon cycle'
New geologic findings about the makeup of the Earth's mantle are helping scientists better understand long-term climate stability and even how seismic waves move through the planet's layers.
The research by a team including Case Western Reserve University scientists focused on the "deep carbon cycle," part of the overall cycle by which carbon moves through the Earth's various systems.In simplest terms, the deep carbon cycle involves two steps:Scientists have long suspected that partially melted chunks of this carbon are broadly distributed throughout the Earth's solid mantle.What they haven't fully understood is how far down into the mantle they might be found, or how the geologically slow movement of the material contributes to the carbon cycle at the surface, which is necessary for life itself."Cycling of carbon between the surface and deep interior is critical to maintaining Earth's climate in the habitable zone over the long term -- meaning hundreds of millions of years," said James Van Orman, a professor of geochemistry and mineral physics in the College of Arts and Sciences at Case Western Reserve and an author on the study, recently published in the "Right now, we have a good understanding of the surface reservoirs of carbon, but know much less about carbon storage in the deep interior, which is also critical to its cycling."Van Orman said this new research showed -- based on experimental measurements of the acoustic properties of carbonate melts, and comparison of these results to seismological data -- that a small fraction (less than one-tenth of 1%) of carbonate melt is likely to be present throughout the mantle at depths of about 180-330 km."Based on this inference, we can now estimate the carbon concentration in the deep upper mantle and infer that this reservoir holds a large mass of carbon, more than 10,000 times the mass of carbon in Earth's atmosphere," Van Orman said.That's important, Van Orman said, because gradual changes in the amount of carbon stored in this large reservoir, due to exchange with the atmosphere, could have a corresponding effect on COThe first author of the article is Man Xu, who did much of the work as a PhD student at Case Western Reserve and is now a postdoctoral scholar at the University of Chicago.Others on the project were from Florida State University, the University of Chicago and Southern University of Science and Technology (SUSTech) in Shenzhen, China.The research also sheds light on seismology, especially deep earth research.One way geologists better understand the deep interior is by measuring how seismic waves generated by earthquakes -- fast-moving compressional waves and slower shear waves -- move through the Earth's layers.Scientists have long wondered why the speed difference between the two types of seismic waves -- P-waves and S-waves -- peaked at depths of around 180 to 330 kilometers into the Earth.Carbon-rich melts seem to answer that question: small quantities of these melts could be dispersed throughout the deep upper mantle and would explain the speed change, as the waves move differently through the melts.
Earthquakes
2,020
September 4, 2020
https://www.sciencedaily.com/releases/2020/09/200904163326.htm
Superheated rocks deep underground help explain earthquake patterns
Rock-melting forces occurring much deeper in the Earth than previously understood appear to drive tremors along a notorious segment of California's San Andreas Fault, according to new USC research that helps explain how quakes happen.
The study from the emergent field of earthquake physics looks at temblor mechanics from the bottom up, rather than from the top down, with a focus on underground rocks, friction and fluids. On the segment of the San Andreas Fault near Parkfield, Calif., underground excitations -- beyond the depths where quakes are typically monitored -- lead to instability that ruptures in a quake."Most of California seismicity originates from the first 10 miles of the crust, but some tremors on the San Andreas Fault take place much deeper," said Sylvain Barbot, assistant professor of Earth sciences at the USC Dornsife College of Letters, Arts and Sciences. "Why and how this happens is largely unknown. We show that a deep section of the San Andreas Fault breaks frequently and melts the host rocks, generating these anomalous seismic waves." The newly published study appears in The findings are significant because they help advance the long-term goal of understanding how and where earthquakes are likely to occur, along with the forces that trigger temblors. Better scientific understanding helps inform building codes, public policy and emergency preparedness in quake-ridden areas like California. The findings may also be important in engineering applications where the temperature of rocks is changed rapidly, such as by hydraulic fracturing.Parkfield was chosen because it is one of the most intensively monitored epicenters in the world. The San Andreas Fault slices past the town, and it's regularly ruptured with significant quakes. Quakes of magnitude 6 have shaken the Parkfield section of the fault at fairly regular intervals in 1857, 1881, 1901, 1922, 1934, 1966 and 2004, according to the U.S. Geological Survey. At greater depths, smaller temblors occur every few months. So what's happening deep in the Earth to explain the rapid quake recurrence?Using mathematical models and laboratory experiments with rocks, the scientists conducted simulations based on evidence gathered from the section of the San Andreas Fault extending up to 36 miles north of -- and 16 miles beneath -- Parkfield. They simulated the dynamics of fault activity in the deep Earth spanning 300 years to study a wide range of rupture sizes and behaviors.The researchers observed that, after a big quake ends, the tectonic plates that meet at the fault boundary settle into a go-along, get-along phase. For a spell, they glide past each other, a slow slip that causes little disturbance to the surface.But this harmony belies trouble brewing. Gradually, motion across chunks of granite and quartz, the Earth's bedrock, generates heat due to friction. As the heat intensifies, the blocks of rock begin to change. When friction pushes temperatures above 650 degrees Fahrenheit, the rock blocks grow less solid and more fluid-like. They start to slide more, generating more friction, more heat and more fluids until they slip past each other rapidly -- triggering an earthquake."Just like rubbing our hands together in cold weather to heat them up, faults heat up when they slide. The fault movements can be caused by large changes in temperature," Barbot said. "This can create a positive feedback that makes them slide even faster, eventually generating an earthquake."It's a different way of looking at the San Andreas Fault. Scientists typically focus on movement in the top of Earth's crust, anticipating that its motion in turn rejiggers the rocks deep below. For this study, the scientists looked at the problem from the bottom up."It's difficult to make predictions," Barbot added, "so instead of predicting just earthquakes, we're trying to explain all of the different types of motion seen in the ground."The study was supported by grants from the National Natural Science Foundation of China (NSFC-41674067 and NSFC-U1839211) and the U.S. National Science Foundation (EAR-1848192).
Earthquakes
2,020
August 27, 2020
https://www.sciencedaily.com/releases/2020/08/200827141252.htm
The Le Teil earthquake provides new insights on seismic risk in France and Western Europe
On 11 November 2019, a magnitude 5 earthquake occurred near the village of Le Teil in the Rhône River Valley in southern France producing an unexpected surface rupture with ground displacement.
For the first time in France, the CNRS, IRSN, IRD, Université de Montpellier, Université Côte d'Azur and Terradue (1) had the opportunity to use all modern seismological, geodetical (2), and geological techniques available to study this historically unprecedented seismic event. The data, published on 27 August 2020 in Communications Earth & Environment, reveals that the earthquake was caused by the reactivation of the ancient La Rouvière fault. The fault formed during an extensional tectonic period some 20-30 million years ago during the Oligocene epoch, and was no longer considered to be active.During the Le Teil earthquake, the fault experienced a reverse faulting movement (compression) with an average surface displacement of about 10cm both vertically and horizontally. Scientists estimate that the event nucleated at a shallow focal depth of approximately 1km, which explains why the rupture along the fault was able to reach the surface and cause considerable damage despite the moderate-magnitude (3) (the accurate position of the earthquake's focus is presently being studied by another research team).The results raise the possibility that other faults could be reactivated in France and Western Europe and produce surface displacements, whereas the risk of earthquakes with surface rupture was until now considered as highly improbable. To better assess the probability of such events, several teams of scientists in France are performing palaeoseismological investigations looking for evidence of past earthquakes along such faults.(1) Members of Géosciences Montpellier (CNRS/Université de Montpellier/Université des Antilles), Géoazur (CNRS/Observatoire de la Côte d'Azur/IRD/Université Côte d'Azur), Isterre (CNRS/IRD/Université Grenoble Alpes/Université Savoie Mont Blanc/Université Gustave Eiffel) laboratories participated in this study, along with IRSN (France) and the company Terradue (Italy).(2) Geodesy is the study, usually with the aid of satellite observations, of the shape and deformations of the surface of the Earth.(3) Only 10% of earthquakes of this magnitude cause surface rupture.
Earthquakes
2,020
August 25, 2020
https://www.sciencedaily.com/releases/2020/08/200825110608.htm
Small quake clusters can't hide from AI
Researchers at Rice University's Brown School of Engineering are using data gathered before a deadly 2017 landslide in Greenland to show how deep learning may someday help predict seismic events like earthquakes and volcanic eruptions.
Seismic data collected before the massive landslide at a Greenland fjord shows the subtle signals of the impending event were there, but no human analyst could possibly have put the clues together in time to make a prediction. The resulting tsunami that devastated the village of Nuugaatsiaq killed four people and injured nine and washed 11 buildings into the sea.A study lead by former Rice visiting scholar Léonard Seydoux, now an assistant professor at the University of Grenoble-Alpes, employs techniques developed by Rice engineers and co-authors Maarten de Hoop and Richard Baraniuk. Their open-access report in De Hoop, who specializes in mathematical analysis of inverse problems and deep learning in connection with Rice's Department of Earth, Environmental and Planetary Sciences, said advances in artificial intelligence (AI) are well-suited to independently monitor large and growing amounts of seismic data. AI has the ability to identify clusters of events and detect background noise to make connections that human experts might not recognize due to biases in their models, not to mention sheer volume, he said.Hours before the Nuugaatsiaq event, those small signals began to appear in data collected by a nearby seismic station. The researchers analyzed data from midnight on June 17, 2017, until one minute before the slide at 11:39 p.m. that released up to 51 million cubic meters of material.The Rice algorithm revealed weak but repetitive rumblings -- undetectable in raw seismic records -- that began about nine hours before the event and accelerated over time, leading to the landslide."There was a precursor paper to this one by our co-author, Piero Poli at Grenoble, that studied the event without AI," de Hoop said. "They discovered something in the data they thought we should look at, and because the area is isolated from a lot of other noise and tectonic activity, it was the purest data we could work with to try our ideas."De Hoop is continuing to test the algorithm to analyze volcanic activity in Costa Rica and is also involved with NASA's InSight lander, which delivered a seismic detector to the surface of Mars nearly two years ago.Constant monitoring that delivers such warnings in real time will save lives, de Hoop said."People ask me if this study is significant -- and yes, it is a major step forward -- and then if we can predict earthquakes. We're not quite ready to do that, but this direction is, I think, one of the most promising at the moment."When de Hoop joined Rice five years ago, he brought expertise in solving inverse problems that involve working backwards from data to find a cause. Baraniuk is a leading expert in machine learning and compressive sensing, which help extract useful data from sparse samples. Together, they're a formidable team."The most exciting thing about this work is not the current result, but the fact that the approach represents a new research direction for machine learning as applied to geophysics," Baraniuk said."I come from the mathematics of deep learning and Rich comes from signal processing, which are at opposite ends of the discipline," de Hoop said. "But here we meet in the middle. And now we have a tremendous opportunity for Rice to build upon its expertise as a hub for seismologists to gather and put these pieces together. There's just so much data now that it's becoming impossible to handle any other way."De Hoop is helping to grow Rice's reputation for seismic expertise with the Simons Foundation Math+X Symposia, which have already featured events on space exploration and mitigating natural hazards like volcanoes and earthquakes. A third event, dates to be announced, will study deep learning applications for solar giants and exoplanets.
Earthquakes
2,020
August 18, 2020
https://www.sciencedaily.com/releases/2020/08/200818160949.htm
Machine learning unearths signature of slow-slip quake origins in seismic data
Combing through historical seismic data, researchers using a machine learning model have unearthed distinct statistical features marking the formative stage of slow-slip ruptures in the earth's crust months before tremor or GPS data detected a slip in the tectonic plates. Given the similarity between slow-slip events and classic earthquakes, these distinct signatures may help geophysicists understand the timing of the devastating faster quakes as well.
"The machine learning model found that, close to the end of the slow slip cycle, a snapshot of the data is imprinted with fundamental information regarding the upcoming failure of the system," said Claudia Hulbert, a computational geophysicist at ENS and the Los Alamos National Laboratory and lead author of the study, published today in Slow-slip events are earthquakes that gently rattle the ground for days, months, or even years, do not radiate large-amplitude seismic waves, and often go unnoticed by the average person. The classic quakes most people are familiar with rupture the ground in minutes. In a given area they also happen less frequently, making the bigger quakes harder to study with the data-hungry machine learning techniques.The team looked at continuous seismic waves covering the period 2009 to 2018 from the Pacific Northwest Seismic Network, which tracks earth movements in the Cascadia region. In this subduction zone, during a slow slip event, the North American plate lurches southwesterly over the Juan de Fuca plate approximately every 14 months. The data set lent itself well to the supervised-machine learning approach developed in laboratory earthquake experiments by the Los Alamos team collaborators and used for this study.The team computed a number of statistical features linked to signal energy in low-amplitude signals, frequency bands their previous work identified as the most informative about the behavior of the geologic system. The most important feature for predicting slow slip in the Cascadia data is seismic power, which corresponds to seismic energy, in particular frequency bands associated to slow slip events. According to the paper, slow slip often begins with an exponential acceleration on the fault, a force so small it eludes detection by seismic sensors."For most events, we can see the signatures of impending rupture from weeks to months before the rupture," Hulbert said. "They are similar enough from one event cycle to the next so that a model trained on past data can recognize the signatures in data from several years later. But it's still an open question whether this holds over long periods of time."The research team's hypothesis about the signal indicating the formation of a slow-slip event aligns with other recent work by Los Alamos and others detecting small-amplitude foreshocks in California. That work found that foreshocks can be observed in average two weeks before most earthquakes of magnitude greater than 4.Hulbert and her collaborators' supervised machine learning algorithms train on the seismic features calculated from the first half of the seismic data and attempts to find the best model that maps these features to the time remaining before the next slow slip event. Then they apply it to the second half of data, which it hasn't seen.The algorithms are transparent, meaning the team can see which features the machine learning uses to predict when the fault would slip. It also allows the researchers to compare these features with those that were most important in laboratory experiments to estimate failure times. These algorithms can be probed to identify which statistical features of the data are important in the model predictions, and why."By identifying the important statistical features, we can compare the findings to those from laboratory experiments, which gives us a window into the underlying physics," Hulbert said. "Given the similarities between the statistical features in the data from Cascadia and from laboratory experiments, there appear to be commonalities across the frictional physics underlying slow slip rupture and nucleation. The same causes may scale from the small laboratory system to the vast scale of the Cascadia subduction zone."The Los Alamos seismology team, led by Paul Johnson, has published several papers in the past few years pioneering the use of machine learning to unpack the physics underlying earthquakes in laboratory experiments and real-world seismic data.
Earthquakes
2,020
August 11, 2020
https://www.sciencedaily.com/releases/2020/08/200811153918.htm
Rare 'boomerang' earthquake observed along Atlantic Ocean fault line
Scientists have tracked a 'boomerang' earthquake in the ocean for the first time, providing clues about how they could cause devastation on land.
Earthquakes occur when rocks suddenly break on a fault -- a boundary between two blocks or plates. During large earthquakes, the breaking of rock can spread down the fault line. Now, an international team of researchers have recorded a 'boomerang' earthquake, where the rupture initially spreads away from initial break but then turns and runs back the other way at higher speeds.The strength and duration of rupture along a fault influences the among of ground shaking on the surface, which can damage buildings or create tsunamis. Ultimately, knowing the mechanisms of how faults rupture and the physics involved will help researchers make better models and predictions of future earthquakes, and could inform earthquake early-warning systems.The team, led by scientists from the University of Southampton and Imperial College London, report their results today in While large (magnitude 7 or higher) earthquakes occur on land and have been measured by nearby networks of monitors (seismometers), these earthquakes often trigger movement along complex networks of faults, like a series of dominoes. This makes it difficult to track the underlying mechanisms of how this 'seismic slip' occurs.Under the ocean, many types of fault have simple shapes, so provide the possibility get under the bonnet of the 'earthquake engine'. However, they are far from large networks of seismometers on land. The team made use of a new network of underwater seismometers to monitor the Romanche fracture zone, a fault line stretching 900km under the Atlantic near the equator.In 2016, they recorded a magnitude 7.1 earthquake along the Romanche fracture zone and tracked the rupture along the fault. This revealed that initially the rupture travelled in one direction before turning around midway through the earthquake and breaking the 'seismic sound barrier', becoming an ultra-fast earthquake.Only a handful of such earthquakes have been recorded globally. The team believe that the first phase of the rupture was crucial in causing the second, rapidly slipping phase.First author of the study Dr Stephen Hicks, from the Department of Earth Sciences and Engineering at Imperial, said: "Whilst scientists have found that such a reversing rupture mechanism is possible from theoretical models, our new study provides some of the clearest evidence for this enigmatic mechanism occurring in a real fault."Even though the fault structure seems simple, the way the earthquake grew was not, and this was completely opposite to how we expected the earthquake to look before we started to analyse the data."However, the team say that if similar types of reversing or boomerang earthquakes can occur on land, a seismic rupture turning around mid-way through an earthquake could dramatically affect the amount of ground shaking caused.Given the lack of observational evidence before now, this mechanism has been unaccounted for in earthquake scenario modelling and assessments of the hazards from such earthquakes. The detailed tracking of the boomerang earthquake could allow researchers to find similar patterns in other earthquakes and to add new scenarios into their modelling and improve earthquake impact forecasts.The ocean bottom seismometer network used was part of the PI-LAB and EUROLAB projects, a million-dollar experiment funded by the Natural Environment Research Council in the UK, the European Research Council, and the National Science Foundation in the US.
Earthquakes
2,020
August 6, 2020
https://www.sciencedaily.com/releases/2020/08/200806122818.htm
What's in oilfield wastewater matters for injection-induced earthquakes
A team of geoscience researchers in the Virginia Tech College of Science has developed a new theory to explain how and why injection-induced earthquakes continue to occur even when injection rates decline.
Experts have known since the 1960s that when oilfield wastewater is pumped into the ground with deep injection wells, earthquakes can occur. Over the past decade, injection-induced earthquakes have become regular occurrences throughout oil and gas basins worldwide, particularly in the central United States, and potentially in China and Canada, as well.Oil and gas production are often accompanied by highly brackish groundwater, also known as oilfield brine. These fluids can be five to six times saltier than seawater, so they are toxic to terrestrial ecosystems and have little beneficial use. As a result, oilfield brine is considered to be a waste product that is disposed of by pumping it back into deep geologic formations.When fluids are pumped into deep injection wells, they alter the naturally occurring fluid pressure in deep geologic formations. These fluid pressure changes can destabilize faults, leading to earthquakes, such as the damaging magnitude-5.8 event in Pawnee, Oklahoma, in September 2016.Among the more vexing scientific questions about injection-induced earthquakes is why they seem to be getting deeper in such places as Oklahoma and Kansas, where injection rates have been declining due to a combination of earthquake mitigation measures and declining oil and gas production.In a study published Aug. 5 in "We know that earthquakes are getting deeper in Oklahoma," said Pollyea, who directs the Computational Geofluids Lab at Virginia Tech, "so we're trying to figure out what conditions make this possible. Our research suggests that it's caused by combination of the geology, natural fluids in the basement rocks, and the wastewater itself."Although researchers have known for decades that deep fluid injections can trigger earthquakes, Pollyea said previous research misses some consequential details about how they occur. Specifically, he pointed out that oilfield brine has much different properties, like density and viscosity, than pure water, and these differences affect the processes that cause fluid pressure to trigger earthquakes."The basic idea is that oilfield brine has a lot of dissolved solid material, which makes the wastewater heavier than naturally occurring fluids in deep geologic formations," said Richard S. Jayne, a co-author of the study and former Ph.D. student at Virginia Tech who is now a research hydrogeologist at Sandia National Laboratory, "so the dense wastewater sinks, increases fluid pressure, and causes deeper earthquakes than would be predicted if the fluids have the same material properties."Using supercomputers at Virginia Tech's Advanced Research Computing division, Pollyea and his team tested their idea by producing more than 100 models of oilfield wastewater disposal using various combinations of geologic properties, wastewater temperature, and wastewater density. With this computational approach, the team isolated both the conditions and physical processes that alter fluid pressure in the geologic formations."We found that there are really two different processes that drive fluid pressure deep into the basement, where earthquakes occur," saids Pollyea. "The first is called pressure diffusion, which occurs when wastewater is forced into geologic formations that are already full of water. This process has been known for a long time, but the second process occurs when high-density wastewater sinks and pushes lower density fluids out of the way."According to this new theory, the density difference between wastewater and deep basement fluids is much more important for induced earthquake occurrence than was previously known. "This is one of the areas that has been neglected in induced-seismicity research," said Megan Brown, an assistant professor of geology who specializes in fluid triggered seismicity at Northern Illinois University and was not involved in this study. "Density-driven pressure transients are an intuitive consequence of a density differential between injected fluids and formation fluids."Although earthquake occurrence has been decreasing in the central U.S. since the peak years of 2014 and 2015, this new theory not only explains why earthquakes are getting deeper in Oklahoma, but it also explains why several magnitude-5+ earthquakes struck Oklahoma in 2016, when injection rates were decreasing state wide."One fascinating aspect of our study is that sinking wastewater plumes do not require pumping to migrate deeper underground," said Pollyea, "in fact, they'll continue sinking under their own weight for decades after injections cease, and our study shows that the wastewater doesn't have to be much heavier for this to occur."In terms of earthquake mitigation and regulatory practices, this study has far-reaching implications: The research team pointed out that high-density brines occur throughout many oil and gas basins in the U.S. But they also argued that using this study in practice requires much more information about the fluids. "This study emphasizes the need for site-specific data and increased sampling," said Brown, because "density differences as a driving factor of near-field pressure transients may also lead to pre-injection mitigation actions."Pollyea said that his research team is continuing to work on their new theory for the hydrogeologic processes that cause induced earthquakes. "We're really interested to know how our ideas about fluid chemistry affect regionally expansive injection operations in places like Oklahoma and Texas," said Pollyea. "And one of our recent M.S. graduates, Graydon Konzen (a study co-author), has done some exciting new work in this area."
Earthquakes
2,020
August 5, 2020
https://www.sciencedaily.com/releases/2020/08/200805124018.htm
Optical seismometer survives 'hellish' summit of Caribbean volcano
The heights of La Soufrière de Guadeloupe volcano can be hellish, sweltering at more than 48 degrees Celsius (120 degrees Fahrenheit) and swathed in billows of acidic gas. Researchers would like to monitor gas and steam eruptions at its summit, to learn more about the volcano's explosive potential, but conventional seismometers are destroyed quickly in the hostile environment.
An instrument called an optical seismometer appears to be up to the challenge, however. In the journal The motion of the optical seismometer (and therefore of the ground) is estimated using an interference phenomenon, which occurs when an infrared laser beam is reflected by the mirrored surface of the seismometer mobile mass. This laser beam is carried between the seismometer at the summit and a remote and safe optoelectronic station through a long fiber optic cable, climbing the volcano's slope. The station calculates the ground displacement and sends the records in real-time to the French Volcanological and Seismological Observatory of Guadeloupe.The seismometer operates purely mechanically, and requires no electronics or power supply that would be vulnerable to the summit conditions, said Romain Feron, the paper's lead author from the ESEO Group and the LAUM laboratory at the Université du Mans. The instrument is encased in Teflon to protect it from the sulfuric gases released by the fumarole."It is, to our knowledge, the first high-resolution optical seismometer ever installed on an active volcano or other hazardous zone," Feron and colleagues write in SRL.The success of the seismometer, after ten years of development, suggests that it could be a good seismic solution in other challenging environments, they noted, including oil and gas production fields, nuclear power plants and high-temperature geothermal reservoirs.Now in operation on the volcano for nine months, the instrument is collecting data that will be combined with other observations from the Guadeloupe observatory to better monitor La Soufrière. The volcano's last significant eruption of gas and steam in 1976 caused evacuations in Basse Terre, Guadeloupe's capital city. Since 2018, the volcano's dome and summit fumaroles have become increasingly active.Seismic monitoring at volcanoes can help researchers understand the movement and pressurization of underground fluids. The new optical seismometer could provide better locations for microseismic events under the dome, and offers a more detailed glimpse of "the fumarole signature, which helps to constrain the geometry and activity of the plumbing system of the dome," Feron said.The instrument has recorded seismic waves from a regional earthquake, an earthquake in Chile, and small seismic events within the volcano less than 2.5 kilometers (1.6 miles) below the summit, the researchers reported.Feron and colleagues made an arduous climb to La Soufrière's 1,467-meter (4,813-foot) summit in September 2019 to install the seismometer, using gas masks to protect themselves from the toxic gases spewing from active fumaroles. In addition to the gases and high temperatures, the team needed to keep a close eye on the weather during the installation, Feron said."It could be beautiful at the bottom of the volcano, but hellish at the top at the same time," he recalled. "It becomes very risky to climb the steep and slippery slopes of the volcano with heavy equipment on the back, not to mention lightning."
Earthquakes
2,020
July 29, 2020
https://www.sciencedaily.com/releases/2020/07/200729114733.htm
Is the Earth's transition zone deforming like the upper mantle?
In a recently published paper in
Despite being composed of solid rocks, the Earth's mantle, which extends to a depth of ~2890 km below the crust, undergoes convective flow by removing heat from the Earth's interior. This process involves mass transfer by subduction of cold tectonic plates from and the ascent of hot plumes towards the Earth's surface, responsible for many large-scale geological features, such as Earthquakes and volcanism. Through a combination of previous seismological and mineral physics studies, it is well known that the Earth's mantle is divided (mineralogically) into two major regimes: the upper and the lower mantle, separated by the "transition zone," a boundary layer between ~410 and ~660 km depth. This transition zone influences the extent of whole mantle convection by controlling mass transfer between the upper and lower mantle. Seismic tomography studies (CT scan imaging of the Earth's interior using seismic waves) have previously revealed that while some slabs penetrate through the transition zone, others seem to stagnate either within or just below. The reason is unclear and the dynamics of the Earth's mantle across the transition zone remains poorly constrained due to the lack of understanding of its mechanical properties.These mechanical properties depend on the ability of minerals to undergo slow plastic deformation in response to a low mechanical stress, called "creep," typically described by a parameter known as "viscosity." The dynamics of the upper mantle relies on plastic deformation of its main constituent, Mg2SiO4 olivine. The first ~300 km of the upper mantle is characterized by a strong directional dependence of the velocity of seismic waves, known as "seismic anisotropy." Therefore, it is generally believed that "dislocation creep" -- a deformation mechanism inducing lattice rotation and crystallographic preferred orientations (CPO) in elastically anisotropic minerals as olivine -- contributes to the overall deformation of the upper mantle. Dislocation creep is an intracrystalline deformation mechanism responsible for the transport of crystal shear, mediated by linear defects called "dislocations." It is a composite deformation mechanism that may involve both glide of dislocations along some specific crystal directions and planes and diffusion-mediated climb out of their glide planes. Indeed, recent numerical simulations of Boioli et al. (2015) have shown that deformation of Mg2SiO4 olivine crystals is accommodated by the Weertman type of dislocation creep under relevant upper mantle conditions, where climb of dislocations enables the recovery of dislocation junctions, allowing plastic strain to be efficiently produced by dislocation glide.Entering the mantle transition zone beyond ~410 km depth with increasing pressure (P) and temperature (T), olivine transforms first into its high-P polymorph wadsleyite and at ~520 km into ringwoodite. It remains unclear if deformation processes of these more compact structures of the high-P polymorphs of olivine are similar to those of olivine (Ritterbex et al. 2015; Ritterbex et al. 2016). To address this question, researchers from the plasticity group at the University of Lille and the Geodynamics Research Center of Ehime University combined numerical simulations of thermally activated dislocation glide mobilities together with results from experimental diffusion data, and demonstrate that, in contrast to olivine at upper mantle conditions, dislocation climb velocities are exceeding those of glide in the high-P polymorphs of olivine, inducing a transition of deformation mechanism in the dislocation creep regime from Weertman creep to pure climb creep at geologic relevant stresses (Image 1). Based on plasticity modeling and constrained by diffusion data from experiments, the current investigation quantifies steady-state deformation of the main transition zone minerals wadsleyite, ringwoodite and majorite garnet as a function of grain size (Image 2).These modelings are able to explain a number of key features associated with the mantle transition zone. It is shown that intracrystalline plasticity of wadsleyite, ringwoodite and majorite garnet by pure climb creep at geological stresses leads to an equiviscous transition zone of 10^(21±1) Pa.s if the grain size is ~0.1 mm or larger (Image 3), matching well the available inverted surface geophysical data which are typically used to constrain rheological properties of the Earth's mantle. Since pure climb creep does not induce lattice rotation and cannot produce CPO, deformation of the transition zone by this mechanism is compatible with its relative seismic isotropy compared to the upper mantle. The researchers also found that CPO is able to develop along with stress concentrations by the activation of Weertman creep (Image 3), for example in corner flows around cold subducting slabs, something that could induce an increase in subduction resistance, explaining why some slabs stall at the base of the transition zone. On the other hand, viscosity reductions are predicted if grains are smaller than ~0.1 mm when the transition zone silicates are deforming by pure atomic diffusion, commonly referred to as "diffusion creep," which might potentially influence flow dynamics in the interior of cold subducting slabs or across phase transitions (Image 3).Future incorporation of these deformation mechanisms as a function of grain size in geodynamic convection models should enhance our understanding of the interaction between the upper and lower mantle and is expected to be helpful in constraining the geochemical evolution of the Earth.
Earthquakes
2,020
July 27, 2020
https://www.sciencedaily.com/releases/2020/07/200727114716.htm
'Inchworm' pattern of Indonesian earthquake rupture powered seismic 'boom'
Earthquakes are often imagined as originating from a single point where the seismic waves are strongest, the hypocenter underground or the epicenter at the Earth's surface, with seismic energy radiating outward in a circular pattern. But this simplified model fails to account for the complex geometry of the actual fault systems where earthquakes occur. The real situation may be much more complex -- and more interesting. In some remarkable cases, a phenomenon called "supershear" rupture can occur, where the earthquake rupture propagates along the fault at a speed faster than the seismic waves themselves can travel -- a process analogous to a sonic boom.
In a new study published in Study co-author Professor Yuji Yagi explains, "We used globally observed teleseismic wave data and performed finite-fault inversion to simultaneously resolve the spatiotemporal evolution of slip and the complex fault geometry."The results of this analysis showed that the propagation of supershear rupture of the Palu-Koro fault southward from the earthquake's epicenter was sustained by a pattern of repeated delay and advancement of slip along the fault, associated with the fault system's complex geometry. Areas with particularly high slip rates, referred to as "slipping patches," were identified near the epicenter as well as 60, 100, and 135 km south of the epicenter. In addition, three distinct episodes of rupture after the process initiated were distinguished, with delays in the advancement of the slipping patches between them.Tracing the surface rupture of the earthquake showed two major bends in the earthquake fault, 10-25 km south of the epicenter and 100-110 km south of the epicenter. Supershear rupture persisted along this geometrically complex fault.As lead author Professor Ryo Okuwaki describes, "Our study shows that the geometric complexity of a fault can significantly influence the velocity of rupture propagation. Our model of the 2018 Palu earthquake shows a zigzag pattern of slip deceleration and acceleration associated with bends in the fault, which we have named inchworm-like slip evolution. We propose that the geometric complexity of a fault system can promote persistent supershear rupture, enhanced by repeated inchworm-like slip evolution."These findings may have significant implications regarding assessment of future earthquake impacts and related disasters. For example, the authors suggest that the slipping patch they detected beneath Palu Bay may have contributed to generation of the 2018 Palu tsunami, which added to the devastation of the earthquake.
Earthquakes
2,020
July 22, 2020
https://www.sciencedaily.com/releases/2020/07/200722112718.htm
What factors influence the likelihood of fracking-related seismicity in Oklahoma?
The depth of a hydraulic fracturing well in Oklahoma, among other factors, increases the probability that fracking will lead to earthquake activity, according to a new report in the
The researchers hope their findings, published as part of an upcoming BSSA special issue on observations, mechanisms and hazards of induced seismicity, will help oil and gas operators and regulators in the state refine drilling strategies to avoid damaging earthquakes.During hydraulic fracturing, well operators inject a pressurized liquid into a rock layer after drilling vertically and often horizontally through the rock. The liquid breaks apart -- fractures -- the rock layer and allows natural gas or petroleum to flow more freely. A growing number of studies suggest that this process can induce seismic activity large enough for people to feel, possibly by increasing fluid pressures within the rock that relieve stress on faults and allow them to slip.In one rock layer examined in the BSSA study, the likelihood that hydraulic fracturing triggered seismic activity increased from 5 to 50 percent as well operations moved from 1.5 to 5.5 kilometers (0.9 to 3.4 miles) deep, the researchers found.Although the exact mechanisms linking well depth and seismic probability are still being examined, Michael Brudzinski and colleagues suggest that the overpressure of fluids trapped inside the rock may be important."The deeper the rock layers are, the more rock that is sitting on top of a well, and that is going to potentially increase the fluid pressures at depth," said Brudzinski, the study's corresponding author from Miami University in Ohio.Oklahoma has been at the center of a dramatic increase in earthquake activity over the past decade, mostly caused by oil and gas companies injecting wastewater produced by drilling back into deeper rock layers. However, a 2018 study identified places in the state where significant amounts of seismic activity were linked to nearly 300 hydraulic fracture wells.Hydraulic fracturing is associated with a magnitude 4.6 earthquake in Canada and a magnitude 5.7 earthquake in China, although fracking-induced earthquakes tend to be smaller in magnitude than those caused by wastewater disposal. As a result, oil and gas operators and regulators would like to know more about why some wells trigger seismic activity, and how to adjust their operations to prevent damaging earthquakes.Brudzinski and colleagues found the link between depth and seismic probability in their examination of data from 929 horizontal and 463 vertical hydraulic fracturing wells in Oklahoma. The scientists used publicly available data on injected volume at well sites, the number of wells on a drilling pad, what kind of fluid was injected, and the vertical depth of the well, among other features.The total volume of injected liquid at the Oklahoma wells did not affect the probability of seismic activity near the wells -- a surprising finding that differs from other studies of induced seismicity. Some previous hydraulic fracturing (and wastewater disposal) studies show an increase in seismic activity with increasing volume.Most of the wells in the current study are single wells, however, and not multiple wells clustered on a drilling pad, Brudzinski noted. In some places in western Canada and Texas, where there is a link between the injected volume and seismicity, multiple wells on a pad are more common."So that's where we started to think that perhaps that's the difference between what we're seeing in our study versus other studies," Brudzinski said. "We're proposing that multiple wells injecting next to each other may be why volume does matter in those cases, although we need to study it more.""It could be that volume does still matter, but more so in a cumulative way than for any given well," he added. "An isolated well with a large volume may not have nearly as much of a [seismic] risk as a large volume well that is in close proximity to other large volume wells."The researchers also compared the probability of seismic activity in wells where the injected liquid was a gel versus "slickwater" -- water with chemicals added to increase flow. They found a lower level of seismicity in gel operations compared to slickwater, although the difference wasn't as statistically significant as the other trends.Simulation studies suggest that the more viscous gel may not flow as far as the slickwater, limiting its effects on faults, Brudzinski said.
Earthquakes
2,020
July 20, 2020
https://www.sciencedaily.com/releases/2020/07/200720112228.htm
Geophysics: A first for a unique instrument
Geophysicists at Ludwig-Maximilians Universitaet (LMU) in Munich have measured Earth's spin and axis orientation with a novel ring laser, and provided the most precise determination of these parameters yet achieved by a ground-based instrument without the need for stellar range finding.
Buried amid the pastures and cropland near the town of Fürstenfeldbruck to the west of Munich is a scientific instrument that is 'one of a kind'. It's a ring laser named ROMY, which is essentially a rotation sensor. On its completion three years ago, the research journal Science hailed ROMY as "the most sophisticated instrument of its type in the world." The acronym refers to one of its uses -- detecting rotational motions in seismology. But in addition to quantifying ground rotation caused by earthquakes, ROMY can sense minute alterations in the Earth's rotational velocity as well as changes in its axis of orientation. These fluctuations are caused not only by seismic events but by factors such as ocean currents and shifts in the distribution of ice masses, among other factors. Now a group of geophysicists led by Professors Heiner Igel (LMU) and Ulrich Schreiber (Technical University of Munich) report the results of the first continuous high-precision measurements of the Earth's rotational parameters in the journal With the aid of a grant from the European Research Council (ERC), Igel and Schreiber developed the concept for the ROMY ring laser. The construction of the observatory, which was largely financed by LMU Munich, was an extremely challenging undertaking. Even the concrete structure in which ROMY is housed had to be erected with millimeter precision. ROMY is made up of a set of four ring lasers that form the faces of an inverted tetrahedron (and each side is 12 m long). Two laser beams circulate in opposite directions around each face of the instrument. The beam traveling in the direction of rotation takes longer than its counterpart to complete each lap. This in turn causes its wavelength to be stretched, while other is compressed. The difference in wavelength depends on the precise orientation of each face with respect to the direction and orientation of Earth's rotation. Data from three of the four rings suffice to determine all the parameters of planetary rotation.The fact that the ring laser has more than met its design criteria is naturally a relief -- and a source of great satisfaction -- for Igel. "We are able to measure not only the orientation of the Earth's axis of rotation, but also its rate of spin," he explains. The method so far employed to measure these parameters with high accuracy relies on very long baseline interferometry (VLBI). This requires the use of a worldwide network of radio telescopes, which use changes in the relative timing of pulsed emissions from distant quasars to determine their own positions. Owing to the involvement of multiple observatories, the VLBI data can only be analyzed after several hours. ROMY has some considerable advantages over this approach. It outputs data virtually in real time, which allows it to monitor short-term changes in rotation parameters. Thus, the new study is based on continuous observations over a period of more than 6 weeks. During this time, ROMY detected changes in the mean orientation of the Earth's axis of less than 1 arc second.In future and with further improvements, ROMY's high-precision measurements will complement the data obtained by the VLBI strategy, and will serve as standard values for geodesy and seismology. The measurements are also of potential scientific interest in fields such as the physics of earthquakes and seismic tomography, says Igel. "In the context of seismology, we have already obtained very valuable data on from earthquakes and seismic waves caused by ocean currents," he adds.
Earthquakes
2,020
July 20, 2020
https://www.sciencedaily.com/releases/2020/07/200720102101.htm
A new idea on how Earth's outer shell first broke into tectonic plates
The activity of the solid Earth -- for example, volcanoes in Java, earthquakes in Japan, etc -- is well understood within the context of the ~50-year-old theory of plate tectonics. This theory posits that Earth's outer shell (Earth's "lithosphere") is subdivided into plates that move relative to each other, concentrating most activity along the boundaries between plates. It may be surprising, then, that the scientific community has no firm concept on how plate tectonics got started. This month, a new answer has been put forward by Dr. Alexander Webb of the Division of Earth and Planetary Science & Laboratory for Space Research at the University of Hong Kong, in collaboration with an international team in a paper published in
Dr. Webb and his team proposed that early Earth's shell heated up, which caused expansion that generated cracks. These cracks grew and coalesced into a global network, subdividing early Earth's shell into plates. They illustrated this idea via a series of numerical simulations, using a fracture mechanics code developed by the paper's first author, Professor Chunan Tang of the Dalian University of Technology. Each simulation tracks the stress and deformation experienced by a thermally-expanding shell. The shells can generally withstand about 1 km of thermal expansion (Earth's radius is ~6371 km), but additional expansion leads to fracture initiation and the rapid establishment of the global fracture network.Although this new model is simple enough -- Earth's early shell warmed up, expanded, and cracked -- superficially this model resembles long-discredited ideas and contrasts with basic physical precepts of Earth science. Before the plate tectonic revolution of the 1960's, Earth's activities and the distribution of oceans and continents were explained by a variety of hypotheses, including the so-called expanding Earth hypothesis. Luminaries such as Charles Darwin posited that major earthquakes, mountain-building, and the distribution of land-masses were thought to result from the expansion of the Earth. However, because Earth's major internal heat source is radioactivity, and the continuous decay of radioactive elements means that there is less available heat as time moves forward, thermal expansion might be considered far less likely than its opposite: thermal contraction. Why, then, do Dr. Webb and his colleagues think that early Earth's lithosphere experienced thermal expansion?"The answer lies in consideration of major heat-loss mechanisms that could have occurred during Earth's early periods," said Dr. Webb. "If volcanic advection, carrying hot material from depth to the surface, was the major mode of early heat-loss, that changes everything." Dominance of volcanism would have an unexpectedly chilling effect on the Earth's outer shell, as documented in Dr. Webb and co-author Dr. William Moore's earlier work (published in Nature in 2013). This is because new hot volcanic material taken from Earth's depths would have been deposited as cold material at the surface -- the heat would be lost to space. The evacuation at depth and piling up at the surface would have eventually required that the surface material sank, bringing cold material downwards. This continual downward motion of cold surface material would have had a chilling effect on the early lithosphere. Because Earth was cooling overall, the heat production and corresponding volcanism would have slowed down. Correspondingly, the downwards motion of lithosphere would have slowed with time, and thus even as the overall planet cooled, the chilled lithosphere would have been increasingly warmed via conduction from hot deep material below. This warming would have been the source of the thermal expansion invoked in the new model. The new modeling illustrates that if Earth's solid lithosphere is sufficiently thermally expanded, it would fracture, and the rapid growth of a fracture network would divide the Earth's lithosphere into plates.Dr. Webb and his colleagues continue to explore the early development of our planet, and of the other planets and moons in the solar system, via integrated field-based, analytical, and theoretical studies. Their field-based explorations bring them to far-flung sites in Australia, Greenland, and South Africa; their analytical research probes the chemistry of ancient rocks and their mineral components; and their theoretical studies simulate various proposed geodynamic processes. Together, these studies chip away at one of Earth and planetary science's greatest remaining mysteries: how and why did Earth go from a molten ball to our plate tectonic planet?
Earthquakes
2,020
July 16, 2020
https://www.sciencedaily.com/releases/2020/07/200716101559.htm
Using the past to predict the future: The case of Typhoon Hagibis
The past is often the window to our future, especially when it comes to natural disasters. Using data from the 2018 floods that struck southwestern Japan to calibrate a machine learning model, researchers from the International Research Institute of Disaster Science (IRIDeS) at Tohoku University and the Japan-Peru Center for Earthquake Engineering Research and Disaster Mitigation (CISMID, in Spanish), have successfully identified the flooding caused by Typhoon Hagibis.
Typhoon Hagibis devastated Japan in October 2019, killing 91 people, damaging 85,000 homes, and causing approximately $15 billion in damage. Flooding across the affected regions was profound.In natural disaster rescue and recovery efforts, real-time flood mapping is crucial. It allows governments to direct relief to the areas that need it most. To aid this effort, satellite images using artificial intelligence (AI) are often employed.Crucial to this is training data. Training data allows the algorithms to learn data and produce outputs when new inputs arise in a process known as machine learning. However, the amount of training data is limited in many cases. Collecting training data immediately following a disaster is costly, time-consuming and many times impossible.At IRIDeS and CISMID, authors of the study, evaluated the performance of a model to learn from the 2018 floods that struck southwestern Japan, and to identify the floods induced by the 2019 Typhoon Hagibis. The resulting flood maps were consistent with the results of actual flooding maps released by local governments and public institutions.The authors note that "Our model successfully identifies the inundated areas and verified that AI can learn from past disasters, ultimately allowing us to better predict flooding in future events." They add that "Our next step in this project would be to incorporate data from the unknown event into a second stage of training for a more accurate estimation."
Earthquakes
2,020
July 9, 2020
https://www.sciencedaily.com/releases/2020/07/200709172834.htm
New evidence of long-term volcanic, seismic risks in northern Europe
An ancient European volcanic region may pose both a greater long-term volcanic risk and seismic risk to northwestern Europe than scientists had realized, geophysicists report in a study in the
The scientists are not predicting that a volcanic eruption or earthquake is imminent in the densely populated area, which is centered in the Eifel region of Germany, and covers parts of Belgium, the Netherlands, France and Luxembourg. But the study revealed activity that is uncommon for the region."Our findings suggest this region is an active volcanic system, and much more seismically active than many of the faults in Europe between the Eifel volcanic region and the Alps," said Paul Davis, a UCLA research professor of geophysics and a senior author of the study.Davis and his co-authors report subtle, unusual movements in the surface of the Earth, from which they conclude the Eifel volcanic region remains seismically active. The region has a long history of volcanic activity, but it has been dormant for a long time; scientists think the last volcanic eruption there was some 11,000 years ago.The geophysicists report that the land surface in that region is lifting up and stretching apart, both of which are unusual in Europe. Although the uplift is only a fraction of an inch per year, it is significant in geological terms, Davis said.The geophysicists analyzed global positioning system data from across Western Europe that showed subtle movements in the Earth's surface. That enabled them to map out how the ground is moving vertically and horizontally as the Earth's crust is pushed, stretched and sheared.The dome-like uplift they observed suggests those movements are generated by a rising subsurface mantle plume, which occurs when extremely hot rock in the Earth's mantle becomes buoyant and rises up, sending extremely hot material to the Earth's surface, causing the deformation and volcanic activity. The mantle is the geological layer of rock between the Earth's crust and its outer core.Corné Kreemer, the study's lead author, is a research professor at the University of Nevada, Reno's Nevada Bureau of Mines and Geology. He said many scientists had assumed that volcanic activity in the Eifel was a thing of the past, but the study indicates that no longer seems to be the case."It seems clear that something is brewing underneath the heart of northwest Europe," he said.The Eifel volcanic region houses many ancient volcanic features, including circular lakes known as maars -- which are remnants of violent volcanic eruptions, such as the one that created Laacher See, the largest lake in the area. The explosion that created Laacher See is believed to have occurred approximately 13,000 years ago, with an explosive power similar to that of the spectacular 1991 Mount Pinatubo eruption in the Philippines.The researchers plan to continue monitoring the area using a variety of geophysical and geochemical techniques to better understand potential risks.The research was supported by the Royal Dutch Academy of Sciences, the United States Geological Survey, the National Earthquake Hazard Reduction Program and NASA.
Earthquakes
2,020