2002 Research Projects

Foreshock-Centric Foreshock Probabilities: Methodology and Application to Parkfield

Project Description: Agnew and Jones (JGR) proposed a method for calculating the conditional probability of a main shock, given a potential foreshock. Their model uses as input the rate of background earthquakes, the rate of foreshocks before main shocks, and the long-term probability of a main shock in the absence of a foreshock event. They applied their model using the concept of characteristic earthquakes on predefined fault segments. Thus, the region of possible main shock nucleation defined by their model relies on specifying a fault segment-specific background seismicity rate and gives an equal potential for main shock nucleation for a foreshock located anywhere along the fault segment. This approach yields inconsistent results when a potential foreshock occurs in a region of closely spaced or overlapping faults, since different faults have different background rates and different long-term probabilities. Our model abandons this main shock-centric view in favor of a foreshock-centric approach. The region of interest is redefined as the region within 10 km of the potential foreshock. The probability of a mainshock is the integrated probability of earthquakes over a threshold magnitude nucleating anywhere within the region. Background rate is also calculated only in the area surrounding the potential foreshock, and results in a single total probability of the earthquake being a foreshock to any possible mainshock within 10 km. A preliminary test of this model was performed using hypothetical earthquakes near the Parkfield segment of the San Andreas fault. The annual probability of a Parkfield main shock in the absence of foreshock is taken from Michael and Jones (BSSA). The resulting conditional three-day probability of a main shock, given a magnitude 5.0 potential foreshock, ranged from 0.0002 to 0.08 when the potential foreshock is near the southern end of the segment, and from 0.13 to 0.16 when the potential foreshock is near the northern end. These results are lower than the corresponding probabilities obtained with the Agnew and Jones (JGR) calculation, which ranged from 0.02 to 0.19 (Michael and Jones, BSSA).
Intern(s): Anna Holt

Andrew Michael, United States Geological Survey, Menlo Park

Click here for Holt's Slides
Click here for Holt's Abstract

The Feb. 2002 Simi Valley, the Oct. 2001 Compton, and the Oct. 1999 Hector Mine Earthquake Sequences

Project Description: Several recent earthquake sequences were captured by the dense, continuously-recording TRINET network. Visual analysis of the continuous data reveals 50-150 events where the catalog reported only 15-50. For an undergraduate research project this summer, I am looking at this continuous data for the Feb. 2002 Simi Valley, the Oct. 2001 Compton, and the Oct. 1999 Hector Mine earthquakes. The Simi Valley and Compton sequences are unusual because they have the magnitude and quantity of aftershocks associated with a magnitude 5 mainshock but lack this mainshock. The Hector Mine quake has an unusually active foreshock sequence. I have compared these swarms with more average foreshock-mainshock-aftershock sequences to better understand the physical processes active during bursts of seismicity. Using the Gutenberg-Richter magnitude frequency distribution and the Omori Law, I computed the b-value and p-value for each of these sequences and for other recent sequences within 30 km radii. For each earthquake sequence, I focused on the first day after the mainshock, anticipating the most unique sequencing of aftershocks to occur early on. Our next step will be to use cross-correlation methods to find more accurate aftershocks locations and gather further clues to the processes that are active during the sequence.
Intern(s): Katy Tschann-Grimm

John Vidale, University of California, Los Angeles

Click here for Tschann-Grimm's Abstract

SOPAC Internship

Project Description: As of now I am setting up a real time data (RTD) GPS program on a Windows 2000 system and testing its accuracy in comparison to the previous way used to analized GPS data. Right now I am focusing more on the tropospheric delay from the satellite signal as travels through the atmosphere. It is important to know the accuracy of the real time system so this information can be used in scientific and commercial applications in the future. This is my first week working with the RTD program so I am basicly learning as I'm going.
Intern(s): Chris Helmer,

Yehuda Bock, University of California, San Diego

Click here for Helmer's Slides
Click here for Helmer's Abstract

HAZUS Internship

Project Description:
Intern(s): Jason Masters

Mark Benthien, University of Southern California

Click here for Master's Slides

Topographically Induced Stresses on the San Andreas Fault

Project Description: Variations in surface topography must be associated with variations in supporting stress in the underlying crust. Here, we assume that the crust behaves as a linear elastic half space to model the behavior of the associated stress. Understanding this stress can help with the study of faults that are close to or beneath major mountain ranges. One method for calculating the stress on a fault underneath this topography is to use the Boussinesq solution for stress due to a point load. By convolving this solution with observed topography, it is possible to find the stress for every point on nearby faults. This study focuses on the San Andreas fault in the area of Southern California where it passes the San Gabriel Mountains. The model assumed the San Andreas Fault to be a strong fault, so it did not shed the stress induced up on it from regional topography. The compressive stress varies along the strike of the fault, from -2MPa to -5Mpa. However the shear stress caused by the topography is uniform along the chosen segment of the fault at about .5MPa in the right lateral direction. Areas for future research include examining the stress created on dipping faults, such as the Sierra Madre fault, how the stress is distributed if the faults are assumed to be relatively weak, and the affects of non-elastic half-space models of the crust.
Intern(s): Erik Olson

Mark Simons, California Institute of Technology

Click here for Olson's Abstract

Forecasting Earthquakes Using Accelerating Moment Release

Project Description: An algorithm recently developed by Rundle et al. (2002) to find regions of anomalous seismic activity associated with large earthquakes identified the location of an Mw=5.6 earthquake near Calexico, Mexico. In this paper we analyze the regional seismicity before this event, and a nearby Mw=5.7 event, using time-to-failure algorithms developed by Bowman et al. (1998) and Bowman and King (2001). The former finds the radius of a circular region surrounding the epicenter that optimizes the time-to-failure acceleration of seismic release. The latter optimizes acceleration based on the expected stress accumulation pattern for a dislocation source. Both methods found a period of accelerating seismicity in an optimal region, the size of which agrees with previously proposed scaling relations. This positive result suggests that the algorithms of Rundle et al. (2002), Bowman et al. (1998), and Bowman and King (2001) provide complementary techniques for true predictive analysis of seismicity sequences. We further show a preliminary assessment of the potential for a large event on selected major faults in southern California based on the stress accumulation technique.
Intern(s): Julia Clark

David Bowman, California State University, Fullerton

Click here for Clark's Slides
Click here for Clark's Abstract

Enhancing SCEC CEO Efforts with Innovative Web Technologies: Application to the California Seismic Safety Commission

Project Description: Web-based technologies are used to share information from the Southern California Earthquake Center (SCEC) with the California Seismic Safety Commission (CSSC) in an automatic and near real-time manner. The purpose of this project is to automatically extract subsets of data from SCEC databases and integrate this information with other available online databases to provide a quick and efficient summary of data to the CSSC. The near real-time Earthquake List has been developed as a pilot program. The CSSC is primarily interested in larger earthquakes that may potentially damage persons or property. The Earthquake List is a working model tested for magnitudes greater than 3.0 that will be able to provide useful information in the infrequent occurrence of a larger event. SCEC online data include comprehensive and rapidly updated earthquake catalogs. A subset of this information is extracted, converted to XML, and combined with information from the US Geological Survey (USGS) and Census Bureau Tiger database using Extensible Stylesheet Language Transformations (XSLT), Java servlets and Java applets and the result is rendered as a two-dimensional map. The Census Bureau Tiger database provides mapping capabilities that are combined with feature data acquired from the USGS to automatically generate an online map showing the earthquake epicenter and nearby cities, as well as bridges, tunnels, hospitals and airports within a 10-mile radius of the event. This information is automatically generated and available online within minutes of an earthquake. We are also experimenting with using data from the Electronic Encyclopedia of Earthquakes to enhance CSSC website content.
Intern(s): Sara Whipple

Rob Mellors, San Diego State University

Click here for Whipple's Abstract

Electronic Encyclopedia of Earthquakes Developer

Project Description:
Intern(s): Shawn Shapiro, Connie Hartling, Rebecca Hunt

Gerry Simila, California State University, Northridge
Sally McGill, California State University, San Bernardino
Jan Vermilye, Whittier College


How Far Southeast did the Surface Rupture of the 1812 Earthquake Extend? Implications of Paleoseismic Observations at the Plunge Creek Site, near San Bernardino, Southern California

Project Description: We have documented the stratigraphy and structure of several trenches across the San Bernardino strand of the San Andreas fault at the Plunge Creek site, near San Bernardino, southern California. The most recent faulting event exposed in the trenches (event W) appears to have occurred between about AD 1440 and AD 1660, if the radiocarbon dates are taken at face value. Two of the trenches reveal suggestive evidence for an older faulting event (event R), which post-dates AD 1220. Because the age control at Plunge Creek is based on radiocarbon dating of detrital-charcoal samples, we must consider all of the radiocarbon ages as maximum estimates of the depositional ages for the layers from which the samples were collected. Thus, event W is not strictly constrained to predate AD 1660. We use ecological arguments to infer that the detrital-charcoal samples at the Plunge Creek site probably overestimate the depositional ages of the sedimentary layers by about 1 +/-1 fire-cycle (i.e., by about 70 +/- 70 years). An independent estimate, based on extrapolation of sedimentation rates to the ground surface, suggests a similar value (0-95 years) for the lag time between the calibrated radiocarbon date of a sample and the depositional date of the layer from which it was collected. After applying an estimated correction (70 years) for the inherited ages of the detrital-charcoal samples, the date of event W is most likely between AD 1510 and AD 1730, with a preferred date of about AD 1630. The preferred date for event R is about AD 1450. It is unlikely that event W represents the southeastern continuation of the AD 1812 earthquake. In order for this to be true, the 16 dated samples that most closely overlie event W would all have to have had inherited ages larger than 245 years, if we use the median dates of the samples. Alternatively, all 16 of these sample would have to have had inherited ages larger than 150 years, if the true dates for all 16 samples were at the younger ends of their two-sigma error bars. It is also unlikely that any earthquake younger than event W has ruptured the ground surface at Plunge Creek. Although ambiguities exist in some of the trenches, relationships in trench 10 argue against any surface rupture subsequent to event W. Stratified deposits extend for the entire length of trench 10 and are unfaulted above the level of event W.
Intern(s): Kathy Barton

Sally McGill, California State University, San Bernardino

Click here for Barton's Abstract