SCEC CSEP Workshop on Testing External Forecasts and Predictions
Conveners: Tom Jordan (SCEC/USC), Tom Bleier (Quakefinder), Andrew Michael (USGS), and Max Werner (Princeton)
Dates: May 7-8, 2013
Location: Boardroom, Davidson Conference Center, University of Southern California, Los Angeles, CA
Participation: 41 total
SYNOPSIS: The Collaboratory for the Study of Earthquake Predictability (CSEP), operated by the Southern California Earthquake Center (SCEC), provides an automated infrastructure for the blind prospective testing of earthquake forecasts. The only time-dependent forecasting procedures currently accommodated by CSEP are those that run automatically on CSEP computers and use seismicity data for updating the forecasts. To extend the range of forecasting models and CSEP testing capabilities, SCEC has undertaken a project, funded by the Department of Homeland Security, to develop a facility for registering and testing external forecasting and prediction (EFP) procedures; i.e., those run outside the extant CSEP. This facility will allow investigators to document their methods and submit forecasts and predictions for retrospective and prospective testing in accordance with collaboratory standards.
The purposes of this workshop are threefold: (a) assess community needs for CSEP-based testing of EFP procedures; (b) gather community input on the design of an EFP registration and testing system; and (c) encourage modeling groups to participate in specific experiments to test their own external forecasts and predictions.
The workshop will bring together CSEP personnel, scientists interested in conducting formal evaluations of their forecasts, scientists interested in EFP testing procedures, and agency representatives. The agenda for the two-day workshop will include session on the following topics:
- Status and requirements for operational earthquake forecasting (OEF)
- Current CSEP capabilities
- Review of earthquake forecasts currently under CSEP testing
- Reports by EFP developers on the status of forecasting/prediction experiments
- Requirements for EFP registration into CSEP; acceptance testing; testing against reference models
- Break-out group discussions of prototype EFP experiments
- Recommendations from break-out groups for prototype EFP experiments
- Standardization of EFP metadata, submission formats, and acceptance tests; model classes and reference models
Presentation slides may be downloaded by clicking the links following the title. PLEASE NOTE: Files are the author’s property. They may contain unpublished or preliminary information and should only be used while viewing the talk.
TUESDAY, MAY 7, 2013
07:30-08:30 | Breakfast / Registration & Check-In | |
08:30-08:40 | Welcome and Introductions | T. Jordan / B. Davis |
Session 1: Overview of OEF and CSEP Moderator: M. Blanpied; Reporter: L. Jones |
||
08:40-08:50 | OEF Status and Requirements: USGS Perspective | A. Michael |
08:50-09:00 | OEF Status and Requirements: ICEF Perspective | T. Jordan |
09:00-09:20 | Current CSEP Capabilities | M. Werner |
09:20-09:40 | Assessing earthquake predictions and seismicity forecasts: You have a model. So what? | J. Zechar |
09:40-10:00 | How CSEP Really Works | M. Liukis |
10:00-10:15 | Break | |
Session 2: Status of Forecasting / Prediction Experiments Moderator: M. Gerstenberger; Reporter: Jeanne Hardebeck |
||
10:15-10:30 | The Operationalization of UCERF3 - Implementation and Testing | E. Field |
10:30-10:45 | A time-dependent ensemble forecast model for Canterbury, New Zealand | M. Gerstenberger |
10:45-11:00 | Operational Earthquake Forecasting in Italy: perspectives and the role of CSEP activities | W. Marzocchi |
11:00-11:15 | Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake | N. Hirata |
11:15-11:30 | History-dependent magnitude forecast by statistical discrimination of foreshocks | Y. Ogata |
11:30-11:45 | The Global Test of the M8-MSc predictions of the great and significant earthquakes: 20 years of experience | V. Kossobokov |
11:45-12:00 | CSEP forecasts on six oceanic transform fault segments | M. Boettcher |
12:00-13:00 | Lunch | |
13:00-13:15 | Example Prototype: Earthquake Early Warning | P. Maechling |
13:15-13:30 | Example Prototype: Geodetic Transient Detection | R. Lohman |
Session 3: Status of Forecasting / Prediction Experiments (continued) Moderator: T. Bleier; Reporter: C. Dunson |
||
13:30-13:38 | Overview of EM Forecasting | T. Bleier |
13:38-13:53 | EM Signals coming from deep below: Sources and Expressions | F. Freund |
13:53-14:00 | On the possible origins of Pre-seismic TIR anomalies | V. Tramutoli |
14:00-14:15 | The QuakeFinder Earthquake Forecasting Process | T. Bleier |
14:15-14:30 | Observations of EM precursors of earthquakes from magnetometers and EQLs | J. Heraud |
14:30-14:45 | Temporal and spatial anomalies of seismo-ionospheric GPS TEC | J. Liu |
14:45-15:00 | On the potential of satellite TIR surveys for a Dynamic Assessment of (short-term) Seismic Risk: some examples from the EU-FP7 PRE-EARTHQUAKES Project | V. Tramutoli |
15:00-15:15 | Multi-parameter observations of atmospheric pre-earthquake signals and their validation | M. Kafatos |
15:15-15:30 | Radio-Tomography Predictions, Results, Prospects and Problems (Supplemental Handout) | D. Rekenthaler |
15:30-15:45 | Break | |
Session 4: Developing Requirements for EFP Registration Into CSEP Moderator: A. Michael; Reporter: M. Werner |
||
15:45-17:30 | Group Discussion: CSEP Template Requirements | |
17:30 | Adjourn | |
18:30-21:00 | Reception Dinner at The University Club | |
WEDNESDAY, MAY 8, 2013 | ||
07:30-08:30 | Breakfast | |
08:30-10:30 | Session 5: Recommendations for Prototype EFP Experiments - Establishing EFP Standards; Metadata, Submission Formats, and Acceptance Tests; Model Classes and Reference Models Moderator: D. Schorlemmer; Reporter: P. Maechling |
|
Notes from T. Bleier, V. Tramutoli | ||
10:30-10:45 | Break | |
10:45-12:00 | Session 6: Recommendations for Prototype EFP Experiments - Establishing EFP Standards; Metadata, Submission Formats, and Acceptance Tests; Model Classes and Reference Models (continued) Moderator: D. Rhoades; Reporter: M. Liukis |
|
Notes from D. Rekenthaler, V. Kossobokov, J. Liu, F. Freund | ||
12:00-13:00 | Lunch | |
13:00-14:00 | Session 7: Recommendations and Wrap-Up Session Moderators: T. Jordan, T. Bleier, A. Michael, and M. Werner |
|
Perspectives from the EFP Developers (M. Werner) Perspectives from the CSEPers (P. Maechling) Perspectives from the International Collaborators / Testing Centers (Japan - N. Hirata, New Zealand - D. Rhoades, Europe - W. Marzocchi) Perspectives from the DHS (B. Davis) Perspectives from the NEPEC (T. Tullis) Perspectives from the CEPEC (L. Jones) Perspectives from the USGS (A. Michael) Perspectives from the SCEC (T. Jordan) |
||
14:00 | Adjourn |
This workshop will be followed by the SCEC CSEP Technical Planning Meeting: Priorities, Technical Implementation and Next Steps on May 8-9, 2013.
PARTICIPANTS
Jeffrey O. Adjei (DHS Science and Tech Directorate) Harley Benz (USGS) Michael Blanpied (USGS Earthquake Hazards Program) Thomas Bleier (QuakeFinder) Margaret Boettcher (New Hampshire) Robert Dahlgren (SETI Institute) Bruce Davis (DHS) Clark Dunson (QuakeFinder) Karen Felzer (USGS) Ned Field (USGS) Friedemann Freund (SETI/SJSU/NASA Ames Research Cntr) Matt Gerstenberger (GNS Science) Jeanne Hardebeck (USGS) |
Ruth Harris (USGS) Jorge Heraud (Pontificia U Catolica del Peru) Naoshi Hirata (ERI, the University of Tokyo) Tran Huynh (USC / SCEC) Dave Jackson (UCLA) Lucile Jones (USGS) Thomas Jordan (USC / SCEC) Menas Kafatos (CEESMO, Chapman University) Vladimir Kossobokov (Russian Academy of Sciences) Jann-Yenq (Tiger) Liu (National Central University, Taiwan) Masha Liukis (SCEC) Rowena Lohman (Cornell) Jeffrey Love (USGS) |
Philip Maechling (SCEC) Warner Marzocchi (INGV Rome) Jeff McGuire (WHOI) John McRaney (USC / SCEC) Andy Michael (USGS) Kevin Milner (USC / SCEC) Yosi Ogata (ISM / U Tokyo) Doug Rekenthaler (RHP LLC & EQ Warnings, Inc.) David Rhoades (GNS Science) Danijel Schorlemmer (GFZ Potsdam) Valerio Tramutoli (University of Basilicata) Terry Tullis (Brown) Max Werner (Princeton) John Yu (USC / SCEC) Jeremy Zechar (ETH Zurich) |
SCEC CSEP Technical Planning Meeting: Priorities, Technical Implementation and Next Steps
Conveners: Tom Jordan (USC/SCEC), Phil Maechling (USC/SCEC), Andy Michael (USGS), Danijel Schorlemmer (GFZ), Max Werner (Princeton)
Dates: May 8-9, 2013
Location: Boardroom, Davidson Conference Center, University of Southern California, Los Angeles, CA
SYNOPSIS: The CSEP Workshop on Testing External Forecasts and Predictions (EFPs), taking place immediately prior to this technical planning meeting, is expected to result in recommendations for prototype EFP experiments. Given the scope and broad participation in the workshop, this smaller follow-up meeting amongst CSEP personnel and scientists will help establish the associated technical requirements for prototype EFP experiments, identify priorities and required resources, and determine the next steps for CSEP EFP activities. In addition, this meeting will leverage workshop participation by global CSEP scientists to recommend CSEP priorities on prospective and retrospective short-term forecasting experiments.
The purposes are: (a) assess CSEP requirements and priorities for EFP prototype experiments; (b) develop EFP prototype implementation plan; (c) assess designs for new short-term and retrospective experiments; (d) recommend priorities for CSEP activities.
This meeting will bring together CSEP personnel (scientists and developers), scientists committed to implementing EFP and OEF testing procedures, and agency representatives. The expected outcomes include:
- A document defining recommended prototype EFP experiments (submission formats, acceptance tests, model classes, reference models, evaluation procedures, metadata)
- A list of tasks to implement the prototype EFP experiments.
- Recommendations for the design of new short-term experiments.
- Recommendations for retrospective experiments.
- Recommended list of priorities for EFP-related and other CSEP activities judged by scientific merit, technical challenges and available resources.
Presentation slides may be downloaded by clicking the links following the title. PLEASE NOTE: Files are the author’s property. They may contain unpublished or preliminary information and should only be used while viewing the talk.
WEDNESDAY, MAY 8, 2013
14:30-14:40 | Meeting Goals and Objectives | M. Werner |
Session 1: Technical Planning for Implementing the Recommended Prototype EFPs Moderator: P. Maechling; Reporter: M. Liukis |
||
14:40-15:10 | Recommended Prototype EFPs: Technical Requirements and Next Steps (P. Maechling, M. Werner) |
|
15:10-15:30 | Discussion | |
Session 2: CSEP Experiments to Support OEF/EFP Activities Moderator: W. Marzocchi; Reporter: J. Zechar |
||
15:30-15:45 | Retrospective CSEP Experiments in New Zealand | D. Rhoades / M. Gerstenberger |
15:45-16:00 | Short-Term Experiments to Support OEF in California
|
A. Michael / M. Werner |
16:00-17:30 | Discussion | |
17:30 | Adjourn | |
18:30-21:00 | Group Dinner at The Lab Gastropub | |
THURSDAY, MAY 9, 2013 | ||
07:30-08:30 | Breakfast | |
Session 3: Experiment Design: Assumptions, Issues, Solutions | ||
08:30-08:40 | Weaknesses and their Solutions in Current CSEP Experiments | W. Marzocchi |
08:40-08:50 | Causality analysis by point process models | Y. Ogata |
08:50-09:00 | Event-Based (Point-Wise) Experiments: Theory, Merits, Challenges | M. Werner |
09:00-09:30 | Discussion | |
Session 4: Data Uncertainties in CSEP Experiments | ||
09:30-09:45 | Handling Data Uncertainties in Evaluations | J. Zechar |
09:45-10:00 | Discussion | |
10:00-10:15 | Break | |
Session 5: Strategies for Retrospective Experiments | ||
10:15-10:30 | The Canterbury/Christchurch Retrospective Experiment | W. Marzocchi |
10:30-11:00 | Discussion: CSEP Strategies for Retrospective Experiments | |
Session 6: Wrap-Up and Recommended Priorities for CSEP | ||
11:00-12:30 | Summary and Recommended Priorities for CSEP | |
12:30 | Adjourn |