# Comments on the Global Earthquake Forecast (Technical)

One of the advantages of hindsight is that what seemed obscure in the past becomes obvious (here I recommend Duncan Watt's NY Times bestseller: "Everything is Obvious Once You Know the Answer"). We can see an example of this in the earthquake forecast that we run on this web site. The NTW method, upon which the global forecast is based, was first developed and published in the peer reviewed literature [1] as a regional forecast over areas the size of California + Nevada. An assumption implicit in this approach is that earthquakes over this entire region are correlated in space (infinite correlation length).

To select the few parameters in the model, we backtested the model using statistical testing procedures including Reliability and ROC tests [1]. However, the more critical problem is to compute forecasts in geographic regions of arbitrary size, without assuming infinite correlation lengths. As a first approximation, the assumptions and parameters of the large regional domains were assumed to hold for smaller domains as well. On this basis, we presented a series of blogs here showing forecasts for regions of 100 miles and 150 miles around cities such as Los Angeles, San Francisco, and Tokyo, Japan.

After working for two years to develop a globally consistent method that did not make the assumptions of an infinite correlation length, we developed the forecast method shown as the current earthquake forecast on the earthquake viewer. Previously, the forecast shown on the earthquake viewer was a version of the Poisson-BASS forecast method that has been discussed in the literature, but whose assumptions have also been called into question.

Our new method, to be published in the peer-reviewed literature, involves considering each active 0.1degree square on the planet, and analyzing the earthquakes in a radially growing region around that location. As there are 3600 x 1800 = 6.48 million such squares on the planet, this turns out to be a rather large computational job, particularly if you want to present the results in a short period of time.

Happily, we were able to develop the necessary computational algorithms, and our new forecast is in fact a testament to tenacity. As will be discussed in a forthcoming publication [2], this new method has the following advantages:

1. It is computationally extremely efficient, and forecasts for the entire planet can be computed in a time scale of a few tens of minutes, then stored efficiently so that results can be presented to the user in a matter of seconds.

2. The hazard rates generally obey the Gutenberg-Richter relation. That means, when the probabilities are small, that the probability of a magnitude 6 earthquake is approximately 10x the probability of a magnitude 7 earthquake.

3. Correlation lengths are finite, so that a small event near Los Angeles, for example, does not affect the probability of a large event near San Francisco

4. The computed probabilities can be seen to fluctuate about the Poisson rates for a given region of arbitrary size. Thus the local NTW forecasts can be considered to represent perturbations around the local Poisson rates. The Poisson rates for large earthquakes in given regions can be either be computed directly from the occurrence of the large events, or extrapolated using the Gutenberg-Richter relation from the rate of small magnitude events. This is the case in Japan, for example, where the large number of magnitude > 5 earthquakes since March 2011 would seem to imply the imminent occurrence of a magnitude 8+ earthquake.

The net result is that computed probabilities in some regions such as California have gone down, sometimes considerably, while computed probabilities in other areas such as Japan have gone up, again sometimes by relatively large factors.

In summary, computed values for the new NTW probabilities now seem "obvious", compared to the values computed in the previous Four Cities Forecasts.

[1] J.B. Rundle, J.R. Holliday, W.R. Graves, D.L. Turcotte, K.F. Tiampo and W. Klein, Probabilities for large events in driven threshold systems, Phys. Rev. E, 86, 021106 (2012)

[2] J.R. Holliday, W.R. Graves and J.B. Rundle, Localizing earthquake probabilities using the NTW method, to be submitted (2013)

## About OpenHazards Bloggers

**Steven Ward** is a Research Geophysicist at
the Institute of Geophysics and Planetary Physics, UC Santa Cruz. He specializes in the quantification and simulation of
natural hazards. **Read Steve's blog.**

**John Rundle** is a Distinguished Professor of Physics
and Geology at UC Davis and
the Executive Director of the APEC Collaboration for Earthquake Simulations. He
chaired the Board of Advisors for the Southern California Earthquake Center from 1994 to 1996. **Read John's blog.**