Stuart Russell -- Global seismic monitoring for the Comprehensive Nuclear-Test-Ban Treaty


The interpretation of sensor data from multiple, geographically dispersed sensors is a ubiquitous challenge in science and engineering. Problems of noise, sensor imperfections, and signal propagation uncertainty, as well as the complexities of data association, can make the reliable detection of events extremely difficult. This is particularly true in global monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT).

In our work, we use real-time data from the UN's International Monitoring System (IMS) to detect events in the atmosphere, oceans, and underground that might be nuclear explosions. The IMS is the world's primary global-scale, continuous, real-time system for seismic event monitoring. Data from over 240 IMS stations (seismic, hydroacoustic, and infrasound) are transmitted via satellite in real time to the International Data Center (IDC) in Vienna, where event bulletins are issued daily. Perfect performance remains well beyond the reach of current technology: the final (SEL3) bulletin from IDC's automated system, a highly complex and well-tuned piece of software, misses nearly one third of all seismic events in the magnitude range of interest, and about half of the reported events are spurious. A large team of expert analysts post-processes the automatic bulletins to improve their accuracy to acceptable levels.

Our approach is based on generative Bayesian modelling and inference. The first-generation model, NET-VISA (NETwork processing via Vertically Integrated Seismic Analysis), incorporates submodels for event occurrence, signal generation, signal propagation, signal detection, the characteristics of detected signals, and local noise at seismic stations. A "detected signal" is a blip with an arrival time and amplitude, as estimated by the UN's existing station processing software. (More than 90% of all such blips are in fact just local station noise.) Given the model and the observed blips, probabilistic inference produces a hypothesized bulletin of events -- with locations, times, depths, and magnitudes -- that best explains the observations. NET-VISA inference involves a dynamically constructed graphical model of unbounded size and time-varying structure; during large events with many aftershocks, the graphical model may contain as many as 500,000 variables.

Compared to the automated SEL3 bulletin, NET-VISA reduces the rate of detection failure by a factor of 2 to 3, maintaining the same false alarm rate. NET-VISA also detects numerous events that were previously missed by the human analysts. In November, 2014, the UN announced that NET-VISA would become the new monitoring algorithm for the CTBT. NETVISA ran continuously at IDC in "development" mode but in frequent use by analysts. On January 1, 2018, NETVISA became an official part of the verification regime.

The second-generation SIG-VISA model extends the generative model all the way to detailed waveforms, rather than just blips. Of particular interest is the fact that the SIG-VISA model expresses, via Gaussian processes, the smooth local dependence of waveform structure on the path taken by the signal through the Earth. This enables SIG-VISA to automatically derive the benefits of seismological techniques such as waveform matching and results in dramatically improved sensitivity. Recent experiments show a tenfold improvmeent in low-magnitude event detection compared to bulletins produced by expert human analysts.

Publications on NET-VISA

Publications on SIG-VISA