Machine learning-detected signal predicts time to earthquake

Machine-learning research published in two related papers today in Nature Geoscience reports the detection of seismic signals accurately predicting the Cascadia fault’s slow slippage, a type of failure observed to precede large earthquakes in other subduction zones.

Los Alamos National Laboratory researchers applied machine learning to analyze Cascadia data and discovered the megathrust broadcasts a constant tremor, a fingerprint of the fault’s displacement. More importantly, they found a direct parallel between the loudness of the fault’s acoustic signal and its physical changes. Cascadia’s groans, previously discounted as meaningless noise, foretold its fragility.

“Cascadia’s behavior was buried in the data. Until machine learning revealed precise patterns, we all discarded the continuous signal as noise, but it was full of rich information. We discovered a highly predictable sound pattern that indicates slippage and fault failure,” said Los Alamos scientist Paul Johnson. “We also found a precise link between the fragility of the fault and the signal’s strength, which can help us more accurately predict a megaquake.”

Read more at: https://phys.org/news/2018-12-machine-learning-detected-earthquake.html#jCp

Source: Machine learning-detected signal predicts time to earthquake

Robin Edgar

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft

 robin@edgarbv.com  https://www.edgarbv.com