Development and validation of real-time earthquake hazard models

Project reference: 1707
Operational earthquake forecasting is a novel scientific direction of Seismology focusing on providing real-time earthquake likelihoods during the evolution of an aftershock sequence. The moment a major earthquake occurs, decision-makers need expert advice to both support proactive actions that will minimize loss of life and property, and lead rescue operations. Although recent research advances provide a starting scientific hypothesis for aftershock triggering mechanisms, the challenge we still face is to develop and validate testable forecast models in real-time settings.
The focus of this project is developing a time-economical calculation environment that will enable seismologists to utilise the preliminary data products and provide an earthquake forecast model within few hours. Then using the early aftershock data within the first few days, we aim in validating our prediction matrix using statistical metrics, such as log-likelihood statistics.
The on-going scientific work on the subject samples a number of world-wide cases with a recent highlight the UK effort to forecast the evolving seismic hazard in Italy following the August-October devastating earthquakes. It should be noted that the work will have important societal impact since it will provide scientists the computational tools to support informed decision-making and advisories from state officials in UK. The British Geological Survey is among the NERC science centres providing emergency scientific advice in cases of evolving hazards for multiple disciplines.
Earthquake Forecast following the 2015 M=7.8 Nepal mainshock. Also shown the distribution of earthquakes within the first 5 days. Taken from Segou and Parsons (2016) article in Seismological Research Letters.
Project Mentor: Amy Krause
Site Co-ordinator: Anne Whiting
Learning Outcomes:
The student will work with geo-science and computer science on forecast models. They will translate hazard earthquake models previously developed in Matlab into a real-time data streaming application, and learn about Python-based streaming tools, for example Apache Spark.
Student Prerequisites (compulsory):
A strong programming background, with an interest in HPC, parallel programming and real-time data processing and data streaming.
Student Prerequisites (desirable):
Experience in Python or Matlab, parallel programming techniques, big data engineering and/or the willingness to learn these technologies
Training Materials:
These will be provided to the successful student once they accept the placement.
Workplan:
- Week 1: Familiarise with the existing Matlab based models, and the streaming tools and development environment (Python, Apache Spark)
- Week 2&3: Design the port of existing models from Matlab to a stream-based architecture, in consultation with geo-scientists from BGS (British Geological Survey)
- Week 4,5,6,7: Develop the earthquake hazard models as outlined in the design phase. Testing & documentation.
- Week 8: Final report
Final Product Description:
The final product will be a forecast model in real-time settings.
Adapting the Project: Increasing the Difficulty:
The implementation could be tested against many models, and a real-time test scenario with large datasets could be run.
Resources:
Nothing specific, just a desktop/laptop machine capable of running the development tools.
Organisation:
Edinburgh Parallel Computing Centre (University of Edinburgh)
Leave a Reply