Collaborative Research: EarthCube Data Capabilities: Volcanology hub for Interdisciplinary Collaboration, Tools and Resources (VICTOR)

  • Lev, Einat (PI)

Project: Research project

Project Details

Description

This pilot project is designed to demonstrate a new and streamlined approach to modeling of volcanic eruptions and analysis of data using advanced cyberinfrastructure. Risks to life and property stem from pyroclastic flows, volcanic ash fallout, and lava flow inundation, so the volcano science community strives to develop computer models to better understand these and similar phenomena. Yet, effective models require a complex interplay between volcano science, physics of flows and computational science. The Volcanology hub for Interdisciplinary Collaboration, Tools and Resources (VICTOR) will build upon lessons learned from previous cyberinfrastructure (VHub). Access will be through a central web portal. All components will be based in the cloud, to allow for demand-based resource management, workflow portability and reproducibility, and to offer access to high-performance computing to a broader scientific community. These components will provide volcano scientists with cyberinfrastructure to accelerate model development and application to reduce the threats from volcano eruptions. An essential component of VICTOR will be exposing and training of the community to latest approaches in computational volcanology through demonstration examples, workshops, tutorials and webinars.

Demonstration of science and computational workflows in VICTOR involves implementing new cyberinfrastructure capabilities and libraries of models to: (1) improve model verification, validation, and benchmarking, (2) assess linked and changing hazards, and (3) connect volcanic eruptions with other parts of the magma transport system and with other related earth systems in a robust, streamlined way. These workflows will use data science techniques, including machine learning and numerical inversion to improve parameter estimation and uncertainty quantification. Workflows will be containerized using modern computing tools such as dockers and electronic notebooks, minimizing the time-intensive steps of locating, installing, running and testing models. It will be demonstrated that model inputs and outputs can be standardized using workflows, facilitating studies of linked- and multi-hazard scenarios. Workflows can be saved, re-run, edited and challenged, enhancing reproducibility and reliability of the modeling process. Ultimately, these containerized models, data science tools, and provisioned low-barrier access to computing resources will increase usability by the community and accelerate the science.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

StatusFinished
Effective start/end date9/1/218/31/24

Funding

  • National Science Foundation: US$328,117.00

ASJC Scopus Subject Areas

  • Computer Science Applications
  • Geology
  • Research and Theory

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.