Type of resources
Contact for the resource
The wellsheets provide a summary of data and are available for selected open-file New Zealand petroleum exploration wells. The well sheets are generated from the petroleum report library and GNS Science in-house post-drill data, integrating petrophysical logs, seismic data, sedimentology and stratigraphy.
We undertake a seismic hazard analysis of the Clyde Dam site, fulfilling Phase 3 of the Clyde Dam Seismic Hazard Reassessment study. Site-specific horizontal magnitude-weighted earthquake acceleration spectra (5% damping), and relative displacement spectra are developed for the Clyde Dam site, and the site conditions of Class B Rock are assumed for the site. Cite as: Stirling MW, Litchfield NJ, Rhoades DA, McVerry GH, Van Dissen RJ. 2012. Clyde Dam seismic hazard reassessment, phase 3: seismic hazard analysis. Lower Hutt (NZ): GNS Science. 86 p. Consultancy Report 2012/168. Prepared for Contact Energy. doi:10.21420/8JG9-TJ73. The follow-on work from Consultancy Report 2012/168 has been crafted into a journal paper for publication in the Bulletin of the Seismological Society of America: Stirling MW, Abbott ER, Rood DH, McVerry GH, Abrahamson NA, Barrell DJA, Huso R, Litchfield NJ, Luna L, Rhoades DA, et al. In review. First use of fragile geologic features to constrain the design motions for a major existing engineered structure. Bulletin of the Seismological Society of America
Tsunami simulation model, such as COMCOT, solves a set of discretized mathematical equations that govern the physical processes of tsunami generation by various sources, propagation over ocean basin, run-up and inundations in coastal areas. The source code of a tsunami simulation model is a collection of a human-readable programming codes/statements that are translated from the discretized mathematical equations through a programming language, e.g. FORTRAN. Together with a suitable compiler, the source code can be further converted into an executable binary application, e.g. a numerical simulation program (model). This executable binary may run on various platforms, e.g. Windows or Linux systems. Continuous validations are often carried out by model developers and other researchers to identify potential bugs, errors, and validate its accuracies against analytical analyses, results from other established models, laboratory experiments, and field observations from real tsunami events. Pre- and post-processing scripts for a tsunami model are often independent of the model’s source code. They are developed to prepare input data for a tsunami simulation model, further process and visualize output data files of tsunami model simulations, e.g. to create inundation maps. These scripts may also be used to convert a tsunami model’s proprietary data formats into other commonly used formats, e.g. GIS-compatible formats such as ESRI ARC ASCII, or vice versa. MATLAB and Python are two commonly used programming languages for these scripts. For example, MATLAB has been used to develop a set of data processing scripts for COMCOT tsunami simulation model. COMCOT-API is a set of Python-based Application Programming Interface (API) scripts that augments and drives the COMCOT tsunami simulation model. This enables full automation of parameter studies and development of an algorithm for probabilistic hazard assessment with COMCOT as the tsunami simulation kernel. The API manages simulations on clusters with different queuing systems, farms simulation scenarios out to clusters and collects data after simulation completion. It also creates non-uniform slips as input to tsunami simulations, based on fault geometry and scaled with earthquake magnitude. User manuals provide detailed descriptions about tsunami models, including their underlying mathematical equations, numerical discretization schemes, programming languages, software/hardware requirements, parameter setup, input and output data, example simulations, and sometimes validations. DOI: https://doi.org/10.21420/V6PE-TG20 Cite as: GNS Science. (2020). Tsunami Models Source Code, Scripts and Manuals. GNS Science. https://doi.org/10.21420/V6PE-TG20
The purpose of the New Zealand Geothermal Analytical Laboratory (NZGAL) Information Management System, is for recording and storing all information related to sample information and results pertaining to samples received by NZGAL. NZGAL contains: 1. Paper records relating to sample submission, client correspondence, sample treatment, measurement, results and internal operations documents (SOP’s and Quality Manual). 2. Digital database and associated files: QLIMS database. Digital database records for sample details pertaining to sample reception to measurement to results 2011 to present. Gas database. MS Access database for recording geothermal gas analysis and calculations 3. Internal digital and paper-based data relating to ongoing data quality control, measurement output and internal procedures. DOI: https://doi.org/10.21420/D539-T756 Cite as: GNS Science. (2021). New Zealand Geothermal Analytical Laboratory Information Management System (NZGAL) [Data set]. GNS Science. https://doi.org/10.21420/D539-T756
Rupture models for a selection of large Aotearoa New Zealand earthquakes. Rupture models are compiled in a a common format. Models derived from both seismic and geodetic data are provided, where available. Rupture Models can be found within Github here: https://github.com/GeoNet/data/tree/main/rupture-models. DOI : https://doi.org/10.21420/396B-3Y58 Cite as: GNS Science, Geonet. (2019). GeoNet Aotearoa New Zealand Rupture Models Dataset [Data set]. GNS Science, GeoNet. https://doi.org/10.21420/396B-3Y58 For specific models, please cite using the reference papers DOI.
Using seafloor image data to build single-taxon and community distribution models for seabed fauna in New Zealand waters. Understanding the spatial distributions of seabed biodiversity is essential for effective management of the effects of human activities including fishing and mining. To improve understanding of seabed fauna distributions, we are developing a new database of benthic invertebrate occurrences in New Zealand waters by assembling quantitative data from all available seabed photographic surveys. By modelling the spatial relationships between taxon occurrences and environmental gradients across the region, we are able to predict the likelihood of individual taxa and communities being present in as-yet unsampled areas. In the first phase of the project, we concentrated on Chatham Rise; a region of high importance to commercial fisheries and with the highest density of available seabed imagery. Predictions from the models developed here are the first abundance-based models of benthic distributions in the New Zealand region and are the best-informed representations of seabed distributions on Chatham Rise to date, providing a resource that will have applications in marine environmental management and ecosystem research. All rasters are in a geotiff format at a 1000 m resolution cell size and projected to WGS 84 / Mercator 41 - EPSG:3994 coordinate system.