diff --git a/.github/workflows/blank.yml b/.github/workflows/blank.yml index 1a39f4a3..e6dbc24a 100644 --- a/.github/workflows/blank.yml +++ b/.github/workflows/blank.yml @@ -6,7 +6,15 @@ on: branches: - main tags: '*' + jobs: + pre-commit: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - uses: actions/setup-python@v3 + - uses: pre-commit/action@v3.0.0 + run_tests: name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }} runs-on: ${{ matrix.os }} diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml new file mode 100644 index 00000000..1b0ab9e8 --- /dev/null +++ b/.pre-commit-config.yaml @@ -0,0 +1,20 @@ +# See https://pre-commit.com for more information +# See https://pre-commit.com/hooks.html for more hooks +repos: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v3.2.0 + hooks: + - id: trailing-whitespace + - id: end-of-file-fixer + - id: check-yaml + - id: check-json + - id: check-added-large-files +- repo: https://github.com/codespell-project/codespell + rev: v2.2.1 + hooks: + - id: codespell + args: ["-L", "mor"] +# - repo: https://github.com/qiaojunfeng/pre-commit-julia-format +# rev: v0.1.1 +# hooks: +# - id: julia-format # formatter for Julia code diff --git a/README.md b/README.md index 1c27b6e1..f9ecd065 100644 --- a/README.md +++ b/README.md @@ -4,11 +4,11 @@ [![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://juliageodynamics.github.io/GeophysicalModelGenerator.jl/dev/) [![Build Status](https://github.com/JuliaGeodynamics/GeophysicalModelGenerator.jl/workflows/CI/badge.svg)](https://github.com/JuliaGeodynamics/GeophysicalModelGenerator.jl/actions) -Creating consistent 3D images of geophysical and geological datasets and turning that into an input model for geodynamic simulations is often challenging. The aim of this package is to help with this, by providing a number of routines to easily import data and create a consistent 3D visualisation from it in the VTK-toolkit format, which can for example be viewed with [Paraview](https://www.paraview.org). In addition, we provide a range of tools that helps to generate input models to perform geodynamic simulations and import the results of such simulations back into julia. +Creating consistent 3D images of geophysical and geological datasets and turning that into an input model for geodynamic simulations is often challenging. The aim of this package is to help with this, by providing a number of routines to easily import data and create a consistent 3D visualisation from it in the VTK-toolkit format, which can for example be viewed with [Paraview](https://www.paraview.org). In addition, we provide a range of tools that helps to generate input models to perform geodynamic simulations and import the results of such simulations back into julia. ![README_img](./docs/src/assets/img/Readme_pic.png) ### Contents -* [Main features](#main-features) +* [Main features](#main-features) * [Usage](#usage) * [Installation](#installation) * [Dependencies](#dependencies) @@ -26,14 +26,14 @@ Some of the key features are: - Grab screenshots of cross-sections or maps in published papers and view them in 3D (together with other data). - Create a consistent overview that includes all available data of a certain region. - Create initial model setups for the 3D geodynamic code [LaMEM](https://bitbucket.org/bkaus/lamem/src/master/). -- Import LaMEM timesteps. +- Import LaMEM timesteps. All data is transformed into either a `GeoData` or a `UTMData` structure which contains info about `longitude/latitude/depth`, `ew/ns/depth` coordinates along with an arbitrary number of scalar/vector datasets, respectively. All data can be exported to Paraview with the `Write_Paraview` routine, which transfers the data to a `ParaviewData` structure (that contains Cartesian Earth-Centered-Earth-Fixed (ECEF) `x/y/z` coordinates, used for plotting) - -## Usage + +## Usage The best way to learn how to use this is to install the package (see below) and look at the tutorials in the [manual](https://juliageodynamics.github.io/GeophysicalModelGenerator.jl/dev/). -## Installation +## Installation First, you need to install julia on your machine. We recommend to use the binaries from [https://julialang.org](https://julialang.org). Next, start julia and switch to the julia package manager using `]`, after which you can add the package. ```julia @@ -59,7 +59,7 @@ We rely on a number of additional packages, which are all automatically installe ## Visualising Alpine data -We have used this package to interpret various data sets of the Alps (mostly openly available, sometimes derived from published papers). You can download the resulting paraview files here (using the `*.vts` format), where we also included the julia scripts to do the work (some of which are also described in more detail in the tutorials). Just unzip the files and open the corresponding `*.vts` in Paraview. +We have used this package to interpret various data sets of the Alps (mostly openly available, sometimes derived from published papers). You can download the resulting paraview files here (using the `*.vts` format), where we also included the julia scripts to do the work (some of which are also described in more detail in the tutorials). Just unzip the files and open the corresponding `*.vts` in Paraview. [https://seafile.rlp.net/d/22b0fb85550240758552/](https://seafile.rlp.net/d/22b0fb85550240758552/) @@ -68,12 +68,12 @@ If you want your data be included here as well, give us an email (or even better You are very welcome to request new features and point out bugs by opening an issue. You can also help by adding features and creating a pull request. ## Development roadmap -In the pipeline: +In the pipeline: - More tutorials - Add more import tools. - Compute gravity anomalies for lon/lat datasets, rather than just x/y/z. -- Provide an interface to [geomIO](https://bitbucket.org/geomio/geomio/wiki/Home) (currently being translated from MATLAB to python) in order to allow creating 3D geometric model setups by drawing in Inkscape. -- Provide tools to create and export 3D geodynamic model setups. - +- Provide an interface to [geomIO](https://bitbucket.org/geomio/geomio/wiki/Home) (currently being translated from MATLAB to python) in order to allow creating 3D geometric model setups by drawing in Inkscape. +- Provide tools to create and export 3D geodynamic model setups. + ## Funding Development of this software package was funded by the German Research Foundation (DFG grants TH2076/7-1 and KA3367/10-1), which are part of the [SPP 2017 4DMB project](http://www.spp-mountainbuilding.de) project as well as by the European Research Council under grant ERC CoG #771143 - [MAGMA](https://magma.uni-mainz.de). The project was initiated at a [TeMaS](https://temas.uni-mainz.de) workshop with researchers from Frankfurt and Mainz where we realized that it is way too timeconsuming to collect available data of a certain region. diff --git a/docs/make.jl b/docs/make.jl index 772e757d..b80eff38 100644 --- a/docs/make.jl +++ b/docs/make.jl @@ -26,7 +26,7 @@ makedocs(; "ISC earthquake data" => "man/tutorial_ISC_data.md", "Plot GPS vectors" => "man/tutorial_GPS.md", "Read UTM data" => "man/tutorial_UTM.md", - "VoteMaps" => "man/Tutorial_Votemaps.md", + "VoteMaps" => "man/Tutorial_Votemaps.md", "Kilometer-scale volcano" => "man/tutorial_local_Flegrei.md", "Generating LaMEM model" => "man/LaPalma_example.md", "Create movies" => "man/tutorial_time_Seismicity.md" @@ -43,7 +43,7 @@ makedocs(; "LaMEM" => "man/lamem.md" ], "List of functions" => "man/listfunctions.md" - ], + ], ) deploydocs(; @@ -55,4 +55,3 @@ deploydocs(; forcepush=true, push_preview = true ) - diff --git a/docs/src/index.md b/docs/src/index.md index b63926cb..fb2cb22b 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -9,7 +9,7 @@ Documentation for [GeophysicalModelGenerator](https://github.com/JuliaGeodynamic The main purpose of this package is to simplify the process of going from 1D/2D/3D geophysical data to a 3D consistent model of the region. By simplifying the process of plotting the data, it becomes easier to compare different data sets, and generate a 3D models that can be used for other computations such as geodynamic simulations, or forward modelling of gravity anomalies. For this, GeophysicalModelGenerator provides the following functionality: -- A consistent GeoData structure, that holds the data along with lon/lat/depth information. +- A consistent GeoData structure, that holds the data along with lon/lat/depth information. - Routines to generate VTK files from the GeoData structure in order to visualize results in Paraview. - The ability to deal with points, 2D profiles and 3D volumes, for both scalar and vector values. - Rapidly import screenshots of published papers and compare them with other data sets in 3D using paraview. diff --git a/docs/src/man/LaPalma_example.md b/docs/src/man/LaPalma_example.md index 946d9441..d68750f7 100644 --- a/docs/src/man/LaPalma_example.md +++ b/docs/src/man/LaPalma_example.md @@ -28,7 +28,7 @@ Topo = ImportTopo(lon = [-18.7, -17.1], lat=[28.0, 29.2], file="@earth_relief_03 ```` ```` -GeoData +GeoData size : (1921, 1441, 1) lon ϵ [ 341.3 : 342.9] lat ϵ [ 28.0 : 29.2] @@ -55,7 +55,7 @@ Topo=load("Topo_LaPalma.jld2","Topo") ```` ```` -GeoData +GeoData size : (1921, 1441, 1) lon ϵ [ 341.3 : 342.9] lat ϵ [ 28.0 : 29.2] @@ -72,7 +72,7 @@ data_all_EQ = load("EQ_Data.jld2","data_all_EQ") ```` ```` -GeoData +GeoData size : (6045,) lon ϵ [ -18.0341 : -17.6671] lat ϵ [ 28.3102 : 28.8144] @@ -118,7 +118,7 @@ Topo_cart = Convert2CartData(Topo, proj) ```` ```` -CartData +CartData size : (1921, 1441, 1) x ϵ [ -86.09445705828863 km : 73.67229892155609 km] y ϵ [ -63.5531883197492 km : 73.28446155584604 km] @@ -143,7 +143,7 @@ Topo_LaMEM = ProjectCartData(Topo_LaMEM, Topo, proj) ```` ```` -CartData +CartData size : (701, 651, 1) x ϵ [ -70.0 km : 70.0 km] y ϵ [ -60.0 km : 70.0 km] @@ -177,7 +177,7 @@ Grid = ReadLaMEM_InputFile("LaPalma.dat") ```` ```` -LaMEM Grid: +LaMEM Grid: nel : (64, 64, 32) marker/cell : (3, 3, 3) markers : (192, 192, 192) @@ -190,7 +190,7 @@ LaMEM Grid: The `LaMEM_grid` structure contains the number of elements in every direction and the number of markers in every cell. It also contains `Grid.X`, `Grid.Y`, `Grid.Z`, which are the coordinates of each of the markers in the 3 directions. -In a next step we need to give each of these points a `Phase` number (which is an integer, that indicates the type of the rock that point has), as well as the temperature (in Celcius). +In a next step we need to give each of these points a `Phase` number (which is an integer, that indicates the type of the rock that point has), as well as the temperature (in Celsius). ````julia Phases = ones(Int32,size(Grid.X))*2; @@ -298,4 +298,3 @@ If you are interested in doing this, have a look at the LaMEM [wiki](https://bit --- *This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).* - diff --git a/docs/src/man/Tutorial_Votemaps.md b/docs/src/man/Tutorial_Votemaps.md index 444b6d31..ef4a3591 100644 --- a/docs/src/man/Tutorial_Votemaps.md +++ b/docs/src/man/Tutorial_Votemaps.md @@ -98,4 +98,3 @@ Similarly, you could only look at a single model (even when that is perhaps easi --- *This page was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).* - diff --git a/docs/src/man/installation.md b/docs/src/man/installation.md index 38181b7d..57484dba 100644 --- a/docs/src/man/installation.md +++ b/docs/src/man/installation.md @@ -7,7 +7,7 @@ In order to use then package you obviously need to install julia. We recommend d ### 2. Install Visual Studio Code -The julia files itself are text files (just like matlab scripts). You may want to edit or modify them at some stage, for which you can use any text editor for that. We prefer to use the freely available [Visual Studio Code](https://code.visualstudio.com) as it has a build-in terminal and is the comes with the (official) julia debugger (install the Julia extension for that). +The julia files itself are text files (just like matlab scripts). You may want to edit or modify them at some stage, for which you can use any text editor for that. We prefer to use the freely available [Visual Studio Code](https://code.visualstudio.com) as it has a built-in terminal and is the comes with the (official) julia debugger (install the Julia extension for that). ### 3. Getting started with julia You start julia on the command line with: @@ -25,19 +25,19 @@ This will start the command-line interface of julia: _/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release |__/ | -julia> +julia> ``` From the julia prompt, you start the package manager by typing `]`: ```julia -(@v1.6) pkg> +(@v1.6) pkg> ``` And you return to the command line with a backspace. -Also useful is that julia has a build-in terminal, which you can reach by typing `;` on the command line: +Also useful is that julia has a built-in terminal, which you can reach by typing `;` on the command line: ```julia julia>; -shell> +shell> ``` In the shell, you can use the normal commands like listing the content of a directory, or the current path: ```julia @@ -48,7 +48,7 @@ shell> pwd ``` As before, return to the main command line (called `REPL`) with a backspace. -If you want to see help information for any julia function, type `?` followed by the command. +If you want to see help information for any julia function, type `?` followed by the command. An example for `tan` is: ```julia help?> tan @@ -73,7 +73,7 @@ search: tan tanh tand atan atanh atand instances transpose transcode contains Un 2×2 Matrix{Float64}: -1.09252 -1.09252 -1.09252 -1.09252 -``` +``` If you are in a directory that has a julia file (which have the extension `*.jl`), you can open that file with Visual Studio Code: ```julia @@ -115,17 +115,13 @@ julia> using GeophysicalModelGenerator ``` ### 5. Other useful packages -As you will work your way through the tutorials you will see that we often use external packages, for example to load ascii data files into julia. You will find detailed instructions in the respective tutorials. +As you will work your way through the tutorials you will see that we often use external packages, for example to load ascii data files into julia. You will find detailed instructions in the respective tutorials. If you already want to install some of those, here our favorites. Install them through the package manager: -- [CSV](https://github.com/JuliaData/CSV.jl): Read comma-separated data files into julia. -- [Plots](https://github.com/JuliaPlots/Plots.jl): Create all kinds of plots in julia (quite an extensive package, but very useful to have). +- [CSV](https://github.com/JuliaData/CSV.jl): Read comma-separated data files into julia. +- [Plots](https://github.com/JuliaPlots/Plots.jl): Create all kinds of plots in julia (quite an extensive package, but very useful to have). - [JLD2](https://github.com/JuliaIO/JLD2.jl): This allows saving julia objects (such as a tomographic model) to a binary file and load it again at a later stage. - [Geodesy](https://github.com/JuliaGeo/Geodesy.jl): Convert UTM coordinates to latitude/longitude/altitude. - [NetCDF](https://github.com/JuliaGeo/NetCDF.jl): Read NetCDF files. - [GMT](https://github.com/GenericMappingTools/GMT.jl): A julia interface to the Generic Mapping Tools (GMT), which is a highly popular package to create (geophysical) maps. Note that installing `GMT.jl` is more complicated than installing the other packages listed above, as you first need to have a working version of `GMT` on your machine (it is not yet installed automatically). Installation instructions for Windows/Linux are on their webpage. On a mac, we made the best experiences by downloading the binaries from their webpage and not using a package manager to install GMT. - - - - diff --git a/docs/src/man/lamem.md b/docs/src/man/lamem.md index 4f63e9a2..c3f7c36b 100644 --- a/docs/src/man/lamem.md +++ b/docs/src/man/lamem.md @@ -1,7 +1,7 @@ # LaMEM -In order to generate geodynamic simulations from setups created with `GeophysicalModelGenerator.jl`, we provide a few routines that directly create marker input files for the 3D geodynamic modelling software [LaMEM](https://bitbucket.org/bkaus/lamem), which is an open-source cartesian code that is well-suited to perform crustal and lithospheric-scale simulations. -If you want to learn how to run LaMEM simulations, please have a look at the [wiki page](https://bitbucket.org/bkaus/lamem/wiki/Home). +In order to generate geodynamic simulations from setups created with `GeophysicalModelGenerator.jl`, we provide a few routines that directly create marker input files for the 3D geodynamic modelling software [LaMEM](https://bitbucket.org/bkaus/lamem), which is an open-source cartesian code that is well-suited to perform crustal and lithospheric-scale simulations. +If you want to learn how to run LaMEM simulations, please have a look at the [wiki page](https://bitbucket.org/bkaus/lamem/wiki/Home). The routines provided here have the following functionality: - Read LaMEM *.dat files (to get the size of the domain) diff --git a/docs/src/man/listfunctions.md b/docs/src/man/listfunctions.md index ba92caa5..10975c75 100644 --- a/docs/src/man/listfunctions.md +++ b/docs/src/man/listfunctions.md @@ -4,4 +4,3 @@ This page details the some of the guidelines that should be followed when contri ```@index ``` - diff --git a/docs/src/man/projection.md b/docs/src/man/projection.md index eee26316..04756d77 100644 --- a/docs/src/man/projection.md +++ b/docs/src/man/projection.md @@ -2,7 +2,7 @@ Typically, you load a dataset by reading it into julia and either generating a `GeoData` structure (in case you have `longitude/latitude/depth` info), or as `UTMData` (in case the data is in `UTM coordinates`, which requires you to specify the zone & hemisphere). -If you write the data to `Paraview`, it is internally converted to a Paraview structure (which involves `x,y,z` Cartesian Earth-Centered-Earth-Fixed (ECEF) coordinates using the `wgs84` ellipsoid). +If you write the data to `Paraview`, it is internally converted to a Paraview structure (which involves `x,y,z` Cartesian Earth-Centered-Earth-Fixed (ECEF) coordinates using the `wgs84` ellipsoid). Yet, if you do geodynamic calculations the chances are that the geodynamic code does not operate in spherical coordinates, but rather use cartesian ones. In that case you should transfer your data to the `CartData` structure, which requires you to specify a `ProjectionPoint` that is a point on the map that will later have the coordinates `(0,0)` in the `CartData` structure. @@ -13,7 +13,7 @@ Converting from one coordinate system to the other is straightforward. Let's use ```julia julia> using GeophysicalModelGenerator, GMT julia> Topo = ImportTopo(lon = [-10, 45], lat=[25, 50], file="@earth_relief_20m") -GeoData +GeoData size : (165, 75, 1) lon ϵ [ -10.0 : 44.666666666666664] lat ϵ [ 25.0 : 49.666666666666664] @@ -22,13 +22,13 @@ GeoData julia> Write_Paraview(Topo,"Topo") Saved file: Topo.vts ``` -The result is shown on the globe as: +The result is shown on the globe as: ![Topo_Europe_GeoData](../assets/img/Topo_Europe_GeoData.png) You can convert this to UTM zone as: ```julia julia> convert(UTMData, Topo) -UTMData +UTMData UTM zone : 29-38 North size : (165, 75, 1) EW ϵ [ 197181.31221507967 : 769155.4572884373] @@ -48,7 +48,7 @@ ProjectionPoint(37.5, 17.3, 703311.4380385976, 4.152826288024972e6, 33, true) Projecting the `GeoData` set using this projection point is done with: ```julia julia> Convert2UTMzone(Topo,p) -UTMData +UTMData UTM zone : 33-33 North size : (165, 75, 1) EW ϵ [ -2.0750691599137965e6 : 3.581351293385453e6] @@ -56,13 +56,13 @@ UTMData depth ϵ [ -4985.5 m : 3123.0 m] fields : (:Topography,) ``` -Whereas this is now in UTM Data (in meters), it is distorted. +Whereas this is now in UTM Data (in meters), it is distorted. ![Topo_Europe_UTMData](../assets/img/Topo_Europe_UTMData.png) Often it is more convenient to have this in `CartData`, which is done in a similar manner: ```julia julia> Topo_Cart = Convert2CartData(Topo,p) -CartData +CartData size : (165, 75, 1) x ϵ [ -2778.3805979523936 km : 2878.039855346856 km] y ϵ [ -1387.8785405466067 km : 1785.2879241356998 km] @@ -72,13 +72,13 @@ CartData This shows that the model is ~5600 by 3000 km. ![Topo_Europe_CartData](../assets/img/Topo_Europe_CartData.png) -Whereas this is ok to look at and compare with a LaMEM model setup, we cannot use it to perform internal calculations (or to generate a LaMEM model setup), because the `x` and `y` coordinates are distorted and not orthogonal. +Whereas this is ok to look at and compare with a LaMEM model setup, we cannot use it to perform internal calculations (or to generate a LaMEM model setup), because the `x` and `y` coordinates are distorted and not orthogonal. #### 2. Projecting data For use with LaMEM, you would need an orthogonal cartesian grid. From the last command above we get some idea on the area, so we can create this: ```julia julia> Topo_Cart_orth = CartData(XYZGrid(-2000:20:2000,-1000:20:1000,0)) -CartData +CartData size : (201, 101, 1) x ϵ [ -2000.0 km : 2000.0 km] y ϵ [ -1000.0 km : 1000.0 km] @@ -88,18 +88,18 @@ CartData Next, we can project the topographic data (in `GeoData` format) on this orthogonal grid ```julia julia> Topo_Cart_orth = ProjectCartData(Topo_Cart_orth, Topo, p) -CartData +CartData size : (201, 101, 1) x ϵ [ -2000.0 km : 2000.0 km] y ϵ [ -1000.0 km : 1000.0 km] z ϵ [ -4.485650671162607 km : 2.5909655318121865 km] fields : (:Topography,) -julia> Write_Paraview(Topo_Cart_orth,"Topo_Cart_orth"); +julia> Write_Paraview(Topo_Cart_orth,"Topo_Cart_orth"); ``` ![Topo_Europe_CartData_Proj](../assets/img/Topo_Europe_CartData_Proj.png) So this interpolates the topographic data from the `GeoData` to the orthogonal cartesian grid (which can be used with LaMEM, for example). -You can do similar projections with full 3D data sets or pointwise data. +You can do similar projections with full 3D data sets or pointwise data. #### 3. List of functions ```@docs diff --git a/docs/src/man/tutorial_Coastlines.md b/docs/src/man/tutorial_Coastlines.md index 39c39ad2..ced93de1 100644 --- a/docs/src/man/tutorial_Coastlines.md +++ b/docs/src/man/tutorial_Coastlines.md @@ -1,8 +1,8 @@ -# Add coastlines +# Add coastlines ## Goal -For orientation, it is often nice to add country borders and coastlines to your paraview plots. +For orientation, it is often nice to add country borders and coastlines to your paraview plots. ## Steps @@ -24,8 +24,8 @@ julia> Lon,Lat,Depth = LonLatDepthGrid(lon[ind_lon],lat[ind_lat],0km); julia> data_surf = zeros(size(Lon)); julia> data_surf[:,:,1] = data[ind_lon,ind_lat] julia> data_surface = GeoData(Lon, Lat, Depth, (SurfaceType=data_surf,)) -julia> Write_Paraview(data_surface, "ContinentOcean") +julia> Write_Paraview(data_surface, "ContinentOcean") ``` The result is shown here, together with Moho data -![Tutorial_Coastlines](../assets/img/Tutorial_Coastlines.png) \ No newline at end of file +![Tutorial_Coastlines](../assets/img/Tutorial_Coastlines.png) diff --git a/docs/src/man/tutorial_GMT_Topography.md b/docs/src/man/tutorial_GMT_Topography.md index d233fb38..5829ac6b 100644 --- a/docs/src/man/tutorial_GMT_Topography.md +++ b/docs/src/man/tutorial_GMT_Topography.md @@ -1,41 +1,41 @@ -# Extract topographic data from GMT.jl +# Extract topographic data from GMT.jl ## Goal In many cases, we want to add topographic data as well to our visualization. This tutorial shows how to use [GMT.jl](https://github.com/GenericMappingTools/GMT.jl) to download data from a certain region, and transfer that. !!! note - It may be tricky to get GMT.jl installed and working correctly on your system (at least until someone prevides a BinaryBuilder package for julia, that is). You first need to have a working version of GMT on your system and only after that, you can install `GMT.jl`. See the installation instructions on their webpage for details. - On a MacBook Pro, a tested procedure to install GMT and to make it work with julia is to directly install the binaries for Julia, GMT (and possibly Ghostscript) and not use any package manager (such as spack or homebrew). + It may be tricky to get GMT.jl installed and working correctly on your system (at least until someone prevides a BinaryBuilder package for julia, that is). You first need to have a working version of GMT on your system and only after that, you can install `GMT.jl`. See the installation instructions on their webpage for details. + On a MacBook Pro, a tested procedure to install GMT and to make it work with julia is to directly install the binaries for Julia, GMT (and possibly Ghostscript) and not use any package manager (such as spack or homebrew). ## Steps #### 1. Download topographic data of the Alpine region -The nice thing about GMT is that it automatically downloads data for you for a certain region and with a certain resolution. As this is a routine that you may use often in your daily workflow, we added the function `ImportTopo` that simplifies this. Note that this function only is available once `GMT` is loaded. +The nice thing about GMT is that it automatically downloads data for you for a certain region and with a certain resolution. As this is a routine that you may use often in your daily workflow, we added the function `ImportTopo` that simplifies this. Note that this function only is available once `GMT` is loaded. ```julia julia> using GeophysicalModelGenerator, GMT julia> Topo = ImportTopo([4,20,37,49], file="@earth_relief_01m.grd") -GeoData +GeoData size : (960, 720, 1) lon ϵ [ 4.0 : 19.983333333333334] lat ϵ [ 37.0 : 48.983333333333334] depth ϵ [ -3.8725 km : 4.2495 km] fields: (:Topography,) ``` -The data is available in different resolutions; see [here](http://gmt.soest.hawaii.edu/doc/latest/grdimage.html) for an overview. Generally, it is advisable to not use the largest resolution if you have a large area. +The data is available in different resolutions; see [here](http://gmt.soest.hawaii.edu/doc/latest/grdimage.html) for an overview. Generally, it is advisable to not use the largest resolution if you have a large area. #### 2. Save Transforming this to Paraview is a piece of cake: ```julia -julia> Write_Paraview(Topo, "Topography_Alps") +julia> Write_Paraview(Topo, "Topography_Alps") ``` The result is shown here, together with Moho data ![Tutorial_GMT_topography](../assets/img/Tutorial_GMT_topography.png) -In case you are interested: we are employing the `oleron` scientific colormap here. \ No newline at end of file +In case you are interested: we are employing the `oleron` scientific colormap here. diff --git a/docs/src/man/tutorial_GMT_Topography_GeologicalMap.md b/docs/src/man/tutorial_GMT_Topography_GeologicalMap.md index c8abcaa1..a716c75d 100644 --- a/docs/src/man/tutorial_GMT_Topography_GeologicalMap.md +++ b/docs/src/man/tutorial_GMT_Topography_GeologicalMap.md @@ -5,21 +5,21 @@ In many cases, we want to add topographic data as well a information about tectonic units to our visualization. This tutorial shows how to use [GMT.jl](https://github.com/GenericMappingTools/GMT.jl) to import data from an ETOPO1 file for a certain region, load a geological map from a raster graphics file (here: PNG), drape it over the topography and transfer that. !!! note - It may be tricky to get GMT.jl installed and working correctly on your system (at least until someone prevides a BinaryBuilder package for julia, that is). You first need to have a working version of GMT on your system and only after that, you can install `GMT.jl`. See the installation instructions on their webpage for details. - On a MacBook Pro, a tested procedure to install GMT and to make it work with julia is to directly install the binaries for Julia, GMT (and possibly Ghostscript) and not use any package manager (such as spack or homebrew). + It may be tricky to get GMT.jl installed and working correctly on your system (at least until someone prevides a BinaryBuilder package for julia, that is). You first need to have a working version of GMT on your system and only after that, you can install `GMT.jl`. See the installation instructions on their webpage for details. + On a MacBook Pro, a tested procedure to install GMT and to make it work with julia is to directly install the binaries for Julia, GMT (and possibly Ghostscript) and not use any package manager (such as spack or homebrew). ## Steps #### 1. Download topographic data and tectonic maps of the Alpine region -The ETOPO1 data file used in this example can be downloaded here: +The ETOPO1 data file used in this example can be downloaded here: [https://ngdc.noaa.gov/mgg/global/global.html](https://ngdc.noaa.gov/mgg/global/global.html). For this example we downloaded *ETOPO1_Ice_g_gmt4.grd* and stored it directly in the folder where we will be working. For the geological map, we download the data from the [SPP 4DMB repository](http://www.spp-mountainbuilding.de/data/Maps.zip) and extract the zip file (also in the current folder). In this data set, a *gmt* file with the data for different tectonic units is given in *./ -tectonic_maps_4dmb_2020_09_17/GMT_example/alcapadi_polygons.gmt* . +tectonic_maps_4dmb_2020_09_17/GMT_example/alcapadi_polygons.gmt* . -#### 2. Create a tectonic map with orthogonal projection -To create a png with an orthogonal map projection (which we need for the png import), we do the following in julia: +#### 2. Create a tectonic map with orthogonal projection +To create a png with an orthogonal map projection (which we need for the png import), we do the following in julia: ```julia -julia> +julia> julia> using GMT julia> filename_gmt = "./tectonic_maps_4dmb_2020_09_17/GMT_example/alcapadi_polygons.gmt" julia> plot(filename_gmt,region="4/20/37/49",show=true) @@ -28,17 +28,17 @@ This opens a window with the plotted map. Save this image in your current workin ![Tutorial_GMT_topography_GeologicalMap_PNG](../assets/img/Tutorial_GMT_topography_GeologicalMap_PNG.png) -#### 3. Import data to paraview +#### 3. Import data to paraview Now, to import the ETOPO1 topography data and to drape the geologic map over it, open julia again. Load the following packages: ```julia julia> using GMT, NearestNeighbors, GeoParams, GeophysicalModelGenerator ``` -First, define the filenames of the files you want to import: +First, define the filenames of the files you want to import: ```julia -julia> filename_topo = "./ETOPO1/ETOPO1_Ice_g_gmt4.grd" +julia> filename_topo = "./ETOPO1/ETOPO1_Ice_g_gmt4.grd" julia> filename_geo = "./tectonicmap_SPP.png" ``` -Next, define the region that you want to visualize (note that we use the same coordinates here as we used previously for the generation of the geological map): +Next, define the region that you want to visualize (note that we use the same coordinates here as we used previously for the generation of the geological map): ```julia julia> lat_min = 37.0 julia> lat_max = 49.0 @@ -58,7 +58,7 @@ At this step, only the topographic data is imported. Now we have to import the t julia> Corner_LowerLeft = ( lon_min, lat_min , 0.0) julia> Corner_UpperRight = (lon_max, lat_max , 0.0) ``` -and import the png file with the GMG function *Screenshot_To_GeoData*: +and import the png file with the GMG function *Screenshot_To_GeoData*: ```julia julia> DataPNG = Screenshot_To_GeoData(filename_geo, Corner_LowerLeft, Corner_UpperRight) @@ -118,4 +118,4 @@ The result is shown here: ![Tutorial_GMT_topography_GeologicalMap](../assets/img/Tutorial_GMT_topography_GeologicalMap.png) -In case you are interested: we are employing the `oleron` scientific colormap from [Fabio Crameri's scientific colormap package](https://www.fabiocrameri.ch/colourmaps/) here. \ No newline at end of file +In case you are interested: we are employing the `oleron` scientific colormap from [Fabio Crameri's scientific colormap package](https://www.fabiocrameri.ch/colourmaps/) here. diff --git a/docs/src/man/tutorial_GPS.md b/docs/src/man/tutorial_GPS.md index 89eb9548..f7ab7a54 100644 --- a/docs/src/man/tutorial_GPS.md +++ b/docs/src/man/tutorial_GPS.md @@ -2,13 +2,13 @@ ## Goal -The aim of this tutorial is to show you how you can download and plot GPS data, which are vector data. +The aim of this tutorial is to show you how you can download and plot GPS data, which are vector data. The example is based on a paper by Sanchez et al. (2018) [https://essd.copernicus.org/articles/10/1503/2018/#section7](https://essd.copernicus.org/articles/10/1503/2018/#section7) ## Steps -#### 1. Download and import GPS data: +#### 1. Download and import GPS data: The data related to the paper can be downloaded from: https://doi.pangaea.de/10.1594/PANGAEA.886889 There you will find links to several data sets. Some are the data on the actual stations and some are interpolated data on a grid. Here, we will use the gridded data as an example (which interpolates the ), and will therefore download the following data sets: @@ -37,7 +37,7 @@ julia> data_file = CSV.File("ALPS2017_DEF_VT.GRD",datarow=18,header=false,deli ``` We can read the numerical data from the file with: ```julia -julia> data = ParseColumns_CSV_File(data_file, 4); +julia> data = ParseColumns_CSV_File(data_file, 4); julia> lon_Vz, lat_Vz, Vz_vec = data[:,1], data[:,2], data[:,3]; ``` @@ -51,7 +51,7 @@ julia> Plots.scatter(lon_Vz,lat_Vz) ![Tutorial_GPS_1](../assets/img/Tutorial_GPS_1.png) So clearly, this is a fully regular grid. -We can determine the size of the grid with +We can determine the size of the grid with ```julia julia> unique(lon_Vz) 41-element Vector{Float64}: @@ -113,7 +113,7 @@ julia> Plots.scatter(lon_Hz,lat_Hz) ``` ![Tutorial_GPS_2](../assets/img/Tutorial_GPS_2.png) -So it appears that the horizontal velocities are given on the same regular grid as well, but not in the water. +So it appears that the horizontal velocities are given on the same regular grid as well, but not in the water. This thus requires a bit more work. The strategy we take is to first define 2D matrixes with horizontal velocities with the same size as Vz which are initialized with `NaN` (not a number), which is treated specially by Paraview. ```julia @@ -138,11 +138,11 @@ julia> Vn = Vn*1000; ``` And the magnitude is: ```julia -julia> Vmagnitude = sqrt.(Ve.^2 + Vn.^2 + Vz.^2); +julia> Vmagnitude = sqrt.(Ve.^2 + Vn.^2 + Vz.^2); ``` #### 4. Interpolate topography on grid -At this stage we have the 3D velocity components on a grid. Yet, we don't have information yet about the elevation of the stations (as the provided data set did not give this). +At this stage we have the 3D velocity components on a grid. Yet, we don't have information yet about the elevation of the stations (as the provided data set did not give this). We could ignore that and set the elevation to zero, which would allow saving the data directly to paraview. Yet, a better way is to load the topographic map of the area and interpolate the elevation to the velocity grid. We are using the `GMT.jl` to load the topographic data: @@ -154,7 +154,7 @@ julia> Elevation = gmtread("@earth_relief_01m.grd", limits=[3,17,42,50]); Next, we use the `Interpolations.jl` package to interpolate the topography: ```julia julia> using Interpolations -julia> interpol = LinearInterpolation((Elevation.x[1:end-1], Elevation.y[1:end-1]), Elevation.z'); +julia> interpol = LinearInterpolation((Elevation.x[1:end-1], Elevation.y[1:end-1]), Elevation.z'); julia> height = interpol.(lon,lat)/1e3; ``` @@ -163,7 +163,7 @@ At this stage, we have all we need. As the velocity is a vector field, we need t ```julia julia> GPS_Sanchez_grid = GeoData(lon,lat,height,(Velocity_mm_year=(Ve,Vn,Vz),V_north=Vn*mm/yr, V_east=Ve*mm/yr, V_vertical=Vz*mm/yr, Vmagnitude = Vmagnitude*mm/yr, Topography = height*km)) -GeoData +GeoData size : (41, 31, 1) lon ϵ [ 4.0 : 16.0] lat ϵ [ 43.0 : 49.0] @@ -182,4 +182,3 @@ In order to plot the velocities as arrows, you need to select the `Glyph` tool ( ![Tutorial_GPS_4](../assets/img/Tutorial_GPS_4.png) The arrows can now be colored by the individual velocity components or its magnitude. - diff --git a/docs/src/man/tutorial_ISC_data.md b/docs/src/man/tutorial_ISC_data.md index 25ee0c8c..8d43dad9 100644 --- a/docs/src/man/tutorial_ISC_data.md +++ b/docs/src/man/tutorial_ISC_data.md @@ -4,18 +4,18 @@ This explains how to load earthquake data obtained from the ISC catalogue. ## Steps -#### 1. Download data -You can get data from the ISC catalogue here: +#### 1. Download data +You can get data from the ISC catalogue here: [http://www.isc.ac.uk/iscbulletin/search/catalogue/](http://www.isc.ac.uk/iscbulletin/search/catalogue/) The catalogue will give you an on screen CSV output that will then have to be copied to a file of your choice (here we will call it *ISC1.dat*). Do that and start julia from the directory where it was downloaded. #### 2. Read data into Julia -The main data-file, `ISC1.dat`, has 23 lines of comments (indicated with `#`), after which the data starts. We can use the julia package [https://github.com/JuliaData/CSV.jl](CSV.jl) to read in the data, and tell it that the data is seperated by `,`. +The main data-file, `ISC1.dat`, has 23 lines of comments (indicated with `#`), after which the data starts. We can use the julia package [https://github.com/JuliaData/CSV.jl](CSV.jl) to read in the data, and tell it that the data is separated by `,`. ```julia-repl julia> using CSV, GeophysicalModelGenerator julia> data_file = CSV.File("ISC1.dat",datarow=24,header=false,delim=',') ``` -As this data contains a lot of information that we are not interested in at the moment and which is given in non-numeric fomats (e.g. date, time etc.), we will use our helper function *ParseColumns_CSV_File* to only extract columns with numeric data. +As this data contains a lot of information that we are not interested in at the moment and which is given in non-numeric formats (e.g. date, time etc.), we will use our helper function *ParseColumns_CSV_File* to only extract columns with numeric data. ```julia-repl julia> data = ParseColumns_CSV_File(data_file, 14) julia> lon = data[:,2]; diff --git a/docs/src/man/tutorial_MohoTopo.md b/docs/src/man/tutorial_MohoTopo.md index 4912936d..64040ae6 100644 --- a/docs/src/man/tutorial_MohoTopo.md +++ b/docs/src/man/tutorial_MohoTopo.md @@ -1,20 +1,20 @@ # Moho topography ## Goal -This explains how to load the Moho topography for Italy and the Alps and create a paraview file +This explains how to load the Moho topography for Italy and the Alps and create a paraview file Spada, M., Bianchi, I., Kissling, E., Agostinetti, N.P., Wiemer, S., 2013. *Combining controlled-source seismology and receiver function information to derive 3-D Moho topography for Italy.* Geophysical Journal International 194, 1050–1068. [doi:10.1093/gji/ggt148](https://doi.org/10.1093/gji/ggt148) ## Steps -#### 1. Download data +#### 1. Download data The data is available as digital dataset on the researchgate page of Prof. Edi Kissling [https://www.researchgate.net/publication/322682919_Moho_Map_Data-WesternAlps-SpadaETAL2013](https://www.researchgate.net/publication/322682919_Moho_Map_Data-WesternAlps-SpadaETAL2013) We have also uploaded it here: [https://seafile.rlp.net/d/a50881f45aa34cdeb3c0/](https://seafile.rlp.net/d/a50881f45aa34cdeb3c0/) -The full data set actually includes 3 different Moho's (Europe, Adria, Tyrrhenia-Corsica). To simplify matters, we have split the full file into 3 seperate ascii files and uploaded it. +The full data set actually includes 3 different Moho's (Europe, Adria, Tyrrhenia-Corsica). To simplify matters, we have split the full file into 3 separate ascii files and uploaded it. Please download the files `Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho1.txt`, `Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho2.txt` and `Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho3.txt` @@ -30,7 +30,7 @@ Note that depth is made negative. #### 3. Reformat the data -Next, let's check if the data is spaced in a regular manner in Lon/Lat direction. +Next, let's check if the data is spaced in a regular manner in Lon/Lat direction. For that, we plot it using the `Plots` package (you may have to install that first on your machine). ```julia julia> using Plots @@ -45,13 +45,13 @@ The easiest way to transfer this to Paraview is to simply save this as 3D data p ```julia julia> using GeophysicalModelGenerator julia> data_Moho1 = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) -GeoData +GeoData size : (12355,) lon ϵ [ 4.00026 - 11.99991] lat ϵ [ 42.51778 - 48.99544] depth ϵ [ -57.46 km - -21.34 km] fields: (:MohoDepth,) -julia> Write_Paraview(data_Moho1, "Spada_Moho_Europe", PointsData=true) +julia> Write_Paraview(data_Moho1, "Spada_Moho_Europe", PointsData=true) ``` And we can do the same with the other two Moho's: @@ -59,11 +59,11 @@ And we can do the same with the other two Moho's: julia> data =readdlm("Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho2.txt",' ',Float64,'\n', skipstart=38,header=false); julia> lon, lat, depth = data[:,1], data[:,2], -data[:,3]; julia> data_Moho2 = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) -julia> Write_Paraview(data_Moho2, "Spada_Moho_Adria", PointsData=true) +julia> Write_Paraview(data_Moho2, "Spada_Moho_Adria", PointsData=true) julia> data =readdlm("Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho3.txt",' ',Float64,'\n', skipstart=38,header=false); julia> lon, lat, depth = data[:,1], data[:,2], -data[:,3]; julia> data_Moho3 = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) -julia> Write_Paraview(data_Moho3, "Spada_Moho_Tyrrhenia", PointsData=true) +julia> Write_Paraview(data_Moho3, "Spada_Moho_Tyrrhenia", PointsData=true) ``` If we plot this in paraview, it looks like this: @@ -71,7 +71,7 @@ If we plot this in paraview, it looks like this: #### 3.1 Fitting a mesh through the data -So obviously, the Moho is discontinuous between these three Mohos. Often, it looks nicer if we fit a regular surface through these data points. To do this we first combine the data points of the 3 surfaces into one set of points +So obviously, the Moho is discontinuous between these three Mohos. Often, it looks nicer if we fit a regular surface through these data points. To do this we first combine the data points of the 3 surfaces into one set of points ``` julia> lon = [data_Moho1.lon.val; data_Moho2.lon.val; data_Moho3.lon.val]; @@ -80,18 +80,18 @@ julia> depth = [data_Moho1.depth.val; data_Moho2.depth.val; data_Moho3.depth.val julia> data_Moho_combined = GeoData(lon, lat, depth, (MohoDepth=depth*km,)) ``` -Next, we define a regular lon/lat grid +Next, we define a regular lon/lat grid ```julia julia> Lon,Lat,Depth = LonLatDepthGrid(4.1:0.1:11.9,42.5:.1:49,-30km); ``` -And we will use a nearest neighbor interpolation method to fit a surface through the data. This has the advantage that it will take the discontinuities into account. We will use the package [NearestNeighbors.jl](https://github.com/KristofferC/NearestNeighbors.jl) for this, which you may have to install first +And we will use a nearest neighbor interpolation method to fit a surface through the data. This has the advantage that it will take the discontinuities into account. We will use the package [NearestNeighbors.jl](https://github.com/KristofferC/NearestNeighbors.jl) for this, which you may have to install first ```julia julia> using NearestNeighbors julia> kdtree = KDTree([lon'; lat']) julia> idxs, dists = knn(kdtree, [Lon[:]'; Lat[:]'], 1, true) ``` -`idxs` contains the indices of the closest points to the grid in (Lon,Lat). Next +`idxs` contains the indices of the closest points to the grid in (Lon,Lat). Next ```julia julia> Depth = zeros(size(Lon))*km; julia> for i=1:length(idxs) @@ -102,7 +102,7 @@ Now, we can create a `GeoData` structure with the regular surface and save it to ```julia julia> data_Moho = GeoData(Lon, Lat, Depth, (MohoDepth=Depth,)) -julia> Write_Paraview(data_Moho, "Spada_Moho_combined") +julia> Write_Paraview(data_Moho, "Spada_Moho_combined") ``` The result is shown here, where the previous points are colored white and are a bit smaller. Obviously, the datasets coincide well. ![DataPoints_Moho_surface](../assets/img/Tutorial_MohoSpada_Surface_Paraview.png) @@ -113,6 +113,3 @@ The full julia script that does it all is given [here](https://github.com/JuliaG ```julia julia> include("MohoTopo_Spada.jl") ``` - - - diff --git a/docs/src/man/tutorial_Screenshot_To_Paraview.md b/docs/src/man/tutorial_Screenshot_To_Paraview.md index ddc734cb..7278c2a6 100644 --- a/docs/src/man/tutorial_Screenshot_To_Paraview.md +++ b/docs/src/man/tutorial_Screenshot_To_Paraview.md @@ -1,12 +1,12 @@ -# Import profiles/maps from published papers +# Import profiles/maps from published papers ## Goal -Ideally, all data should be availabe in digital format, after which you could use the tools described in the other tutorial to transform them into `GeoData` and export them to VTK. +Ideally, all data should be available in digital format, after which you could use the tools described in the other tutorial to transform them into `GeoData` and export them to VTK. Yet, the reality is different and often data is not (yet) available, or papers are old and the authors can no longer be contacted. For that reason, `GeophysicalModelGenerator` has tools that allow you to transfer a screenshot from any published paper into `GeoData/Paraview` and see it in 3D at the correct geographic location. This can be done for vertical profiles and for mapviews, which gives you a quick and easy way to see those papers in a new (3D) light. -Here, we explain how. +Here, we explain how. - [Import profiles/maps from published papers](#import-profilesmaps-from-published-papers) - [Goal](#goal) - [General procedure](#general-procedure) @@ -50,7 +50,7 @@ Extracting GeoData from: Lippitsch_Fig13a.png └ lower right = (17.23 , 43.8 , -400.0 ) └ upper left = (4.65 , 45.73 , 0.0 ) └ upper right = (17.23 , 43.8 , 0.0 ) -GeoData +GeoData size : (325, 824, 1) lon ϵ [ 4.6499999999999995 : 17.230000000000004] lat ϵ [ 43.79999999999999 : 45.730000000000004] @@ -59,7 +59,7 @@ GeoData ``` Finally, you save it in Paraview format as always: ```julia -julia> Write_Paraview(data_profile1, "Lippitsch_Fig13a_profile") +julia> Write_Paraview(data_profile1, "Lippitsch_Fig13a_profile") ``` You can open this in paraview. Here, it is shown along with topographic data (made transparent): @@ -81,10 +81,10 @@ Corner_UpperRight = (15.5, 50.0 , -150.0) Corner_LowerRight = (15.5, 43.0 , -150.0) Corner_UpperLeft = (3.5 , 50.0 , -150.0) data_Fig13_map = Screenshot_To_GeoData("Fig13_mapview.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig13_map, "Lippitsch_Fig13_mapview") +Write_Paraview(data_Fig13_map, "Lippitsch_Fig13_mapview") ``` -Once added to paraview (together with a few additional map views from the same paper): +Once added to paraview (together with a few additional map views from the same paper): ![Tutorial_ScreenShots_Lippitsch_4](../assets/img/Tutorial_ScreenShots_Lippitsch_4.png) #### 4. Using an automatic digitizer to pick points on map @@ -112,10 +112,10 @@ The general approach is simple: open a multiblock file, and pass the filename to An example showing you how this works is: ```julia julia> vtmfile = vtk_multiblock("Lippitsch_CrossSections") -julia> Write_Paraview(data_Fig12_90km, vtmfile) -julia> Write_Paraview(data_Fig12_180km, vtmfile) -julia> Write_Paraview(data_Fig12_300km, vtmfile) -julia> Write_Paraview(data_Fig12_400km, vtmfile) +julia> Write_Paraview(data_Fig12_90km, vtmfile) +julia> Write_Paraview(data_Fig12_180km, vtmfile) +julia> Write_Paraview(data_Fig12_300km, vtmfile) +julia> Write_Paraview(data_Fig12_400km, vtmfile) julia> vtk_save(vtmfile) ``` Note that you have to create the cross-sections first (see the julia script below). @@ -127,4 +127,3 @@ The full julia script that interprets all the figures is given [here](https://gi ```julia julia> include("Lippitsch_Screenshots.jl") ``` - diff --git a/docs/src/man/tutorial_UTM.md b/docs/src/man/tutorial_UTM.md index e13ef20d..166e87a7 100644 --- a/docs/src/man/tutorial_UTM.md +++ b/docs/src/man/tutorial_UTM.md @@ -11,12 +11,12 @@ Spooner, Cameron; Scheck-Wenderoth, Magdalena; Götze, Hans-Jürgen; Ebbing, Jö ## Steps -#### 1. Download and import UTM data: +#### 1. Download and import UTM data: Download the data file `2019-004_Spooner_Lithospheric Mantle.txt` from [http://doi.org/10.5880/GFZ.4.5.2019.004]() and make sure that you change to the directory using julia. If you open the data with a text editor, you'll see that it looks like: ``` -# These data are freely available under the Creative Commons Attribution 4.0 International Licence (CC BY 4.0) -# when using the data please cite as: +# These data are freely available under the Creative Commons Attribution 4.0 International Licence (CC BY 4.0) +# when using the data please cite as: # Spooner, Cameron; Scheck-Wenderoth, Magdalena; Götze, Hans-Jürgen; Ebbing, Jörg; Hetényi, György (2019): 3D Gravity Constrained Model of Density Distribution Across the Alpine Lithosphere. GFZ Data Services. http://doi.org/10.5880/GFZ.4.5.2019.004 X COORD (UTM Zone 32N) Y COORD (UTM Zone 32N) TOP (m.a.s.l) THICKNESS (m) DENSITY (Kg/m3) 286635 4898615 -24823.2533 70418.125 3305 @@ -39,30 +39,30 @@ julia> data=readdlm("2019-004_Spooner_Lithospheric Mantle.txt",skipstart=4) 286635.0 4.91862e6 -25443.5 69410.0 3305.0 286635.0 4.93862e6 -28025.2 66402.5 3305.0 286635.0 4.95862e6 -32278.1 61663.5 3305.0 - ⋮ + ⋮ 926635.0 5.45862e6 -35302.7 83215.6 3335.0 926635.0 5.47862e6 -34908.6 84203.0 3335.0 926635.0 5.49862e6 -34652.4 85398.3 3335.0 ``` We can read the numerical data from the file with: -```julia +```julia julia> ew, ns, depth, thick, rho = data[:,1], data[:,2], data[:,3], data[:,4], data[:,5]; ``` #### 2. Check & reshape vertical velocity -The data is initially available as 1D columns, which needs to be reshaped into 2D arrays. +The data is initially available as 1D columns, which needs to be reshaped into 2D arrays. We first reshape it into 2D arrays (using `reshape`). Yet, if we later want to visualize a perturbed surface in paraview, we need to save this as a 3D array (with 1 as 3rd dimension). ```julia julia> res = ( length(unique(ns)), length(unique(ew)), 1) -julia> EW = reshape(ew,res) +julia> EW = reshape(ew,res) julia> NS = reshape(ns,res) julia> Depth = reshape(depth,res) julia> T = reshape(thick,res) julia> Rho = reshape(rho,res) ``` -Next we can examine `EW`: +Next we can examine `EW`: ```julia julia> EW 31×33×1 Array{Float64, 3}: @@ -72,7 +72,7 @@ julia> EW 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 - ⋮ ⋮ ⋱ ⋮ + ⋮ ⋮ ⋱ ⋮ 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 286635.0 306635.0 326635.0 346635.0 366635.0 386635.0 406635.0 826635.0 846635.0 866635.0 886635.0 906635.0 926635.0 @@ -80,11 +80,11 @@ julia> EW ``` So, on fact, the EW array varies in the 2nd dimension. It should, however, vary in the first dimension which is why we need to apply a permutation & switch the first and second dimensions: ```julia -julia> EW = permutedims(EW,[2 1 3]) -julia> NS = permutedims(NS,[2 1 3]) -julia> Depth = permutedims(Depth,[2 1 3]) -julia> T = permutedims(T,[2 1 3]) -julia> Rho = permutedims(Rho,[2 1 3]) +julia> EW = permutedims(EW,[2 1 3]) +julia> NS = permutedims(NS,[2 1 3]) +julia> Depth = permutedims(Depth,[2 1 3]) +julia> T = permutedims(T,[2 1 3]) +julia> Rho = permutedims(Rho,[2 1 3]) ``` @@ -93,7 +93,7 @@ Next we can add the data to the `UTMData` structure. In this, we use the informa ```julia julia> Data_LM = UTMData(EW,NS,Depth,32, true, (Thickness=T/1e3*km, Density=Rho*kg/m^3 )) -UTMData +UTMData UTM zone : 32-32 North size : (33, 31, 1) EW ϵ [ 286635.0 : 926635.0] @@ -107,9 +107,9 @@ UTMData We can transfer this into a GeoData structure as: -```julia +```julia julia> Data_LM_lonlat = convert(GeoData,Data_LM) -GeoData +GeoData size : (33, 31, 1) lon ϵ [ 6.046903388679526 : 14.892535147436673] lat ϵ [ 44.11618022783332 : 49.64004892531006] diff --git a/docs/src/man/tutorial_load3DSeismicData.md b/docs/src/man/tutorial_load3DSeismicData.md index 24909c1a..ec38fb24 100644 --- a/docs/src/man/tutorial_load3DSeismicData.md +++ b/docs/src/man/tutorial_load3DSeismicData.md @@ -2,17 +2,17 @@ ## Goal This explains how to load a 3D P-wave model and plot it in Paraview as a 3D volumetric data set. It also shows how you can create horizontal or vertical cross-sections through the data in a straightforward manner and how you can extract subsets of the data; -The example is the P-wave velocity model of the Alps as described in: +The example is the P-wave velocity model of the Alps as described in: Zhao, L., Paul, A., Malusà, M.G., Xu, X., Zheng, T., Solarino, S., Guillot, S., Schwartz, S., Dumont, T., Salimbeni, S., Aubert, C., Pondrelli, S., Wang, Q., Zhu, R., 2016. *Continuity of the Alpine slab unraveled by high-resolution P wave tomography*. Journal of Geophysical Research: Solid Earth 121, 8720–8737. [doi:10.1002/2016JB013310](https://doi.org/10.1002/2016JB013310) The data is given in ASCII format with longitude/latitude/depth/velocity anomaly (percentage) format. ## Steps -#### 1. Download data +#### 1. Download data The data is can be downloaded from [https://seafile.rlp.net/d/a50881f45aa34cdeb3c0/](https://seafile.rlp.net/d/a50881f45aa34cdeb3c0/), where you should download the file `Zhao_etal_JGR_2016_Pwave_Alps_3D_k60.txt`. Do that and start julia from the directory where it was downloaded. #### 2. Read data into Julia -The dataset has no comments, and the data values in every row are separated by a space. In order to read this into julia as a matrix, we can use the build-in julia package `DelimitedFiles`. We want the resulting data to be stored as double precision values (`Float64`), and the end of every line is a linebreak (`\n`). +The dataset has no comments, and the data values in every row are separated by a space. In order to read this into julia as a matrix, we can use the built-in julia package `DelimitedFiles`. We want the resulting data to be stored as double precision values (`Float64`), and the end of every line is a linebreak (`\n`). ```julia julia> using DelimitedFiles julia> data=readdlm("Zhao_etal_JGR_2016_Pwave_Alps_3D_k60.txt",' ',Float64,'\n', skipstart=0,header=false) @@ -23,7 +23,7 @@ julia> data=readdlm("Zhao_etal_JGR_2016_Pwave_Alps_3D_k60.txt",' ',Float64,'\n', 0.45 38.0 -1001.0 -0.059 0.6 38.0 -1001.0 -0.055 0.75 38.0 -1001.0 -0.057 - ⋮ + ⋮ 17.25 51.95 -1.0 -0.01 17.4 51.95 -1.0 -0.005 17.55 51.95 -1.0 0.003 @@ -61,7 +61,7 @@ julia> Depth_vec = unique(depth) -1.0 ``` So the data has a vertical spacing of 10 km. -Next, let's check if the data is spaced in a regular manner in Lon/Lat direction. +Next, let's check if the data is spaced in a regular manner in Lon/Lat direction. For that, we read the data at a given depth level (say -101km) and plot it using the `Plots` package (you may have to install that first on your machine). ```julia julia> using Plots @@ -70,7 +70,7 @@ julia> scatter(lon[ind],lat[ind],marker_z=dVp_perc[ind], ylabel="latitude",xlabe ``` ![DataPoints](../assets/img/Tutorial_Zhao_LatLon_data.png) -Note that we employ the scientific colormap `roma` here. [This](https://docs.juliaplots.org/latest/generated/colorschemes/) gives an overview of available colormaps. You can download the colormaps for Paraview [here](https://www.fabiocrameri.ch/visualisation/). +Note that we employ the scientific colormap `roma` here. [This](https://docs.juliaplots.org/latest/generated/colorschemes/) gives an overview of available colormaps. You can download the colormaps for Paraview [here](https://www.fabiocrameri.ch/visualisation/). Clearly, the data is given as regular Lat/Lon points: @@ -134,7 +134,7 @@ Next we create a paraview file: ```julia julia> using GeophysicalModelGenerator julia> Data_set = GeoData(Lon,Lat,Depth,(dVp_Percentage=dVp_perc_3D,)) -GeoData +GeoData size : (121, 94, 101) lon ϵ [ 0.0 - 18.0] lat ϵ [ 38.0 - 51.95] @@ -147,7 +147,7 @@ julia> Write_Paraview(Data_set, "Zhao_etal_2016_dVp_percentage") #### 4. Plotting data in Paraview In paraview you should open the `*.vts` file, and press `Apply` (left menu) after doing that. Once you did that you can select `dVp_Percentage` and `Surface` (see red ellipses below)/. -In paraview you can open the file and visualize it. +In paraview you can open the file and visualize it. ![Paraview_1](../assets/img/Tutorial_Zhao_Paraview_1.png) @@ -181,8 +181,8 @@ In many cases you would like to create cross-sections through the 3D data sets a There is a simple way to achieve this using the `CrossSection` function. To make a cross-section at a given depth: ```julia -julia> Data_cross = CrossSection(Data_set, Depth_level=-100km) -GeoData +julia> Data_cross = CrossSection(Data_set, Depth_level=-100km) +GeoData size : (121, 94, 1) lon ϵ [ 0.0 : 18.0] lat ϵ [ 38.0 : 51.95] @@ -196,12 +196,12 @@ julia> Write_Paraview(Data_cross, "Zhao_CrossSection_100km") Or at a specific longitude: ```julia julia> Data_cross = CrossSection(Data_set, Lon_level=10) -GeoData +GeoData size : (1, 94, 101) lon ϵ [ 10.05 : 10.05] lat ϵ [ 38.0 : 51.95] depth ϵ [ -1001.0 km : -1.0 km] - fields: (:dVp_Percentage,) + fields: (:dVp_Percentage,) julia> Write_Paraview(Data_cross, "Zhao_CrossSection_Lon10") 1-element Vector{String}: "Zhao_CrossSection_Lon10.vts" @@ -211,7 +211,7 @@ As you see, this cross-section is not taken at exactly 10 degrees longitude. Tha If you wish to interpolate the data, specify `Interpolate=true`: ```julia julia> Data_cross = CrossSection(Data_set, Lon_level=10, Interpolate=true) -GeoData +GeoData size : (1, 100, 100) lon ϵ [ 10.0 : 10.0] lat ϵ [ 38.0 : 51.95] @@ -224,7 +224,7 @@ as you see, this causes the data to be interpolated on a `(100,100)` grid (which We can also create a diagonal cut through the model: ```julia julia> Data_cross = CrossSection(Data_set, Start=(1.0,39), End=(18,50)) -GeoData +GeoData size : (100, 100, 1) lon ϵ [ 1.0 : 18.0] lat ϵ [ 39.0 : 50.0] @@ -233,7 +233,7 @@ GeoData julia> Write_Paraview(Data_cross, "Zhao_CrossSection_diagonal") ``` -Here an image that shows the resulting cross-sections: +Here an image that shows the resulting cross-sections: ![Paraview_7](../assets/img/Tutorial_Zhao_Paraview_7.png) @@ -242,7 +242,7 @@ Sometimes, the data set covers a large region (e.g., the whole Earth), and you a ```julia julia> Data_subset = ExtractSubvolume(Data_set,Lon_level=(5,12), Lat_level=(40,45)) -GeoData +GeoData size : (48, 35, 101) lon ϵ [ 4.95 : 12.0] lat ϵ [ 39.95 : 45.05] @@ -257,7 +257,7 @@ By default, we extract the original data and do not interpolate it on a new grid In some cases, you will want to interpolate the data on a different grid. Use the `Interpolate=true` option for that: ```julia julia> Data_subset_interp = ExtractSubvolume(Data_set,Lon_level=(5,12), Lat_level=(40,45), Interpolate=true) -GeoData +GeoData size : (50, 50, 50) lon ϵ [ 5.0 : 12.0] lat ϵ [ 40.0 : 45.0] @@ -266,7 +266,7 @@ GeoData julia> Write_Paraview(Data_subset, "Zhao_Subset_interp") ``` ## Load and save data to disk -It would be useful to save the 3D data set we just created to disk, such that we can easily load it again at a later stage and create cross-sections etc, or compare it with other models. +It would be useful to save the 3D data set we just created to disk, such that we can easily load it again at a later stage and create cross-sections etc, or compare it with other models. It is quite easy to do so with the [JLD2.jl](https://github.com/JuliaIO/JLD2.jl) package: ```julia julia> using JLD2 @@ -286,6 +286,3 @@ The full julia script that does it all is given [here](https://github.com/JuliaG ```julia julia> include("Alps_VpModel_Zhao_etal_JGR2016.jl") ``` - - - diff --git a/docs/src/man/tutorial_loadirregular3DSeismicData.md b/docs/src/man/tutorial_loadirregular3DSeismicData.md index 2a9de39c..5e8e5188 100644 --- a/docs/src/man/tutorial_loadirregular3DSeismicData.md +++ b/docs/src/man/tutorial_loadirregular3DSeismicData.md @@ -9,11 +9,11 @@ As the data is not given in a regular lon/lat grid, we first interpolate it to a ## Steps -#### 1. Download data +#### 1. Download data The data is can be downloaded from [https://www.seismologie.ifg.uni-kiel.de/en/research/research-data/mere2020model](https://www.seismologie.ifg.uni-kiel.de/en/research/research-data/mere2020model). Do that and start julia from the directory where it was downloaded. #### 2. Read data into Julia -The main data-file, `El-Sharkawy-etal-G3.2020_MeRE2020_Mediterranean.csv`, has 23 lines of comments (indicated with `#`), after which the data starts. We can use the build-in package `DelimitedFiles` to read in the data, and tell it that the data is seperated by `|`. We also want the resulting data to be stored as double precision values (`Float64`), and the end of every line is a linebreak (`\n`). +The main data-file, `El-Sharkawy-etal-G3.2020_MeRE2020_Mediterranean.csv`, has 23 lines of comments (indicated with `#`), after which the data starts. We can use the built-in package `DelimitedFiles` to read in the data, and tell it that the data is separated by `|`. We also want the resulting data to be stored as double precision values (`Float64`), and the end of every line is a linebreak (`\n`). ```julia-repl julia> using DelimitedFiles julia> data=readdlm("El-Sharkawy-etal-G3.2020_MeRE2020_Mediterranean.csv",'|',Float64,'\n', skipstart=23,header=false) @@ -27,7 +27,7 @@ julia> data=readdlm("El-Sharkawy-etal-G3.2020_MeRE2020_Mediterranean.csv",'|',Fl 40.26 -10.98 50.0 4.36 42.19 -10.97 50.0 4.38 44.1 -10.97 50.0 4.38 - ⋮ + ⋮ ``` Next, extract vectors from it: ``` @@ -55,7 +55,7 @@ The result looks like this: So this is somewhat regular but not entirely and in some areas data points are missing. It is possible to create a VTK mesh that exactly respects this data, but for that we need knowledge on how the points are connected in 3D. The comments in the file do not provide this information, which is why we interpolate it on a regular lon/lat grid here which uses the same depth levels as in the data. -We extract the available depth levels with +We extract the available depth levels with ```julia julia> Depth_vec = unique(depth) 301-element Vector{Float64}: @@ -78,7 +78,7 @@ which shows that the data set goes from `[-350:1:-50]`. Let's create a regular grid, which describes a somewhat smaller area than the data-points to ensure that we can do an interpolation without having to extrapolate ```julia -julia> using GeophysicalModelGenerator +julia> using GeophysicalModelGenerator julia> Lon,Lat,Depth = LonLatDepthGrid(-10:0.5:40,32:0.25:50,Depth_vec); julia> size(Lon) (101, 73, 301) @@ -93,9 +93,9 @@ julia> scatter!(Lon[:,:,1],Lat[:,:,1],color=:white, markersize=1.5, markertype=" #### 3.2 Interpolate to a regular grid -Next, we need a method to interpolate the irregular datapoints @ a certain depth level to the white data points. +Next, we need a method to interpolate the irregular datapoints @ a certain depth level to the white data points. -There are a number of ways to do this, for example by employing [GMT.jl](https://github.com/GenericMappingTools/GMT.jl), or by using [GeoStats.jl](https://juliaearth.github.io/GeoStats.jl/stable/index.html). +There are a number of ways to do this, for example by employing [GMT.jl](https://github.com/GenericMappingTools/GMT.jl), or by using [GeoStats.jl](https://juliaearth.github.io/GeoStats.jl/stable/index.html). In this example, we will employ GeoStats. If you haven't installed it yet, do that with ```julia julia> ] @@ -132,10 +132,10 @@ julia> P = EstimationProblem(Geo, Cgrid, :Vs) data: 12278 MeshData{2,Float64} domain: 101×73 CartesianGrid{2,Float64} variables: Vs (Float64) -julia> S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); +julia> S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); julia> sol = solve(P, S) ``` -Here, we interpolated the data based on the Euclidean distance. Other methods, such as Kriging, can be used as well. +Here, we interpolated the data based on the Euclidean distance. Other methods, such as Kriging, can be used as well. Next, we can extract the data ```julia julia> sol_Vs = values(sol).Vs @@ -153,7 +153,7 @@ julia> for iz=1:size(Depth,3) coord = PointSet([lon[ind]'; lat[ind]']) Geo = georef((Vs=Vs[ind],), coord) P = EstimationProblem(Geo, Cgrid, :Vs) - S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); + S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); sol = solve(P, S) sol_Vs= values(sol).Vs Vs_2D = reshape(sol_Vs, size(domain(sol))) @@ -162,16 +162,16 @@ julia> for iz=1:size(Depth,3) ``` #### 4. Generate Paraview file -Once the 3D velocity matrix has been generated, producing a Paraview file is done with the following command +Once the 3D velocity matrix has been generated, producing a Paraview file is done with the following command ```julia julia> using GeophysicalModelGenerator -julia> Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) -GeoData +julia> Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) +GeoData size : (101, 73, 301) lon ϵ [ -10.0 - 40.0] lat ϵ [ 32.0 - 50.0] depth ϵ [ -350.0 km - -50.0 km] - fields: (:Vs_km_s,) + fields: (:Vs_km_s,) julia> Write_Paraview(Data_set, "MeRe_ElSharkawy") 1-element Vector{String}: "MeRe_ElSharkawy.vts" @@ -195,6 +195,3 @@ The full julia script that does it all is given [here](https://github.com/JuliaG ```julia julia> include("MeRe_ElSharkawy.jl") ``` - - - diff --git a/docs/src/man/tutorial_loadregular3DSeismicData_netCDF.md b/docs/src/man/tutorial_loadregular3DSeismicData_netCDF.md index 280d5b93..bc3ce5e5 100644 --- a/docs/src/man/tutorial_loadregular3DSeismicData_netCDF.md +++ b/docs/src/man/tutorial_loadregular3DSeismicData_netCDF.md @@ -7,7 +7,7 @@ El-Sharkawy et al. (2020), *The Slab Puzzle of the Alpine‐Mediterranean Region ## Steps -#### 1. Download data +#### 1. Download data The data is can be downloaded from [https://ds.iris.edu/files/products/emc/emc-files/El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc](https://ds.iris.edu/files/products/emc/emc-files/El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc). Do that and start julia from the directory where it was downloaded. #### 2. Read data into Julia @@ -26,79 +26,79 @@ julia> ncinfo("El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc") ##### Dimensions ##### -Name Length +Name Length -------------------------------------------------------------------------------------------------------------------------- -depth 301 -latitude 100 -longitude 100 +depth 301 +latitude 100 +longitude 100 ##### Variables ##### -Name Type Dimensions +Name Type Dimensions -------------------------------------------------------------------------------------------------------------------------- -depth FLOAT depth -latitude FLOAT latitude -longitude FLOAT longitude -Vs FLOAT longitude latitude depth +depth FLOAT depth +latitude FLOAT latitude +longitude FLOAT longitude +Vs FLOAT longitude latitude depth ##### Attributes ##### -Variable Name Value +Variable Name Value -------------------------------------------------------------------------------------------------------------------------- -global author_email amr.elsharkawy@ifg.uni-kiel.de -global data_revision r.0.0 +global author_email amr.elsharkawy@ifg.uni-kiel.de +global data_revision r.0.0 global author_institution Institute of Geosciences, University of Kiel, Otto-Hahn Pl.. global keywords seismic tomography, shear wave, Mediterranean, phase veloc.. global acknowledgment Model was provided by Dr. El-Sharkawy, Institute of Geosci.. global history 2020-09-29 14:37:43 UTC Converted to netCDF by GeoCSV_2_ne.. -global repository_pid doi:10.17611/dp/emc.2020.meresvelsh.1 -global id MeRE2020 -global author_name Amr EL-Sharkawy -global comment model converted to netCDF by IRIS EMC +global repository_pid doi:10.17611/dp/emc.2020.meresvelsh.1 +global id MeRE2020 +global author_name Amr EL-Sharkawy +global comment model converted to netCDF by IRIS EMC global NCO netCDF Operators version 4.7.5 (Homepage = http://nco.sf.n.. global summary MeRE2020 is a high-resolution Shear‐wave velocity mod.. -global repository_institution IRIS DMC -global netCDF_file El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-1.0.nc -global author_url https://www.seismologie.ifg.uni-kiel.de -global reference El-Sharkawy, et al. (2020) -global repository_name EMC -global Conventions CF-1.0 -global Metadata_Conventions Unidata Dataset Discovery v1.0 +global repository_institution IRIS DMC +global netCDF_file El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-1.0.nc +global author_url https://www.seismologie.ifg.uni-kiel.de +global reference El-Sharkawy, et al. (2020) +global repository_name EMC +global Conventions CF-1.0 +global Metadata_Conventions Unidata Dataset Discovery v1.0 global title The Slab Puzzle of the Alpine‐Mediterranean Region: I.. -depth units km -depth long_name depth in km -depth display_name depth in km -depth positive down -latitude units degrees_north -latitude long_name Latitude; positive north -latitude standard_name latitude -longitude units degrees_east -longitude long_name Longitude; positive east -longitude standard_name longitude -Vs units km.s-1 -Vs long_name Shear wave velocity -Vs display_name S Velocity (km/s) +depth units km +depth long_name depth in km +depth display_name depth in km +depth positive down +latitude units degrees_north +latitude long_name Latitude; positive north +latitude standard_name latitude +longitude units degrees_east +longitude long_name Longitude; positive east +longitude standard_name longitude +Vs units km.s-1 +Vs long_name Shear wave velocity +Vs display_name S Velocity (km/s) ``` As you can see, there is quite some information present in this file. The most important information here are the different variables stored in this file: ```julia-repl ##### Variables ##### -Name Type Dimensions +Name Type Dimensions -------------------------------------------------------------------------------------------------------------------------- -depth FLOAT depth -latitude FLOAT latitude -longitude FLOAT longitude -Vs FLOAT longitude latitude depth +depth FLOAT depth +latitude FLOAT latitude +longitude FLOAT longitude +Vs FLOAT longitude latitude depth ``` -Here we can see that there are four variables in this file, three of them (depth,latitude, longitude) having a single dimension and the fourth one (Vs) having dimensions of the three previous variables. The three one-dimensional vectors therefore denote a regualr grid of coordinates defining the locations where Vs is stored. -To load this data, we can now simply use the commmand *ncread*: +Here we can see that there are four variables in this file, three of them (depth,latitude, longitude) having a single dimension and the fourth one (Vs) having dimensions of the three previous variables. The three one-dimensional vectors therefore denote a regular grid of coordinates defining the locations where Vs is stored. +To load this data, we can now simply use the command *ncread*: ```julia-repl julia> lat = ncread(filename,"latitude") julia> lon = ncread(filename,"longitude") julia> depth = ncread(filename,"depth") julia> Vs_3D = ncread(filename,"Vs") -julia> depth = -1 .* depth +julia> depth = -1 .* depth ``` Note that we multiplied depth with -1. This is necessary to make depth to be negative, as that is what `GeophysicalModelGenerator.jl` expects. @@ -108,16 +108,16 @@ In the netCDF file, coordinates are given as 1D vectors denoting the location of Lon3D,Lat3D,Depth3D = LonLatDepthGrid(lon, lat, depth); ``` #### 4. Generate Paraview file -Once the 3D coordinate matrix has been generated, producing a Paraview file is done with the following command +Once the 3D coordinate matrix has been generated, producing a Paraview file is done with the following command ```julia julia> using GeophysicalModelGenerator -julia> Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) -GeoData +julia> Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) +GeoData size : (100, 100, 301) lon ϵ [ 29.0 - 51.0] lat ϵ [ -11.0 - 45.9900016784668] depth ϵ [ -350.0 km - -50.0 km] - fields: (:Vs_km_s,) + fields: (:Vs_km_s,) julia> Write_Paraview(Data_set, "MeRe_ElSharkawy") 1-element Vector{String}: "MeRe_ElSharkawy.vts" @@ -140,6 +140,3 @@ The full julia script that does it all is given [here](https://github.com/JuliaG ```julia julia> include("MeRe_ElSharkawy.jl") ``` - - - diff --git a/docs/src/man/tutorial_local_Flegrei.md b/docs/src/man/tutorial_local_Flegrei.md index 4bbf6a02..02522c66 100644 --- a/docs/src/man/tutorial_local_Flegrei.md +++ b/docs/src/man/tutorial_local_Flegrei.md @@ -44,7 +44,7 @@ The first column gives us a temporal marker we can use to plot earthquakes in di julia> using DelimitedFiles, GeophysicalModelGenerator, Glob, GeoStats julia> data_80s = readdlm("SeismicLocations/Seismicity_UTM_1983_1984.txt", '\t', skipstart=0, header=false); julia> data_00s = readdlm("SeismicLocations/Seismicity_UTM_2005_2016.txt", ' ', skipstart=0, header=false); -julia> data = vcat(data_80s,data_00s) +julia> data = vcat(data_80s,data_00s) julia> time = data[:,1]; julia> WE = data[:,2]; julia> SN = data[:,3]; @@ -145,4 +145,4 @@ This is one of the horizontal sections created by the code in the previous model ![Tutorial_Flegrei_Noise](../assets/img/Flegrei_Noise.png) -If you want to run the entire example, you can find the .jl code [here](https://github.com/JuliaGeodynamics/GeophysicalModelGenerator.jl/blob/main/tutorial/Tutorial_Flegrei.jl) \ No newline at end of file +If you want to run the entire example, you can find the .jl code [here](https://github.com/JuliaGeodynamics/GeophysicalModelGenerator.jl/blob/main/tutorial/Tutorial_Flegrei.jl) diff --git a/docs/src/man/tutorials.md b/docs/src/man/tutorials.md index 79af59d1..f98e1eda 100644 --- a/docs/src/man/tutorials.md +++ b/docs/src/man/tutorials.md @@ -6,7 +6,7 @@ The best way to learn how to use julia and GeophysicalModelGenerator to visualiz 2. [Moho topography data](./tutorial_MohoTopo.md). Shows how to plot Moho data as 3D points in paraview, and how to fit a surface through it. 3. [Topography](./tutorial_GMT_Topography.md). Shows how to quickly obtain the topography of any part of the world using GMT & transfer that to paraview. 4. [Coastlines](./tutorial_Coastlines.md). Shows how to generate land/sea surfaces. -5. [Import screenshots](./tutorial_Screenshot_To_Paraview.md). Gives examples how you can easily import screenshots from published papers and visualize them in 3D +5. [Import screenshots](./tutorial_Screenshot_To_Paraview.md). Gives examples how you can easily import screenshots from published papers and visualize them in 3D 6. [3D seismic tomography data on irregular grid](./tutorial_loadirregular3DSeismicData.md). Shows how to interpolate 3D seismic data, given on an irregular lon/lat grid, to a regular grid and create Paraview input from it. 7. [Topography and geological maps](./tutorial_GMT_Topography_GeologicalMap.md). Shows how to import ETOPO1 topography, how to drape a geological map over it & transfer that to Paraview. 8. [ISC earthquake data](./tutorial_ISC_data.md). Shows how to import earthquake data from the ISC catalogue. diff --git a/docs/src/man/visualise.md b/docs/src/man/visualise.md index 7345b159..b486cfdd 100644 --- a/docs/src/man/visualise.md +++ b/docs/src/man/visualise.md @@ -13,22 +13,22 @@ The last line loads packages we need to read in the pre-generated `JLD2` file (s Lets load the data, by first moving to the correct directory (will likely be different on your machine). ```julia julia> ; -shell> cd ~/Downloads/Zhao_etal_2016_data/ +shell> cd ~/Downloads/Zhao_etal_2016_data/ ``` Now you can use the `backspace` key to return to the `REPL`, where we will load the data ```julia -julia> Data = JLD2.load("Zhao_Pwave.jld2","Data_set_Zhao2016_Vp"); +julia> Data = JLD2.load("Zhao_Pwave.jld2","Data_set_Zhao2016_Vp"); ``` At this stage you can look at it with ```julia -julia> Visualise(Data); +julia> Visualise(Data); ``` Note that this tends to take a while, the first time you do this (faster afterwards). Let's add topography to the plot as well, which requires us to first load that: ```julia julia> Topo = ImportTopo([0,18,38,52], file="@earth_relief_01m.grd"); -julia> Visualise(Data, Topography=Topo); +julia> Visualise(Data, Topography=Topo); ``` Which will look like: ![Tutorial_Visualize](../assets/img/Tutorial_Visualize.png) @@ -41,7 +41,7 @@ ProjectionPoint(45.0, 10.0, 578815.302916711, 4.983436768349297e6, 32, true) julia> Data_Cart = CartData(XYZGrid(-600:10:600,-600:10:600,-1000:10:-1)); julia> Topo_Cart = CartData(XYZGrid(-600:10:600,-600:10:600,0)); julia> Topo_Cart = ProjectCartData(Topo_Cart, Topo, p) -CartData +CartData size : (121, 121, 1) x ϵ [ -600.0 : 600.0] y ϵ [ -600.0 : 600.0] @@ -49,7 +49,7 @@ CartData fields : (:Topography,) attributes: ["note"] julia> Data_Cart = ProjectCartData(Data_Cart, Data, p) -CartData +CartData size : (121, 121, 100) x ϵ [ -600.0 : 600.0] y ϵ [ -600.0 : 600.0] diff --git a/docs/src/scripts/LaPalma.dat b/docs/src/scripts/LaPalma.dat index f64edccf..89ae7bf0 100644 --- a/docs/src/scripts/LaPalma.dat +++ b/docs/src/scripts/LaPalma.dat @@ -3,7 +3,7 @@ # Data is provided in the variables seg_x, seg_y, seg_z and includes: # * coordinates of the delimiters between the segments (n-1 points) # * number of cells (n points) -# * bias coefficients (n points) +# * bias coefficients (n points) #=============================================================================== @@ -40,11 +40,11 @@ unit_stress = 1e9 # relative geometry tolerance for grid manipulations (default 1e-9) gtol = 1e-9 - -# Number of cells for all segments + +# Number of cells for all segments nel_x = 64 - nel_y = 64 - nel_z = 32 + nel_y = 64 + nel_z = 32 # Coordinates of all segments (including start and end points) coord_x = -50 50 @@ -62,7 +62,7 @@ unit_stress = 1e9 surf_max_angle = 45.0 # maximum angle with horizon (smoothed if larger) surf_topo_file = LaPalma_Topo.topo # initial topography file (redundant) erosion_model = 0 # erosion model [0-none (default), 1-infinitely fast] - sediment_model = 0 # sedimentation model [0-none (dafault), 1-prescribed rate] + sediment_model = 0 # sedimentation model [0-none (default), 1-prescribed rate] #=============================================================================== # Boundary conditions @@ -79,7 +79,7 @@ unit_stress = 1e9 # Free surface top boundary flag open_top_bound = 1 - + #=============================================================================== # Jacobian & residual evaluation parameters #=============================================================================== @@ -154,7 +154,7 @@ unit_stress = 1e9 out_avd = 1 # activate AVD phase output out_avd_pvd = 1 # activate writing .pvd file out_avd_ref = 1 # AVD grid refinement factor - + # free surface output out_surf = 1 # activate surface output out_surf_pvd = 1 # activate writing .pvd file @@ -172,19 +172,19 @@ unit_stress = 1e9 Name = air ID = 0 rho = 100 - alpha = 3e-5 + alpha = 3e-5 - # Linear Viscosity + # Linear Viscosity eta = 1e17 # Elastic parameters G = 1e10 nu = 0.2 - - # Thermal parameters + + # Thermal parameters k = 30 Cp = 1000 - + # Plastic parameters ch = 10e6 fr = 30 @@ -195,19 +195,19 @@ unit_stress = 1e9 Name = water ID = 1 rho = 100 - alpha = 3e-5 + alpha = 3e-5 - # Linear Viscosity + # Linear Viscosity eta = 1e17 # Elastic parameters G = 1e10 nu = 0.2 - - # Thermal parameters + + # Thermal parameters k = 30 Cp = 1000 - + # Plastic parameters ch = 10e6 fr = 30 @@ -220,7 +220,7 @@ unit_stress = 1e9 ID = 2 rho = 2900 alpha = 3e-5 - + # dislocation viscosity #disl_prof = Plagioclase_An75-Ranalli_1995 #disl_prof = Wet_Quarzite-Ranalli_1995 @@ -230,18 +230,18 @@ unit_stress = 1e9 # Elastic parameters G = 3e10 nu = 0.2 - + # Thermal parameters k = 3 Cp = 1000 - + # Plastic parameters ch = 10e6 fr = 30 - - - + + + # ------------------- Mantle lithosphere------------------- Name = Mantle @@ -255,11 +255,11 @@ unit_stress = 1e9 # Elastic parameters G = 6.5e10 nu = 0.2 - + # Thermal parameters k = 3.3 Cp = 1000 - + # Plastic parameters ch = 10e6 fr = 30 @@ -318,9 +318,9 @@ unit_stress = 1e9 # PETSc options #=============================================================================== - + # SNES - + -snes_npicard 5 -snes_max_it 200 -snes_rtol 1e-6 @@ -328,11 +328,11 @@ unit_stress = 1e9 -snes_PicardSwitchToNewton_rtol 1e-7 #-7 #-snes_NewtonSwitchToPicard_it 1 # number of Newton iterations after which we switch back to Picard # -snes_monitor -# -snes_linesearch_monitor +# -snes_linesearch_monitor # Jacobian solver - -js_ksp_type fgmres - -js_ksp_monitor + -js_ksp_type fgmres + -js_ksp_monitor -js_ksp_max_it 30 -js_ksp_rtol 1e-5 -js_ksp_atol 1e-6 @@ -345,17 +345,13 @@ unit_stress = 1e9 -gmg_pc_mg_galerkin -gmg_pc_mg_type multiplicative -gmg_pc_mg_cycle_type v - + -gmg_mg_levels_ksp_type richardson -gmg_mg_levels_ksp_richardson_scale 0.5 -gmg_mg_levels_ksp_max_it 5 -gmg_mg_levels_pc_type jacobi -crs_pc_factor_mat_solver_package pastix - - - - - + diff --git a/docs/src/scripts/LaPalma_example.jl b/docs/src/scripts/LaPalma_example.jl index e1575367..4ce8b412 100644 --- a/docs/src/scripts/LaPalma_example.jl +++ b/docs/src/scripts/LaPalma_example.jl @@ -2,13 +2,13 @@ # # ## Aim # In this tutorial, your will learn how to use real data to create a geodynamic model setup with LaMEM. We will use the data of La Palma, which is a volcanic island that started erupting in mid september 2021. -# LaMEM is a cartesian geodynamic model, which implies that you will have to transfer the data from `GeoData` to `CartData`. +# LaMEM is a cartesian geodynamic model, which implies that you will have to transfer the data from `GeoData` to `CartData`. # # # ## 1. Load data # We will use two types of data to create the model -# 1) Topography -# 2) Earthquake locations +# 1) Topography +# 2) Earthquake locations # We start with loading the required packages using GeophysicalModelGenerator, JLD2 @@ -17,41 +17,41 @@ using GMT # In case you managed to install GMT on your machine, you can automatically download the topography with Topo = ImportTopo(lon = [-18.7, -17.1], lat=[28.0, 29.2], file="@earth_relief_03s.grd") -# In case you did not manage that, we have prepared a JLD2 file [here](https://seafile.rlp.net/f/03286a46a9f040a69b0f/?dl=1). +# In case you did not manage that, we have prepared a JLD2 file [here](https://seafile.rlp.net/f/03286a46a9f040a69b0f/?dl=1). # Download it, and doublecheck that you are in the same directory as the data file with: pwd() -# Load the data: +# Load the data: Topo=load("Topo_LaPalma.jld2","Topo") # Next, lets load the seismicity. We have prepared a file with earthquake locations up to early November (from [https://www.ign.es/web/ign/portal/vlc-catalogo](https://www.ign.es/web/ign/portal/vlc-catalogo)). # Download that [here](https://seafile.rlp.net/f/03286a46a9f040a69b0f/?dl=1) data_all_EQ = load("EQ_Data.jld2","data_all_EQ") -# Write the data to paraview with +# Write the data to paraview with Write_Paraview(data_all_EQ,"data_all_EQ",PointsData=true) Write_Paraview(Topo,"Topo") # As earthquakes are point-wise data, you have to specify this. # ![LaPalma_EQTopo_GeoData](../assets/img/TopoEQs_LaPalma_GeoData.png) -# Note that this data is not in "easy" coordinates (see coordinate axis in the plot, where `z` is *not* pointing upwards). +# Note that this data is not in "easy" coordinates (see coordinate axis in the plot, where `z` is *not* pointing upwards). -# ## 2. Convert data -# In order to create model setups, it is helpful to first transfer the data to Cartesian. +# ## 2. Convert data +# In order to create model setups, it is helpful to first transfer the data to Cartesian. # This requires us to first determine a *projection point*, that is fixed. Often, it is helpful to use the center of the topography for this. In the present example, we will center the model around La Palma itself: -proj = ProjectionPoint(Lon=-17.84, Lat=28.56) +proj = ProjectionPoint(Lon=-17.84, Lat=28.56) # Once this is done you can convert the topographic data to the cartesian reference frame EQ_cart = Convert2CartData(data_all_EQ, proj); Topo_cart = Convert2CartData(Topo, proj) -# It is important to realize that the cartesian coordinates of the topographic grid is no longer strictly orthogonal after this conversion. You don't notice that in the current example, as the model domain is rather small. -# In other cases, however, this is quite substantial (e.g., India-Asia collision zone). +# It is important to realize that the cartesian coordinates of the topographic grid is no longer strictly orthogonal after this conversion. You don't notice that in the current example, as the model domain is rather small. +# In other cases, however, this is quite substantial (e.g., India-Asia collision zone). # LaMEM needs an orthogonal grid of topography, which we can create with: Topo_LaMEM = CartData(XYZGrid(-70:.2:70,-60:.2:70,0)); # In a next step, the routine `ProjectCartData` projects a `GeoData` structure to a `CartData` struct -Topo_LaMEM = ProjectCartData(Topo_LaMEM, Topo, proj) +Topo_LaMEM = ProjectCartData(Topo_LaMEM, Topo, proj) # Let's have a look at the data: Write_Paraview(EQ_cart,"EQ_cart",PointsData=true) @@ -59,20 +59,20 @@ Write_Paraview(Topo_LaMEM,"Topo_LaMEM") # ## 3. Create LaMEM setup # -# In a next step, we need to read the LaMEM input file of the simulation. In particular, this will read the lateral dimensions of the grid and +# In a next step, we need to read the LaMEM input file of the simulation. In particular, this will read the lateral dimensions of the grid and # the number of control volumes (elements), you want to apply in every direction. -# The LaMEM input file can be downloaded [here](../scripts/LaPalma.dat). +# The LaMEM input file can be downloaded [here](../scripts/LaPalma.dat). # Make sure you are in the same directory as the *.dat file & execute the following command Grid = ReadLaMEM_InputFile("LaPalma.dat") -# The `LaMEM_grid` structure contains the number of elements in every direction and the number of markers in every cell. +# The `LaMEM_grid` structure contains the number of elements in every direction and the number of markers in every cell. # It also contains `Grid.X`, `Grid.Y`, `Grid.Z`, which are the coordinates of each of the markers in the 3 directions. -# -# In a next step we need to give each of these points a `Phase` number (which is an integer, that indicates the type of the rock that point has), as well as the temperature (in Celcius). +# +# In a next step we need to give each of these points a `Phase` number (which is an integer, that indicates the type of the rock that point has), as well as the temperature (in Celsius). Phases = ones(Int32,size(Grid.X))*2; Temp = ones(Float64,size(Grid.X)); -# In this example we set the temperature based on the depth, assuming a constant geotherm. Depth is given in kilometers +# In this example we set the temperature based on the depth, assuming a constant geotherm. Depth is given in kilometers Geotherm = 30; Temp = -Grid.Z.*Geotherm; @@ -86,7 +86,7 @@ ind = findall( Grid.Z .< -40); Phases[ind] .= 3; # Everything above the free surface is assumed to be "air" -ind = AboveSurface(Grid, Topo_cart); +ind = AboveSurface(Grid, Topo_cart); Phases[ind] .= 0; # And all "air" points that are below sea-level becomes "water" @@ -94,9 +94,9 @@ ind = findall( (Phases.==0) .& (Grid.Z .< 0)); Phases[ind] .= 1; # You can interpret the seismicity in different ways. If you assume that there are fully molten magma chambers, there should be no earthquakes within the magma chamber (as stresses are liklely to be very small). -# If, however, this is a partially molten mush, extraction of melt of that mush will change the fluid pressure and may thus release build-up stresses. +# If, however, this is a partially molten mush, extraction of melt of that mush will change the fluid pressure and may thus release build-up stresses. # We will assume the second option in our model setup. -# +# # Looking at the seismicity, there is a swarm at around 35 km depth AddSphere!(Phases,Temp,Grid, cen=(0,0,-35), radius=5, phase=ConstantPhase(5), T=ConstantTemp(1200)); @@ -122,4 +122,4 @@ Save_LaMEMMarkersParallel(Model3D) #src Note: The markdown page is generated using: -#src Literate.markdown("LaPalma_example.jl",outputdir="../man/", execute=true) \ No newline at end of file +#src Literate.markdown("LaPalma_example.jl",outputdir="../man/", execute=true) diff --git a/docs/src/scripts/Tutorial_Votemaps.jl b/docs/src/scripts/Tutorial_Votemaps.jl index 597f26cc..52b4b408 100644 --- a/docs/src/scripts/Tutorial_Votemaps.jl +++ b/docs/src/scripts/Tutorial_Votemaps.jl @@ -6,10 +6,10 @@ # ## Steps # # #### 1. Load data -# We assume that you have all tomographic models already available as *.JLD2 files. In our example we use the data uploaded to +# We assume that you have all tomographic models already available as *.JLD2 files. In our example we use the data uploaded to # [https://seafile.rlp.net/d/22b0fb85550240758552/](https://seafile.rlp.net/d/22b0fb85550240758552/) # -# Specifically, we will use the tomographic models of Paffrath, Zhao and Koulakov, as we have them all available in processed form +# Specifically, we will use the tomographic models of Paffrath, Zhao and Koulakov, as we have them all available in processed form # Download the corresponding *.jld2 files to the same directory using JLD2, GeophysicalModelGenerator @@ -19,7 +19,7 @@ Pwave_Paffrath = load("Paffrath_Pwave.jld2","Data_set_Paffrath2021_Vp") # The 3 data sets all contain tomographic models for roughly the alpine area, but as you can see, they all have a different resolution and the fields are sometimes called different as well: Pwave_Paffrath -GeoData +GeoData size : (162, 130, 42) lon ϵ [ -13.3019 : 35.3019] lat ϵ [ 30.7638 : 61.2362] @@ -27,7 +27,7 @@ GeoData fields: (:dVp_Percentage, :Vp, :Resolution) PSwave_Koulakov - GeoData + GeoData size : (108, 81, 35) lon ϵ [ 4.0 : 20.049999999999997] lat ϵ [ 37.035928143712574 : 49.01197604790419] @@ -40,19 +40,19 @@ PSwave_Koulakov # - Everything that fulfills this criteria gets the number 1, everything else 0. # - Do the same with other tomographic models (using the same criteria or a different one). # - Make sure that all tomographic models are interpolated to the same grid. -# - Sum up the 3 models. +# - Sum up the 3 models. # - The resulting 3D map will have the number `3` at locations where all 3 models see a high-velocity anomaly (say a slab), implying that this feature is consistent between all models. Features that have a number 1 or 2 are only seen in a single or in 2/3 models. # # The result of this gives a feeling which features are consistent between the 3 models. It is ofcourse still subjective, as you still have to choose a criteria and as we are assuming in this that the 3 tomographic models are comparable (which they are not as they are produced using different amounts of datam, etc. etc.) # -# So how do we create Votemaps? +# So how do we create Votemaps? # Doing this is rather simple: Data_VoteMap = VoteMap( [Pwave_Paffrath, PSwave_Koulakov, Pwave_Zhao], ["dVp_Percentage>3.0","dVp_percentage>2.0","dVp_Percentage>2.0"], dims=(100,100,100)) # This will look at the common `lon`,`lat`,`depth` ranges between all 3 models, interpret each of the models to a common grid of size `(100,100,100)` and apply each of the criteria specified # The resulting `GeoData` struct looks like: -GeoData +GeoData size : (100, 100, 100) lon ϵ [ 4.0 : 18.0] lat ϵ [ 38.0 : 49.01197604790419] @@ -65,13 +65,10 @@ Write_Paraview(Data_VoteMap, "VoteMap_Alps") # In paraview, this gives # ![Tutorial_VoteMap](../assets/img/Tutorial_VoteMap.png) -# You can ofcourse argue that newer tomographic models include more data & should therefore count more +# You can ofcourse argue that newer tomographic models include more data & should therefore count more # A simple way to take that into account is to list the model twice: Data_VoteMap = VoteMap( [Pwave_Paffrath, Pwave_Paffrath, PSwave_Koulakov, Pwave_Zhao], - ["dVp_Percentage>3.0","dVp_Percentage>3.0", "dVp_percentage>2.0","dVp_Percentage>2.0"], + ["dVp_Percentage>3.0","dVp_Percentage>3.0", "dVp_percentage>2.0","dVp_Percentage>2.0"], dims=(100,100,100)) # Similarly, you could only look at a single model (even when that is perhaps easier done directly in paraview, where you can simply select a different isocontour value) - - - diff --git a/src/GMT_utils.jl b/src/GMT_utils.jl index 50accad0..ed6a69e8 100644 --- a/src/GMT_utils.jl +++ b/src/GMT_utils.jl @@ -5,11 +5,11 @@ using .GMT export ImportTopo """ - Topo = ImportTopo(limits; file::String="@earth_relief_01m.grd") - -Uses GMT to download the topography of a certain region, specified with limits=[lon_min, lon_max, lat_min, lat_max] + Topo = ImportTopo(limits; file::String="@earth_relief_01m.grd") -Note: +Uses GMT to download the topography of a certain region, specified with limits=[lon_min, lon_max, lat_min, lat_max] + +Note: ==== - latitude values in the southern hemisphere should have a minus sign (e.g., -2.8) - longitude values that are "west" should *either* come with a minus sign *or* are defined by values >180 @@ -34,10 +34,10 @@ Note: *Note*: this routine is only available once the GMT.jl package is loaded on the REPL -# Example +# Example ```julia-repl julia> Topo = ImportTopo([4,20,37,49]); -GeoData +GeoData size : (960, 720, 1) lon ϵ [ 4.0 : 19.983333333333334] lat ϵ [ 37.0 : 48.983333333333334] @@ -54,14 +54,14 @@ julia> Write_Paraview(Topo,"Topo_Alps") function ImportTopo(limits; file::String="@earth_relief_01m.grd") # Correct if negative values are given (longitude coordinates that are west) - ind = findall(limits[1:2] .< 0); - + ind = findall(limits[1:2] .< 0); + if (limits[1] < 0) && (limits[2] < 0) - limits[ind] .= 360 .+ limits[ind]; - limits[1:2] = sort(limits[1:2]) + limits[ind] .= 360 .+ limits[ind]; + limits[1:2] = sort(limits[1:2]) end - # Download topo file + # Download topo file G = gmtread(file, limits=limits, grid=true); # Transfer to GeoData @@ -69,7 +69,7 @@ function ImportTopo(limits; file::String="@earth_relief_01m.grd") Lon,Lat,Depth = LonLatDepthGrid(G.x[1:nx],G.y[1:ny],0); Depth[:,:,1] = 1e-3*G.z'; Topo = GeoData(Lon, Lat, Depth, (Topography=Depth*km,)) - + return Topo end diff --git a/src/GeophysicalModelGenerator.jl b/src/GeophysicalModelGenerator.jl index 26ced113..16f9127b 100644 --- a/src/GeophysicalModelGenerator.jl +++ b/src/GeophysicalModelGenerator.jl @@ -7,22 +7,22 @@ using Requires # Load & export some useful commands/functions from GeoParams: import GeoParams using .GeoParams -export - @u_str, uconvert, upreffered, unit, ustrip, NoUnits, # Units - GeoUnit, GEO_units, SI_units, NO_units, AbstratGeoUnits, +export + @u_str, uconvert, upreffered, unit, ustrip, NoUnits, # Units + GeoUnit, GEO_units, SI_units, NO_units, AbstratGeoUnits, Nondimensionalize, Nondimensionalize!, Dimensionalize, Dimensionalize!, - superscript, upreferred, GEO, SI, NONE, isDimensional, + superscript, upreferred, GEO, SI, NONE, isDimensional, km, m, cm, mm, Myrs, yr, s, MPa, Pa, Pas, K, C, kg, mol, isDimensional, Value, NumValue, Unit, UnitValue export ReadCSV_LatLon, meshgrid, voxGrav -abstract type AbstractGeneralGrid end # general grid types +abstract type AbstractGeneralGrid end # general grid types export AbstractGeneralGrid # julia standard library packages -using DelimitedFiles, Statistics +using DelimitedFiles, Statistics # other packages using WriteVTK, Colors, MeshIO, FileIO, Interpolations, Geodesy @@ -52,7 +52,7 @@ function __init__() println("Loading GLMakie plotting routines within GMG") @eval include("./Visualisation.jl") end - + end diff --git a/src/LaMEM_io.jl b/src/LaMEM_io.jl index 455e3099..b0d12140 100644 --- a/src/LaMEM_io.jl +++ b/src/LaMEM_io.jl @@ -4,9 +4,9 @@ using Glob using Interpolations # LaMEM I/O -# +# # These are routines that help to create a LaMEM marker files from a ParaviewData structure, which can be used to perform geodynamic simulations -# We also include routines with which we can read LaMEM *.pvtr files into julia +# We also include routines with which we can read LaMEM *.pvtr files into julia export LaMEM_grid, ReadLaMEM_InputFile export Save_LaMEMMarkersParallel, Save_LaMEMTopography @@ -21,47 +21,47 @@ struct LaMEM_grid <: AbstractGeneralGrid nmark_y :: Int64 nmark_z :: Int64 # total number of markers - nump_x :: Int64 + nump_x :: Int64 nump_y :: Int64 nump_z :: Int64 # total number of elements in grid nel_x :: Int64 nel_y :: Int64 nel_z :: Int64 - # exent of the grid + # extent of the grid W :: Float64 L :: Float64 H :: Float64 # start and end coordinates of grid segments - coord_x - coord_y + coord_x + coord_y coord_z # 1D vectors with marker coordinates x_vec y_vec z_vec # grid with marker coordinates - X - Y + X + Y Z # 1D vectors with node coordinates xn_vec yn_vec zn_vec # grid with node coordinates - Xn - Yn - Zn + Xn + Yn + Zn end -""" +""" ParaviewData(Grid::LaMEM_grid, fields::NamedTuple) Creates a `ParaviewData` struct from a LaMEM grid and from fields stored on that grid. Note that one needs to have a field `Phases` and optionally a field `Temp` to create LaMEM marker files. """ ParaviewData(Grid::LaMEM_grid, fields::NamedTuple) = ParaviewData(Grid.X, Grid.Y, Grid.Z, fields) -""" +""" CartData(Grid::LaMEM_grid, fields::NamedTuple) Creates a `CartData` struct from a LaMEM grid and from fields stored on that grid. Note that one needs to have a field `Phases` and optionally a field `Temp` to create LaMEM marker files. @@ -90,7 +90,7 @@ end """ value = ParseValue_LaMEM_InputFile(file,keyword,type) -Extracts a certain `keyword` from a LaMEM input `file` and convert it to a certain type +Extracts a certain `keyword` from a LaMEM input `file` and convert it to a certain type # Example ```julia @@ -123,7 +123,7 @@ function ParseValue_LaMEM_InputFile(file,keyword,type) end end end - + end return value @@ -132,14 +132,14 @@ end """ - Grid::LaMEM_grid = ReadLaMEM_InputFile(file) + Grid::LaMEM_grid = ReadLaMEM_InputFile(file) Parses a LaMEM input file and stores grid information in the `Grid` structure. # Example ```julia -julia> Grid = ReadLaMEM_InputFile("SaltModels.dat") -LaMEM Grid: +julia> Grid = ReadLaMEM_InputFile("SaltModels.dat") +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -163,7 +163,7 @@ function ReadLaMEM_InputFile(file) coord_y = ParseValue_LaMEM_InputFile(file,"coord_y",Float64); coord_z = ParseValue_LaMEM_InputFile(file,"coord_z",Float64); - # compute infromation from file + # compute information from file W = coord_x[end]-coord_x[1]; L = coord_y[end]-coord_y[1]; H = coord_z[end]-coord_z[1]; @@ -189,14 +189,14 @@ function ReadLaMEM_InputFile(file) zn = coord_z[1] : dz : coord_z[end]; # node grid - Xn,Yn,Zn = XYZGrid(xn, yn, zn); + Xn,Yn,Zn = XYZGrid(xn, yn, zn); # marker spacing dx = W / nump_x; dy = L / nump_y; dz = H / nump_z; - # marker coordinate vectors + # marker coordinate vectors x = coord_x[1]+dx/2 : dx : coord_x[end]-dx/2; y = coord_y[1]+dy/2 : dy : coord_y[end]-dy/2; z = coord_z[1]+dz/2 : dz : coord_z[end]-dz/2; @@ -219,7 +219,7 @@ function ReadLaMEM_InputFile(file) # make coordinate vectors xn = make1DCoords(nseg_x, nel_x, coord_x, bias_x); yn = make1DCoords(nseg_y, nel_y, coord_y, bias_y); - zn = make1DCoords(nseg_z, nel_z, coord_z, bias_z); + zn = make1DCoords(nseg_z, nel_z, coord_z, bias_z); # node grid Xn,Yn,Zn = XYZGrid(xn, yn, zn); @@ -237,7 +237,7 @@ function ReadLaMEM_InputFile(file) # finish Grid Grid = LaMEM_grid( nmark_x, nmark_y, nmark_z, nump_x, nump_y, nump_z, - nel_x_tot, nel_y_tot, nel_z_tot, + nel_x_tot, nel_y_tot, nel_z_tot, W, L, H, coord_x, coord_y, coord_z, x, y, z, @@ -335,12 +335,12 @@ end """ Save_LaMEMMarkersParallel(Grid::CartData; PartitioningFile=empty, directory="./markers", verbose=true) -Saves a LaMEM marker file from the `CartData` structure `Grid`. It must have a field called `Phases`, holding phase information (as integers) and optionally a field `Temp` with temperature info. +Saves a LaMEM marker file from the `CartData` structure `Grid`. It must have a field called `Phases`, holding phase information (as integers) and optionally a field `Temp` with temperature info. It is possible to provide a LaMEM partitioning file `PartitioningFile`. If not, output is assumed to be for one processor. The size of `Grid` should be consistent with what is provided in the LaMEM input file. In practice, the size of the mesh can be retrieved from a LaMEM input file using `ReadLaMEM_InputFile`. -# Example +# Example ``` julia> Grid = ReadLaMEM_InputFile("LaMEM_input_file.dat") @@ -349,7 +349,7 @@ julia> Temp = ones(Float64,size(Grid.X)); julia> Model3D = CartData(Grid, (Phases=Phases,Temp=Temp)) julia> Save_LaMEMMarkersParallel(Model3D) Writing LaMEM marker file -> ./markers/mdb.00000000.dat -``` +``` If you want to create a LaMEM input file for multiple processors: ``` julia> Save_LaMEMMarkersParallel(Model3D, PartitioningFile="ProcessorPartitioning_4cpu_1.2.2.bin") @@ -365,22 +365,22 @@ function Save_LaMEMMarkersParallel(Grid::CartData; PartitioningFile=empty, direc x = ustrip.(Grid.x.val[:,1,1]); y = ustrip.(Grid.y.val[1,:,1]); z = ustrip.(Grid.z.val[1,1,:]); - + if haskey(Grid.fields,:Phases) - Phases = Grid.fields[:Phases]; + Phases = Grid.fields[:Phases]; else error("You must provide the field :Phases in the structure") end - + if haskey(Grid.fields,:Temp) - Temp = Grid.fields[:Temp]; + Temp = Grid.fields[:Temp]; else if verbose println("Field :Temp is not provided; setting it to zero") end Temp = zeros(size(Phases)); end - + if PartitioningFile==empty # in case we run this on 1 processor only Nprocx = 1; @@ -388,8 +388,8 @@ function Save_LaMEMMarkersParallel(Grid::CartData; PartitioningFile=empty, direc Nprocz = 1; xc,yc,zc = x,y,z; else - Nprocx,Nprocy,Nprocz, - xc,yc,zc, + Nprocx,Nprocy,Nprocz, + xc,yc,zc, nNodeX,nNodeY,nNodeZ = GetProcessorPartitioning(PartitioningFile) end @@ -410,7 +410,7 @@ function Save_LaMEMMarkersParallel(Grid::CartData; PartitioningFile=empty, direc # Loop over all processors partition for n=1:Nproc # Extract coordinates for current processor - + part_x = ustrip.(Grid.x.val[x_start[n]:x_end[n],y_start[n]:y_end[n],z_start[n]:z_end[n]]); part_y = ustrip.(Grid.y.val[x_start[n]:x_end[n],y_start[n]:y_end[n],z_start[n]:z_end[n]]); part_z = ustrip.(Grid.z.val[x_start[n]:x_end[n],y_start[n]:y_end[n],z_start[n]:z_end[n]]); @@ -421,9 +421,9 @@ function Save_LaMEMMarkersParallel(Grid::CartData; PartitioningFile=empty, direc # Information vector per processor num_prop = 5; # number of properties we save [x/y/z/phase/T] lvec_info = num_particles; - + lvec_prtcls = zeros(Float64,num_prop*num_particles); - + lvec_prtcls[1:num_prop:end] = part_x[:]; lvec_prtcls[2:num_prop:end] = part_y[:]; lvec_prtcls[3:num_prop:end] = part_z[:]; @@ -474,7 +474,7 @@ function get_numscheme(Nprocx,Nprocy,Nprocz) nix = zeros(Int64, Nprocx*Nprocy*Nprocz) njy = zeros(Int64, Nprocx*Nprocy*Nprocz) nkz = zeros(Int64, Nprocx*Nprocy*Nprocz) - + num=0; for k=1:Nprocz for j=1:Nprocy @@ -487,7 +487,7 @@ function get_numscheme(Nprocx,Nprocy,Nprocz) end end end - + return n,nix,njy,nkz end @@ -508,9 +508,9 @@ function PetscBinaryWrite_Vec(filename, A) write(f,hton(Float64(1211214))); # header (not actually used) write(f,hton(Float64(nummark))); # info about # of markers written - + for i=2:n - write(f,hton(Float64(A[i]))); # Write data itself + write(f,hton(Float64(A[i]))); # Write data itself end end @@ -521,13 +521,13 @@ end """ nProcX,nProcY,nProcZ, xc,yc,zc, nNodeX,nNodeY,nNodeZ = GetProcessorPartitioning(filename) -Reads a LaMEM processor partitioning file, used to create marker files, and returns the parallel layout +Reads a LaMEM processor partitioning file, used to create marker files, and returns the parallel layout """ function GetProcessorPartitioning(filename) io = open(filename, "r") - + nProcX = ntoh(read(io,Int32)) nProcY = ntoh(read(io,Int32)) nProcZ = ntoh(read(io,Int32)) @@ -544,17 +544,17 @@ function GetProcessorPartitioning(filename) xcoor = [ntoh(read(io,Float64)) for i=1:nNodeX].*CharLength; ycoor = [ntoh(read(io,Float64)) for i=1:nNodeY].*CharLength; zcoor = [ntoh(read(io,Float64)) for i=1:nNodeZ].*CharLength; - + xc = xcoor[iX .+ 1] yc = ycoor[iY .+ 1] zc = zcoor[iZ .+ 1] close(io) - return nProcX,nProcY,nProcZ, - xc,yc,zc, + return nProcX,nProcY,nProcZ, + xc,yc,zc, nNodeX,nNodeY,nNodeZ - + end @@ -564,7 +564,7 @@ end coord, Data_3D_Arrays, Name_Vec = ReadData_VTR(fname) Reads a VTR (structured grid) VTK file `fname` and extracts the coordinates, data arrays and names of the data. -In general, this only contains a piece of the data, and one should open a `*.pvtr` file to retrieve the full data +In general, this only contains a piece of the data, and one should open a `*.pvtr` file to retrieve the full data """ function ReadData_VTR(fname, FullSize) file = open(fname, "r") @@ -578,7 +578,7 @@ function ReadData_VTR(fname, FullSize) while header==true line = readline(file) - line_strip = lstrip(line) + line_strip = lstrip(line) if startswith(line_strip, "") @@ -599,32 +599,32 @@ function ReadData_VTR(fname, FullSize) end if startswith(line_strip, "") - line_strip = lstrip(readline(file)) + line_strip = lstrip(readline(file)) while ~startswith(line_strip, "") Type, Name, NumberOfComponents, Offset = Parse_VTR_Line(line_strip); num += 1 Offset_Vec = [Offset_Vec; Offset]; - Name_Vec = [Name_Vec; Name]; - Type_Vec = [Type_Vec; Type]; + Name_Vec = [Name_Vec; Name]; + Type_Vec = [Type_Vec; Type]; NumComp_Vec = [NumComp_Vec; NumberOfComponents]; - line_strip = lstrip(readline(file)) - end + line_strip = lstrip(readline(file)) + end end if startswith(line_strip, "") - line_strip = lstrip(readline(file)) + line_strip = lstrip(readline(file)) while ~startswith(line_strip, "") Type, Name, NumberOfComponents, Offset = Parse_VTR_Line(line_strip); num += 1 - + Offset_Vec = [Offset_Vec; Offset]; - Name_Vec = [Name_Vec; Name]; - Type_Vec = [Type_Vec; Type]; + Name_Vec = [Name_Vec; Name]; + Type_Vec = [Type_Vec; Type]; NumComp_Vec = [NumComp_Vec; NumberOfComponents]; - line_strip = lstrip(readline(file)) + line_strip = lstrip(readline(file)) # if we have cell Data, for some reason we need to increment this by one. PieceExtent[1:2:end] .+= 1 - end - + end + end if startswith(line_strip, " Data = ReadData_PVTR("Haaksbergen.pvtr", "./Timestep_00000005_3.35780500e-01/") -ParaviewData +ParaviewData size : (33, 33, 33) x ϵ [ -3.0 : 3.0] y ϵ [ -2.0 : 2.0] @@ -796,13 +796,13 @@ function ReadData_PVTR(fname, dir) while header==true line = readline(file) - line_strip = lstrip(line) + line_strip = lstrip(line) if startswith(line_strip, " 1 || length(unique(trunc.(diff(Topo.y.val[1,:,1]), digits=8))) > 1 x1 = ustrip(Topo.x.val[end,1,1]); y1 = ustrip(Topo.y.val[1,end,1]); dx = (x1-x0) / (nx-1); dy = (y1-y0) / (ny-1); - + itp = LinearInterpolation((Topo.x.val[:,1,1], Topo.y.val[1,:,1]), ustrip.(Topo.fields.Topography[:,:,1])); Topo_itp = [itp(x,y) for x in x0:dx:x1, y in y0:dy:y1]; @@ -916,7 +916,7 @@ function Save_LaMEMTopography(Topo::CartData, filename::String) dy = ustrip(Topo.y.val[2,2,1]) - y0 # Code the topograhic data into a vector Topo_vec = [ nx;ny;x0;y0;dx;dy; ustrip.(Topo.fields.Topography[:])] - end + end # Write as PetscBinary file PetscBinaryWrite_Vec(filename, Topo_vec) @@ -937,15 +937,15 @@ Likewise for the `mpiexec` directory (if not specified it is assumed to be avail function CreatePartitioningFile(LaMEM_input::String,NumProc::Int64; LaMEM_dir::String=pwd(), LaMEM_options="", MPI_dir="") # Create string to execute LaMEM - mpi_str = MPI_dir*"mpiexec -n $(NumProc) " + mpi_str = MPI_dir*"mpiexec -n $(NumProc) " LaMEM_str = LaMEM_dir*"/"*"LaMEM -ParamFile "*LaMEM_input*" -mode save_grid " str = mpi_str*LaMEM_str - + println("Executing command: $str") - + # Run exit=run(`sh -c $str`, wait=false); - + # Retrieve newest file if success(exit) files=readdir(glob"ProcessorPartitioning_*.bin") @@ -955,8 +955,8 @@ function CreatePartitioningFile(LaMEM_input::String,NumProc::Int64; LaMEM_dir::S end id = findall(time_modified.==maximum(time_modified)) # last modified PartFile = files[id] - - println("Successfuly generated PartitioningFile: $(PartFile[1])") + + println("Successfully generated PartitioningFile: $(PartFile[1])") else error("Something went wrong with executing command ") end diff --git a/src/Paraview_output.jl b/src/Paraview_output.jl index cda6771c..a57ca728 100644 --- a/src/Paraview_output.jl +++ b/src/Paraview_output.jl @@ -10,32 +10,32 @@ export Write_Paraview, Movie_Paraview Writes a structure with `Geodata`` to a paraview (or VTK) file. If you have unstructured points (e.g., earthquake data), set `PointsData=true`. In case you want to create a movie in Paraview, and this is a timestep of that movie you also have to pass `time` and `pvd` -# Example 1: Write a 3D volume +# Example 1: Write a 3D volume ```julia-repl julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); -julia> Data_set = GeoData(Lat,Lon,Depth,(Depthdata=Depth,LonData=Lon)) +julia> Data_set = GeoData(Lat,Lon,Depth,(Depthdata=Depth,LonData=Lon)) julia> Write_Paraview(Data_set, "test_depth3D") ``` # Example 2: Horizontal slice @ given depth ```julia-repl julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,10km); -julia> Data_set = GeoData(Lat,Lon,Depth,(Topography=Depth,)) +julia> Data_set = GeoData(Lat,Lon,Depth,(Topography=Depth,)) julia> Write_Paraview(Data_set, "test") ``` # Example 3: Case with topography ```julia-repl julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,10km); -julia> Depth[2:4,2:4,1] .= 25km -julia> Data_set = GeoData(Lat,Lon,Depth,(Topography=Depth,)) +julia> Depth[2:4,2:4,1] .= 25km +julia> Data_set = GeoData(Lat,Lon,Depth,(Topography=Depth,)) julia> Write_Paraview(Data_set, "test2") ``` # Example 4: Profile ```julia-repl julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,35,(-300:25:0)km); -julia> Data_set = GeoData(Lat,Lon,Depth,(DataSet=Depth,Depth=Depth)) +julia> Data_set = GeoData(Lat,Lon,Depth,(DataSet=Depth,Depth=Depth)) julia> Write_Paraview(Data_set, "test") ``` @@ -44,12 +44,12 @@ julia> Write_Paraview(Data_set, "test") julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,10km); julia> Ve, Vn, Vz = ones(size(Depth)), ones(size(Depth))*0.5, zeros(size(Depth)); julia> Data_set = GeoData(Lat,Lon,Depth,(DataSet=Depth, Velocity=(Ve,Vn,Vz))) -GeoData +GeoData size : (11, 11, 1) lon ϵ [ 30.0 - 40.0] lat ϵ [ 10.0 - 20.0] depth ϵ [ 10.0 km - 10.0 km] - fields: (:DataSet, :Velocity) + fields: (:DataSet, :Velocity) julia> Write_Paraview(Data_set, "test_Velocity") ``` @@ -58,7 +58,7 @@ Note that these points should be 1D vectors. ```julia-repl julia> Lon,Lat,Depth = LonLatDepthGrid(10:5:20,35:2:40,(-300:50:0)km); julia> Lon=Lon[:]; Lat=Lat[:]; Depth=Depth[:]; -julia> Data_set = GeoData(Lat,Lon,Depth,(DataSet=Depth[:],Depth=Depth*10)); +julia> Data_set = GeoData(Lat,Lon,Depth,(DataSet=Depth[:],Depth=Depth*10)); julia> Write_Paraview(Data_set, "test_Points", PointsData=true) ``` @@ -76,31 +76,31 @@ function Write_Paraview(DataSet::ParaviewData, filename="test"; PointsData=false filename = joinpath(directory, filename); # add directory name to pathname end - # Create VT* file - if PointsData + # Create VT* file + if PointsData # in case we write a dataset with unconnected points (e.g., GPS data, EQ locations etc.) npoints = length(DataSet.x) cells = [MeshCell(VTKCellTypes.VTK_VERTEX, (i, )) for i = 1:npoints] x = ustrip.(DataSet.x.val); x = x[:]; y = ustrip.(DataSet.y.val); y = y[:]; - z = ustrip.(DataSet.z.val); z = z[:]; + z = ustrip.(DataSet.z.val); z = z[:]; vtkfile = vtk_grid(filename, x,y,z, cells) else # for connected 3D grids, 2D planes or 1D lines - vtkfile = vtk_grid(filename, ustrip.(DataSet.x.val), ustrip.(DataSet.y.val), ustrip.(DataSet.z.val)) + vtkfile = vtk_grid(filename, ustrip.(DataSet.x.val), ustrip.(DataSet.y.val), ustrip.(DataSet.z.val)) end # Add data fields to VT* file names = String.(collect(keys(DataSet.fields))); # this is how to retrieve the names of the data fields for (index, name) in enumerate(names) - + if typeof(DataSet.fields[index])<: Tuple # if we do a tuple of velocities, it appears difficult to deal with units # This will require some more work unit_name = "" - Data = DataSet.fields[index] + Data = DataSet.fields[index] if unit(Data[1][1])!=NoUnits error("potential error as vector data fields have units; please save them with no units!") end @@ -108,19 +108,19 @@ function Write_Paraview(DataSet::ParaviewData, filename="test"; PointsData=false unit_name = unit(DataSet.fields[index][1]) Data = ustrip.(DataSet.fields[index]) end - + name_with_units = join([name," [$(unit_name)]"]); # add units to the name of the field - if PointsData - vtkfile[name_with_units, VTKPointData()] = Data[:]; + if PointsData + vtkfile[name_with_units, VTKPointData()] = Data[:]; else - vtkfile[name_with_units] = Data; + vtkfile[name_with_units] = Data; end end outfiles = vtk_save(vtkfile); println("Saved file: $(outfiles[1])") if !isnothing(pvd) - # Write movie + # Write movie pvd[time] = vtkfile end @@ -131,12 +131,12 @@ end Write_Paraview(DataSet::GeoData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) = Write_Paraview(convert(ParaviewData,DataSet), filename, PointsData=PointsData, pvd=pvd, time=time, directory=directory); """ - Write_Paraview(DataSet::UTMData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) + Write_Paraview(DataSet::UTMData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) -Writes a `UTMData` structure to paraview. Note that this data is *not* transformed into an Earth-like framework, but remains cartesian instead. +Writes a `UTMData` structure to paraview. Note that this data is *not* transformed into an Earth-like framework, but remains cartesian instead. """ -function Write_Paraview(DataSet::UTMData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) - +function Write_Paraview(DataSet::UTMData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) + PVData = ParaviewData(DataSet.EW, DataSet.NS, DataSet.depth.val, DataSet.fields) outfiles = Write_Paraview(PVData, filename, PointsData=PointsData, pvd=pvd, time=time, directory=directory); @@ -144,12 +144,12 @@ function Write_Paraview(DataSet::UTMData, filename::Any; PointsData=false, pvd=n end """ - Write_Paraview(DataSet::CartData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) + Write_Paraview(DataSet::CartData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) -Writes a `CartData` structure to paraview. +Writes a `CartData` structure to paraview. """ -function Write_Paraview(DataSet::CartData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) - +function Write_Paraview(DataSet::CartData, filename::Any; PointsData=false, pvd=nothing, time=nothing, directory=nothing) + PVData = ParaviewData(DataSet.x.val, DataSet.y.val, DataSet.z.val, DataSet.fields) outfiles = Write_Paraview(PVData, filename, PointsData=PointsData, pvd=pvd, time=time, directory=directory); @@ -162,7 +162,7 @@ end pvd = Movie_Paraview(; name="Movie", pvd=pvd, Finalize::Bool=false, Initialize::Bool=true) If you want to make a movie of your data set, you can use this routine to initialize and to finalize the movie-file. -It will create a `*.pvd` file, which you can open in Paraview +It will create a `*.pvd` file, which you can open in Paraview Individual timesteps are added to the movie by passing `pvd` and the time of the timestep to the `Write_Paraview` routine. @@ -182,15 +182,12 @@ Movie_Paraview(pvd=movie, Finalize=true) function Movie_Paraview(; name="Movie", pvd=nothing, Finalize::Bool=false, Initialize::Bool=true) if (Initialize) & !(Finalize) - pvd = paraview_collection(name) + pvd = paraview_collection(name) end if Finalize vtk_save(pvd) println("Saved PVD file") end - + return pvd end - - - \ No newline at end of file diff --git a/src/Setup_geometry.jl b/src/Setup_geometry.jl index af2092e4..92bdfb1f 100644 --- a/src/Setup_geometry.jl +++ b/src/Setup_geometry.jl @@ -1,17 +1,17 @@ using Base: Int64, Float64, NamedTuple using Printf using Parameters # helps setting default parameters in structures -using SpecialFunctions: erfc +using SpecialFunctions: erfc # Setup_geometry -# +# # These are routines that help to create input geometries, such as slabs with a given angle # export AddBox!, AddSphere!, AddEllipsoid!, AddCylinder!, makeVolcTopo, ConstantTemp, LinearTemp, HalfspaceCoolingTemp, SpreadingRateTemp, - ConstantPhase, LithosphericPhases, + ConstantPhase, LithosphericPhases, Compute_ThermalStructure, Compute_Phase @@ -35,17 +35,17 @@ Parameters - Origin - the origin, used to rotate the box around. Default is the left-front-top corner - StrikeAngle - strike angle of slab - DipAngle - dip angle of slab -- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` -- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` +- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` +- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` Examples ======== Example 1) Box with constant phase and temperature & a dip angle of 10 degrees: -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") -LaMEM Grid: +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -56,38 +56,38 @@ julia> Phases = zeros(Int32, size(Grid.X)); julia> Temp = zeros(Float64, size(Grid.X)); julia> AddBox!(Phases,Temp,Grid, xlim=(0,500), zlim=(-50,0), phase=ConstantPhase(3), DipAngle=10, T=ConstantTemp(1000)) julia> Model3D = ParaviewData(Grid, (Phases=Phases,Temp=Temp)); # Create Cartesian model -julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview 1-element Vector{String}: - "LaMEM_ModelSetup.vts" + "LaMEM_ModelSetup.vts" ``` Example 2) Box with halfspace cooling profile -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") julia> Phases = zeros(Int32, size(Grid.X)); julia> Temp = zeros(Float64, size(Grid.X)); julia> AddBox!(Phases,Temp,Grid, xlim=(0,500), zlim=(-50,0), phase=ConstantPhase(3), DipAngle=10, T=ConstantTemp(1000)) julia> Model3D = ParaviewData(Grid, (Phases=Phases,Temp=Temp)); # Create Cartesian model -julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview 1-element Vector{String}: - "LaMEM_ModelSetup.vts" + "LaMEM_ModelSetup.vts" ``` """ function AddBox!(Phase, Temp, Grid::AbstractGeneralGrid; # required input xlim=Tuple{2}, ylim=nothing, zlim=Tuple{2}, # limits of the box Origin=nothing, StrikeAngle=0, DipAngle=0, # origin & dip/strike phase = ConstantPhase(1), # Sets the phase number(s) in the box - T=nothing ) # Sets the thermal structure (various fucntions are available) - + T=nothing ) # Sets the thermal structure (various functions are available) + # Retrieve 3D data arrays for the grid X,Y,Z = coordinate_grids(Grid) - # Limits of block - if ylim==nothing - ylim = (minimum(Y), maximum(Y)) + # Limits of block + if ylim==nothing + ylim = (minimum(Y), maximum(Y)) end - - if Origin==nothing + + if Origin==nothing Origin = (xlim[1], ylim[1], zlim[2]) # upper-left corner end @@ -102,10 +102,10 @@ function AddBox!(Phase, Temp, Grid::AbstractGeneralGrid; # requi # Set phase number & thermal structure in the full domain ztop = zlim[2] - Origin[3] zbot = zlim[1] - Origin[3] - ind = findall( (Xrot .>= (xlim[1] - Origin[1])) .& (Xrot .<= (xlim[2] - Origin[1])) .& - (Yrot .>= (ylim[1] - Origin[2])) .& (Yrot .<= (ylim[2] - Origin[2])) .& + ind = findall( (Xrot .>= (xlim[1] - Origin[1])) .& (Xrot .<= (xlim[2] - Origin[1])) .& + (Yrot .>= (ylim[1] - Origin[2])) .& (Yrot .<= (ylim[2] - Origin[2])) .& (Zrot .>= zbot) .& (Zrot .<= ztop) ) - + # Compute thermal structure accordingly. See routines below for different options if T != nothing @@ -114,7 +114,7 @@ function AddBox!(Phase, Temp, Grid::AbstractGeneralGrid; # requi # Set the phase. Different routines are available for that - see below. Phase[ind] = Compute_Phase(Phase[ind], Temp[ind], Xrot[ind], Yrot[ind], Zrot[ind], phase) - + return nothing end @@ -133,17 +133,17 @@ Parameters - Grid - LaMEM grid structure (usually obtained with ReadLaMEM_InputFile) - cen - center coordinates of sphere - radius - radius of sphere -- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` -- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` +- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` +- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` Example ======== Sphere with constant phase and temperature: -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") -LaMEM Grid: +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -154,16 +154,16 @@ julia> Phases = zeros(Int32, size(Grid.X)); julia> Temp = zeros(Float64, size(Grid.X)); julia> AddSphere!(Phases,Temp,Grid, cen=(0,0,-1), radius=0.5, phase=ConstantPhase(2), T=ConstantTemp(800)) julia> Model3D = ParaviewData(Grid, (Phases=Phases,Temp=Temp)); # Create Cartesian model -julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview 1-element Vector{String}: - "LaMEM_ModelSetup.vts" + "LaMEM_ModelSetup.vts" ``` """ function AddSphere!(Phase, Temp, Grid::AbstractGeneralGrid; # required input cen=Tuple{3}, radius=Tuple{1}, # center and radius of the sphere phase = ConstantPhase(1), # Sets the phase number(s) in the sphere - T=nothing ) # Sets the thermal structure (various fucntions are available) - + T=nothing ) # Sets the thermal structure (various functions are available) + # Retrieve 3D data arrays for the grid X,Y,Z = coordinate_grids(Grid) @@ -200,17 +200,17 @@ Parameters - Origin - the origin, used to rotate the box around. Default is the left-front-top corner - StrikeAngle - strike angle of slab - DipAngle - dip angle of slab -- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` -- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` +- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` +- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` Example ======== Ellipsoid with constant phase and temperature, rotated 90 degrees and tilted by 45 degrees: -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") -LaMEM Grid: +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -221,18 +221,18 @@ julia> Phases = zeros(Int32, size(Grid.X)); julia> Temp = zeros(Float64, size(Grid.X)); julia> AddEllipsoid!(Phases,Temp,Grid, cen=(-1,-1,-1), axes=(0.2,0.1,0.5), StrikeAngle=90, DipAngle=45, phase=ConstantPhase(3), T=ConstantTemp(600)) julia> Model3D = ParaviewData(Grid, (Phases=Phases,Temp=Temp)); # Create Cartesian model -julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview 1-element Vector{String}: - "LaMEM_ModelSetup.vts" + "LaMEM_ModelSetup.vts" ``` """ function AddEllipsoid!(Phase, Temp, Grid::AbstractGeneralGrid; # required input cen=Tuple{3}, axes=Tuple{3}, # center and semi-axes of the ellpsoid Origin=nothing, StrikeAngle=0, DipAngle=0, # origin & dip/strike phase = ConstantPhase(1), # Sets the phase number(s) in the box - T=nothing ) # Sets the thermal structure (various fucntions are available) + T=nothing ) # Sets the thermal structure (various functions are available) - if Origin==nothing + if Origin==nothing Origin = cen # center end @@ -281,17 +281,17 @@ Parameters - base - center coordinate of bottom of cylinder - cap - center coordinate of top of cylinder - radius - radius of the cylinder -- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` -- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` +- phase - specifies the phase of the box. See `ConstantPhase()`,`LithosphericPhases()` +- T - specifies the temperature of the box. See `ConstantTemp()`,`LinearTemp()`,`HalfspaceCoolingTemp()`,`SpreadingRateTemp()` Example ======== Cylinder with constant phase and temperature: -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") -LaMEM Grid: +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -302,16 +302,16 @@ julia> Phases = zeros(Int32, size(Grid.X)); julia> Temp = zeros(Float64, size(Grid.X)); julia> AddCylinder!(Phases,Temp,Grid, base=(-1,-1,-1.5), cap=(1,1,-0.5), radius=0.25, phase=ConstantPhase(4), T=ConstantTemp(400)) julia> Model3D = ParaviewData(Grid, (Phases=Phases,Temp=Temp)); # Create Cartesian model -julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +julia> Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview 1-element Vector{String}: - "LaMEM_ModelSetup.vts" + "LaMEM_ModelSetup.vts" ``` """ function AddCylinder!(Phase, Temp, Grid::AbstractGeneralGrid; # required input base=Tuple{3}, cap=Tuple{3}, radius=Tuple{1}, # center and radius of the sphere phase = ConstantPhase(1), # Sets the phase number(s) in the sphere - T=nothing ) # Sets the thermal structure (various fucntions are available) - + T=nothing ) # Sets the thermal structure (various functions are available) + # axis vector of cylinder axVec = cap .- base ax2 = (axVec[1]^2 + axVec[2]^2 + axVec[3]^2) @@ -355,8 +355,8 @@ function Rot3D!(X,Y,Z, StrikeAngle, DipAngle) for i in eachindex(X) CoordVec = [X[i], Y[i], Z[i]] - CoordRot = rotz*CoordVec; - CoordRot = roty*CoordRot; + CoordRot = rotz*CoordVec; + CoordRot = roty*CoordRot; X[i] = CoordRot[1]; Y[i] = CoordRot[2]; Z[i] = CoordRot[3]; @@ -390,9 +390,9 @@ Example ======== Cylinder with constant phase and temperature: -```julia +```julia julia> Grid = ReadLaMEM_InputFile("test_files/SaltModels.dat") -LaMEM Grid: +LaMEM Grid: nel : (32, 32, 32) marker/cell : (3, 3, 3) markers : (96, 96, 96) @@ -400,7 +400,7 @@ LaMEM Grid: y ϵ [-2.0 : 2.0] z ϵ [-2.0 : 0.0] julia> Topo = makeVolcTopo(Grid, center=[0.0,0.0], height=0.4, radius=1.5, crater=0.5, base=0.1) -CartData +CartData size : (33, 33, 1) x ϵ [ -3.0 : 3.0] y ϵ [ -2.0 : 2.0] @@ -408,21 +408,21 @@ CartData fields : (:Topography,) attributes: ["note"] julia> Topo = makeVolcTopo(Grid, center=[0.0,0.0], height=0.8, radius=0.5, crater=0.0, base=0.4, background=Topo.fields.Topography) -CartData +CartData size : (33, 33, 1) x ϵ [ -3.0 : 3.0] y ϵ [ -2.0 : 2.0] z ϵ [ 0.1 : 0.8] fields : (:Topography,) attributes: ["note"] -julia> Write_Paraview(Topo,"VolcanoTopo") # Save topography to paraview -Saved file: VolcanoTopo.vts +julia> Write_Paraview(Topo,"VolcanoTopo") # Save topography to paraview +Saved file: VolcanoTopo.vts ``` """ -function makeVolcTopo(Grid::LaMEM_grid; - center::Array{Float64, 1}, - height::Float64, - radius::Float64, +function makeVolcTopo(Grid::LaMEM_grid; + center::Array{Float64, 1}, + height::Float64, + radius::Float64, crater=0.0, base=0.0, background=nothing) @@ -443,7 +443,7 @@ function makeVolcTopo(Grid::LaMEM_grid; # get radial distance from crater rim RD .-= crater - + # find position relative to crater rim dr = radius - crater pos = (-RD ./ dr .+ 1) @@ -456,7 +456,7 @@ function makeVolcTopo(Grid::LaMEM_grid; else background = nondimensionalize(background, CharUnits) if size(background) == size(X) - H .= background + H .= background elseif size(background) == size(reshape(X,nx,ny,1)) H .= background[:,:,1] else @@ -467,7 +467,7 @@ function makeVolcTopo(Grid::LaMEM_grid; H[ind] .= pos[ind] .* (height-base) .+ base ind = findall(x->x>= 1.0, pos) H[ind] .= height - + # dimensionalize Topo = dimensionalize(H, km, CharUnits) @@ -476,12 +476,12 @@ function makeVolcTopo(Grid::LaMEM_grid; end -abstract type AbstractThermalStructure end +abstract type AbstractThermalStructure end """ ConstantTemp(T=1000) - + Sets a constant temperature inside the box Parameters @@ -500,7 +500,7 @@ end """ LinearTemp(Ttop=0, Tbot=1000) - + Set a linear temperature structure from top to bottom Parameters @@ -526,7 +526,7 @@ end """ HalfspaceCoolingTemp(Tsurface=0, Tmantle=1350, Age=60, Adiabat=0) - + Sets a halfspace temperature structure in plate Parameters @@ -553,7 +553,7 @@ function Compute_ThermalStructure(Temp, X, Y, Z, s::HalfspaceCoolingTemp) ThermalAge = Age*1e6*SecYear; MantleAdiabaticT = Tmantle .+ Adiabat*abs.(Z); # Adiabatic temperature of mantle - + for i in eachindex(Temp) Temp[i] = (Tsurface .- Tmantle)*erfc((abs.(Z[i])*1e3)./(2*sqrt(kappa*ThermalAge))) + MantleAdiabaticT[i]; end @@ -563,7 +563,7 @@ end """ SpreadingRateTemp(Tsurface=0, Tmantle=1350, Adiabat=0, MORside="left",SpreadingVel=3, AgeRidge=0, maxAge=80) - + Sets a halfspace temperature structure within the box, combined with a spreading rate (which implies that the plate age varies) Parameters @@ -593,28 +593,28 @@ function Compute_ThermalStructure(Temp, X, Y, Z, s::SpreadingRateTemp) kappa = 1e-6; SecYear = 3600*24*365 dz = Z[end]-Z[1]; - + MantleAdiabaticT = Tmantle .+ Adiabat*abs.(Z); # Adiabatic temperature of mantle - + if MORside=="left" - Distance = X .- X[1,1,1]; + Distance = X .- X[1,1,1]; elseif MORside=="right" - Distance = X[end,1,1] .- X; + Distance = X[end,1,1] .- X; elseif MORside=="front" - Distance = Y .- Y[1,1,1]; + Distance = Y .- Y[1,1,1]; elseif MORside=="back" - Distance = Y[1,end,1] .- Y; + Distance = Y[1,end,1] .- Y; else error("unknown side") end - + for i in eachindex(Temp) ThermalAge = abs(Distance[i]*1e3*1e2)/SpreadingVel + AgeRidge*1e6; # Thermal age in years if ThermalAge>maxAge*1e6 ThermalAge = maxAge*1e6 end - + ThermalAge = ThermalAge*SecYear; Temp[i] = (Tsurface .- Tmantle)*erfc((abs.(Z[i])*1e3)./(2*sqrt(kappa*ThermalAge))) + MantleAdiabaticT[i]; @@ -624,12 +624,12 @@ end -abstract type AbstractPhaseNumber end +abstract type AbstractPhaseNumber end """ ConstantPhase(phase=1) - + Sets a constant phase inside the box Parameters @@ -649,7 +649,7 @@ end """ LithosphericPhases(Layers=[10 20 15], Phases=[1 2 3 4], Tlab=nothing ) - + This allows defining a layered lithosphere. Layering is defined from the top downwards. Parameters @@ -709,4 +709,4 @@ function Compute_Phase(Phase, Temp, X, Y, Z, s::LithosphericPhases; Ztop=0) end # allow AbstractGeneralGrid instead of Z and Ztop -Compute_Phase(Phase, Temp, Grid::LaMEM_grid, s::LithosphericPhases) = Compute_Phase(Phase, Temp, Grid.X, Grid.Y, Grid.Z, s::LithosphericPhases, Ztop=maximum(Grid.coord_z)) \ No newline at end of file +Compute_Phase(Phase, Temp, Grid::LaMEM_grid, s::LithosphericPhases) = Compute_Phase(Phase, Temp, Grid.X, Grid.Y, Grid.Z, s::LithosphericPhases, Ztop=maximum(Grid.coord_z)) diff --git a/src/Visualisation.jl b/src/Visualisation.jl index 162c9121..3800c3dd 100644 --- a/src/Visualisation.jl +++ b/src/Visualisation.jl @@ -15,7 +15,7 @@ Note that you may have to use `ProjectCartData` to project it to orthogonal cart function Visualise(Data; Topography=nothing, Topo_range=nothing) - axis_equal = false; # in case we use x/y/z data in km, this is useful + axis_equal = false; # in case we use x/y/z data in km, this is useful if isa(Data,GeoData) x = Data.lon.val[:,1,1]; xlab = "lon" @@ -23,7 +23,7 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) z = Data.depth.val[1,1,:]; zlab = "depth [km]" orthogonal = true; elseif isa(Data,CartData) - # Determine + # Determine x = Data.x.val[:,1,1]; xlab = "X [km]" y = Data.y.val[1,:,1]; ylab = "Y [km]" z = Data.z.val[1,1,:]; zlab = "Z [km]" @@ -39,7 +39,7 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) else error("not yet implemented ") end - + if !axis_equal x_vec = 1:length(x) y_vec = 1:length(y) @@ -53,31 +53,31 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) dx = (maximum(x) - minimum(x))/(length(x)-1); dy = (maximum(y) - minimum(y))/(length(y)-1); dz = (maximum(z) - minimum(z))/(length(z)-1); - + if !isnothing(Topography) if isa(Topography,GeoData) - x_topo = (Topography.lon.val .- x[1])/dx; - y_topo = (Topography.lat.val .- y[1])/dy; - z_topo = (Topography.depth.val .- z[1])/dz; + x_topo = (Topography.lon.val .- x[1])/dx; + y_topo = (Topography.lat.val .- y[1])/dy; + z_topo = (Topography.depth.val .- z[1])/dz; elseif isa(Topography,CartData) - x_topo = Topography.x.val; - y_topo = Topography.y.val; - z_topo = Topography.z.val; + x_topo = Topography.x.val; + y_topo = Topography.y.val; + z_topo = Topography.z.val; end if isnothing(Topo_range) - # base + # base topo_max = round(maximum(ustrip.(Topography.fields.Topography)),digits=1) Topo_range = (-topo_max, topo_max) end end - + data_names = keys(Data.fields) # Names of the fields data_selected = Observable(Symbol(data_names[1])) data_string = @lift String($data_selected) get_vol(f_name) = Data.fields[f_name] - vol = lift(get_vol, data_selected) + vol = lift(get_vol, data_selected) fig = Figure(resolution = (2000,2000), fontsize=20) ax = LScene(fig[1, 1:2], scenekw = (camera = cam3d!, raw = false)) @@ -93,10 +93,10 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) format = v -> string(round((v-1)*dz + z[1], digits = 2))), (label = "Transparency topo", range = 0:.01:1), ) - + # Create dropdown menus menu_dataset = Menu(fig, options = [String.(data_names)...], default=String(data_selected[])) - menu_colormap = Menu(fig, options = ["roma","romaO","vik","turku","davos","batlow","tab10","tab20","bone"], + menu_colormap = Menu(fig, options = ["roma","romaO","vik","turku","davos","batlow","tab10","tab20","bone"], default="roma") # Colorbar limits @@ -104,14 +104,14 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) cmax = @lift round(maximum($vol), digits=2) cmin_str = @lift string($cmin) cmax_str = @lift string($cmax) - + cmin_box = Textbox(fig, stored_string = cmin_str,width = nothing) cmax_box = Textbox(fig, stored_string = cmax_str,width = nothing) iso_level = Observable([1.7]) iso_alpha = Observable(0.5); iso_box = Textbox(fig, stored_string ="$(iso_level[][1])",width = nothing) - iso_toggle = Toggle(fig, active = true); + iso_toggle = Toggle(fig, active = true); iso_slide = Slider(fig, range = 0:.01:1) set_close_to!(iso_slide, iso_alpha[]) @@ -127,8 +127,8 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) lo = sgrid.layout nc = ncols(lo) - - # Note: volumeslices & GLMakie in general seems to have a bit of an issue with + + # Note: volumeslices & GLMakie in general seems to have a bit of an issue with # using real coordinates. In many cases the numerical values of lon/lat are much smaller than the depth values, # & not centered around zero. # @@ -165,7 +165,7 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) ylabel!(ax.scene, ylab) zlabel!(ax.scene, zlab) - # + # cb = Colorbar(fig[1, 3], plt, vertical = true, label=data_string, height = Relative(0.6)) # connect sliders to `volumeslices` update methods @@ -181,8 +181,8 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) set_close_to!(sl_xy, .5length(z_vec)) end set_close_to!(sl_alpha_topo, .5) - - # change color limits + + # change color limits on(cmin_box.stored_string) do s ra = plt[:colorrange] plt[:colorrange] = (parse(Float64,s), ra.val[2]) @@ -192,7 +192,7 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) plt[:colorrange] = (ra.val[1], parse(Float64,s)) end - # Change data + # Change data on(menu_dataset.selection) do s data_selected[] = Symbol(s) plt[:colorrange] = (cmin[], cmax[]) @@ -203,20 +203,20 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) set_close_to!(sl_yz, sl_yz.value[]) set_close_to!(sl_xz, sl_xz.value[]) set_close_to!(sl_xy, sl_xy.value[]) - + end - - # Change colormap + + # Change colormap on(menu_colormap.selection) do s plt.colormap = s end - # Create isosurface? + # Create isosurface? on(iso_toggle.active) do s iso.visible = s end on(iso_box.stored_string) do s - iso_level[] = [parse(Float64,s)] + iso_level[] = [parse(Float64,s)] end on(iso_slide.value) do v iso_alpha[] = v @@ -252,6 +252,6 @@ function Visualise(Data; Topography=nothing, Topo_range=nothing) display(fig) - + return nothing -end \ No newline at end of file +end diff --git a/src/coord_conversion.jl b/src/coord_conversion.jl index 20519b4f..8d60e30b 100644 --- a/src/coord_conversion.jl +++ b/src/coord_conversion.jl @@ -20,15 +20,15 @@ function ConvertGeo2Cart(inputdata::GeoData) depth = inputdata.depth.values R = rearth+inputdata.depth.values # radius from the center of the earth - + # compute Cartesian coordinates and assign them to a ValueList variable X = ValueList("x",inputdata.depth.unit,R.*cosd(inputdata.lon.values).*cosd(inputdata.lat.values)) Y = ValueList("y",inputdata.depth.unit,R.*sind(inputdata.lon.values).*cosd(inputdata.lat.values)) Z = ValueList("z",inputdata.depth.unit,R.*sind(inputdata.lat.values)) # assign all data to the respective struct - convdata = ParaviewData(X,Y,Z,inputdata.values) + convdata = ParaviewData(X,Y,Z,inputdata.values) return convdata -end \ No newline at end of file +end diff --git a/src/data_conversion.jl b/src/data_conversion.jl index 30acafc8..7bb5d261 100644 --- a/src/data_conversion.jl +++ b/src/data_conversion.jl @@ -1,20 +1,20 @@ """ -**data_conversion.jl** contains functions to convert imported data from e.g. CSV or netCDF files to the GeoData format -that is used internally to store all data related to a certain data set. For importing files, use standard methods, such -as CSV import using *readdlm* (see the [DelimitedFiles.jl](https://docs.julialang.org/en/v1/stdlib/DelimitedFiles/) package) -or netCDF import (see the [NetCDF.jl](https://github.com/JuliaGeo/NetCDF.jl) package). +**data_conversion.jl** contains functions to convert imported data from e.g. CSV or netCDF files to the GeoData format +that is used internally to store all data related to a certain data set. For importing files, use standard methods, such +as CSV import using *readdlm* (see the [DelimitedFiles.jl](https://docs.julialang.org/en/v1/stdlib/DelimitedFiles/) package) +or netCDF import (see the [NetCDF.jl](https://github.com/JuliaGeo/NetCDF.jl) package). """ """ -Using *readdlm* on CSV files will provide two output arguments, one containing the header as a Array{AbstractString, 2} and +Using *readdlm* on CSV files will provide two output arguments, one containing the header as a Array{AbstractString, 2} and the other one containing the data as Array{Float64, 2}. """ ############ CONVERT DLM DATA TO GEO DATA function DLM2Geo(hdr::Array{AbstractString, 2},data::Array{Float64, 2},DepthCon::AbstractString) - + # initialize array of structures to store the data - # while doing so, separate the unit from the variable name + # while doing so, separate the unit from the variable name ndata = size(data,1) # number of entries nfields = size(data,2) # number of fields @@ -39,7 +39,7 @@ function DLM2Geo(hdr::Array{AbstractString, 2},data::Array{Float64, 2},DepthCon: LatData = data[1:end,ifield] elseif occursin("depth",hdr[ifield]) # ISSUE: WE DEFINE DEPTH AS NEGATIVE, BUT HOW DO WE SET THAT? - # WE COULD ADD A FLAG THAT INDICATES THE DEPTH CONVENTION AND + # WE COULD ADD A FLAG THAT INDICATES THE DEPTH CONVENTION AND # TREAT IT ACCORDINGLY depth_ind = ifield; varname = GetVariableName(hdr[ifield])# get variable name @@ -86,7 +86,7 @@ function DLM2Geo(hdr::Array{AbstractString, 2},data::Array{Float64, 2},DepthCon: # take care of the header strings varname = GetVariableName(tmp_hdr[ihdr])# get variable name - varunit = GetVariableUnit(tmp_hdr[ihdr])# get variable unit + varunit = GetVariableUnit(tmp_hdr[ihdr])# get variable unit if cmp(varunit,"%")==0 tmp_hdr[ihdr] = string(varname,"_percentage") else @@ -100,26 +100,26 @@ function DLM2Geo(hdr::Array{AbstractString, 2},data::Array{Float64, 2},DepthCon: hdr_tpl = Tuple(Symbol(x) for x in tmp_hdr) # convert header to tuple data_tpl = Tuple.(tmp_vec for i in size(tmp_vec,1)) # convert data to tuple tmp = NamedTuple{hdr_tpl}(data_tpl) - + println(typeof(tmp)) # initialize data structure importdata = GeoData(LonData,LatData,DepthData,tmp) # assign data to output - return importdata + return importdata end -########### REARRANGE DATA TO OBTAIN A 3D MATIX IF NECESSARY ########## +########### REARRANGE DATA TO OBTAIN A 3D MATRIX IF NECESSARY ########## function RearrangeToMatrix() end ########### CONVERT NETCDF DATA TO GEO DATA ######## """ -Converting netCDF data to GeoData is fairly easy. One can read in Data from a netCDF file using ncread("filename","variable") -(the contents of the netcdf file can be queried beforehand using ncinfo("filename")). Via *ncread*, one then reads in all the +Converting netCDF data to GeoData is fairly easy. One can read in Data from a netCDF file using ncread("filename","variable") +(the contents of the netcdf file can be queried beforehand using ncinfo("filename")). Via *ncread*, one then reads in all the desired variables. NetCDF2Geo then takes care of converting this data to GeoData. """ function NetCDF2Geo() @@ -136,7 +136,7 @@ function GetVariableName(inputstring::SubString{String}) indfirst = nothing iloop = 1 str2find = ["(","[","{"] - # find first occurence of one of the brackets + # find first occurrence of one of the brackets while isnothing(indfirst) indfirst = findfirst(str2find[iloop],inputstring) iloop = iloop + 1 @@ -144,7 +144,7 @@ function GetVariableName(inputstring::SubString{String}) break end end - + # either return the whole inputstring or only the part before the unit if isnothing(indfirst) return inputstring @@ -180,9 +180,3 @@ function GetVariableUnit(inputstring::SubString{String}) end end - - - - - - diff --git a/src/data_import.jl b/src/data_import.jl index 127ff75a..1c6e157a 100644 --- a/src/data_import.jl +++ b/src/data_import.jl @@ -1,6 +1,6 @@ # This is data_import.jl # -# This file contains functions to import different data types. +# This file contains functions to import different data types. # # Author: Marcel Thielmann, 05/2021 @@ -13,9 +13,9 @@ function ReadCSV_LatLon(filename::AbstractString,DepthCon::AbstractString) # import data from file with coordinates given in lat/lon/depth format and additional data given in additional columns # the idea here is to assign the data to a structure of the type GeoData which will then be used for further processing data,hdr = readdlm(filename,',', Float64,'\n'; header=true, skipblanks=true, comments=true, comment_char='#') - + # initialize array of structures to store the data - # while doing so, separate the unit from the variable name + # while doing so, separate the unit from the variable name ndata = size(data,1) # number of entries nfields = size(data,2) # number of fields @@ -40,7 +40,7 @@ function ReadCSV_LatLon(filename::AbstractString,DepthCon::AbstractString) LatData = data[1:end,ifield] elseif occursin("depth",hdr[ifield]) # ISSUE: WE DEFINE DEPTH AS NEGATIVE, BUT HOW DO WE SET THAT? - # WE COULD ADD A FLAG THAT INDICATES THE DEPTH CONVENTION AND + # WE COULD ADD A FLAG THAT INDICATES THE DEPTH CONVENTION AND # TREAT IT ACCORDINGLY depth_ind = ifield; varname = GetVariableName(hdr[ifield])# get variable name @@ -78,7 +78,7 @@ function ReadCSV_LatLon(filename::AbstractString,DepthCon::AbstractString) # take care of the header strings varname = GetVariableName(tmp_hdr[ihdr])# get variable name - varunit = GetVariableUnit(tmp_hdr[ihdr])# get variable unit + varunit = GetVariableUnit(tmp_hdr[ihdr])# get variable unit if cmp(varunit,"%")==0 tmp_hdr[ihdr] = string(varname,"_percentage") else @@ -92,14 +92,14 @@ function ReadCSV_LatLon(filename::AbstractString,DepthCon::AbstractString) hdr_tpl = Tuple(Symbol(x) for x in tmp_hdr) # convert header to tuple data_tpl = Tuple.(tmp_vec for i in size(tmp_vec,1)) # convert data to tuple tmp = NamedTuple{hdr_tpl}(data_tpl) - + println(typeof(tmp)) # initialize data structure importdata = GeoData(LonData,LatData,DepthData,tmp) # assign data to output - return importdata + return importdata end @@ -110,7 +110,7 @@ function GetVariableName(inputstring::SubString{String}) indfirst = nothing iloop = 1 str2find = ["(","[","{"] - # find first occurence of one of the brackets + # find first occurrence of one of the brackets while isnothing(indfirst) indfirst = findfirst(str2find[iloop],inputstring) iloop = iloop + 1 @@ -118,7 +118,7 @@ function GetVariableName(inputstring::SubString{String}) break end end - + # either return the whole inputstring or only the part before the unit if isnothing(indfirst) return inputstring @@ -173,7 +173,7 @@ function Screenshot_To_GeoData(filename::String, Corner_LowerLeft, Corner_UpperR img = load(filename) # load image # Define lon/lat/depth of lower left corner - + # try to determine if this is a horizontal profile or not if abs(Corner_UpperRight[3]-Corner_LowerLeft[3])>0.0 DepthProfile = true @@ -181,7 +181,7 @@ function Screenshot_To_GeoData(filename::String, Corner_LowerLeft, Corner_UpperR DepthProfile = false end - # We should be able to either define 4 corners or only 2 and reconstruct the other two from the + # We should be able to either define 4 corners or only 2 and reconstruct the other two from the if isnothing(Corner_LowerRight) || isnothing(Corner_UpperLeft) if DepthProfile Corner_LowerRight = (Corner_UpperRight[1], Corner_UpperRight[2], Corner_LowerLeft[3]) @@ -235,7 +235,7 @@ function Screenshot_To_GeoData(filename::String, Corner_LowerLeft, Corner_UpperR # i = 2; Corners_lat = [Corner_LowerLeft[i] Corner_LowerRight[i]; Corner_UpperLeft[i] Corner_UpperRight[i];] # i = 3; Corners_depth = [Corner_LowerLeft[i] Corner_LowerRight[i]; Corner_UpperLeft[i] Corner_UpperRight[i];] - + # Extract the colors from the grid img_RGB = convert.(RGB, img) # convert to RGB data @@ -263,7 +263,7 @@ function Screenshot_To_GeoData(filename::String, Corner_LowerLeft, Corner_UpperR Y = interp_linear_lat.(X_int, Y_int); Depth = interp_linear_depth.(X_int, Y_int); - # Transfer to 3D arrays (check if needed or not; if yes, redo error message in struct routin) + # Transfer to 3D arrays (check if needed or not; if yes, redo error message in struct routine) red = zeros(size(Depth)); red[:,:,1] = r; green = zeros(size(Depth)); green[:,:,1] = g; blue = zeros(size(Depth)); blue[:,:,1] = b; @@ -288,7 +288,7 @@ end Does the same as `Screenshot_To_GeoData`, but returns a `CartData` structure """ function Screenshot_To_CartData(filename::String, Corner_LowerLeft, Corner_UpperRight; Corner_LowerRight=nothing, Corner_UpperLeft=nothing) - + # first create a GeoData struct Data_Cart = Screenshot_To_GeoData(filename, Corner_LowerLeft, Corner_UpperRight; Corner_LowerRight=Corner_LowerRight, Corner_UpperLeft=Corner_UpperLeft, Cartesian=true) @@ -300,7 +300,7 @@ end """ Data = Screenshot_To_UTMData(filename::String, Corner_LowerLeft, Corner_UpperRight; Corner_LowerRight=nothing, Corner_UpperLeft=nothing, UTMzone::Int64=nothing, isnorth::Bool=true) -Does the same as `Screenshot_To_GeoData`, but returns for UTM data +Does the same as `Screenshot_To_GeoData`, but returns for UTM data Note that you have to specify the `UTMzone` and `isnorth` """ function Screenshot_To_UTMData(filename::String, Corner_LowerLeft, Corner_UpperRight; Corner_LowerRight=nothing, Corner_UpperLeft=nothing, UTMzone::Int64=nothing, isnorth::Bool=true) @@ -311,4 +311,3 @@ function Screenshot_To_UTMData(filename::String, Corner_LowerLeft, Corner_UpperR return Data_UTM end - diff --git a/src/data_types.jl b/src/data_types.jl index ac0c4f1d..dca5c633 100644 --- a/src/data_types.jl +++ b/src/data_types.jl @@ -7,7 +7,7 @@ export GeoData, ParaviewData, UTMData, CartData, LonLatDepthGrid, XYZGrid, Velocity_SphericalToCartesian!, Convert2UTMzone, Convert2CartData, ProjectionPoint, coordinate_grids, CreateCartGrid, CartGrid, flip - + """ struct ProjectionPoint @@ -22,7 +22,7 @@ export GeoData, ParaviewData, UTMData, CartData, Structure that holds the coordinates of a point that is used to project a data set from Lon/Lat to a Cartesian grid and vice-versa. """ struct ProjectionPoint - Lat :: Float64 + Lat :: Float64 Lon :: Float64 EW :: Float64 NS :: Float64 @@ -37,8 +37,8 @@ Defines a projection point used for map projections, by specifying latitude and """ function ProjectionPoint(; Lat=49.9929, Lon=8.2473) # Default = Mainz (center of universe) - x_lla = LLA(Lat, Lon, 0.0); # Lat/Lon/Alt of geodesy package - x_utmz = UTMZ(x_lla, wgs84) # UTMZ of + x_lla = LLA(Lat, Lon, 0.0); # Lat/Lon/Alt of geodesy package + x_utmz = UTMZ(x_lla, wgs84) # UTMZ of ProjectionPoint(Lat, Lon, x_utmz.x, x_utmz.y, Int64(x_utmz.zone), x_utmz.isnorth) end @@ -50,10 +50,10 @@ Defines a projection point used for map projections, by specifying UTM coordinat """ function ProjectionPoint(EW::Float64, NS::Float64, Zone::Int64, isnorth::Bool) - - x_utmz = UTMZ(EW,NS,0.0,Zone, isnorth) # UTMZ of - x_lla = LLA(x_utmz, wgs84); # Lat/Lon/Alt of geodesy package - + + x_utmz = UTMZ(EW,NS,0.0,Zone, isnorth) # UTMZ of + x_lla = LLA(x_utmz, wgs84); # Lat/Lon/Alt of geodesy package + ProjectionPoint(x_lla.lat, x_lla.lon, EW, NS, Zone, isnorth) end @@ -65,21 +65,21 @@ mutable struct ValueList values::Vector{Float64} end -""" +""" GeoData(lon::Any, lat:Any, depth::GeoUnit, fields::NamedTuple) - + Data structure that holds one or several fields with longitude, latitude and depth information. - `depth` can have units of meter, kilometer or be unitless; it will be converted to km. -- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. +- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. - In case you only pass one array we will convert it to a NamedTuple with default name. - A single field should be added as `(DataFieldName=Data,)` (don't forget the comma at the end). - Multiple fields can be added as well. `lon`,`lat`,`depth` should all have the same size as each of the `fields`. - In case you want to display a vector field in paraview, add it as a tuple: `(Velocity=(Veast,Vnorth,Vup), Veast=Veast, Vnorth=Vnorth, Vup=Vup)`; we automatically apply a vector transformation when transforming this to a `ParaviewData` structure from which we generate Paraview output. As this changes the magnitude of the arrows, you will no longer see the `[Veast,Vnorth,Vup]` components in Paraview which is why it is a good ideas to store them as separate Fields. -- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. +- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. - `Lat`,`Lon`,`Depth` should have the same size as the `Data` array. The ordering of the arrays is important. If they are 3D arrays, as in the example below, we assume that the first dimension corresponds to `lon`, second dimension to `lat` and third dimension to `depth` (which should be in km). See below for an example. -# Example +# Example ```julia-repl julia> Lat = 1.0:3:10.0; julia> Lon = 11.0:4:20.0; @@ -107,12 +107,12 @@ julia> Lat3D 1.0 4.0 7.0 10.0 1.0 4.0 7.0 10.0 1.0 4.0 7.0 10.0 - + [:, :, 2] = 1.0 4.0 7.0 10.0 1.0 4.0 7.0 10.0 1.0 4.0 7.0 10.0 - + [:, :, 3] = 1.0 4.0 7.0 10.0 1.0 4.0 7.0 10.0 @@ -123,19 +123,19 @@ julia> Depth3D -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km -20.0 km - + [:, :, 2] = -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km -15.0 km - + [:, :, 3] = -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km -10.0 km julia> Data = zeros(size(Lon3D)); -julia> Data_set = GeophysicalModelGenerator.GeoData(Lon3D,Lat3D,Depth3D,(DataFieldName=Data,)) -GeoData +julia> Data_set = GeophysicalModelGenerator.GeoData(Lon3D,Lat3D,Depth3D,(DataFieldName=Data,)) +GeoData size : (3, 4, 3) lon ϵ [ 11.0 : 19.0] lat ϵ [ 1.0 : 10.0] @@ -146,16 +146,16 @@ GeoData """ struct GeoData <: AbstractGeneralGrid lon :: GeoUnit - lat :: GeoUnit + lat :: GeoUnit depth :: GeoUnit fields :: NamedTuple atts :: Dict - + # Ensure that the data is of the correct format function GeoData(lon,lat,depth,fields,atts=nothing) - + # check depth & convert it to units of km in case no units are given or it has different length units - if unit.(depth[1])==NoUnits + if unit.(depth[1])==NoUnits depth = depth*km # in case depth has no dimensions end depth = uconvert.(km,depth) # convert to km @@ -168,7 +168,7 @@ struct GeoData <: AbstractGeneralGrid if isa(lon, StepRangeLen) lon = Vector(lon); end - + # Check ordering of the arrays in case of 3D if sum(size(lon).>1)==3 if maximum(abs.(diff(lon,dims=2)))>maximum(abs.(diff(lon,dims=1))) || maximum(abs.(diff(lon,dims=3)))>maximum(abs.(diff(lon,dims=1))) @@ -197,10 +197,10 @@ struct GeoData <: AbstractGeneralGrid DataField = DataField[1]; # in case we have velocity vectors as input end - if !(size(lon)==size(lat)==size(depth)==size(DataField)) + if !(size(lon)==size(lat)==size(depth)==size(DataField)) error("The size of Lon/Lat/Depth and the Fields should all be the same!") end - + if isnothing(atts) # if nothing is given as attributes, then we note that in GeoData atts = Dict("note" => "No attributes were given to this dataset") @@ -209,8 +209,8 @@ struct GeoData <: AbstractGeneralGrid if !(typeof(atts)<: Dict) error("Attributes should be given as Dict!") end - end - + end + return new(lon,lat,depth,fields,atts) end @@ -239,7 +239,7 @@ Cartesian data in `x/y/z` coordinates to be used with Paraview. This is usually generated automatically from the `GeoData` structure, but you can also invoke do this manually: ```julia-repl -julia> Data_set = GeophysicalModelGenerator.GeoData(1.0:10.0,11.0:20.0,(-20:-11)*km,(DataFieldName=(-20:-11),)) +julia> Data_set = GeophysicalModelGenerator.GeoData(1.0:10.0,11.0:20.0,(-20:-11)*km,(DataFieldName=(-20:-11),)) julia> Data_cart = convert(ParaviewData, Data_set) ``` """ @@ -261,24 +261,24 @@ function Base.show(io::IO, d::ParaviewData) end # conversion function from GeoData -> ParaviewData -function Base.convert(::Type{ParaviewData}, d::GeoData) - +function Base.convert(::Type{ParaviewData}, d::GeoData) + # Utilize the Geodesy.jl package & use the Cartesian Earth-Centered-Earth-Fixed (ECEF) coordinate system lon = Array(ustrip.(d.lon.val)); lat = Array(ustrip.(d.lat.val)); LLA_Data = LLA.(lat,lon, Array(ustrip.(d.depth.val))*1000); # convert to LLA from Geodesy package X,Y,Z = zeros(size(lon)), zeros(size(lon)), zeros(size(lon)); - + # convert to cartesian ECEF reference frame. Note that we use kilometers and the wgs84 for i in eachindex(X) - data_xyz = ECEF(LLA_Data[i], wgs84) + data_xyz = ECEF(LLA_Data[i], wgs84) X[i] = data_xyz.x/1e3; Y[i] = data_xyz.y/1e3; Z[i] = data_xyz.z/1e3; end - - # This is the 'old' implementation, which does not employ a reference ellipsoid + + # This is the 'old' implementation, which does not employ a reference ellipsoid # X = R .* cosd.( lon ) .* cosd.( lat ); # Y = R .* sind.( lon ) .* cosd.( lat ); # Z = R .* sind.( lat ); @@ -289,7 +289,7 @@ function Base.convert(::Type{ParaviewData}, d::GeoData) if typeof(d.fields[i]) <: Tuple if length(d.fields[i]) == 3 # the tuple has length 3, which is therefore assumed to be a velocity vector - + # If the field name contains the string "color" we do not apply a vector transformation as it is supposed to contain RGB colors if !occursin("color", string(field_names[i])) println("Applying a vector transformation to field: $(field_names[i])") @@ -306,29 +306,29 @@ end -""" +""" UTMData(EW::Any, NS:Any, depth::GeoUnit, UTMZone::Int, NorthernHemisphere=true, fields::NamedTuple) - + Data structure that holds one or several fields with UTM coordinates (east-west), (north-south) and depth information. - `depth` can have units of meters, kilometer or be unitless; it will be converted to meters (as UTMZ is usually in meters) -- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. +- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. - In case you only pass one array we will convert it to a NamedTuple with default name. - A single field should be added as `(DataFieldName=Data,)` (don't forget the comma at the end). - Multiple fields can be added as well. - In case you want to display a vector field in paraview, add it as a tuple: `(Velocity=(Veast,Vnorth,Vup), Veast=Veast, Vnorth=Vnorth, Vup=Vup)`; we automatically apply a vector transformation when transforming this to a `ParaviewData` structure from which we generate Paraview output. As this changes the magnitude of the arrows, you will no longer see the `[Veast,Vnorth,Vup]` components in Paraview which is why it is a good ideas to store them as separate Fields. -- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. +- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. - `Lat`,`Lon`,`Depth` should have the same size as the `Data` array. The ordering of the arrays is important. If they are 3D arrays, as in the example below, we assume that the first dimension corresponds to `lon`, second dimension to `lat` and third dimension to `depth` (which should be in km). See below for an example. -# Example +# Example ```julia-repl julia> ew = 422123.0:100:433623.0 julia> ns = 4.514137e6:100:4.523637e6 julia> depth = -5400:250:600 julia> EW,NS,Depth = XYZGrid(ew, ns, depth); julia> Data = ustrip.(Depth); -julia> Data_set = UTMData(EW,NS,Depth,33, true, (FakeData=Data,Data2=Data.+1.)) -UTMData +julia> Data_set = UTMData(EW,NS,Depth,33, true, (FakeData=Data,Data2=Data.+1.)) +UTMData UTM zone : 33-33 North size : (116, 96, 25) EW ϵ [ 422123.0 : 433623.0] @@ -340,7 +340,7 @@ UTMData If you wish, you can convert this from `UTMData` to `GeoData` with ```julia-repl julia> Data_set1 = convert(GeoData, Data_set) -GeoData +GeoData size : (116, 96, 25) lon ϵ [ 14.075969111533457 : 14.213417764154963] lat ϵ [ 40.77452227533946 : 40.86110443583479] @@ -357,18 +357,18 @@ julia> Write_Paraview(Data_set1, "Data_set1") """ struct UTMData <: AbstractGeneralGrid EW :: GeoUnit - NS :: GeoUnit + NS :: GeoUnit depth :: GeoUnit zone :: Any northern :: Any - fields :: NamedTuple + fields :: NamedTuple atts :: Dict - + # Ensure that the data is of the correct format function UTMData(EW,NS,depth,zone,northern,fields,atts=nothing) - + # check depth & convert it to units of km in case no units are given or it has different length units - if unit.(depth)[1]==NoUnits + if unit.(depth)[1]==NoUnits depth = depth*m # in case depth has no dimensions end depth = uconvert.(m,depth) # convert to meters @@ -402,7 +402,7 @@ struct UTMData <: AbstractGeneralGrid DataField = DataField[1]; # in case we have velocity vectors as input end - if !(size(EW)==size(NS)==size(depth)==size(DataField)) + if !(size(EW)==size(NS)==size(depth)==size(DataField)) error("The size of EW/NS/Depth and the Fields should all be the same!") end @@ -410,7 +410,7 @@ struct UTMData <: AbstractGeneralGrid zone = ones(Int64,size(EW))*zone northern = ones(Bool,size(EW))*northern end - + # take care of attributes if isnothing(atts) # if nothing is given as attributes, then we note that in GeoData @@ -423,7 +423,7 @@ struct UTMData <: AbstractGeneralGrid end return new(EW,NS,depth,zone,northern, fields,atts) - + end end @@ -449,7 +449,7 @@ end """ Converts a `UTMData` structure to a `GeoData` structure """ -function Base.convert(::Type{GeoData}, d::UTMData) +function Base.convert(::Type{GeoData}, d::UTMData) Lat = zeros(size(d.EW)); Lon = zeros(size(d.EW)); @@ -463,7 +463,7 @@ function Base.convert(::Type{GeoData}, d::UTMData) Lat[i] = lla_i.lat Lon[i] = lon - end + end # handle the case where an old GeoData structure is converted if any( propertynames(d) .== :atts) @@ -484,7 +484,7 @@ end """ Converts a `GeoData` structure to a `UTMData` structure """ -function Base.convert(::Type{UTMData}, d::GeoData) +function Base.convert(::Type{UTMData}, d::GeoData) EW = zeros(size(d.lon)); NS = zeros(size(d.lon)); @@ -496,13 +496,13 @@ function Base.convert(::Type{UTMData}, d::GeoData) # Use functions of the Geodesy package to convert to LLA lla_i = LLA(d.lat.val[i],d.lon.val[i],Float64(ustrip.(d.depth.val[i])*1e3)) utmz_i = UTMZ(lla_i, wgs84) - + EW[i] = utmz_i.x NS[i] = utmz_i.y depth[i] = utmz_i.z zone[i] = utmz_i.zone; northern[i] = utmz_i.isnorth - end + end # handle the case where an old GeoData structure is converted if any( propertynames(d) .== :atts) @@ -522,11 +522,11 @@ end This flips the data in the structure in a certain dimension (default is z [3]) """ function flip(Data::GeoData, dimension=3) - + depth = reverse(Data.depth.val,dims=dimension)*Data.depth.unit # flip depth - lon = reverse(Data.lon.val,dims=dimension)*Data.lon.unit # flip - lat = reverse(Data.lat.val,dims=dimension)*Data.lat.unit # flip - + lon = reverse(Data.lon.val,dims=dimension)*Data.lon.unit # flip + lat = reverse(Data.lat.val,dims=dimension)*Data.lat.unit # flip + # flip fields fields = Data.fields; name_keys = keys(fields) @@ -540,20 +540,20 @@ end """ - Convert2UTMzone(d::GeoData, p::ProjectionPoint) + Convert2UTMzone(d::GeoData, p::ProjectionPoint) -Converts a `GeoData` structure to fixed UTM zone, around a given `ProjectionPoint` +Converts a `GeoData` structure to fixed UTM zone, around a given `ProjectionPoint` This useful to use real data as input for a cartesian geodynamic model setup (such as in LaMEM). In that case, we need to project map coordinates to cartesian coordinates. One way to do this is by using UTM coordinates. Close to the `ProjectionPoint` the resulting coordinates will be rectilinear and distance in meters. The map distortion becomes larger the further you are away from the center. - + """ -function Convert2UTMzone(d::GeoData, proj::ProjectionPoint) +function Convert2UTMzone(d::GeoData, proj::ProjectionPoint) EW = zeros(size(d.lon)); NS = zeros(size(d.lon)); zone = zeros(Int64,size(d.lon)); northern = zeros(Bool,size(d.lon)); - trans = UTMfromLLA(proj.zone, proj.isnorth, wgs84) + trans = UTMfromLLA(proj.zone, proj.isnorth, wgs84) for i in eachindex(d.lon.val) # Use functions of the Geodesy package to convert to LLA @@ -564,7 +564,7 @@ function Convert2UTMzone(d::GeoData, proj::ProjectionPoint) NS[i] = utm_i.y zone[i] = proj.zone; northern[i] = proj.isnorth - end + end # handle the case where an old GeoData structure is converted if any( propertynames(d) .== :atts) @@ -579,21 +579,21 @@ end -""" +""" CartData(x::Any, y::Any, z::GeoUnit, fields::NamedTuple) - + Data structure that holds one or several fields with with Cartesian x/y/z coordinates. Distances are in kilometers - `x`,`y`,`z` can have units of meters, kilometer or be unitless; they will be converted to kilometers -- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. +- `fields` should ideally be a NamedTuple which allows you to specify the names of each of the fields. - In case you only pass one array we will convert it to a NamedTuple with default name. - A single field should be added as `(DataFieldName=Data,)` (don't forget the comma at the end). - Multiple fields can be added as well. - In case you want to display a vector field in paraview, add it as a tuple: `(Velocity=(Vx,Vnorth,Vup), Veast=Veast, Vnorth=Vnorth, Vup=Vup)`; we automatically apply a vector transformation when transforming this to a `ParaviewData` structure from which we generate Paraview output. As this changes the magnitude of the arrows, you will no longer see the `[Veast,Vnorth,Vup]` components in Paraview which is why it is a good ideas to store them as separate Fields. -- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. +- Yet, there is one exception: if the name of the 3-component field is `colors`, we do not apply this vector transformation as this field is regarded to contain RGB colors. - `x`,`y`,`z` should have the same size as the `Data` array. The ordering of the arrays is important. If they are 3D arrays, as in the example below, we assume that the first dimension corresponds to `x`, second dimension to `y` and third dimension to `z` (which should be in km). See below for an example. -# Example +# Example ```julia-repl julia> x = 0:2:10 julia> y = -5:5 @@ -601,7 +601,7 @@ julia> z = -10:2:2 julia> X,Y,Z = XYZGrid(x, y, z); julia> Data = Z julia> Data_set = CartData(X,Y,Z, (FakeData=Data,Data2=Data.+1.)) -CartData +CartData size : (6, 11, 7) x ϵ [ 0.0 km : 10.0 km] y ϵ [ -5.0 km : 5.0 km] @@ -620,7 +620,7 @@ julia> Write_Paraview(Data_set, "Data_set") If you wish, you can convert this to `UTMData` (which will simply convert the ) ```julia-repl julia> Data_set1 = convert(GeoData, Data_set) -GeoData +GeoData size : (116, 96, 25) lon ϵ [ 14.075969111533457 : 14.213417764154963] lat ϵ [ 40.77452227533946 : 40.86110443583479] @@ -632,14 +632,14 @@ which would allow visualizing this in paraview in the usual manner: """ struct CartData <: AbstractGeneralGrid x :: GeoUnit - y :: GeoUnit + y :: GeoUnit z :: GeoUnit fields :: NamedTuple - atts :: Dict - + atts :: Dict + # Ensure that the data is of the correct format function CartData(x,y,z,fields,atts=nothing) - + # Check ordering of the arrays in case of 3D if sum(size(x).>1)==3 if maximum(abs.(diff(x,dims=2)))>maximum(abs.(diff(x,dims=1))) || maximum(abs.(diff(x,dims=3)))>maximum(abs.(diff(x,dims=1))) @@ -654,7 +654,7 @@ struct CartData <: AbstractGeneralGrid x = Convert!(x,km) y = Convert!(y,km) z = Convert!(z,km) - + # fields should be a NamedTuple. In case we simply provide an array, lets transfer it accordingly if !(typeof(fields)<: NamedTuple) if (typeof(fields)<: Tuple) @@ -673,7 +673,7 @@ struct CartData <: AbstractGeneralGrid DataField = DataField[1]; # in case we have velocity vectors as input end - if !(size(x)==size(y)==size(z)==size(DataField)) + if !(size(x)==size(y)==size(z)==size(DataField)) error("The size of x/y/z and the Fields should all be the same!") end @@ -689,7 +689,7 @@ struct CartData <: AbstractGeneralGrid end return new(x,y,z,fields,atts) - + end end @@ -711,10 +711,10 @@ end CartData(xyz::Tuple{Array,Array,Array}) This creates a `CartData` struct if you have a Tuple with 3D coordinates as input. -# Example +# Example ```julia julia> data = CartData(XYZGrid(-10:10,-5:5,0)) -CartData +CartData size : (21, 11, 1) x ϵ [ -10.0 km : 10.0 km] y ϵ [ -5.0 km : 5.0 km] @@ -728,11 +728,11 @@ CartData(xyz::Tuple) = CartData(xyz[1],xyz[2],xyz[3],(Z=xyz[3],)) """ - Convert2UTMzone(d::CartData, proj::ProjectionPoint) + Convert2UTMzone(d::CartData, proj::ProjectionPoint) This transfers a `CartData` dataset to a `UTMData` dataset, that has a single UTM zone. The point around which we project is `ProjectionPoint` """ -function Convert2UTMzone(d::CartData, proj::ProjectionPoint) +function Convert2UTMzone(d::CartData, proj::ProjectionPoint) return UTMData(ustrip.(d.x.val).*1e3 .+ proj.EW,ustrip.(d.y.val).*1e3 .+ proj.NS, ustrip.(d.z.val).*1e3,proj.zone, proj.isnorth, d.fields, d.atts) @@ -743,7 +743,7 @@ end Convert2CartData(d::UTMData, proj::ProjectionPoint) Converts a `UTMData` structure to a `CartData` structure, which essentially transfers the dimensions to km """ -function Convert2CartData(d::UTMData, proj::ProjectionPoint) +function Convert2CartData(d::UTMData, proj::ProjectionPoint) # handle the case where an old structure is converted if any( propertynames(d) .== :atts) @@ -761,7 +761,7 @@ end Convert2CartData(d::GeoData, proj::ProjectionPoint) Converts a `GeoData` structure to a `CartData` structure, which essentially transfers the dimensions to km """ -function Convert2CartData(d::GeoData, proj::ProjectionPoint) +function Convert2CartData(d::GeoData, proj::ProjectionPoint) d_UTM = Convert2UTMzone(d,proj) return CartData( (ustrip.(d_UTM.EW.val) .- proj.EW)./1e3, (ustrip.(d_UTM.NS.val) .- proj.NS)./1e3, @@ -809,7 +809,7 @@ function LonLatDepthGrid(Lon::Any, Lat::Any, Depth::Any) if nLon==nLat==nDepth==1 error("Cannot use this routine for a 3D point (no need to create a grid in that case") - end + end if maximum([length(size(Lon)), length(size(Lat)), length(size(Depth))])>1 error("You can only give 1D vectors or numbers as input") end @@ -861,32 +861,32 @@ end In-place conversion of velocities in spherical velocities `[Veast, Vnorth, Vup]` to cartesian coordinates (for use in paraview). NOTE: the magnitude of the vector will be the same, but the individual `[Veast, Vnorth, Vup]` components -will not be retained correctly (as a different `[x,y,z]` coordinate system is used in paraview). +will not be retained correctly (as a different `[x,y,z]` coordinate system is used in paraview). Therefore, if you want to display or color that correctly in Paraview, you need to store these magnitudes as separate fields """ function Velocity_SphericalToCartesian!(Data::GeoData, Velocity::Tuple) - # Note: This is partly based on scripts originally written by Tobias Baumann, Uni Mainz + # Note: This is partly based on scripts originally written by Tobias Baumann, Uni Mainz for i in eachindex(Data.lat.val) az = Data.lon.val[i]; el = Data.lat.val[i]; R = [-sind(az) -sind(el)*cosd(az) cosd(el)*cosd(az); - cosd(az) -sind(el)*sind(az) cosd(el)*sind(az); + cosd(az) -sind(el)*sind(az) cosd(el)*sind(az); 0.0 cosd(el) sind(el) ]; - + V_sph = [Velocity[1][i]; Velocity[2][i]; Velocity[3][i] ]; - + # Normalize spherical velocity V_mag = sum(sqrt.(V_sph.^2)); # magnitude - V_norm = V_sph/V_mag + V_norm = V_sph/V_mag V_xyz_norm = R*V_norm; V_xyz = V_xyz_norm.*V_mag; # scale with magnitude - # in-place saving of rotated velocity - Velocity[1][i] = V_xyz[1]; + # in-place saving of rotated velocity + Velocity[1][i] = V_xyz[1]; Velocity[2][i] = V_xyz[2]; Velocity[3][i] = V_xyz[3]; end @@ -894,7 +894,7 @@ end # Internal function that converts arrays to a GeoUnit with certain units function Convert!(d,u) - if unit.(d)[1]==NoUnits + if unit.(d)[1]==NoUnits d = d*u # in case it has no dimensions end d = uconvert.(u,d) # convert to u @@ -953,18 +953,18 @@ struct CartGrid{FT, D} <: AbstractGeneralGrid N :: NTuple{D,Int} # Number of grid points in every direction Δ :: NTuple{D,FT} # (constant) spacing in every direction L :: NTuple{D,FT} # Domain size - min :: NTuple{D,FT} # start of the grid in every direction - max :: NTuple{D,FT} # end of the grid in every direction + min :: NTuple{D,FT} # start of the grid in every direction + max :: NTuple{D,FT} # end of the grid in every direction coord1D :: NTuple{D,StepRangeLen{FT}} # Tuple with 1D vectors in all directions coord1D_cen :: NTuple{D,StepRangeLen{FT}} # Tuple with 1D vectors of center points in all directions -end +end """ Grid = CreateCartGrid(; size=(), x = nothing, z = nothing, y = nothing, extent = nothing, CharDim = nothing) -Creates a 1D, 2D or 3D cartesian grid of given size. Grid can be created by defining the size and either the `extent` (length) of the grid in all directions, or by defining start & end points (`x`,`y`,`z`). +Creates a 1D, 2D or 3D cartesian grid of given size. Grid can be created by defining the size and either the `extent` (length) of the grid in all directions, or by defining start & end points (`x`,`y`,`z`). If you specify `CharDim` (a structure with characteristic dimensions created with `GeoParams.jl`), we will nondimensionalize the grd before creating the struct. Spacing is assumed to be constant in a given direction @@ -980,22 +980,22 @@ Note: since this is mostly for solid Earth geoscience applications, the second d A basic case with non-dimensional units: ```julia julia> Grid = CreateCartGrid(size=(10,20),x=(0.,10), z=(2.,10)) -Grid{Float64, 2} - size: (10, 20) - length: (10.0, 8.0) - domain: x ∈ [0.0, 10.0], z ∈ [2.0, 10.0] - grid spacing Δ: (1.1111111111111112, 0.42105263157894735) +Grid{Float64, 2} + size: (10, 20) + length: (10.0, 8.0) + domain: x ∈ [0.0, 10.0], z ∈ [2.0, 10.0] + grid spacing Δ: (1.1111111111111112, 0.42105263157894735) ``` An example with dimensional units: ```julia julia> CharDim = GEO_units() julia> Grid = CreateCartGrid(size=(10,20),x=(0.0km, 10km), z=(-20km, 10km), CharDim=CharDim) -CartGrid{Float64, 2} - size: (10, 20) - length: (0.01, 0.03) - domain: x ∈ [0.0, 0.01], z ∈ [-0.02, 0.01] - grid spacing Δ: (0.0011111111111111111, 0.0015789473684210528) +CartGrid{Float64, 2} + size: (10, 20) + length: (0.01, 0.03) + domain: x ∈ [0.0, 0.01], z ∈ [-0.02, 0.01] + grid spacing Δ: (0.0011111111111111111, 0.0015789473684210528) ``` @@ -1007,16 +1007,16 @@ function CreateCartGrid(; extent = nothing, CharDim = nothing ) - + if isa(size, Number) size = (size,) # transfer to tuple end if isa(extent, Number) - extent = (extent,) + extent = (extent,) end N = size - dim = length(N) - + dim = length(N) + # Specify domain by length in every direction if !isnothing(extent) x,y,z = nothing, nothing, nothing @@ -1043,13 +1043,13 @@ function CreateCartGrid(; L = (x[2] - x[1], y[2] - y[1], z[2] - z[1]) X₁= (x[1], y[1], z[1]) end - Xₙ = X₁ .+ L - Δ = L ./ (N .- 1) - - # nondimensionalize + Xₙ = X₁ .+ L + Δ = L ./ (N .- 1) + + # nondimensionalize if !isnothing(CharDim) X₁, Xₙ, Δ, L = GeoUnit.(X₁), GeoUnit.(Xₙ), GeoUnit.(Δ), GeoUnit.(L) - + X₁ = ntuple( i -> nondimensionalize(X₁[i], CharDim), dim) Xₙ = ntuple( i -> nondimensionalize(Xₙ[i], CharDim), dim) Δ = ntuple( i -> nondimensionalize(Δ[i], CharDim), dim) @@ -1058,18 +1058,18 @@ function CreateCartGrid(; X₁, Xₙ, Δ, L = NumValue.(X₁), NumValue.(Xₙ), NumValue.(Δ), NumValue.(L) end - # Generate 1D coordinate arrays of vertexes in all directions + # Generate 1D coordinate arrays of vertices in all directions coord1D=() for idim=1:dim coord1D = (coord1D..., range(X₁[idim], Xₙ[idim]; length = N[idim] )) end - + # Generate 1D coordinate arrays centers in all directionbs coord1D_cen=() for idim=1:dim coord1D_cen = (coord1D_cen..., range(X₁[idim]+Δ[idim]/2, Xₙ[idim]-Δ[idim]/2; length = N[idim]-1 )) end - + ConstantΔ = true; return CartGrid(ConstantΔ,N,Δ,L,X₁,Xₙ,coord1D, coord1D_cen) @@ -1079,7 +1079,7 @@ end # view grid object function show(io::IO, g::CartGrid{FT, DIM}) where {FT, DIM} - + print(io, "CartGrid{$FT, $DIM} \n", " size: $(g.N) \n", " length: $(g.L) \n", @@ -1090,7 +1090,7 @@ end # nice printing of grid function domain_string(grid::CartGrid{FT, DIM}) where {FT, DIM} - + xₗ, xᵣ = grid.coord1D[1][1], grid.coord1D[1][end] if DIM>1 yₗ, yᵣ = grid.coord1D[2][1], grid.coord1D[2][end] @@ -1120,7 +1120,7 @@ function coordinate_grids(Data::CartGrid) end """ - Data = CartData(Grid::CartGrid, fields::NamedTuple; y_val=0.0) + Data = CartData(Grid::CartGrid, fields::NamedTuple; y_val=0.0) Returns a CartData set given a cartesian grid `Grid` and `fields` defined on that grid. """ @@ -1136,8 +1136,8 @@ function CartData(Grid::CartGrid, fields::NamedTuple; y_val=0.0) dat = reshape(fields[ifield],Grid.N[1],1,Grid.N[2]); # reshape into 3D form fields = merge(fields, [names[ifield] => dat]) end - + end - + return CartData(X,Y,Z, fields) -end \ No newline at end of file +end diff --git a/src/stl.jl b/src/stl.jl index 45e69b06..97a5300e 100644 --- a/src/stl.jl +++ b/src/stl.jl @@ -1,12 +1,12 @@ # NOTE: this remains WIP, which is why it is not yet in the documentation -# These are routines that allow importing *.stl triangular surfaces and employ them +# These are routines that allow importing *.stl triangular surfaces and employ them using FileIO using GeometryBasics: TriangleP, Mesh, normals, PointMeta, coordinates using LinearAlgebra -# Warning: the TriangleIntersect dependency does not seem to work on different machines, as the developer did not add a version nunber.. -# That forces us to remove it here, and +# Warning: the TriangleIntersect dependency does not seem to work on different machines, as the developer did not add a version number.. +# That forces us to remove it here, and #using TriangleIntersect export Ray, Intersection, IntersectRayTriangle, load, TriangleNormal, Point, IntersectRayMesh, coordinates @@ -19,12 +19,12 @@ Base.convert(::Type{Triangle}, t::TriangleP) = Triangle(convert(Point,t[1]) Triangle(t::TriangleP) = convert(Triangle,t) -# Define a few routins to allow computing the intersection of a ray with a triangle +# Define a few routines to allow computing the intersection of a ray with a triangle #/(p::GeometryBasics.Point, n::Number) = GeometryBasics.Point(p.x/n, p.y/n, p.z/n) #unitize(p::GeometryBasics.Point) = p/(p*p) #= -# Intersection +# Intersection struct Intersection ip::Point # intersecting point id::Float64 # intersecting distance @@ -63,7 +63,7 @@ function IntersectRayTriangle(r::Ray, t::TriangleP, normal) v2v2 = dot(v2,v2) v1v2 = dot(v1,v2) t_denom = v1v2*v1v2 - v1v1*v2v2 - + denom = dot(normal,r.direction) denom == 0 && return no_intersection ri = normal*(t[1] - r.origin) / denom @@ -80,7 +80,7 @@ function IntersectRayTriangle(r::Ray, t::TriangleP, normal) t_intersection >= 1 && return no_intersection s_intersection + t_intersection >= 1 && return no_intersection return Intersection(t[1] + s_intersection*v1+t_intersection*v2, ri, true) - + end =# @@ -119,31 +119,31 @@ function STLToSurface(name::String, Xin, Yin, minZ) else X,Y = Xin, Yin; end - + Z = ones(size(X))*NaN - + minM = minimum.(mesh) maxM = maximum.(mesh) max_x = [maxM[i][1] for i=1:length(maxM)] min_x = [minM[i][1] for i=1:length(maxM)] max_y = [maxM[i][2] for i=1:length(maxM)] min_y = [minM[i][2] for i=1:length(maxM)] - + for i in eachindex(X) - + r_up = Ray(Point(X[i],Y[i],minZ),Point(0.0, 0.0, 1.0)) - - ind = findall( (X[i] .>= min_x) .& (X[i] .<= max_x) .& + + ind = findall( (X[i] .>= min_x) .& (X[i] .<= max_x) .& (Y[i] .>= min_y) .& (Y[i] .<= max_y) ) for iT in eachindex(ind) - + is = intersect(r_up, Triangle(mesh[ind[iT]])) - if is.is_intersection + if is.is_intersection Z[i] = is.ip.z end - + end end @@ -153,7 +153,7 @@ function STLToSurface(name::String, Xin, Yin, minZ) X_out[:,:,1] = X; Y_out[:,:,1] = Y; Z_out[:,:,1] = Z; - + Data = ParaviewData(X_out,Y_out,Z_out,(depth=Z_out,)) return Data @@ -162,26 +162,26 @@ end =# -""" - inside = IsInsideClosedSTL(mesh::Mesh, Pt, eps=1e-3) +""" + inside = IsInsideClosedSTL(mesh::Mesh, Pt, eps=1e-3) Determine whether a point `Pt` is inside a 3D closed triangular `*.stl` surface or not. -This implements the winding number method, following the python code: +This implements the winding number method, following the python code: https://github.com/marmakoide/inside-3d-mesh This again is described in the following [paper](https://igl.ethz.ch/projects/winding-number/) by Alec Jacobson, Ladislav Kavan and Olga Sorkine-Hornung. """ -function IsInsideClosedSTL(mesh::Mesh, Pt::Vector, eps=1e-3) +function IsInsideClosedSTL(mesh::Mesh, Pt::Vector, eps=1e-3) # Compute triangle vertices and their norms relative to X - M_vec = [mesh.position[i]-Pt[:] for i in eachindex(mesh.position)]; + M_vec = [mesh.position[i]-Pt[:] for i in eachindex(mesh.position)]; M = zeros(length(M_vec),3); for i=1:length(M_vec) M[i,:] = Float64.(M_vec[i][1:3]); end M_norm = sqrt.(sum(M.^2,dims=2)) - + # Accumulate generalized winding number per triangle winding_number = 0. for iT=1:length(mesh) @@ -204,8 +204,7 @@ function IsInsideClosedSTL(mesh::Mesh, Pt::Vector, eps=1e-3) else isinside = false end - - + + return isinside end - diff --git a/src/transformation.jl b/src/transformation.jl index da79dbf7..607a061f 100644 --- a/src/transformation.jl +++ b/src/transformation.jl @@ -10,35 +10,35 @@ export ProjectCartData Projects all datafields from the GeoData struct `d` to the CartData struct `d_cart`, around the projection point `p`. `d_cart` *must* be an orthogonal cartesian grid (deformed doesn't work; use `Convert2CartData(d, proj)`, where `proj` is a projection point in that case). -# Note: -- If `d_cart` and `d` are horizontal surfaces (3rd dimension has size==1), it also interpolates the depth coordinate. +# Note: +- If `d_cart` and `d` are horizontal surfaces (3rd dimension has size==1), it also interpolates the depth coordinate. """ function ProjectCartData(d_cart::CartData, d::GeoData, p::ProjectionPoint) Data_UTM = Convert2UTMzone(d_cart, p) - Data_lonlat = convert(GeoData,Data_UTM) - + Data_lonlat = convert(GeoData,Data_UTM) + # Check whether the data sets have the same sign. If not, we may have to shift one by 360 degrees min_lon_cart, max_lon_cart = minimum(Data_lonlat.lon.val), maximum(Data_lonlat.lon.val) min_lon, max_lon = minimum(d.lon.val), maximum(d.lon.val) - + if (sign(min_lon)!=sign(min_lon_cart)) && (sign(max_lon)!=sign(max_lon_cart)) - # the longitude data has a different sign. This can happen if one of them is "West" (and has negative values), whereas the other has + # the longitude data has a different sign. This can happen if one of them is "West" (and has negative values), whereas the other has if (min_lon_cart<0) Data_lonlat = GeoData(Data_lonlat.lon.val .+ 360, Data_lonlat.lat.val, ustrip.(Data_lonlat.depth.val), Data_lonlat.fields) end end - + if size(Data_lonlat.lon.val,3)==1 z_new, fields_new = InterpolateDataFields2D(d,Data_lonlat.lon.val, Data_lonlat.lat.val) - - # Create new struct + + # Create new struct d_cart = CartData(d_cart.x.val,d_cart.y.val,z_new,fields_new) - + else - d_data = InterpolateDataFields(d, Data_lonlat.lon.val, Data_lonlat.lat.val, Data_lonlat.depth.val) + d_data = InterpolateDataFields(d, Data_lonlat.lon.val, Data_lonlat.lat.val, Data_lonlat.depth.val) d_cart = CartData(d_cart.x.val,d_cart.y.val,d_cart.z.val,d_data.fields) - + end return d_cart @@ -51,26 +51,26 @@ end Projects all datafields from the UTMData struct `d` to the CartData struct `d_cart`, around the projection point `p`. `d_cart` *must* be an orthogonal cartesian grid (deformed doesn't work; use `Convert2CartData(d, proj)`, where `proj` is a projection point in that case). - - # Note: - - If `d_cart` and `d` are horizontal surfaces (3rd dimension has size==1), it also interpolates the depth coordinate. - + + # Note: + - If `d_cart` and `d` are horizontal surfaces (3rd dimension has size==1), it also interpolates the depth coordinate. + """ function ProjectCartData(d_cart::CartData, d::UTMData, p::ProjectionPoint) Data_UTM = Convert2UTMzone(d_cart, p) - + if size(Data_UTM.EW.val,3)==1 z_new, fields_new = InterpolateDataFields2D(d,Data_UTM.EW.val, Data_UTM.NS.val) - - # Create new struct + + # Create new struct d_cart = CartData(d_cart.x.val,d_cart.y.val,z_new,fields_new) - + else - d_data = InterpolateDataFields(d, Data_UTM.EW.val, Data_UTM.NS.val, Data_UTM.depth.val) + d_data = InterpolateDataFields(d, Data_UTM.EW.val, Data_UTM.NS.val, Data_UTM.depth.val) d_cart = CartData(d_cart.x.val,d_cart.y.val,d_cart.z.val,d_data.fields) - + end return d_cart -end \ No newline at end of file +end diff --git a/src/utils.jl b/src/utils.jl index 4e746a35..048b5b52 100644 --- a/src/utils.jl +++ b/src/utils.jl @@ -1,4 +1,4 @@ -# few utils that are useful +# few utils that are useful export meshgrid, CrossSection, ExtractSubvolume, SubtractHorizontalMean export ParseColumns_CSV_File, AboveSurface, BelowSurface, VoteMap @@ -29,7 +29,7 @@ end """ CrossSection(Volume::GeoData; dims=(100,100), Interpolate=false, Depth_level=nothing; Lat_level=nothing; Lon_level=nothing; Start=nothing, End=nothing ) -Creates a cross-section through a volumetric (3D) `GeoData` object. +Creates a cross-section through a volumetric (3D) `GeoData` object. - Cross-sections can be horizontal (map view at a given depth), if `Depth_level` is specified - They can also be vertical, either by specifying `Lon_level` or `Lat_level` (for a fixed lon/lat), or by defining both `Start=(lon,lat)` & `End=(lon,lat)` points. @@ -40,9 +40,9 @@ Creates a cross-section through a volumetric (3D) `GeoData` object. julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); julia> Data = Depth*2; # some data julia> Vx,Vy,Vz = ustrip(Data*3),ustrip(Data*4),ustrip(Data*5); -julia> Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))); -julia> Data_cross = CrossSection(Data_set3D, Depth_level=-100km) -GeoData +julia> Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))); +julia> Data_cross = CrossSection(Data_set3D, Depth_level=-100km) +GeoData size : (11, 11, 1) lon ϵ [ 10.0 : 20.0] lat ϵ [ 30.0 : 40.0] @@ -58,7 +58,7 @@ function CrossSection(V::AbstractGeneralGrid; dims=(100,100), Interpolate=false, X,Y,Z = coordinate_grids(V) if !isnothing(Depth_level) # Horizontal slice - CheckBounds(Z, Depth_level) + CheckBounds(Z, Depth_level) if Interpolate Lon,Lat,Depth = LonLatDepthGrid( LinRange(minimum(X), maximum(X), dims[1]), LinRange(minimum(Y), maximum(Y), dims[2]), @@ -72,7 +72,7 @@ function CrossSection(V::AbstractGeneralGrid; dims=(100,100), Interpolate=false, end if !isnothing(Lat_level) # vertical slice @ given latitude - CheckBounds(Y, Lat_level) + CheckBounds(Y, Lat_level) if Interpolate Lon,Lat,Depth = LonLatDepthGrid( LinRange(minimum(X), maximum(X), dims[1]), Lat_level, @@ -86,8 +86,8 @@ function CrossSection(V::AbstractGeneralGrid; dims=(100,100), Interpolate=false, end if !isnothing(Lon_level) # vertical slice @ given longitude - CheckBounds(X, Lon_level) - if Interpolate + CheckBounds(X, Lon_level) + if Interpolate Lon,Lat,Depth = LonLatDepthGrid( Lon_level, LinRange(minimum(Y), maximum(Y), dims[1]), LinRange(minimum(Z), maximum(Z), dims[2])) @@ -117,17 +117,17 @@ function CrossSection(V::AbstractGeneralGrid; dims=(100,100), Interpolate=false, Lon = zeros(dims[1],dims[2],1) Lat = zeros(dims[1],dims[2],1) Depth = zeros(dims[1],dims[2],1)*Depth_p[1] - + # We need 3D matrixes for the paraview writing routine to know we are in 3D Lon[:,:,1] = Lon_p[:,1,:] Lat[:,:,1] = Lat_p[1,:,:] Depth[:,:,1] = Depth_p[1,:,:] - + end if Interpolate # Interpolate data on profile - DataProfile = InterpolateDataFields(V, Lon, Lat, Depth); + DataProfile = InterpolateDataFields(V, Lon, Lat, Depth); else # extract data (no interpolation) DataProfile = ExtractDataSets(V, iLon, iLat, iDepth); @@ -145,7 +145,7 @@ Extract or "cuts-out" a piece of a 2D or 3D GeoData set, defined by `Lon`, `Lat` This is useful if you are only interested in a part of a much bigger larger data set. -- `Lon_level`,`Lat_level` and `Depth_level` should be tuples that indicate `(minimum_value, maximum_value)` along the respective direction. If not specified we use the full range. +- `Lon_level`,`Lat_level` and `Depth_level` should be tuples that indicate `(minimum_value, maximum_value)` along the respective direction. If not specified we use the full range. - By default, `Interpolate=false` and we find the closest indices within the data set (so your new data set will not go exactly from minimum to maximum). - Alternatively, if `Interpolate=true` we interpolate the data onto a new grid that has dimensions `dims`. This can be useful to compare data sets that are originally given in different resolutions. @@ -155,14 +155,14 @@ julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); julia> Data = Depth*2; # some data julia> Vx,Vy,Vz = ustrip(Data*3),ustrip(Data*4),ustrip(Data*5); julia> Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) -GeoData +GeoData size : (11, 11, 13) lon ϵ [ 10.0 : 20.0] lat ϵ [ 30.0 : 40.0] depth ϵ [ -300.0 km : 0.0 km] fields: (:Depthdata, :LonData, :Velocity) julia> Data_extracted = ExtractSubvolume(Data_set3D,Lon_level=(10,12),Lat_level=(35,40)) -GeoData +GeoData size : (3, 6, 13) lon ϵ [ 10.0 : 12.0] lat ϵ [ 35.0 : 40.0] @@ -175,7 +175,7 @@ By default it extracts the data points closest to the area defined by Lon_level/ Alternatively, you can also interpolate the data onto a new grid: ```julia julia> Data_extracted = ExtractSubvolume(Data_set3D,Lon_level=(10,12),Lat_level=(35,40), Interpolate=true, dims=(50,51,52)) -GeoData +GeoData size : (50, 51, 52) lon ϵ [ 10.0 : 12.0] lat ϵ [ 35.0 : 40.0] @@ -205,10 +205,10 @@ function ExtractSubvolume(V::GeoData; Interpolate=false, Lon_level=nothing, Lat_ # Don't interpolate i_s, i_e = argmin(abs.(V.lon.val[:,1,1] .- Lon_level[1])), argmin(abs.(V.lon.val[:,1,1] .- Lon_level[2])) iLon = i_s:i_e; - + i_s, i_e = argmin(abs.(V.lat.val[1,:,1] .- Lat_level[1])), argmin(abs.(V.lat.val[1,:,1] .- Lat_level[2])) iLat = i_s:i_e; - + i_s, i_e = argmin(abs.(V.depth.val[1,1,:] .- ustrip(Depth_level[1]))), argmin(abs.(V.depth.val[1,1,:] .- ustrip(Depth_level[2]))) step = 1; if i_emax_Data error("Outside bounds [$min_Data : $max_Data]; $Data_Cross") @@ -231,7 +231,7 @@ function CheckBounds(Data::GeoUnit, Data_Cross) end function CheckBounds(Data::AbstractArray, Data_Cross) - + min_Data, max_Data = NumValue(minimum(Data)), NumValue(maximum(Data)); if ustrip(Data_Cross) < min_Data || ustrip(Data_Cross)>max_Data error("Outside bounds [$min_Data : $max_Data]; $Data_Cross") @@ -274,7 +274,7 @@ function InterpolateDataFields(V::AbstractGeneralGrid, Lon, Lat, Depth) for i = 1:length(V.fields) if typeof(V.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(size(Lon,1),size(Lon,2),size(Lon,3),length(data_tuple)); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); @@ -285,7 +285,7 @@ function InterpolateDataFields(V::AbstractGeneralGrid, Lon, Lat, Depth) else interpol = LinearInterpolation((Lon_vec, Lat_vec, Depth_vec), ustrip.(data_tuple[j]),extrapolation_bc = Flat()); # create interpolation object end - data_array[:,:,:,j] = interpol.(Lon, Lat, ustrip.(Depth)); + data_array[:,:,:,j] = interpol.(Lon, Lat, ustrip.(Depth)); end data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 3D matrix to tuple @@ -299,13 +299,13 @@ function InterpolateDataFields(V::AbstractGeneralGrid, Lon, Lat, Depth) end data_new = interpol.(Lon, Lat, ustrip.(Depth)); # interpolate data field end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - + end - + # Create a GeoData struct with the newly interpolated fields if isa(V,GeoData) @@ -340,7 +340,7 @@ function InterpolateDataFields(V::UTMData, EW, NS, Depth) for i = 1:length(V.fields) if typeof(V.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(size(EW,1),size(EW,2),size(EW,3),length(data_tuple)); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); @@ -351,7 +351,7 @@ function InterpolateDataFields(V::UTMData, EW, NS, Depth) else interpol = LinearInterpolation((EW_vec, NS_vec, Depth_vec), ustrip.(data_tuple[j]),extrapolation_bc = Flat()); # create interpolation object end - data_array[:,:,:,j] = interpol.(EW, NS, Depth); + data_array[:,:,:,j] = interpol.(EW, NS, Depth); end data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 3D matrix to tuple @@ -365,13 +365,13 @@ function InterpolateDataFields(V::UTMData, EW, NS, Depth) end data_new = interpol.(EW, NS, Depth); # interpolate data field end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - + end - + # Create a GeoData struct with the newly interpolated fields Data_profile = UTMData(EW, NS, Depth, fields_new); @@ -388,13 +388,13 @@ function InterpolateDataFields2D(V::GeoData, Lon, Lat) Lon_vec = V.lon.val[:,1,1]; Lat_vec = V.lat.val[1,:,1]; - + fields_new = V.fields; field_names = keys(fields_new); for i = 1:length(V.fields) if typeof(V.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(size(Lon,1),size(Lon,2),size(Lon,3),length(data_tuple)); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); @@ -404,7 +404,7 @@ function InterpolateDataFields2D(V::GeoData, Lon, Lat) else interpol = LinearInterpolation((Lon_vec, Lat_vec), ustrip.(data_tuple[j]),extrapolation_bc = Flat()); # create interpolation object end - data_array[:,:,1,j] = interpol.(Lon, Lat); + data_array[:,:,1,j] = interpol.(Lon, Lat); end if length(size(data_tuple[1]))==3 data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 3D matrix to tuple @@ -421,11 +421,11 @@ function InterpolateDataFields2D(V::GeoData, Lon, Lat) data_new = interpol.(Lon, Lat); # interpolate data field end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - + end # Interpolate z-coordinate as well @@ -434,8 +434,8 @@ function InterpolateDataFields2D(V::GeoData, Lon, Lat) else interpol = LinearInterpolation((Lon_vec, Lat_vec), V.depth.val, extrapolation_bc = Flat()); # create interpolation object end - depth_new = interpol.(Lon, Lat); - + depth_new = interpol.(Lon, Lat); + # Create a GeoData struct with the newly interpolated fields # Data_profile = GeoData(Lon, Lat, Depth*0, fields_new); @@ -452,19 +452,19 @@ function InterpolateDataFields2D(V::UTMData, EW, NS) EW_vec = V.EW.val[:,1,1]; NS_vec = V.NS.val[1,:,1]; - + fields_new = V.fields; field_names = keys(fields_new); for i = 1:length(V.fields) if typeof(V.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(size(EW,1),size(EW,2),size(EW,3),length(data_tuple)); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); for j=1:length(data_tuple) interpol = LinearInterpolation((EW_vec, NS_vec), ustrip.(data_tuple[j]),extrapolation_bc = Flat()); # create interpolation object - data_array[:,:,1,j] = interpol.(EW, NS); + data_array[:,:,1,j] = interpol.(EW, NS); end data_new = tuple([data_array[:,:,1,c] for c in 1:size(data_array,4)]...) # transform 3D matrix to tuple @@ -478,11 +478,11 @@ function InterpolateDataFields2D(V::UTMData, EW, NS) data_new = interpol.(EW, NS); # interpolate data field end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - + end # Interpolate z-coordinate as well @@ -491,8 +491,8 @@ function InterpolateDataFields2D(V::UTMData, EW, NS) else interpol = LinearInterpolation((EW_vec, NS_vec), V.depth.val, extrapolation_bc = Flat()); # create interpolation object end - depth_new = interpol.(EW, NS); - + depth_new = interpol.(EW, NS); + # Create a UTMData struct with the newly interpolated fields # Data_profile = UTMData(EW, NS, Depth*0, fields_new); @@ -508,21 +508,21 @@ Interpolates a 3D data set `V` on a surface defined by `Surf`. nex # Example ```julia julia> Data -ParaviewData +ParaviewData size : (33, 33, 33) x ϵ [ -3.0 : 3.0] y ϵ [ -2.0 : 2.0] z ϵ [ -2.0 : 0.0] fields: (:phase, :density, :visc_total, :visc_creep, :velocity, :pressure, :temperature, :dev_stress, :strain_rate, :j2_dev_stress, :j2_strain_rate, :plast_strain, :plast_dissip, :tot_displ, :yield, :moment_res, :cont_res) julia> surf -ParaviewData +ParaviewData size : (96, 96, 1) x ϵ [ -2.9671875 : 3.2671875] y ϵ [ -1.9791666666666667 : 1.9791666666666667] z ϵ [ -1.5353766679763794 : -0.69925457239151] fields: (:Depth,) julia> Surf_interp = InterpolateDataOnSurface(Data, surf) - ParaviewData + ParaviewData size : (96, 96, 1) x ϵ [ -2.9671875 : 3.2671875] y ϵ [ -1.9791666666666667 : 1.9791666666666667] @@ -531,7 +531,7 @@ julia> Surf_interp = InterpolateDataOnSurface(Data, surf) ``` """ function InterpolateDataOnSurface(V::ParaviewData, Surf::ParaviewData) - + # Create GeoData structure: V_geo = GeoData(V.x.val, V.y.val, V.z.val, V.fields) V_geo.depth.val = ustrip(V_geo.depth.val); @@ -553,7 +553,7 @@ end Interpolates a 3D data set `V` on a surface defined by `Surf` """ function InterpolateDataOnSurface(V::GeoData, Surf::GeoData) - + Surf_interp = InterpolateDataFields(V, Surf.lon.val, Surf.lat.val, Surf.depth.val) return Surf_interp @@ -570,7 +570,7 @@ function ExtractDataSets(V::AbstractGeneralGrid, iLon, iLat, iDepth) Lon = zeros(typeof(X[1]), length(iLon),length(iLat),length(iDepth)); Lat = zeros(typeof(Y[1]), length(iLon),length(iLat),length(iDepth)); Depth = zeros(typeof(Z[1]), length(iLon),length(iLat),length(iDepth)); - + iLo = 1:length(iLon); iLa = 1:length(iLat); iDe = 1:length(iDepth) @@ -583,13 +583,13 @@ function ExtractDataSets(V::AbstractGeneralGrid, iLon, iLat, iDepth) for i = 1:length(V.fields) if typeof(V.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = fields_new[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(typeof(data_tuple[1][1]),length(iLon),length(iLat),length(iDepth),length(data_tuple)); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); for j=1:length(data_tuple) data_field = data_tuple[j]; - data_array[:,:,:,j] = data_field[iLon, iLat, iDepth]; + data_array[:,:,:,j] = data_field[iLon, iLat, iDepth]; end data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 4D matrix to tuple @@ -598,13 +598,13 @@ function ExtractDataSets(V::AbstractGeneralGrid, iLon, iLat, iDepth) data_new = zeros(typeof(V.fields[i][1]), length(iLon),length(iLat),length(iDepth)); data_new[iLo,iLa,iDe] = V.fields[i][iLon, iLat, iDepth] # interpolate data field end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - + end - + # Create a GeoData struct with the newly interpolated fields if isa(V,GeoData) @@ -634,15 +634,15 @@ function SubtractHorizontalMean(V::AbstractArray{T, 3}; Percentage=false) where if Percentage V_sub = zeros(size(V)); # no units else - V_sub = zeros(typeof(V[1]), size(V)); + V_sub = zeros(typeof(V[1]), size(V)); end for iLayer = 1:NumLayers average = mean(filter(!isnan, vec(V[:,:,iLayer]))); - + if Percentage V_sub[:,:,iLayer] = ustrip(V[:,:,iLayer]) .- ustrip(average); - V_sub[:,:,iLayer] = V_sub[:,:,iLayer]./ustrip(average)*100.0; # the result is normalized + V_sub[:,:,iLayer] = V_sub[:,:,iLayer]./ustrip(average)*100.0; # the result is normalized else V_sub[:,:,iLayer] = V[:,:,iLayer] .- average; end @@ -667,15 +667,15 @@ function SubtractHorizontalMean(V::AbstractArray{T, 2}; Percentage=false) where if Percentage V_sub = zeros(size(V)); # no units else - V_sub = zeros(typeof(V[1]), size(V)); + V_sub = zeros(typeof(V[1]), size(V)); end for iLayer = 1:NumLayers average = mean(filter(!isnan, vec(V[:,iLayer]))); - + if Percentage V_sub[:,iLayer] = ustrip(V[:,iLayer]) .- ustrip(average); - V_sub[:,iLayer] = V_sub[:,iLayer]./ustrip(average)*100.0; # the result is normalized + V_sub[:,iLayer] = V_sub[:,iLayer]./ustrip(average)*100.0; # the result is normalized else V_sub[:,iLayer] = V[:,iLayer] .- average; end @@ -688,7 +688,7 @@ end -""" +""" ParseColumns_CSV_File(data_file, num_columns) This parses numbers from CSV file that is read in with `CSV.File`. @@ -696,7 +696,7 @@ That is useful in case the CSV files has tables that contain both strings (e.g., # Example -This example assumes that the data starts at line 18, that the colums are separated by spaces, and that it contains at most 4 columns with data: +This example assumes that the data starts at line 18, that the columns are separated by spaces, and that it contains at most 4 columns with data: ```julia-repl julia> using CSV julia> data_file = CSV.File("FileName.txt",datarow=18,header=false,delim=' ') @@ -705,7 +705,7 @@ julia> data = ParseColumns_CSV_File(data_file, 4) """ function ParseColumns_CSV_File(data_file, num_columns) - data = zeros(size(data_file,1), num_columns); + data = zeros(size(data_file,1), num_columns); for (row_num,row) in enumerate(data_file) num = 0; for i=1:length(row) @@ -713,7 +713,7 @@ function ParseColumns_CSV_File(data_file, num_columns) num += 1; data[row_num,num] = row[i] else - + try parse(Float64,row[i]) num += 1; data[row_num,num] = parse(Float64,row[i]) @@ -727,7 +727,7 @@ function ParseColumns_CSV_File(data_file, num_columns) end -""" +""" AboveSurface(Data::GeoData, DataSurface::GeoData; above=true) Returns a boolean array of size(Data.Lon), which is true for points that are above the surface DataSurface (or for points below if above=false). @@ -738,17 +738,17 @@ This can be used, for example, to mask points above/below the Moho in a volumetr First we create a 3D data set and a 2D surface: ```julia julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); -julia> Data = Depth*2; +julia> Data = Depth*2; julia> Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon)) -GeoData +GeoData size : (11, 11, 13) lon ϵ [ 10.0 : 20.0] lat ϵ [ 30.0 : 40.0] depth ϵ [ -300.0 km : 0.0 km] fields: (:Depthdata, :LonData) -julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,-40km); +julia> Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,-40km); julia> Data_Moho = GeoData(Lon,Lat,Depth+Lon*km, (MohoDepth=Depth,)) - GeoData + GeoData size : (11, 11, 1) lon ϵ [ 10.0 : 20.0] lat ϵ [ 30.0 : 40.0] @@ -757,13 +757,13 @@ julia> Data_Moho = GeoData(Lon,Lat,Depth+Lon*km, (MohoDepth=Depth,)) ``` Next, we intersect the surface with the data set: ```julia -julia> Above = AboveSurface(Data_set3D, Data_Moho); +julia> Above = AboveSurface(Data_set3D, Data_Moho); ``` Now, `Above` is a boolean array that is true for points above the surface and false for points below and at the surface. """ function AboveSurface(Data::GeoData, DataSurface::GeoData; above=true) - + if size(DataSurface.lon)[3]!=1 error("It seems that DataSurface is not a surface") end @@ -874,20 +874,20 @@ The way it works is: - Filter data points in one model (e.g., areas with a velocity anomaly > 2 percent). Set everything that satisfies this criteria to 1 and everything else to 0. - Sum the results of the different datasets -If a feature is consistent between different datasets, it will have larger values. +If a feature is consistent between different datasets, it will have larger values. # Example We assume that we have 2 seismic velocity datasets `Data_Zhao_Pwave` and `DataKoulakov_Alps`: ```julia julia> Data_Zhao_Pwave -GeoData +GeoData size : (121, 94, 101) lon ϵ [ 0.0 : 18.0] lat ϵ [ 38.0 : 51.95] depth ϵ [ -1001.0 km : -1.0 km] fields: (:dVp_Percentage,) julia> DataKoulakov_Alps - GeoData + GeoData size : (108, 81, 35) lon ϵ [ 4.0 : 20.049999999999997] lat ϵ [ 37.035928143712574 : 49.01197604790419] @@ -895,9 +895,9 @@ julia> DataKoulakov_Alps fields: (:dVp_percentage, :dVs_percentage) ``` You can create a VoteMap which combines the two data sets with: -```julia +```julia julia> Data_VoteMap = VoteMap([Data_Zhao_Pwave,DataKoulakov_Alps],["dVp_Percentage>2.5","dVp_percentage>3.0"]) -GeoData +GeoData size : (50, 50, 50) lon ϵ [ 4.0 : 18.0] lat ϵ [ 38.0 : 49.01197604790419] @@ -906,9 +906,9 @@ GeoData ``` You can also create a VoteMap of a single dataset: -```julia +```julia julia> Data_VoteMap = VoteMap(Data_Zhao_Pwave,"dVp_Percentage>2.5", dims=(50,51,52)) -GeoData +GeoData size : (50, 51, 52) lon ϵ [ 0.0 : 18.0] lat ϵ [ 38.0 : 51.95] @@ -924,7 +924,7 @@ function VoteMap(DataSets::Vector{GeoData}, criteria::Vector{String}; dims=(50,5 if length(criteria) != numDataSets error("Need the same number of criteria as the number of data sets") end - + # Determine the overlapping lon/lat/depth regions of all datasets lon_limits = [minimum(DataSets[1].lon.val); maximum(DataSets[1].lon.val)]; lat_limits = [minimum(DataSets[1].lat.val); maximum(DataSets[1].lat.val)]; @@ -935,7 +935,7 @@ function VoteMap(DataSets::Vector{GeoData}, criteria::Vector{String}; dims=(50,5 lat_limits[1] = maximum([lat_limits[1] minimum(DataSets[i].lat.val)]); lat_limits[2] = minimum([lat_limits[2] maximum(DataSets[i].lat.val)]); - + z_limits[1] = maximum([z_limits[1] minimum(DataSets[i].depth.val)]); z_limits[2] = minimum([z_limits[2] maximum(DataSets[i].depth.val)]); end @@ -944,7 +944,7 @@ function VoteMap(DataSets::Vector{GeoData}, criteria::Vector{String}; dims=(50,5 VoteMap = zeros(Int64,dims) for i=1:numDataSets VoteMap_Local = zeros(Int64,dims) - + # Interpolate data set to smaller domain DataSet = ExtractSubvolume(DataSets[i]; Interpolate=true, Lon_level=lon_limits, Lat_level=lat_limits, Depth_level=z_limits, dims=dims); @@ -957,10 +957,10 @@ function VoteMap(DataSets::Vector{GeoData}, criteria::Vector{String}; dims=(50,5 end Array3D = ustrip.(DataSet.fields[expr.args[2]]); # strip units, just in case - - # Modify the value, to be Array3D + + # Modify the value, to be Array3D expr_mod = Expr(:call, expr.args[1], :($Array3D), expr.args[3]); # modify the original expression to use Array3D as variable name - + # The expression should have a ".", such as Array .> 1.0. If not, it will not apply this in a pointwise manner # Here, we add this dot if it is not there yet if cmp(String(expr_mod.args[1])[1],Char('.'))==1 @@ -970,7 +970,7 @@ function VoteMap(DataSets::Vector{GeoData}, criteria::Vector{String}; dims=(50,5 ind = eval(expr_mod); # evaluate the modified expression VoteMap_Local[ind] .= 1; # assign vote-map - VoteMap = VoteMap + VoteMap_Local; # Sum + VoteMap = VoteMap + VoteMap_Local; # Sum end DataSet = ExtractSubvolume(DataSets[1], Interpolate=true, Lon_level=lon_limits, Lat_level=lat_limits, Depth_level=z_limits, dims=dims); @@ -989,7 +989,7 @@ end """ Data_R = RotateTranslateScale(Data::ParaviewData; Rotate=0, Translate=(0,0,0), Scale=(1.0,1.0,1.0)) -Does an in-place rotation, translation and scaling of the Cartesian dataset `Data`. +Does an in-place rotation, translation and scaling of the Cartesian dataset `Data`. # Parameters Note that we apply the transformations in exactly this order: @@ -1001,7 +1001,7 @@ Note that we apply the transformations in exactly this order: ```julia julia> X,Y,Z = XYZGrid(10:20,30:40,-50:-10); julia> Data_C = ParaviewData(X,Y,Z,(Depth=Z,)) -ParaviewData +ParaviewData size : (11, 11, 41) x ϵ [ 10.0 : 20.0] y ϵ [ 30.0 : 40.0] @@ -1009,7 +1009,7 @@ ParaviewData fields: (:Depth,) julia> Data_R = RotateTranslateScale(Data_C, Rotate=30); julia> Data_R -ParaviewData +ParaviewData size : (11, 11, 41) x ϵ [ 8.169872981077807 : 21.83012701892219] y ϵ [ 28.16987298107781 : 41.83012701892219] @@ -1020,7 +1020,7 @@ ParaviewData function RotateTranslateScale(Data::ParaviewData; Rotate=0, Translate=(0,0,0), Scale=(1.0,1.0,1.0)) X,Y,Z = Data.x.val, Data.y.val, Data.z.val; # Extract coordinates - Xr,Yr,Zr = X,Y,Z; # Rotated coordinates + Xr,Yr,Zr = X,Y,Z; # Rotated coordinates # 1) Scaling if length(Scale)==1 @@ -1032,39 +1032,39 @@ function RotateTranslateScale(Data::ParaviewData; Rotate=0, Translate=(0,0,0), S # 2) 2D rotation around X/Y axis, around center of box - Xm,Ym = mean(X), mean(Y); + Xm,Ym = mean(X), mean(Y); R = [cosd(Rotate[1]) -sind(Rotate[1]); sind(Rotate[1]) cosd(Rotate[1])]; # 2D rotation matrix for i in eachindex(X) Rot_XY = R*[X[i]-Xm; Y[i]-Ym]; Xr[i] = Rot_XY[1] + Xm; Yr[i] = Rot_XY[2] + Ym; - end - + end + # 3) Add translation Xr .+= Translate[1]; Yr .+= Translate[2]; Zr .+= Translate[3]; - + # Modify original structure #Data.x.val = Xr; #Data.y.val = Yr; #Data.z.val = Zr; - + return ParaviewData(Xr,Yr,Zr, Data.fields) end """ - Topo = DrapeOnTopo(Topo::GeoData, Data::GeoData) + Topo = DrapeOnTopo(Topo::GeoData, Data::GeoData) -This drapes fields of a data set `Data` on the topography `Topo` +This drapes fields of a data set `Data` on the topography `Topo` """ function DrapeOnTopo(Topo::GeoData, Data::GeoData) - + Lon,Lat,Depth = LonLatDepthGrid( Topo.lon.val[:,1,1], Topo.lat.val[1,:,1],Topo.depth.val[1,1,:]); # use nearest neighbour to interpolate data @@ -1076,50 +1076,50 @@ function DrapeOnTopo(Topo::GeoData, Data::GeoData) idx_out = findall( (Lon .< minimum(Data.lon.val)) .| (Lon .> maximum(Data.lon.val)) .| (Lat .< minimum(Data.lat.val)) .| (Lat .> maximum(Data.lat.val)) ) - + fields_new = Topo.fields; field_names = keys(Data.fields); - + for i = 1:length(Data.fields) - + if typeof(Data.fields[i]) <: Tuple # vector or anything that contains more than 1 field - data_tuple = Data.fields[i] # we have a tuple (likely a vector field), so we have to loop + data_tuple = Data.fields[i] # we have a tuple (likely a vector field), so we have to loop data_array = zeros(typeof(data_tuple[1][1]),size(Topo.lon.val,1),size(Topo.lon.val,2),size(Topo.lon.val,3),length(Data.fields[i])); # create a 3D array that holds the 2D interpolated values unit_array = zeros(size(data_array)); for j=1:length(data_tuple) data_field = data_tuple[j]; - tmp = data_array[:,:,:,1]; + tmp = data_array[:,:,:,1]; tmp = data_field[idx] data_array[:,:,:,j] = tmp end - + data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 4D matrix to tuple # remove points outside domain for j=1:length(data_tuple) - tmp = data_new[j]; + tmp = data_new[j]; tmp[idx_out] .= NaN data_array[:,:,:,j] = tmp end data_new = tuple([data_array[:,:,:,c] for c in 1:size(data_array,4)]...) # transform 4D matrix to tuple else - + # scalar field data_new = zeros(typeof(Data.fields[i][1]), size(Topo.lon.val,1),size(Topo.lon.val,2),size(Topo.lon.val,3)); data_new = Data.fields[i][idx] # interpolate data field - + end - - # replace the one + + # replace the one new_field = NamedTuple{(field_names[i],)}((data_new,)) # Create a tuple with same name fields_new = merge(fields_new, new_field); # replace the field in fields_new - - end + + end Topo_new = GeoData(Topo.lon.val,Topo.lat.val,Topo.depth.val, fields_new) @@ -1129,10 +1129,10 @@ function DrapeOnTopo(Topo::GeoData, Data::GeoData) end -""" +""" DrapeOnTopo(Topo::CartData, Data::CartData) -Drapes Cartesian Data on topography +Drapes Cartesian Data on topography """ function DrapeOnTopo(Topo::CartData, Data::CartData) Topo_lonlat = GeoData(ustrip.(Topo.x.val),ustrip.(Topo.y.val), ustrip.(Topo.z.val), Topo.fields ) @@ -1145,19 +1145,19 @@ function DrapeOnTopo(Topo::CartData, Data::CartData) return Topo_new end -""" +""" LithostaticPressure!(Plithos::Array, Density::Array, dz::Number; g=9.81) Computes lithostatic pressure from a 3D density array, assuming constant soacing `dz` in vertical direction. Optionally, the gravitational acceleration `g` can be specified. """ function LithostaticPressure!(Plithos::Array{T,N}, Density::Array{T,N}, dz::Number; g=9.81) where {T,N} - + Plithos[:] = Density*dz*g; - + selectdim(Plithos,N,size(Plithos)[N]) .= 0 # set upper row to zero - + Plithos[:] = reverse!(cumsum(reverse!(Plithos),dims=N)) - + return nothing end diff --git a/src/voxel_gravity.jl b/src/voxel_gravity.jl index 5c8eb8c6..ab83a90e 100644 --- a/src/voxel_gravity.jl +++ b/src/voxel_gravity.jl @@ -13,24 +13,24 @@ export voxGrav voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3}, RHO::Array{Float64, 3}; refMod="AVG", lengthUnit="m", rhoTol=1e-9, Topo=[], outName="Bouguer", printing=true) - Computes Bouguer anomalies and gradients - - Required arguments: - X,Y,Z: 3D matrices with the coordinates of the grid (X should vary in the first dimension, Y in the second, Z (vertical) in the thrid) - RHO: 3D matrix with the densitiy at each grid point [kg/m^3] - - Optional arguments: - refMod: 1D vector with the reference density for each depth - Alternatively, the strings "NE", "SE", "SW", "NW", "AVG" can be used. - In that case, one of the corners of `RHO` is used as reference model. - In case of "AVG" the reference model is the average of each depth slice. - lengthUnit: The unit of the coordinates and topography file. Either "m" or "km" - rhoTol: density differences smaller than rhoTol will be ignored [kg/m^3] - Topo: 2D matrix with the topography of the surface (only relevant for the paraview output) - outName: name of the paraview output (do not include file type) + Computes Bouguer anomalies and gradients + + Required arguments: + X,Y,Z: 3D matrices with the coordinates of the grid (X should vary in the first dimension, Y in the second, Z (vertical) in the third) + RHO: 3D matrix with the densitiy at each grid point [kg/m^3] + + Optional arguments: + refMod: 1D vector with the reference density for each depth + Alternatively, the strings "NE", "SE", "SW", "NW", "AVG" can be used. + In that case, one of the corners of `RHO` is used as reference model. + In case of "AVG" the reference model is the average of each depth slice. + lengthUnit: The unit of the coordinates and topography file. Either "m" or "km" + rhoTol: density differences smaller than rhoTol will be ignored [kg/m^3] + Topo: 2D matrix with the topography of the surface (only relevant for the paraview output) + outName: name of the paraview output (do not include file type) printing: activate printing of additional information [true or false] """ -function voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3}, RHO::Array{Float64, 3}; +function voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3}, RHO::Array{Float64, 3}; refMod="AVG", lengthUnit="m", rhoTol=1e-9, Topo=[], outName="Bouguer", printing=true) ## check input @@ -58,13 +58,13 @@ function voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3 ny = size(X,2); nz = size(X,3); - # substract reference model + # subtract reference model for i = 1 : nz RHO[:,:,i] .= RHO[:,:,i] .- RefMod[i] end # interpolate density grid to cell centers - DRHO = RHO[1:end-1,1:end-1,1:end-1] .+ RHO[2:end,1:end-1,1:end-1] .+ RHO[2:end,2:end,1:end-1] .+ RHO[1:end-1,2:end,1:end-1] + + DRHO = RHO[1:end-1,1:end-1,1:end-1] .+ RHO[2:end,1:end-1,1:end-1] .+ RHO[2:end,2:end,1:end-1] .+ RHO[1:end-1,2:end,1:end-1] + RHO[1:end-1,1:end-1,2:end] .+ RHO[2:end,1:end-1,2:end] .+ RHO[2:end,2:end,2:end] .+ RHO[1:end-1,2:end,2:end] DRHO = DRHO ./ 8 @@ -119,7 +119,7 @@ function voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3 coords = permutedims(coords,[4,1,2,3]) vtkfile = vtk_grid(outName,coords) - if printing + if printing @printf "%.3f %% of the domain contained anomalous densities. If this is more than expected, adjust rhoTol for faster computation.\n" 100*frac @printf "Writing output...\n" vtkfile["Bouguer Anomaly [mGal]"] = dg @@ -137,7 +137,7 @@ function voxGrav(X::Array{Float64, 3}, Y::Array{Float64, 3}, Z::Array{Float64, 3 if orient == 1 return dg, gradX, gradY - else + else return permutedims(dg, [2,1]), permutedims(gradX, [2,1]), permutedims(gradY, [2,1]) end end @@ -208,7 +208,7 @@ function checkInput(X, Y, Z, RHO, refMod, lengthUnit, rhoTol, Topo, outName, pri else error("lengthUnit should be \"m\" or \"km\".") end - + # reference model if !(typeof(refMod) == String) if length(refMod) == nz diff --git a/test/TestData.csv b/test/TestData.csv index e6c1f92e..90ef756f 100755 --- a/test/TestData.csv +++ b/test/TestData.csv @@ -1,10 +1,10 @@ # Test data to be used to test data import in GeophysicalModelGenerator.jl -# taken from the Joint Pamir-Hindu Kush earthquake catalog for events +# taken from the Joint Pamir-Hindu Kush earthquake catalog for events # deeper than 50 km from Kufner et al., 2016 -# +# # this data is only used for testing the csv import -# -# See also Kufner et. al. 2016 for more information and the original data +# +# See also Kufner et. al. 2016 for more information and the original data longitude(deg),latitude(deg),depth(km),local_magnitude,absError(km) 68.740,36.424,52.04,5.5,9.7 @@ -16,4 +16,4 @@ longitude(deg),latitude(deg),depth(km),local_magnitude,absError(km) 68.950,36.240,82.55,5.2,5.5 71.303,36.301,84.36,5.1,3.2 71.567,36.401,90.90,5.3,3.6 -70.829,36.321,110.41,5.3,3.6 \ No newline at end of file +70.829,36.321,110.41,5.3,3.6 diff --git a/test/runtests.jl b/test/runtests.jl index 94e09837..38da6e28 100644 --- a/test/runtests.jl +++ b/test/runtests.jl @@ -32,8 +32,7 @@ end include("test_stl.jl") end -# Cleanup +# Cleanup foreach(rm, filter(endswith(".vts"), readdir())) foreach(rm, filter(endswith(".vtu"), readdir())) rm("./markers/",recursive=true) - diff --git a/test/test_data_import.jl b/test/test_data_import.jl index f76edaa5..db94b069 100644 --- a/test/test_data_import.jl +++ b/test/test_data_import.jl @@ -28,7 +28,7 @@ using GeophysicalModelGenerator # test loading images (profiles & mapviews) -# Extact & save profile in GeoData format +# Extract & save profile in GeoData format filename = "test.png"; # fake png Corner_LowerLeft = (18.0, 51.0, -590.0) Corner_UpperRight = (9.0, 42.0, 0.0) @@ -74,5 +74,3 @@ data_Image = Screenshot_To_UTMData(filename,Corner_LowerLeft, Corner_ @test data_Image.EW.val[22] ≈ 0.42424242424242425 @test data_Image.NS.val[22] ≈ 48.666666666666664 @test Value(data_Image.depth[22]) ≈ -15.0m - - diff --git a/test/test_data_types.jl b/test/test_data_types.jl index 0df8d3d5..90e0a1dd 100644 --- a/test/test_data_types.jl +++ b/test/test_data_types.jl @@ -10,12 +10,12 @@ Data_set = GeoData(Lat,Lon,Depth,(FakeData=Data,Data2=Data.+1.)) # crea @test Value(Data_set.depth[2])==-19km Depth1 = (-20.:-11.)*m; # depth has units of m -Data_set1 = GeoData(Lat,Lon,Depth1,(FakeData=Data,Data2=Data.+1.)) +Data_set1 = GeoData(Lat,Lon,Depth1,(FakeData=Data,Data2=Data.+1.)) @test Value(Data_set1.depth[2])==-0.019km @test Data_set.atts["note"]=="No attributes were given to this dataset" # check whether the default attribute assignment works Depth2 = -20.:-11.; # no units -Data_set2 = GeoData(Lat,Lon,Depth2,(FakeData=Data,Data2=Data.+1.)) +Data_set2 = GeoData(Lat,Lon,Depth2,(FakeData=Data,Data2=Data.+1.)) @test Value(Data_set2.depth[2])==-19km # test that it works if we give a Data array, rather than a NamedTuple, as input (we add a default name) @@ -37,14 +37,14 @@ Data_set4 = GeoData(Lat,Lon,Depth,(Data,),att_dict) # check that an error is thrown if a different size input is given for depth Depth2 = (-100:10:1.0) # no units -@test_throws ErrorException GeoData(Lat,Lon,Depth2,(FakeData=Data,Data2=Data.+1.)) +@test_throws ErrorException GeoData(Lat,Lon,Depth2,(FakeData=Data,Data2=Data.+1.)) # Convert 1D vector to cartesian structure Data_cart = convert(ParaviewData,Data_set) @test Data_cart.x[3] ≈ 6189.685604255086 @test Data_cart.y[3] ≈ 324.3876769792181 -@test Data_cart.z[3] ≈ 1421.35608984477 +@test Data_cart.z[3] ≈ 1421.35608984477 # Create Lon/Lat/Depth and X/Y/Z grids from given numbers or 1D vectors Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-10:-1)km); @@ -71,21 +71,21 @@ X,Y,Z = XYZGrid(10,30,(-10:-1)km); @test_throws ErrorException XYZGrid(10:20,30:40,[20 30; 40 50]); -# Create 3D arrays & convert them +# Create 3D arrays & convert them Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-10:-1)km); # 3D grid Data = ustrip(Depth); -Data_set1 = GeoData(Lon,Lat,Depth,(FakeData=Data,Data2=Data.+1.)) +Data_set1 = GeoData(Lon,Lat,Depth,(FakeData=Data,Data2=Data.+1.)) @test size(Data_set1.depth)==(11, 11, 10) @test Value(Data_set1.depth[1,2,3])==-8.0km # double-check that giving 3D arrays in the wrong ordering triggers a warning message -@test_throws ErrorException("It appears that the lon array has a wrong ordering") GeoData(Lat,Lon,Depth,(FakeData=Data,Data2=Data.+1.)) +@test_throws ErrorException("It appears that the lon array has a wrong ordering") GeoData(Lat,Lon,Depth,(FakeData=Data,Data2=Data.+1.)) -# Create 2D arrays & convert them +# Create 2D arrays & convert them Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,-50km); Data = ustrip(Depth); -Data_set2 = GeoData(Lon,Lat,Depth,(FakeData=Data,Data2=Data.+1.)) +Data_set2 = GeoData(Lon,Lat,Depth,(FakeData=Data,Data2=Data.+1.)) @test Value(Data_set2.depth[2,2])== -50.0km # Convert the 2D and 3D arrays to their cartesian counterparts @@ -113,7 +113,7 @@ ns = 4.514137e6:100:4.523637e6 depth = -5400:250:600 EW,NS,Depth = XYZGrid(ew, ns, depth); Data = ustrip.(Depth); -Data_set = UTMData(EW,NS,Depth,33, true, (FakeData=Data,Data2=Data.+1.)) +Data_set = UTMData(EW,NS,Depth,33, true, (FakeData=Data,Data2=Data.+1.)) @test Data_set.EW[3,4,2] ≈ 422323.0 @test Data_set.NS[3,4,2] ≈ 4.514437e6 @@ -128,9 +128,9 @@ Data_set1 = convert(GeoData, Data_set) @test Data_set1.lat[20] ≈ 40.77470011887963 @test Value(Data_set1.depth[20]) == -5400m -# Convert from GeoData -> UTMData +# Convert from GeoData -> UTMData Data_set2 = convert(UTMData, Data_set1) -@test sum(abs.(Data_set2.EW.val-Data_set.EW.val)) < 1e-5 +@test sum(abs.(Data_set2.EW.val-Data_set.EW.val)) < 1e-5 # Test Projection point for negative values proj = ProjectionPoint(Lat= -2.8, Lon=36) @@ -140,7 +140,7 @@ proj = ProjectionPoint(Lat= -2.8, Lon=36) # Convert from GeoData -> UTMData, but for a fixed zone (used for map projection) proj = ProjectionPoint(Lat= 40.77470011887963, Lon=14.099668158564413) Data_set3 = Convert2UTMzone(Data_set1, proj) -@test Data_set3.EW.val[100] ≈ 432022.99999999994 +@test Data_set3.EW.val[100] ≈ 432022.99999999994 # Create CartData structure x = 0:2:10 @@ -167,4 +167,4 @@ Data_set5 = Convert2CartData(Data_set, proj) # Convert result back to UTM (convert LaMEM results back to UTM & afterwards to GeoData) Data_set6 = Convert2UTMzone(Data_set5, proj) @test sum(Data_set.EW.val-Data_set6.EW.val) ≈ 0.0 -@test sum(Data_set.NS.val-Data_set6.NS.val) ≈ 0.0 \ No newline at end of file +@test sum(Data_set.NS.val-Data_set6.NS.val) ≈ 0.0 diff --git a/test/test_files/GeometricPrimitives.dat b/test/test_files/GeometricPrimitives.dat index 75c9cd61..5cae630b 100644 --- a/test/test_files/GeometricPrimitives.dat +++ b/test/test_files/GeometricPrimitives.dat @@ -19,4 +19,4 @@ nmark_x = 3 # markers per cell in x-direction nmark_y = 3 # ... y-direction - nmark_z = 3 # ... z-direction \ No newline at end of file + nmark_z = 3 # ... z-direction diff --git a/test/test_files/SaltModels.dat b/test/test_files/SaltModels.dat index 1ee6dc30..b40dec6f 100644 --- a/test/test_files/SaltModels.dat +++ b/test/test_files/SaltModels.dat @@ -1,8 +1,8 @@ # This is a simple 3D subduction setup a free slip upper boundary -# The geometry is created in MATLAB, and linear rheologies are being used apart from the +# The geometry is created in MATLAB, and linear rheologies are being used apart from the # crust, which is plastic to allow detachment of the upper boundary # -# The MATLAB file CreateMarkers_Subduction_Linear_FreeSlip_parallel.m is used to create +# The MATLAB file CreateMarkers_Subduction_Linear_FreeSlip_parallel.m is used to create # the input geometry. # A multigrid solver is employed @@ -10,13 +10,13 @@ # Scaling #=============================================================================== - units = geo # geological units - + units = geo # geological units + unit_temperature = 1000 unit_length = 1000 # in m unit_viscosity = 1e20 unit_stress = 1e9 - + #=============================================================================== # Time stepping parameters #=============================================================================== @@ -54,7 +54,7 @@ # No-slip boundary flag mask (left right front back bottom top) noslip = 0 0 0 0 1 0 - + #=============================================================================== # Solution parameters & controls @@ -64,12 +64,12 @@ FSSA = 1.0 # free surface stabilization parameter [0 - 1] init_guess = 0 # initial guess flag eta_min = 1e18 # viscosity upper bound - eta_ref = 1e20 # reference viscosity for initial guess + eta_ref = 1e20 # reference viscosity for initial guess eta_max = 1e25 # viscosity lower limit p_litho_plast = 1 # use lithostatic pressure for plasticity (which makes the yield stress purely depth-dependent!) - + #=============================================================================== # Solver options #=============================================================================== @@ -78,12 +78,12 @@ MGSweeps = 10 # number of MG smoothening steps per level [default=10] MGSmoother = chebyshev # type of smoothener used [chebyshev or jacobi] MGCoarseSolver = mumps # coarse grid solver [direct/mumps/superlu_dist or redundant - more options specifiable through the command-line options -crs_ksp_type & -crs_pc_type] - + #=============================================================================== # Model setup & advection #=============================================================================== - msetup = files + msetup = files mark_load_file = ./markers/mdb # marker input file (extension is .xxxxxxxx.dat) nmark_x = 3 # markers per cell in x-direction @@ -95,9 +95,9 @@ interp = minmod # velocity interpolation scheme mark_ctrl = basic # marker control type nmark_lim = 8 100 # min/max number per cell - - + + #=============================================================================== # Output #=============================================================================== @@ -124,13 +124,13 @@ out_tot_displ = 1 out_moment_res = 1 out_cont_res = 1 - + # AVD phase viewer output options (requires activation) out_avd = 1 # activate AVD phase output out_avd_pvd = 1 # activate writing .pvd file out_avd_ref = 3 # AVD grid refinement factor - + # Free surface output options (can be activated only if surface tracking is enabled) out_surf = 1 # activate surface output @@ -138,7 +138,7 @@ out_surf_velocity = 1 out_surf_topography = 1 out_surf_amplitude = 1 - + #=============================================================================== # Material phase parameters #=============================================================================== @@ -155,12 +155,12 @@ ID = 1 # phase id rho = 3250 # density eta = 5e22 - + - - - - + + + + #=============================================================================== # PETSc options #=============================================================================== diff --git a/test/test_files/Subduction2D_FreeSlip_Particles_Linear_DirectSolver.dat b/test/test_files/Subduction2D_FreeSlip_Particles_Linear_DirectSolver.dat index 7ba48579..db2719bb 100644 --- a/test/test_files/Subduction2D_FreeSlip_Particles_Linear_DirectSolver.dat +++ b/test/test_files/Subduction2D_FreeSlip_Particles_Linear_DirectSolver.dat @@ -1,8 +1,8 @@ # This is a simple 2D subduction setup a free slip upper boundary -# The geometry is created in MATLAB, and linear rheologies are being used apart from the +# The geometry is created in MATLAB, and linear rheologies are being used apart from the # crust, which is plastic to allow detachment of the upper boundary # -# The MATLAB file CreateMarkers_Subduction_Linear_FreeSlip_parallel.m is used to create +# The MATLAB file CreateMarkers_Subduction_Linear_FreeSlip_parallel.m is used to create # the input geometry. # A direct solver is employed @@ -10,13 +10,13 @@ # Scaling #=============================================================================== - units = geo # geological units - + units = geo # geological units + unit_temperature = 1000 unit_length = 1e3 unit_viscosity = 1e20 unit_stress = 1e9 - + #=============================================================================== # Time stepping parameters #=============================================================================== @@ -54,7 +54,7 @@ # No-slip boundary flag mask (left right front back bottom top) noslip = 0 0 0 0 1 0 - + #=============================================================================== # Solution parameters & controls #=============================================================================== @@ -63,23 +63,23 @@ FSSA = 1.0 # free surface stabilization parameter [0 - 1] init_guess = 0 # initial guess flag eta_min = 1e18 # viscosity upper bound - eta_ref = 1e20 # reference viscosity for initial guess - eta_max = 1e25 # viscosity lower limit + eta_ref = 1e20 # reference viscosity for initial guess + eta_max = 1e25 # viscosity lower limit p_litho_plast = 1 # use lithostatic pressure for plasticity #=============================================================================== # Solver options #=============================================================================== SolverType = direct # solver [direct or multigrid] - DirectSolver = mumps # mumps/superlu_dist/pastix + DirectSolver = mumps # mumps/superlu_dist/pastix DirectPenalty = 1e5 - + #=============================================================================== # Model setup & advection #=============================================================================== - msetup = files + msetup = files mark_load_file = ./markers/mdb # marker input file (extension is .xxxxxxxx.dat) nmark_x = 3 # markers per cell in x-direction @@ -91,9 +91,9 @@ interp = minmod # velocity interpolation scheme mark_ctrl = basic # marker control type nmark_lim = 8 100 # min/max number per cell - - + + #=============================================================================== # Output #=============================================================================== @@ -120,13 +120,13 @@ out_tot_displ = 1 out_moment_res = 1 out_cont_res = 1 - + # AVD phase viewer output options (requires activation) out_avd = 1 # activate AVD phase output out_avd_pvd = 1 # activate writing .pvd file out_avd_ref = 3 # AVD grid refinement factor - + #=============================================================================== # Material phase parameters @@ -144,23 +144,23 @@ ID = 1 # phase id rho = 3250 # density eta = 5e22 - + # plastic properties of crust ch = 5e6 # cohesion [MPa] - fr = 5 # friction angle - + fr = 5 # friction angle + - - + + # Define properties of mantle lithosphere/slab ID = 2 # phase id - + rho = 3250 # density eta = 5e22 # viscosity - - + + #=============================================================================== # PETSc options #=============================================================================== diff --git a/test/test_files/non-uniform_grid.dat b/test/test_files/non-uniform_grid.dat index a2391f4d..e663be4a 100644 --- a/test/test_files/non-uniform_grid.dat +++ b/test/test_files/non-uniform_grid.dat @@ -25,7 +25,7 @@ coord_x = 0.0 16.0 coord_y = -0.5 0.0 0.5 coord_z = -10.0 -5.0 -4.0 0.0 - + # Bias ratios for all segments bias_x = 1.0 @@ -41,4 +41,3 @@ nmark_z = 2 # ... z-direction #=============================================================================== - diff --git a/test/test_lamem.jl b/test/test_lamem.jl index 53fe0a4e..23e70df4 100644 --- a/test/test_lamem.jl +++ b/test/test_lamem.jl @@ -17,7 +17,7 @@ Temp = zeros(Float64, size(Grid.X)); Model3D = CartData(Grid, (Phases=Grid.Z,)); @test Value(Model3D.y[100])==-1.9375km -# Read Partitioning file: +# Read Partitioning file: PartitioningFile = "test_files/ProcessorPartitioning_4cpu_1.2.2.bin" Nprocx,Nprocy,Nprocz, xc,yc,zc, nNodeX,nNodeY,nNodeZ = GetProcessorPartitioning(PartitioningFile) @test Nprocz==2 @@ -60,7 +60,7 @@ AddBox!(Phases,Temp,Grid, xlim=(0,500), zlim=(-500,-20), phase=LithosphericPhase @test sum(Temp) == 1.189394358568891e9 Model3D = CartData(Grid, (Phases=Phases,Temp=Temp)); -Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview +Write_Paraview(Model3D,"LaMEM_ModelSetup") # Save model to paraview # Test writing a LaMEM topography file diff --git a/test/test_paraview.jl b/test/test_paraview.jl index 69f0ea37..f5df66ac 100644 --- a/test/test_paraview.jl +++ b/test/test_paraview.jl @@ -7,56 +7,56 @@ using GeophysicalModelGenerator # Generate a 3D grid Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); Data = Depth*2; # some data -Data_set = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon)) +Data_set = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon)) @test Write_Paraview(Data_set, "test_depth3D") == nothing # Horizontal profile @ 10 km height Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,10km); Depth[2:4,2:4,1] .= 25km # add some fake topography -Data_set2 = GeoData(Lon,Lat,Depth,(Topography=Depth,)) +Data_set2 = GeoData(Lon,Lat,Depth,(Topography=Depth,)) @test Write_Paraview(Data_set2, "test2") == nothing # Cross sections Lon,Lat,Depth = LonLatDepthGrid(10:20,35,(-300:25:0)km); -Data_set3 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) +Data_set3 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) @test Write_Paraview(Data_set3, "test3") == nothing Lon,Lat,Depth = LonLatDepthGrid(15,30:40,(-300:25:0)km); -Data_set4 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) +Data_set4 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) @test Write_Paraview(Data_set4, "test4") == nothing Lon,Lat,Depth = LonLatDepthGrid(15,35,(-300:25:0)km); -Data_set5 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) +Data_set5 = GeoData(Lon,Lat,Depth,(DataSet=Depth,)) @test Write_Paraview(Data_set5, "test5") == nothing -# Test saving vectors +# Test saving vectors Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,50km); Ve = zeros(size(Depth)) .+ 1.0; Vn = zeros(size(Depth)); Vz = zeros(size(Depth)); -Velocity = (copy(Ve),copy(Vn),copy(Vz)) # tuple with 3 values, which -Data_set_vel = GeoData(Lon,Lat,Depth,(Velocity=Velocity, Veast=Velocity[1]*cm/yr, Vnorth=Velocity[2]*cm/yr, Vup=Velocity[3]*cm/yr)) +Velocity = (copy(Ve),copy(Vn),copy(Vz)) # tuple with 3 values, which +Data_set_vel = GeoData(Lon,Lat,Depth,(Velocity=Velocity, Veast=Velocity[1]*cm/yr, Vnorth=Velocity[2]*cm/yr, Vup=Velocity[3]*cm/yr)) @test Write_Paraview(Data_set_vel, "test_Vel") == nothing # Test saving colors -red = zeros(size(Lon)); -green = zeros(size(Lon)); -blue = zeros(size(Lon)); +red = zeros(size(Lon)); +green = zeros(size(Lon)); +blue = zeros(size(Lon)); Data_set_color = GeoData(Lon, Lat, Depth, (Velocity=Velocity,colors=(red,green,blue),color2=(red,green,blue))) @test Write_Paraview(Data_set_color, "test_Color") == nothing # Manually test the in-place conversion from spherical -> cartesian (done automatically when converting GeoData->ParaviewData ) -Vel_Cart = (copy(Ve),copy(Vn),copy(Vz)) +Vel_Cart = (copy(Ve),copy(Vn),copy(Vz)) Velocity_SphericalToCartesian!(Data_set_vel, Vel_Cart); @test Vel_Cart[2][15] ≈ 0.9743700647852352 @test Vel_Cart[1][15] ≈ -0.224951054343865 @test Vel_Cart[3][15] ≈ 0.0 # Test saving unstructured point data (EQ, or GPS points) -Data_set_VelPoints = GeoData(Lon[:],Lat[:],ustrip.(Depth[:]),(Velocity=(copy(Ve[:]),copy(Vn[:]),copy(Vz[:])), Veast=Ve[:]*mm/yr, Vnorth=Vn[:]*cm/yr, Vup=Vz[:]*cm/yr)) +Data_set_VelPoints = GeoData(Lon[:],Lat[:],ustrip.(Depth[:]),(Velocity=(copy(Ve[:]),copy(Vn[:]),copy(Vz[:])), Veast=Ve[:]*mm/yr, Vnorth=Vn[:]*cm/yr, Vup=Vz[:]*cm/yr)) @test Write_Paraview(Data_set_VelPoints, "test_Vel_points", PointsData=true) == nothing -end \ No newline at end of file +end diff --git a/test/test_setup_geometry.jl b/test/test_setup_geometry.jl index 67ae3b1f..1073ceb5 100644 --- a/test/test_setup_geometry.jl +++ b/test/test_setup_geometry.jl @@ -7,7 +7,7 @@ Lon3D,Lat3D,Depth3D = LonLatDepthGrid(1.0:1:10.0, 11.0:1:20.0, (-20:1:-10)*km) Data = zeros(size(Lon3D)); Temp = ones(Float64, size(Data))*1350; Phases = zeros(Int32, size(Data)); -Grid = GeoData(Lon3D,Lat3D,Depth3D,(DataFieldName=Data,)) +Grid = GeoData(Lon3D,Lat3D,Depth3D,(DataFieldName=Data,)) AddBox!(Phases,Temp,Grid, xlim=(2,4), zlim=(-15,-10), phase=ConstantPhase(3), DipAngle=10, T=LinearTemp(Tbot=1350, Ttop=200)) @test sum(Temp[1,1,:]) ≈ 14850.0 @@ -15,12 +15,12 @@ AddBox!(Phases,Temp,Grid, xlim=(2,4), zlim=(-15,-10), phase=ConstantPhase(3), Di AddEllipsoid!(Phases,Temp,Grid, cen=(4,15,-17), axes=(1,2,3), StrikeAngle=90, DipAngle=45, phase=ConstantPhase(2), T=ConstantTemp(600)) @test sum(Temp[1,1,:]) ≈ 14850.0 -# CartData +# CartData X,Y,Z = XYZGrid(1.0:1:10.0, 11.0:1:20.0, -20:1:-10); Data = zeros(size(X)); Temp = ones(Float64, size(Data))*1350; Phases = zeros(Int32, size(Data)); -Grid = CartData(X,Y,Z,(DataFieldName=Data,)) +Grid = CartData(X,Y,Z,(DataFieldName=Data,)) AddBox!(Phases,Temp,Grid, xlim=(2,4), zlim=(-15,-10), phase=ConstantPhase(3), DipAngle=10, T=LinearTemp(Tbot=1350, Ttop=200)) @test sum(Temp[1,1,:]) ≈ 14850.0 diff --git a/test/test_stl.jl b/test/test_stl.jl index 8fc62343..ce854949 100644 --- a/test/test_stl.jl +++ b/test/test_stl.jl @@ -9,8 +9,8 @@ X,Y,Z = XYZGrid(150:180, -15:2:15, 10:5:60) # Create mesh Phase = zeros(size(X)); for i in eachindex(X) - inside = IsInsideClosedSTL(mesh, [X[i], Y[i], Z[i]]) - if inside + inside = IsInsideClosedSTL(mesh, [X[i], Y[i], Z[i]]) + if inside Phase[i] = 1; end end diff --git a/test/test_transformation.jl b/test/test_transformation.jl index ff502a5b..d791e339 100644 --- a/test/test_transformation.jl +++ b/test/test_transformation.jl @@ -4,17 +4,17 @@ using GeophysicalModelGenerator # Create 3D volume with some fake data Lon,Lat,Depth = LonLatDepthGrid(5:25,20:50,(-1300:100:0)km); -Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Depth*2 + Lon*km,LonData=Lon)) +Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Depth*2 + Lon*km,LonData=Lon)) proj = ProjectionPoint(Lon=20,Lat=35) # Convert this 3D dataset to a Cartesian dataset (the grid will not be orthogonal) -Data_set3D_Cart = Convert2CartData(Data_set3D, proj) +Data_set3D_Cart = Convert2CartData(Data_set3D, proj) @test sum(abs.(Value(Data_set3D_Cart.x))) ≈ 5.293469089428514e6km # Create Cartesian grid X,Y,Z = XYZGrid(-500:100:500,-900:200:900,(-500:100:0)km); -Data_Cart = CartData(X,Y,Z,(Z=Z,)) +Data_Cart = CartData(X,Y,Z,(Z=Z,)) # Project values of Data_set3D to the cartesian data Data_Cart = ProjectCartData(Data_Cart, Data_set3D, proj) @@ -26,12 +26,12 @@ Data_Cart = ProjectCartData(Data_Cart, Data_set3D, proj) # Next, 3D surface (like topography) Lon,Lat,Depth = LonLatDepthGrid(5:25,20:50,0); Depth = cos.(Lon/5).*sin.(Lat)*10 -Data_surf = GeoData(Lon,Lat,Depth,(Z=Depth,)) -Data_surf_Cart = Convert2CartData(Data_surf, proj) +Data_surf = GeoData(Lon,Lat,Depth,(Z=Depth,)) +Data_surf_Cart = Convert2CartData(Data_surf, proj) # Cartesian surface X,Y,Z = XYZGrid(-500:10:500,-900:20:900,0); -Data_Cart = CartData(X,Y,Z,(Z=Z,)) +Data_Cart = CartData(X,Y,Z,(Z=Z,)) Data_Cart = ProjectCartData(Data_Cart, Data_surf, proj) @test sum(Value(Data_Cart.z)) ≈ 1858.2487019158766km @@ -42,7 +42,7 @@ Data_Cart = ProjectCartData(Data_Cart, Data_surf, proj) WE,SN,depth = XYZGrid(420000:1000:430000, 4510000:1000:4520000, 0); Data_surfUTM = UTMData(WE, SN, depth, 33, true, (Depth = WE,)); -Data_Cart = CartData(X,Y,Z,(Z=Z,)) +Data_Cart = CartData(X,Y,Z,(Z=Z,)) Data_Cart = ProjectCartData(Data_Cart, Data_surfUTM, proj) @test sum(Value(Data_Cart.z)) ≈ 0.0km diff --git a/test/test_utils.jl b/test/test_utils.jl index 18ae4445..c3cfdd7e 100644 --- a/test/test_utils.jl +++ b/test/test_utils.jl @@ -7,23 +7,23 @@ using GeophysicalModelGenerator Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,-50km); Data1 = Depth*2; # some data Vx1,Vy1,Vz1 = Data1*3,Data1*4,Data1*5 -Data_set2D = GeoData(Lon,Lat,Depth,(Depthdata=Data1,LonData1=Lon, Velocity=(Vx1,Vy1,Vz1))) +Data_set2D = GeoData(Lon,Lat,Depth,(Depthdata=Data1,LonData1=Lon, Velocity=(Vx1,Vy1,Vz1))) @test_throws ErrorException CrossSection(Data_set2D, Depth_level=-10) # Create 3D volume with some fake data Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(-300:25:0)km); Data = Depth*2; # some data Vx,Vy,Vz = ustrip(Data*3),ustrip(Data*4),ustrip(Data*5); -Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) +Data_set3D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) # Create 3D cartesian dataset -Data_setCart3D = CartData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) +Data_setCart3D = CartData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) # Create 3D volume with some fake data Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,(0:-25:-300)km); Data = Depth*2; # some data Vx,Vy,Vz = ustrip(Data*3),ustrip(Data*4),ustrip(Data*5); -Data_set3D_reverse = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) +Data_set3D_reverse = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon, Velocity=(Vx,Vy,Vz))) # Create cross-sections in various directions (no interpolation which is default) test_cross = CrossSection(Data_set3D, Depth_level=-100km) @@ -105,16 +105,16 @@ Data_pert = SubtractHorizontalMean(Data, Percentage=true) # 3D wi @test Data_pert[1000] == 0.0 Data2D = Data[:,1,:]; -Data_pert = SubtractHorizontalMean(Data2D, Percentage=true) # 2D version with units [dp the same along a vertical profile] +Data_pert = SubtractHorizontalMean(Data2D, Percentage=true) # 2D version with units [dp the same along a vertical profile] -Data_set2D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon,Pertdata=Data_pert ,Velocity=(Vx,Vy,Vz))) +Data_set2D = GeoData(Lon,Lat,Depth,(Depthdata=Data,LonData=Lon,Pertdata=Data_pert ,Velocity=(Vx,Vy,Vz))) @test Data_set2D.fields[3][10,8,1] == 0 # Create surface ("Moho") Lon,Lat,Depth = LonLatDepthGrid(10:20,30:40,-40km); Depth = Depth + Lon*km; # some fake topography on Moho -Data_Moho = GeoData(Lon,Lat,Depth,(MohoDepth=Depth,LonData=Lon)) +Data_Moho = GeoData(Lon,Lat,Depth,(MohoDepth=Depth,LonData=Lon)) # Test intersecting a surface with 2D or 3D data sets @@ -140,7 +140,7 @@ Data_VoteMap = VoteMap(Data_set3D_reverse, "Depthdata<-560",dims=(10,10,10)) @test Data_VoteMap.fields[:VoteMap][101]==0 @test Data_VoteMap.fields[:VoteMap][100]==1 -# Combine 2 datasets +# Combine 2 datasets Data_VoteMap = VoteMap([Data_set3D_reverse, Data_set3D], ["Depthdata<-560","LonData>19"],dims=(10,10,10)) @test Data_VoteMap.fields[:VoteMap][10,9,1]==2 @test Data_VoteMap.fields[:VoteMap][9 ,9,1]==1 @@ -160,4 +160,4 @@ Data_C1 = RotateTranslateScale(Data_C, Scale=10, Rotate=10, Translate=(1,2,3)); @test Data_C1.z.val[20] == -497.0 -# Test \ No newline at end of file +# Test diff --git a/tutorial/Alps_VpModel_Zhao_etal_JGR2016.jl b/tutorial/Alps_VpModel_Zhao_etal_JGR2016.jl index fa5de440..f6da4a90 100644 --- a/tutorial/Alps_VpModel_Zhao_etal_JGR2016.jl +++ b/tutorial/Alps_VpModel_Zhao_etal_JGR2016.jl @@ -1,4 +1,4 @@ -# This shows how to plot a 3D seismic tomography model +# This shows how to plot a 3D seismic tomography model # The paper that describes it is: # # Zhao, L., Paul, A., Malusà, M.G., Xu, X., Zheng, T., Solarino, S., Guillot, S., Schwartz, S., Dumont, T., Salimbeni, S., Aubert, C., Pondrelli, S., Wang, Q., Zhu, R., 2016. Continuity of the Alpine slab unraveled by high-resolution P wave tomography. Journal of Geophysical Research: Solid Earth 121, 8720–8737. doi:10.1002/2016JB013310 @@ -28,8 +28,8 @@ dVp_perc_3D = reshape(dVp_perc, resolution); Data_set = GeoData(Lon,Lat,Depth,(dVp_Percentage=dVp_perc_3D,)) Write_Paraview(Data_set, "Zhao_etal_2016_dVp_percentage") -# extract cross-sections -Data_cross = CrossSection(Data_set, Depth_level=-100km) +# extract cross-sections +Data_cross = CrossSection(Data_set, Depth_level=-100km) Write_Paraview(Data_cross, "Zhao_CrossSection_100km") Data_cross = CrossSection(Data_set, Lon_level=10) diff --git a/tutorial/Lippitsch_Screenshots.jl b/tutorial/Lippitsch_Screenshots.jl index 19936c26..7efbde5a 100644 --- a/tutorial/Lippitsch_Screenshots.jl +++ b/tutorial/Lippitsch_Screenshots.jl @@ -6,20 +6,20 @@ # # For convenience we created a zip file with all cross-sections and mapviews here: # -# +# using GeophysicalModelGenerator # Process cross-sections of Figure 13. Note that we estimated some of the lon/lat locations - + data_Fig13a = Screenshot_To_GeoData("Lippitsch_Fig13a.png",( 4.65,45.73, -400.0), (17.23, 43.80, 0.0)) -Write_Paraview(data_Fig13a, "Lippitsch_Fig13a") - +Write_Paraview(data_Fig13a, "Lippitsch_Fig13a") + data_Fig13b = Screenshot_To_GeoData("Lippitsch_Fig13b.png",( 5.51,51.53, -400.0), (12.04, 43.68 , 0.0)) -Write_Paraview(data_Fig13b, "Lippitsch_Fig13b") - +Write_Paraview(data_Fig13b, "Lippitsch_Fig13b") + data_Fig13c = Screenshot_To_GeoData("Lippitsch_Fig13c.png",(17.78,50.95, -400.0), (11.66, 43.68, 0.0)) -Write_Paraview(data_Fig13c, "Lippitsch_Fig13c") +Write_Paraview(data_Fig13c, "Lippitsch_Fig13c") # Mapview images Corner_LowerLeft = ( 3.5, 43.0 , -150.0) @@ -27,7 +27,7 @@ Corner_UpperRight = (15.5, 50.0 , -150.0) Corner_LowerRight = (15.5, 43.0 , -150.0) Corner_UpperLeft = (3.5 , 50.0 , -150.0) data_Fig13_map = Screenshot_To_GeoData("Fig13_mapview.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig13_map, "Lippitsch_Fig13_mapview") +Write_Paraview(data_Fig13_map, "Lippitsch_Fig13_mapview") Depth = -90.0; Corner_LowerLeft = (Corner_LowerLeft[1], Corner_LowerLeft[2], Depth) @@ -35,7 +35,7 @@ Corner_UpperRight = (Corner_UpperRight[1], Corner_UpperRight[2], Dept Corner_LowerRight = (Corner_LowerRight[1], Corner_LowerRight[2], Depth) Corner_UpperLeft = (Corner_UpperLeft[1], Corner_UpperLeft[2], Depth) data_Fig12_90km = Screenshot_To_GeoData("Fig12_90km.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig12_90km, "Lippitsch_Fig12_90km") +Write_Paraview(data_Fig12_90km, "Lippitsch_Fig12_90km") Depth = -180.0; Corner_LowerLeft = (Corner_LowerLeft[1], Corner_LowerLeft[2], Depth) @@ -43,7 +43,7 @@ Corner_UpperRight = (Corner_UpperRight[1], Corner_UpperRight[2], Dept Corner_LowerRight = (Corner_LowerRight[1], Corner_LowerRight[2], Depth) Corner_UpperLeft = (Corner_UpperLeft[1], Corner_UpperLeft[2], Depth) data_Fig12_180km = Screenshot_To_GeoData("Fig12_180km.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig12_180km, "Lippitsch_Fig12_180km") +Write_Paraview(data_Fig12_180km, "Lippitsch_Fig12_180km") Depth = -300.0; Corner_LowerLeft = (Corner_LowerLeft[1], Corner_LowerLeft[2], Depth) @@ -51,7 +51,7 @@ Corner_UpperRight = (Corner_UpperRight[1], Corner_UpperRight[2], Dept Corner_LowerRight = (Corner_LowerRight[1], Corner_LowerRight[2], Depth) Corner_UpperLeft = (Corner_UpperLeft[1], Corner_UpperLeft[2], Depth) data_Fig12_300km = Screenshot_To_GeoData("Fig12_300km.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig12_300km, "Lippitsch_Fig12_300km") +Write_Paraview(data_Fig12_300km, "Lippitsch_Fig12_300km") Depth = -400.0; Corner_LowerLeft = (Corner_LowerLeft[1], Corner_LowerLeft[2], Depth) @@ -59,14 +59,13 @@ Corner_UpperRight = (Corner_UpperRight[1], Corner_UpperRight[2], Dept Corner_LowerRight = (Corner_LowerRight[1], Corner_LowerRight[2], Depth) Corner_UpperLeft = (Corner_UpperLeft[1], Corner_UpperLeft[2], Depth) data_Fig12_400km = Screenshot_To_GeoData("Fig12_400km.png",Corner_LowerLeft, Corner_UpperRight, Corner_LowerRight=Corner_LowerRight,Corner_UpperLeft=Corner_UpperLeft) -Write_Paraview(data_Fig12_400km, "Lippitsch_Fig12_400km") +Write_Paraview(data_Fig12_400km, "Lippitsch_Fig12_400km") # Example of how we can save this to a multibock *.VTM file (which allows you to open all files @ once in paraview) vtmfile = vtk_multiblock("Lippitsch_CrossSections") -Write_Paraview(data_Fig12_90km, vtmfile) -Write_Paraview(data_Fig12_180km, vtmfile) -Write_Paraview(data_Fig12_300km, vtmfile) -Write_Paraview(data_Fig12_400km, vtmfile) +Write_Paraview(data_Fig12_90km, vtmfile) +Write_Paraview(data_Fig12_180km, vtmfile) +Write_Paraview(data_Fig12_300km, vtmfile) +Write_Paraview(data_Fig12_400km, vtmfile) vtk_save(vtmfile) - diff --git a/tutorial/MeRe_ElSharkawy.jl b/tutorial/MeRe_ElSharkawy.jl index 4fefc6fc..ab73dec4 100644 --- a/tutorial/MeRe_ElSharkawy.jl +++ b/tutorial/MeRe_ElSharkawy.jl @@ -1,4 +1,4 @@ -# This shows how to +# This shows how to # You will need to download the data from: # https://www.seismologie.ifg.uni-kiel.de/en/research/research-data/mere2020model. @@ -27,7 +27,7 @@ for iz=1:size(Depth,3) coord = PointSet([lon[ind]'; lat[ind]']) Geo = georef((Vs=Vs[ind],), coord) P = EstimationProblem(Geo, Cgrid, :Vs) - S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); + S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); sol = solve(P, S) sol_Vs= values(sol).Vs Vs_2D = reshape(sol_Vs, size(domain(sol))) @@ -35,5 +35,5 @@ for iz=1:size(Depth,3) end # Save data to paraview: -Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) -Write_Paraview(Data_set, "MeRe_ElSharkawy") \ No newline at end of file +Data_set = GeoData(Lon,Lat,Depth,(Vs_km_s=Vs_3D,)) +Write_Paraview(Data_set, "MeRe_ElSharkawy") diff --git a/tutorial/MohoTopo_Spada.jl b/tutorial/MohoTopo_Spada.jl index 09e51c5d..c44ec731 100644 --- a/tutorial/MohoTopo_Spada.jl +++ b/tutorial/MohoTopo_Spada.jl @@ -14,17 +14,17 @@ lon, lat, depth = data[:,1], data[:,2], -data[:,3]; data_Moho1 = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) # Write as data points: -Write_Paraview(data_Moho1, "Spada_Moho_Europe", PointsData=true) +Write_Paraview(data_Moho1, "Spada_Moho_Europe", PointsData=true) # Do the same with the other Moho's: data =readdlm("Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho2.txt",' ',Float64,'\n', skipstart=38,header=false); lon, lat, depth = data[:,1], data[:,2], -data[:,3]; Tutorial_MohoSpada_LonLat_Paraview = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) -Write_Paraview(data_Moho2, "Spada_Moho_Adria", PointsData=true) +Write_Paraview(data_Moho2, "Spada_Moho_Adria", PointsData=true) data =readdlm("Moho_Map_Data-WesternAlps-SpadaETAL2013_Moho3.txt",' ',Float64,'\n', skipstart=38,header=false); lon, lat, depth = data[:,1], data[:,2], -data[:,3]; data_Moho3 = GeoData(lon,lat,depth,(MohoDepth=depth*km,)) -Write_Paraview(data_Moho3, "Spada_Moho_Tyrrhenia", PointsData=true) +Write_Paraview(data_Moho3, "Spada_Moho_Tyrrhenia", PointsData=true) # Combine all data points into one data set: lon = [data_Moho1.lon.val; data_Moho2.lon.val; data_Moho3.lon.val]; @@ -44,5 +44,4 @@ for i=1:length(idxs) Depth[i] = depth[idxs[i]][1] end data_Moho = GeoData(Lon, Lat, Depth, (MohoDepth=Depth,)) -Write_Paraview(data_Moho, "Spada_Moho_combined") - +Write_Paraview(data_Moho, "Spada_Moho_combined") diff --git a/tutorial/Tutorial_CSV.jl b/tutorial/Tutorial_CSV.jl index 4bb6a21b..8881952e 100644 --- a/tutorial/Tutorial_CSV.jl +++ b/tutorial/Tutorial_CSV.jl @@ -1,5 +1,5 @@ """ -This is Tutorial_CSV.jl. It contains all the necessary commands to import a CSV file from a seismic tomography, +This is Tutorial_CSV.jl. It contains all the necessary commands to import a CSV file from a seismic tomography, to convert it to the GeoData format and to export it to a Paraview format. The following steps are performed: 1. Read data from this file. 2. Put the data in a GeoData format (this is the format that is used internally in the GMG). @@ -36,7 +36,7 @@ for iz=1:size(Depth,3) coord = PointSet([lon[ind]'; lat[ind]']) Geo = georef((Vs=Vs[ind],), coord) P = EstimationProblem(Geo, Cgrid, :Vs) - S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); + S = IDW(:Vs => (distance=Euclidean(),neighbors=2)); sol = solve(P, S) sol_Vs= values(sol).Vs Vs_2D = reshape(sol_Vs, size(domain(sol))) diff --git a/tutorial/Tutorial_Flegrei.jl b/tutorial/Tutorial_Flegrei.jl index a9f8f940..d01a162f 100644 --- a/tutorial/Tutorial_Flegrei.jl +++ b/tutorial/Tutorial_Flegrei.jl @@ -2,8 +2,8 @@ This is Tutorial_Flegrei.jl. It contains all the necessary commands to import the data files from geophysical models at the volcano and represent them in cartesian and UTM coordinates. The following steps are performed: -1. Export earthquake data with a time label in both formats. -2. Export a velocity models in both formats. +1. Export earthquake data with a time label in both formats. +2. Export a velocity models in both formats. 3. Scale and interpolate a shear-wave model and export in both formats. You will need to download the zipped folder containing all files from: @@ -18,7 +18,7 @@ data_80s = readdlm("SeismicLocations/Seismicity_UTM_1983_1984.txt", ' data_00s = readdlm("SeismicLocations/Seismicity_UTM_2005_2016.txt", ' ', skipstart=0, header=false); # 1.1 create a single file and detine the timing and locations -data = vcat(data_80s, data_00s) +data = vcat(data_80s, data_00s) time = data[:,1]; WE = data[:,2]; SN = data[:,3]; diff --git a/tutorial/Tutorial_GPS.jl b/tutorial/Tutorial_GPS.jl index 84f9abd5..ccfcaa98 100644 --- a/tutorial/Tutorial_GPS.jl +++ b/tutorial/Tutorial_GPS.jl @@ -7,8 +7,8 @@ using DataFrames, CSV # Read in coordinates of the grids (not stations as they are given in a different reference frame) # -# Note: the Vz velocity is given on a fully regular grid; yet Ve/Vn only on on-land stations which makes this -# a bit tricky. +# Note: the Vz velocity is given on a fully regular grid; yet Ve/Vn only on on-land stations which makes this +# a bit tricky. # # The approach we take here is to first read in the Vz data points & reshape it to 2D matrixes # Next, we read the Ve/Vn data and add them to the Vz grid @@ -46,7 +46,7 @@ Vmagnitude = sqrt.(Ve.^2 + Vn.^2 + Vz.^2); # velocity magnitude in m # Finally, it would be nice to put the data points on the topography. # The elevation of the points is not given in the GPS dataset, so we use GMT to extract a topographic grid -# and interpolate the elevation on the GPS grid locations +# and interpolate the elevation on the GPS grid locations using GMT, Interpolations Elevation = gmtread("@earth_relief_01m.grd", limits=[3,17,42,50]); Lon_Topo,Lat_Topo,dummy = LonLatDepthGrid(Elevation.x[1:end-1],Elevation.y[1:end-1],0); diff --git a/tutorial/Tutorial_TopoGeological_PNG.jl b/tutorial/Tutorial_TopoGeological_PNG.jl index 09eef4c9..49949828 100644 --- a/tutorial/Tutorial_TopoGeological_PNG.jl +++ b/tutorial/Tutorial_TopoGeological_PNG.jl @@ -1,12 +1,12 @@ """ -This is Tutorial_TopoGeological_PNG.jl. It contains all the necessary commands to import the netCDF file from ETOPO1 and +This is Tutorial_TopoGeological_PNG.jl. It contains all the necessary commands to import the netCDF file from ETOPO1 and to convert it to the GeoData format. The, a png file with the geological map of the Alps. The following steps are performed: 1. Read data from this file. 2. Select a longitude and latitude range. 3. Put the data in a GeoData format (this is the format that is used internally in the GMG). 4. Export that data to a format readable by Paraview. -ETOPO1 data files used in this example can be downloaded here: +ETOPO1 data files used in this example can be downloaded here: [https://ngdc.noaa.gov/mgg/global/global.html](https://ngdc.noaa.gov/mgg/global/global.html) """ @@ -40,7 +40,7 @@ DataPNG = Screenshot_To_GeoData(filename_geo, Corner_LowerLeft, Corner_UpperRigh # 4. interpolate geological map data colors onto topo grid using nearest neighbor interpolation -# 4.1 set up the tree and determine nearest neighbors +# 4.1 set up the tree and determine nearest neighbors coord = [vec(DataPNG.lon.val)';vec(DataPNG.lat.val)']; kdtree = KDTree(coord; leafsize = 10); points = [vec(Lon)';vec(Lat)']; diff --git a/tutorial/Tutorial_TopoGeological_SHP.jl b/tutorial/Tutorial_TopoGeological_SHP.jl index 93659edc..28145450 100644 --- a/tutorial/Tutorial_TopoGeological_SHP.jl +++ b/tutorial/Tutorial_TopoGeological_SHP.jl @@ -1,12 +1,12 @@ """ -This is Tutorial_ETOPO1.jl. It contains all the necessary commands to import the netCDF file from ETOPO1, +This is Tutorial_ETOPO1.jl. It contains all the necessary commands to import the netCDF file from ETOPO1, to convert it to the GeoData format and to export it to a Paraview format. The following steps are performed: 1. Read data from this file. 2. Select a longitude and latitude range. 3. Put the data in a GeoData format (this is the format that is used internally in the GMG). 4. Export that data to a format readable by Paraview. -ETOPO1 data files used in this example can be downloaded here: +ETOPO1 data files used in this example can be downloaded here: [https://ngdc.noaa.gov/mgg/global/global.html](https://ngdc.noaa.gov/mgg/global/global.html) doi:10.7289/V5C8276M @@ -42,7 +42,7 @@ tectunits = unique(table.tect_unit); # get tectonic units for ishape = 1:length(table) phase = findall(==(0),cmp.(table.tect_unit[ishape], tectunits)); # assign a certain phase to the respective tectonic unit - + #xv = zeros(length(table.geometry[ishape].points),1); # preallocate the vector to hold the different points #yv = zeros(length(table.geometry[ishape].points),1); # preallocate the vector to hold the different points xv = Vector{Float64}(undef,length(table.geometry[ishape].points)); @@ -52,7 +52,7 @@ tectunits = unique(table.tect_unit); # get tectonic units xv[ipoint] = table.geometry[ishape].points[ipoint].x; yv[ipoint] = table.geometry[ishape].points[ipoint].y; end - + xa = vec(Lon); ya = vec(Lat); @@ -60,13 +60,13 @@ tectunits = unique(table.tect_unit); # get tectonic units points = Point.(xa,ya); inside = [isinside(p, polygon; allowonedge=true) for p in points] - + inside = inside .& (vec(TectUnitData).==0); TectUnitData[inside] .= phase; end - - + + # Set up the Data structure Data_set = GeoData(Lon, Lat, Depth, (Topography=Depth*km,tect_unit=TectUnitData)) diff --git a/tutorial/Tutorial_netCDF.jl b/tutorial/Tutorial_netCDF.jl index 4abe22df..04ee6edb 100644 --- a/tutorial/Tutorial_netCDF.jl +++ b/tutorial/Tutorial_netCDF.jl @@ -1,11 +1,11 @@ """ -This is Tutorial_netCDF.jl. It contains all the necessary commands to import a netCDF file from a seismic tomography, +This is Tutorial_netCDF.jl. It contains all the necessary commands to import a netCDF file from a seismic tomography, to convert it to the GeoData format and to export it to a Paraview format. The following steps are performed: 1. Read data from this file. 2. Put the data in a GeoData format (this is the format that is used internally in the GMG). 3. Export that data to a format readable by Paraview. -The data file used in this example can be downloaded here: +The data file used in this example can be downloaded here: [https://ds.iris.edu/files/products/emc/emc-files/El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc](https://ds.iris.edu/files/products/emc/emc-files/El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc) """ @@ -16,13 +16,13 @@ filename = "./El-Sharkawy-etal-G3.2020-MeRE2020-Mediterranean-0.0.nc" # this onl # 2. load desired data -# Now check with ncinfo(filename), what the variables are called exactly and what the contents of your netCDF file are +# Now check with ncinfo(filename), what the variables are called exactly and what the contents of your netCDF file are lat = ncread(filename,"latitude"); lon = ncread(filename,"longitude"); depth = ncread(filename,"depth"); vs = ncread(filename,"Vs"); -depth = -1 .* depth # CAREFUL: MAKE SURE DEPTH IS NEGATIVE, AS THIS IS THE ASSUMTPION IN GeoData +depth = -1 .* depth # CAREFUL: MAKE SURE DEPTH IS NEGATIVE, AS THIS IS THE ASSUMPTION IN GeoData # For netCDF data, 3D coordinates of a regular grid are only given as 1D vectors. As we need to compute Cartesian coordinates for # Paraview, we need the full matrix of grid coordinates @@ -33,4 +33,3 @@ Data_set = GeoData(Lon3D,Lat3D,Depth3D,(VS=vs,)) # Export the data structure to Paraview format Write_Paraview(Data_set, "test_netcdf_3D") -