ADCIRC User Guide
The ADCIRC (ADvanced CIRCulation) model is a system of computer programs often used in Coastal Engineering Storm Surge research for solving time dependent, free surface circulation and transport problems in two and three dimensions. These programs utilize the finite element method in space allowing the use of highly flexible, unstructured grids.
An Example ADCIRC Unstructed Triangular 2D Mesh For the Houston/Galveston Bay, TX region
ADCIRC is often coupled with The wind wave model SWAN (Simulating WAves Nearshore), especially in storm-surge applications where wave radiation stress can have important effects on ocean circulation and vice cersa. Typical research topics include:
- prediction of storm surge and flooding
- modeling tides and wind driven circulation
- larval transport studies
- near shore marine operations
- dredging feasibility and material disposal studies
The following user guide gives a brief overview of ADCIRC and how it and supporting programs can be run on DesignSafe.
ADCIRC Applications
ADCIRC is a suite of Fortran programs, for either parallel or serial execution. The main components comprise of:
adcirc
- Non parallelized version of ADCIRC.- This version of the application is ideal for smaller simulations, and runs on a single node on Frontera. Runtimes are subject to current wait times in the Frontera job queue.
- There is options to run the serial
adcirc
program within DesignSafe via the ADCIRC Interactive VM.
padcirc
- Parallelized version of ADCIRC.- This is the Parallel version of the ADCIRC application and uses multiple compute nodes on TACC's Frontera or Lonestar6 HPC resource and is ideal for larger simulations. Runtimes are subject to current wait times in the HPC job queues.
- Within DesignSafe,
padcirc
simulations can be run within the ADCIRC Interactie VM, in the HPC JupyterHub, and via the TACC HPC queues on TACC's Fontera, Stampede3, and Lonestar6 HPC resources.
adcswan
/padcswan
- Serial/Parallelized versions of ADCIRC coupled with SWAN- The tightly coupled SWAN + ADCIRC paradigm allows both wave and circulation interactions to be solved on the same unstructured mesh resulting in a more accurate and efficient solution technique.
- This version of the application uses multiple nodes on TACC's Frontera or Lonestar6 HPC resource and is ideal for larger simulations. Runtimes are subject to current wait times in the HPC job queues.
adcprep
adcprep
is a utility program that prepares input files for PADCIRC & PADCSWAN simulations. It partitions the mesh accross each parallel process, distributing the necessary input files, such as fort.15, fort.14, and fort.13, through a user-friendly interface.- Note this utility only needs to be run when running the parallel version of ADCIRC and ADCIRC+SWAN.
Along with the above programs, commonly used utilities used in conjunction with ADCIRC include:
- FigureGen - A fortran program for visualizing ADCIRC inputs and outputs over the grid. Has a variety of capabilities, and can be run within DesignSafe as a stand-alone app or through the Interactive ADCIRC VM. See the FigureGen Document for more information.
- Kalpana - A python package for visualizign ADCIRC inputs/outputs and converting them into shapefiles and google kmz files for visualization in QGIS. Kalpana can also be run through the Interactive ADCIRC VM, or as a standalone application. See the Kalpana Documentation for more information.
Decision Matrix for ADCIRC Applications
Deciding which DesignSafe application to run depends on your problem domain and size. In general, the serial adcirc application is only used for testing and benchmarking, as most problems of interest require large grids. The easiest way to determine the size of your ADCIRC problem is to measure it in terms of the # of finite elements in your grid. This can be found at the top of the fort.14 file (see input files for more information):
āÆ head fort.14 -n 5
Quarter Annular Grid - Example 1 ! ALPHANUMERIC DESCRIPTOR FOR GRID FILE
96 63 ! NE,NP - NUMBER OF ELEMENTS AND NUMBER OF NODAL POINTS
1 60960.0 0.0 3.0480 ! NODE NO., X, Y, DEPTH
2 76200.0 0.0 4.7625
3 91440.0 0.0 6.8580
The Quarter Annular Grid Example can be found in the CommunityData folder at ``CommunityData/Use Case Products/ADCIRC/adcirc/adcirc_quarterannular-2d'
For example, for the common benchmark test case invovling a hypothetical quarter annular grid, we can see that the problem size is of size 96 finite elements.
This test is more than ok to run using the regular ADCIRC version, and no adcprep
prior run is required.
For cases when using parallel processing, the main deciding factor can be how many parallel processes to use.
Scaling studies have shown that targeting about 2000 nodes per process is ideal for these scenarios.
Thus the following table can be helpful for deciding where and when to run each application.
# Elements | ADCPREP? | # Nodes per Process | Sequential ADCIRC | Parallel ADCIRC | SWAN + ADCIRC |
---|---|---|---|---|---|
< 1000 | No | 1 | ā | š 1 | š 2 |
1000 - 1 million | Yes | 2000 | ā | ā | ā |
> 1 million | Yes 3 | 2000 | ā | ā 4 | ā |
- ā : Recommended for this scenario.
- ā: Not recommended or not yet available.
- š: Viable under certain conditions or for certain job sizes. Please refer to footnotes.
ADCIRC On DesignSafe
DesignSafe Offers a variety of platforms on which to run and test ADCIRC related applications. Behing the scenes power it is all powered by the Tapis API which connects the compute resources with the analysis environments. In the context of how ADCIRC can be run, the two important things to keep in mind is (1) Where is the computation running and (2) Through what interface am I interacting with this compute platform. At a high level, the compute platforms, from most powerful to least, that DesignSafe offers for computation are:
-
High Performance Computing (HPC) Job Queues - Job queues that are configured to handle jobs requiring multiple compute nodes, with GPU compute nodes also available. ADCIRC can be run on HPC job queues in the following manner:
- HPC ADCIRC Applications - By using the pre-configured HPC applications either via the Web-Portal or throug the Tapis API. These HPC applications run pre-built version of ADCIRC on inputs that you can upload to your MyData on DesignSafe.
- HPC Jupyter Instances - These are jupyter images running on an HPC queue, and can provide GPU support. For the moment, no native ADCIRC applications are supported in the HPC Jupyter instances, but ADCIRC can be installed in these environments.
- TACC- By requesting a specific allocation on TACC. Usually this is done if more resources are required for larger runs. Please open a ticket if your use case requires more than the resources provided by the pre-configured HPC applications. For more information on requesting HPC allocations, please refer to the HPC Allocations documentation.
-
JupyterHub Images - These run on dedicated VMs, so they can handle more computation, but not as much as the HPC job queues that have access to multiple nodes for massively parallel jobs.
- Interactive VMs - These run on shared VM resources, and therefore handle the lightest form of computations. The Interactive VM is launched from the web-portal, and offers a convenient environment for testing ADCIRC applications before running in a production environment.
ADCIRC Through the Interactive VM
The Interactive VM is a docker image running on a shared VM with ADCIRC and supporting utilities pre-built for easy testing and development of ADCIRC related applications within the DesignSafe environment.
Advantages and Disadvantages of Interactive ADCIRC VM
A few advantages of using the ADCIRC VM include:
- No queue wait time - Don't have to wait in the HPC queue to test input files.
- Pre-compiled versions of ADCIRC and supporting utilities such as FigureGen and Kalpana.
- Convenient Jupyter Lab interface, with plugins for github repo managemen, code formatting, and more.
Disadvantages include:
- VM runs on a shared resource - Can be slow if lots of users are using the VM.
- Not as large compute power - To simulate hurricanes at high-fidelity, ADCIRC needs to run on very large grids, which may take too long to run in the VM. Furthermore memory requirements for plotting and visualizing the grids and associated data may be too large for the interactive VM.
Overall, the interactive VM is meant to be as a testing and learning environment. It is ideal to configure and test smaller runs of large jobs before submitting to the HPC queue to verify inputs/outputs are configured correctly for ADCIRC and supporting programs.
Getting Started
You can access the interactive VM via the DesignSafe-CI workspace by selecting "Workspace" > "Tools & Applications" > "Simulation" > "ADCIRC" > Select "Jupyter" > "Interactive VM for ADCIRC" to start the interactive VM.
Selecting the Interactive VM for ADCIRC
The intereactive VM will spawn a JupyterLab instance for you on a shared VM not within the HPC queues, so wait time shoudl be minimal (all though may be a little longer if it's your first time). After your job goes into the "Running" stage, a dialogue box should prompt you to connect to your instance.
Once Job is running and window appears, click on Connect
Once you click connect you should see a familiar JupyterLab interface:
Jupyter Lab Interface Launcher screen provides Kalpana kernel and terminal environment with ADCIRC and FigureGen
Example - Running an ADCIRC simulation
One way to run an ADCIRC simulation in the interactive VM is via the linux terminal available from the Jupyter Lab interface. Open a new terminal from the launcher window, navigate to your MyData directory and create a new directory for your ADCIRC run. We copy into this directory some example ADCIRC input files corresponding to the ADCIRC Shinnecock Inlet test case (see ADCIRC data available on DesignSafe for more example cases).
cd ~/work/MyData
cp -r ~/work/CommunityData/Use\ Case\ Products/ADCIRC/adcirc/adcirc_shinnecock_inlet .
Serial Run
Now from within this directory we can run the code in serial by simply running the adcirc
command from the root directory containing the ADCIRC input files.
cd ~/work/MyData/adcirc_shinnecock_inlet
adcirc
You should see an output similar to:
Example ADCIRC output indicating the max elevation and maximum water velocity values and location at each time step.
Note the outputs are created in the same directory as the inputs (see left folder bar in Jupyter Lab interface).
Parallel Run
To run the same simulation in parallel, we must first run adcprep to prep the files for a parallel run. If we want to run the same simulation with four parallel processes, we must (from a clean simulation directory) run adcprep twice.
cd ~/work/MyData
cp -r ~/work/CommunityData/Use\ Case\ Products/ADCIRC/adcirc/adcirc_shinnecock_inlet .
adcprep
Note adcprep is an interactive program.
Example of running adcprep
the first time to partition the mesh.
On the first run through the you want to partition the mesh inputing in order:
- Number of processes for parallel run - Be careful it does not exist available processes from an mpi-run.
- Action to perform - Option 1 on the first run to partition the mesh, which must be done first.
- Name of the fort.14 file - In our case the default name of fort.14
After partitioning the mesh, a partmesh.txt
and metis_graph.txt
file should be created.
On the second run, you will input in the following order:
- Number of processes for parallel run - Be careful it does not exist available processes from an mpi-run.
- Action to perform - Option 2 on the second run to prep the rest of the input files.
Example of running adcprep
the second time to prep individual PE run directories.*
Note how after the second adcprep run, PE* directories are created for the input/output files corresponding to each individual process.
After both runs of adcprep
, padcirc
can now be run.
Note how it must be launched using the mpirun
command, specifying the number of outputs.
mpirun -np 4 padcirc
Running ADCIRC on HPC Resources
ADCIRC can be run on HPC resources at TACC through DesignSafe through the pre-configured HPC applications. Currently all these are configured to run only on TACC's Frontera supercomputer.
App ID | App Name |
---|---|
adcirc_netcdf_55_Frontera-55.01u4 | ADCIRC-V55 (Frontera) |
padcirc_swan-net_frontera_v55-55.00u4 | PADCIRC SWAN (Frontera) - V55 |
padcirc-frontera-55.01u4 | PADCIRC (Frontera) - V55 |
Note while the web portal provides a convenient interface to submit HPC jobs, the Tapis API provides a more programmatic way to interact and launch jobs. The corresponding Tapis application IDs of the web-portal apps avaialable are listed in the table above. We will review how to run HPC jobs through both of these interfaces below.
Using the Web Portal
To access the DesignSafe
- Select the appropriate ADCIRC application from the Simulation tab in the Workspace.
- Locate your Input Directory (Folder) with your input files that are in the Data Depot and follow the onscreen directions to enter this directory in the form.
- For the Parallel versions, enter your Mesh File into the form (usually fort.14 file).
- Enter a maximum job runtime in the form. See guidance on form for selecting a runtime.
- Enter a job name.
- Enter an output archive location or use the default provided.
- For the Parallel versions, select the number of nodes to be used for your job. Larger data files run more efficiently on higher node counts.
- Click Run to submit your job.
- Check the job status by clicking on the arrow in the upper right of the job submission form.
Using Tapis
Note: These instructions are for Tapis v2. See documentation on Tapis v2 on how to install the Tapis API, and in more in depth documentation. The below uses the Tapis Command Line Interface (CLI). They assume you've also authenticated with tapis using the
tapis auth init
command. Note that DesignSafe's native JupyterHub environment comes with the Tapis API pre-installed, so the following can be run from within a regular Jupyter Analysis Environment from the Web-Portal.
The same ADCIRC applications that are run through the front-end interface can be run via the Tapis v2 API.
For example, to view configurations for the PADCIRC (Frontera) v55 application, we can simply perform a tapis app show
command:
$ tapis app show
$ tapis apps show padcirc-frontera-55.01u4
+--------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field | Value |
+--------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
| id | padcirc-frontera-55.01u4 |
| name | padcirc-frontera |
| version | 55.01 |
| revision | 4 |
| label | PADCIRC (Frontera) - V55 |
| lastModified | a year ago |
| shortDescription | Parallel ADCIRC is a computer program for solving systems of shallow water equations. |
| longDescription | PADCIRC is the parallel version of the ADCIRC which is optimized for enhanced performance on multiple computer nodes to run very large models. It includes |
| | MPI library calls to allow it to operate at high efficiency on parallel machines. |
| owner | ds_admin |
| isPublic | True |
| executionType | HPC |
| executionSystem | designsafe.community.exec.frontera |
| deploymentSystem | designsafe.storage.default |
| available | True |
| parallelism | PARALLEL |
| defaultProcessorsPerNode | 168 |
| defaultMemoryPerNode | 192 |
| defaultNodeCount | 3 |
| defaultMaxRunTime | 02:00:00 |
| defaultQueue | normal |
| helpURI | https://www.designsafe-ci.org/rw/user-guides/tools-applications/simulation/adcirc/ |
| deploymentPath | /applications/padcirc-frontera-55.01u4.zip |
| templatePath | wrapper-frontera.sh |
| testPath | test/test.sh |
| checkpointable | False |
| uuid | 4548497563320577555-242ac11b-0001-005 |
| icon | None |
+--------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
To get an example job .json config for submitting this job we can use the tapis jobs init
command:
$ tapis jobs init padcirc-frontera-55.01u4 > test_job.json
$ cat test_job.json
{
"name": "padcirc-frontera-job-1715717562412",
"appId": "padcirc-frontera-55.01u4",
"batchQueue": "normal",
"maxRunTime": "01:00:00",
"memoryPerNode": "192GB",
"nodeCount": 1,
"processorsPerNode": 168,
"archive": true,
"inputs": {
"inputDirectory": "agave://designsafe.storage.community/app_examples/adcirc/EC2001"
},
"parameters": {},
"notifications": [
{
"event": "*",
"persistent": true,
"url": "carlosd@tacc.utexas.edu"
}
]
}
Note how the input directory is a DesignSafe Agave URI (see documentation on how to use agave URIs).
Now modify the test job the following way:
- Change the queue to the development queue (to wait less time)
- Node Count to 1 and the processors per node to 40
- Runtime to 30 minutes
- Notifications email accordingly (note that it should default to the email address associated with your DesignSafe account).
The resulting json file should look like:
{
"name": "padcirc-frontera-job-1715717815835",
"appId": "padcirc-frontera-55.01u4",
"batchQueue": "development",
"maxRunTime": "00:30:00",
"memoryPerNode": "192GB",
"nodeCount": 1,
"processorsPerNode": 40,
"archive": true,
"inputs": {
"inputDirectory": "agave://designsafe.storage.community/app_examples/adcirc/EC2001"
},
"parameters": {},
"notifications": [
{
"event": "*",
"persistent": true,
"url": "carlosd@tacc.utexas.edu"
}
]
}
Now to submit the job you can perform a tapis jobs submit
command, specifying the job config file just created:
$ tapis jobs submit -F test_job.json
fatal: not a git repository (or any of the parent directories): .git
+--------+------------------------------------------+
| Field | Value |
+--------+------------------------------------------+
| id | f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007 |
| name | padcirc-frontera-job-1715717815835 |
| status | ACCEPTED |
+--------+------------------------------------------+
Note: You can ignore the "fatal" git repository error.
To view the status of your job, you can list all your jobs (using a head
command to get the first couple of jobs):
$ tapis jobs list | head -n 4
+------------------------------------------+--------------------------------------------------------------+----------+
| id | name | status |
+------------------------------------------+--------------------------------------------------------------+----------+
| f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007 | padcirc-frontera-job-1715717815835 | RUNNING |
Note your job will be viewable from the front-end web interface as well:
Jobs submitted via the Tapis API will be visible on the web portal.
And to see the complete job config you can always do a tapis job show
command:
$ tapis jobs show f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007
+--------------------+----------------------------------------------------------------------------------------------------------------+
| Field | Value |
+--------------------+----------------------------------------------------------------------------------------------------------------+
| accepted | 2024-05-14T21:59:43.351Z |
| appId | padcirc-frontera-55.01u4 |
| appUuid | 4548497563320577555-242ac11b-0001-005 |
| archive | True |
| archiveOnAppError | False |
| archivePath | clos21/archive/jobs/job-f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007 |
| archiveSystem | designsafe.storage.default |
| blockedCount | 0 |
| created | 2024-05-14T21:59:43.354Z |
| ended | 21 minutes ago |
| failedStatusChecks | 0 |
| id | f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007 |
| lastStatusCheck | 21 minutes ago |
| lastStatusMessage | Transitioning from status ARCHIVING to FINISHED in phase ARCHIVING. |
| lastUpdated | 2024-05-14T22:01:22.494Z |
| maxHours | 0.5 |
| memoryPerNode | 192.0 |
| name | padcirc-frontera-job-1715717815835 |
| nodeCount | 1 |
| owner | clos21 |
| processorsPerNode | 40 |
| remoteEnded | 21 minutes ago |
| remoteJobId | 6316891 |
| remoteOutcome | FINISHED |
| remoteQueue | development |
| remoteStarted | 2024-05-14T22:00:05.629Z |
| remoteStatusChecks | 2 |
| remoteSubmitted | 22 minutes ago |
| schedulerJobId | None |
| status | FINISHED |
| submitRetries | 0 |
| systemId | designsafe.community.exec.frontera |
| tenantId | designsafe |
| tenantQueue | aloe.jobq.designsafe.submit.DefaultQueue |
| visible | True |
| workPath | /scratch1/05400/ds_apps/clos21/job-f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007-padcirc-frontera-job-1715717815835 |
+--------------------+----------------------------------------------------------------------------------------------------------------+
Note once the status of the job reaches the FINISHED
state, you should be able to find the job outputs in the archive directory.
Here we see if is on the storage system designsafe.storage.default
at the path clos21/archive/jobs/job-f6949b3e-a5c9-4c8d-b985-c4bfb4baccb4-007
.
This path should be in my MyData
directory, which I can view from the front end to see my output ADCIRC files:
Outputs will be found in MyData
within the the DataDepot.
See guides on FigureGen Document and Kalpana Documentation for info on how ot visualize ADCIRC output files.
ADCIRC Reference
The below section provides a brief overview of more techinical aspects of ADCIRC for quick reference. Links to supporting documentation are included or can be found below in the external documentation.
ADCIRC Command Line Options
adcirc
Command Line Options
Option | Description | Special Notes |
---|---|---|
-I INPUTDIR |
Set the directory for input files. | |
-O GLOBALDIR |
Set the directory for fulldomain output files. | |
-W NUM_WRITERS |
Dedicate NUM_WRITERS MPI processes to writing ascii output files. | Affects ascii formatted fort.63, fort.64, fort.73, and fort.74 files. |
adcprep
Command Line Options
Option | Description | Special Notes |
---|---|---|
--np NUM_SUBDOMAINS |
Decompose the domain into NUM_SUBDOMAINS subdomains. | Required for parallel computation. |
--partmesh |
Partition the mesh only, resulting in a partmesh.txt file. | Should be done first. Generates partmesh.txt for subdomain assignments. |
--prepall |
Decompose all ADCIRC input files using the partmesh.txt file. | Requires previous execution with --partmesh . Expects default input file names. |
adcprep
Runs
The usual workflow of running adcprep
consists of two steps - (1) partitioning of the mesh into sub-domains that each core will work on. (2) Decomposing other input files over the partitioned mesh.
Note that running adcprep
alone with no command line options will bring up an interactive menu.
Common adcprep
options used include:
- Partitioning Mesh Only
adcprep --partmesh --np 32
This command partitions the mesh into 32 subdomains, creating a partmesh.txt file.
- Preparing All Input Files
adcprep --prepall --np 32
Utilizes the previously created partmesh.txt file to decompose all input files into PE* subdirectories.
PADIRC Runs
Some common options used when using PADCIRC are the following:
- Specifying Input/Output Directories
padcirc -I /path/to/input -O /path/to/output
Looks for input files in /path/to/input
and writes output files to /path/to/output
.
- Adjusting Writer Cores
padcirc -W 4
Dedicates 4 MPI processes to write ASCII output files.
For more information see - ADCIRC Webpage Documentation
ADCIRC Input Files
Input File Table Summary
Default File Name(s) | Description | Condition |
---|---|---|
fort.14 |
Grid and Boundary Information File | Required |
fort.15 |
Model Parameter and Periodic Boundary Condition File | Required |
fort.10 |
Passive Scalar Transport Input File | Conditional |
fort.11 |
Density Initial Condition Input File | Conditional |
fort.13 |
Nodal Attributes File | Conditional |
fort.19 |
Non-periodic Elevation Boundary Condition File | Conditional |
fort.20 |
Non-periodic, Normal Flux Boundary Condition File | Conditional |
fort.22 |
Meteorological Forcing Data | Conditional |
fort.200, ... |
Multiple File Meteorological Forcing Input | Conditional |
fort.23 |
Wave Radiation Stress Forcing File | Conditional |
fort.24 |
Self Attraction/Earth Load Tide Forcing File | Conditional |
fort.25 , 225/227 |
Ice Coverage Input Files | Conditional |
fort.35 |
Level of No Motion Boundary Condition Input | Conditional |
fort.36 |
Salinity Boundary Condition Input | Conditional |
fort.37 |
Temperature Boundary Condition Input | Conditional |
fort.38 |
Surface Temperature Boundary Values | Conditional |
fort.39 |
Salinity and Temperature River Boundary Values | Conditional |
fort.67 or fort.68 |
2DDI Hot Start Files | Conditional |
fort.141 |
Time Varying Bathymetry Input File | Conditional |
elev_stat.151 |
Elevation Station Location input file | Conditional |
vel_stat.151 |
Velocity Station Location input file | Conditional |
conc_stat.151 |
Concentration Station Location input file | Conditional |
met_stat.151 |
Meteorological Recording Station Location Input file | Conditional |
N/A | Time-Varying Weir Input File | Conditional |
N/A | Time Varying Weirs Schedule File | Conditional |
ADCIRC Outputs Files
ADCIRC Outputs Summary
Default File Name(s) | Description | Simulation Type |
---|---|---|
fort.6 | Screen Output | Always |
fort.16 | General Diagnostic Output | Always |
fort.33 | Iterative Solver ITPACKV 2D Diagnostic Output | Specific setting |
fort.41 | 3D Density, Temperature and/or Salinity at Specified Recording Stations | 3D simulation |
fort.42 | 3D Velocity at Specified Recording Stations | 3D simulation |
fort.43 | 3D Turbulence at Specified Recording Stations | 3D simulation |
fort.44 | 3D Density, Temperature and/or Salinity at All Nodes in the Model Grid | 3D simulation |
fort.45 | 3D Velocity at All Nodes in the Model Grid | 3D simulation |
fort.46 | 3D Turbulence at All Nodes in the Model Grid | 3D simulation |
fort.47 | Temperature Values at the Surface Layer | Specific setting |
fort.51 | Elevation Harmonic Constituents at Specified Elevation Recording Stations | Harmonic analysis |
fort.52 | Depth-averaged Velocity Harmonic Constituents at Specified Velocity Stations | Harmonic analysis |
fort.53 | Elevation Harmonic Constituents at All Nodes in the Model Grid | Harmonic analysis |
fort.54 | Depth-averaged Velocity Harmonic Constituents at All Nodes in the Model Grid | Harmonic analysis |
fort.55 | Harmonic Constituent Diagnostic Output | Harmonic analysis |
fort.61 | Elevation Time Series at Specified Elevation Recording Stations | Time series output |
fort.62 | Depth-averaged Velocity Time Series at Specified Velocity Recording Stations | Time series output |
fort.63 | Elevation Time Series at All Nodes in the Model Grid | Time series output |
fort.64 | Depth-averaged Velocity Time Series at All Nodes in the Model Grid | Time series output |
maxele.63, maxvel.63, maxwvel.63, maxrs.63, minpr.63 | Global Maximum and Minimum files for the Model Run | Specific setting |
fort.67, fort.68 | Hot Start Output | Restart capability |
fort.71 | Atmospheric Pressure Time Series at Specified Meteorological Recording Stations | Meteorological input |
fort.72 | Wind Velocity Time Series at Specified Meteorological Recording Stations | Meteorological input |
fort.73 | Atmospheric Pressure Time Series at All Nodes in the Model Grid | Meteorological input |
fort.74 | Wind Stress or Velocity Time Series at All Nodes in the Model Grid | Meteorological input |
fort.75 | Bathymetry Time Series at Specified Bathymetry Recording Stations | Specific setting |
fort.76 | Bathymetry Time Series at All Nodes in the Model Grid | Specific setting |
fort.77 | Time-varying weir output file | Specific structure |
fort.81 | Depth-averaged Scalar Concentration Time Series at Specified Concentration Recording Stations | Scalar transport |
fort.83 | Depth-averaged Scalar Concentration Time Series at All Nodes in the Model Grid | Scalar transport |
fort.90 | Primitive Weighting in Continuity Equation Time Series at All Nodes in the Model Grid | Specific setting |
fort.91 | Ice Coverage Fields at Specified Recording Stations | Ice modeling |
fort.93 | Ice Coverage Fields at All Nodes in the Model Grid | Ice modeling |
ADCIRC Examples
Quarter Annular Harbor with Tidal Forcing Example: ADCIRC Simulation Guide
The Quarter Annular Harbor is commonly used as a test case to assess the performance of finite lement numerical schemes applied to shallow water equations.
Problem Setup
The Quarter Annular Harbor problem features a domain that is a quarter of an annulus, bounded by land on three sides and an open ocean boundary. The setup includes:
- Inner radius (
r1
): 60,960 m - Outer radius (
r2
): 152,400 m - Bathymetry: Varies quadratically from
h1
= 3.048 m atr1
toh2
= 19.05 m atr2
- Finite element grid: Radial spacing of 15,240 m and angular spacing of 11.25 degrees
The problem's geometry tests the model's performance in both horizontal coordinate directions, with an emphasis on identifying spurious modes and numerical dissipation.
ADCIRC Inputs
Two primary input files are required:
-
Grid and Boundary Information File (
fort.14
)This file outlines the mesh configuration, including:
- Grid Information: 96 elements and 63 nodes.
- Nodal Information: Node number, horizontal coordinates, and depth.
- Elemental Information: Element number, nodes per element, and comprising node numbers.
- Boundary Conditions:
- Elevation specified boundary: 1 segment with 9 nodes (Node 7 to Node 63).
- Normal flow boundary: 1 segment with 21 nodes (Node 63 to Node 7).
-
Model Parameter and Periodic Boundary Condition File (
fort.15
)Specifies model parameters, including:
- Initialization: Cold started from a state of rest.
- Coordinate System: Cartesian.
- Nonlinearities: Finite amplitude, advection, and quadratic bottom friction.
- Forcings: No tidal potential or wind stress. Gravity in m/sĀ².
- Boundary Forcing: Sinusoidal elevation with a period of 44,712 s, amplitude of 0.3048 m, and phase of 0 degrees, ramped up over the first two days.
- Simulation Duration: 5 days with a time step of 174.656 s.
- Output Settings: Water level and velocity time series output at specified intervals and locations. Harmonic analysis of model elevation and velocity fields for the M2 constituent on the final day. Hot start files generated every 512 time steps.
ADCIRC Outputs
The simulation generates several output files, briefly summarized as follows:
- General Diagnostic Output (
fort.16
): Echoes input file information, ADCIRC processing data, and error messages. - Iterative Solver Diagnostic (
fort.33
): Contains solver diagnostics, typically empty after successful runs. - Harmonic Constituents:
- Elevation at specified stations (
fort.51
). - Velocity at specified stations (
fort.52
). - Elevation at all nodes (
fort.53
). - Velocity at all nodes (
fort.54
).
- Elevation at specified stations (
- Time Series Output:
- Elevation at specified stations (
fort.61
). - Velocity at specified stations (
fort.62
). - Elevation at all nodes (
fort.63
). - Velocity at all nodes (
fort.64
).
- Elevation at specified stations (
- Hot Start Files (
fort.67
,fort.68
): Facilitate restarting simulations from specific states.
Running Example**
This simulation example is best run from the ADCIRC Interactive VM.
- Start the ADCIRC Interactive VM
- Copy the inputs from
- Execute ADCIRC, specifying the input files and any runtime options as needed.
References
- ADCIRC Website Examples
- Lynch, D.R. and W.G. Gray. 1979. A wave equation model for finite element tidal computations. Computers and Fluids. 7:207-228.
Shinnecock Inlet, NY with Tidal Forcing Example: ADCIRC Simulation Guide
This documentation outlines the procedure and details for setting up and running an ADCIRC simulation focused on the tidal hydrodynamics in the vicinity of Shinnecock Inlet, NY. This example derives from a study conducted at the U.S. Army Corps of Engineers Coastal Hydraulics Laboratory. It is commonly used as a test-case for ADCIRC releases.
Problem Setup
Shinnecock Inlet is a geographical feature located along the outer shore of Long Island, New York. The simulation utilizes a finite element grid to model the hydrodynamics in this area, reflecting the following characteristics:
- The grid's discretization varies from approximately 2 km offshore to around 75 m in nearshore areas.
- Due to the coarse resolution, this model does not accurately resolve circulation near the inlet and the back bay.
The input files for this simulation can be found in the CommunityData directory at CommunityData/Use Case Products/ADCIRC/adcirc/adcirc_shinnecock_inlet
.
ADCIRC Input
-
Grid and Boundary Information File (
fort.14
)This file defines the simulation's spatial domain, containing:
- 5780 elements and 3070 nodes, detailing the mesh used for the simulation.
- Nodal and elemental information, including node numbers, horizontal coordinates, depth, and elements' composition.
- Boundary specifications:
- An elevation specified open boundary with 75 nodes (from node 75 to node 1).
- A normal flow mainland boundary with 285 nodes (from node 1 to node 75).
-
Model Parameter and Periodic Boundary Condition File (
fort.15
)This file outlines the simulation's parameters:
- Initialization from a state of rest (cold start).
- Use of a longitude-latitude coordinate system.
- Inclusion of nonlinearities such as finite amplitude (with elemental wetting and drying), advection, and hybrid bottom friction.
- The model is forced using tidal potential terms and along the elevation boundary with 5 tidal constituents (M2, S2, N2, O1, K1), ramped up over the first two days.
- The simulation duration is 5 days with a time step of 6 seconds.
- Output of water level and velocity time series every 300 time steps (half-hour) at all nodes from days 3.8 to 5. No harmonic output or hot start files are produced.
ADCIRC Output
The simulation generates the following output files:
- General Diagnostic Output (
fort.16
): Includes input file information, ADCIRC processing data, and error messages. - Iterative Solver Diagnostic (
fort.33
): Contains diagnostic information from the iterative solver, typically empty upon successful completion. - Elevation Time Series (
fort.63
): Outputs elevation time series at all nodes every 300 time steps. - Depth-averaged Velocity Time Series (
fort.64
): Outputs velocity time series at all nodes every 300 time steps.
References
- ADCIRC Website Examples
- Militello, A., and Kraus, N. C. (2000). Shinnecock Inlet, New York, Site Investigation, Report 4, Evaluation of Flood and Ebb Shoal Sediment Source Alternatives for the West of Shinnecock Interim Project, New York. Technical Report CHL-98-32. U.S. Army Engineer Research and Development Center, Vicksburg, Mississippi.
- Morang, A. (1999). Shinnecock Inlet, New York, Site Investigation Report 1, Morphology and Historical Behavior. Technical Report CHL-98-32, US Army Engineer Waterways Experiment Station, Vicksburg, Mississippi.
- Williams, G. L., Morang, A., Lillycrop, L. (1998). Shinnecock Inlet, New York, Site Investigation Report 2, Evaluation of Sand Bypass Options. Technical Report CHL-98-32, US Army Engineer Waterways Experiment Station, Vicksburg, Mississippi.
ADCIRC Installation (Advanced)
For the advanced user, below is a guide on how to install ADCIRC locally. The below instructions can be executed within a users Jupyter Hub environment (HPC and non-HPC), to get a local install of ADCIRC. Note this is for advanced users only.
Spack ADCIRC Installation (DesignSafe JupyterHub)
The below instructions are for DesignSafe on JupyterHub instances (non HPC). They will allow you to test and run ADCIRC examples within a Jupyter session, without having to use HPC resources. This is in particular useful for
Move into your MyData directory and clone the spack repo. Note we put the spack repo in MyData so that it persists over Jupyter sessions.
cd ~/MyData
git clone -c feature.manyFiles=true https://github.com/spack/spack.git ~/spack
After installing spack, initialize it with:
source ~/MyData/spack/share/spack/setup-env.sh
This needs to be run every time a new jupyter terminal environment is spawned. To automatically do this, add the command to your ~/.bashrc
or alternatively, set up an alias:
alias spack-setup='source ~/spack/share/spack/setup-env.sh'
Now we clone the spack ADCIRC repo, and add the ADCIRC spack repository to spack.
cd ~/MyData
git clone https://github.com/adcirc/adcirc-spack.git
spack repo add ~/MyData/adcirc-spack
Now to install ADCIRC:
spack install adcirc@main +swan +grib
Note: The installation above may take a long time!
To activate ADCIRC in your environment just run:
spack load adcirc
That should make the padcirc
, adcirc
, adcprep
, and padcswan
executablex available in your path.
For more information on how to use spack see the documentation For more information on ADCIRC's spack repository and build options, see the ADCIRC Spack Repository
Resources and Documentation
The following sections provide further information on useful resources for using ADCIRC.
ADCIRC Data Hosted on DesignSafe
A wealth of ADCIRC related data can be found already on DesginSafe, in both the CommunityData and published projects folder. The following are a few notable locations to find already available data to use for ADCIRC simulations. Note you want to most likely copy these files to your MyData or HPC work before using them, since CommunityData and Published Projects directory are read only, and can lead to issues when running jobs/notebooks from these directories.
- Community Data :
- Use Case Products -
CommunityData/Use Case Products/ADCIRC/adcirc
- App Examples -
CommunityData/app_examples/
- Use Case Products -
- Notable ADCIRC Published Projects:
To see a full list of ADCIRC related data in the data depot, search for ADCIRC
in the keyword search bar.
External Documentation
There are a wide variety of ADCIRC resources on the web. Below are a few to help navigate and learn more about ADCIRC.
- ADCIRC GitHub Page - As of v54, the official central information hub for all things ADCIRC. Contains source code, utility programs, and issue tracking; a good place for developers and users interested in the staying up to date with the latest updates and developments in ADCIRC, along with bug fixes and issues. Useful links include:
- Issues - For reporting bugs, searching for common issues with ADCIRC, or asking questions/feature requests.
- Test Suite - Test suite of ADCIRC examples, used for testing new releases of ADCIRC.
- ADCIRC Official Website - Older primary source for all things ADCIRC, including model description, capabilities, and latest updates. Useful Sub pages include:
- Input File Descriptions/Output File Descriptions - Mostly correct for basic inputs/outputs, barring any changes since v54+.
- Parameter Definitions
- Example Problems
- ADCIRC Wiki - Out of date, but some useful information on here.
Other ADCIRC Utilities and Libraries
The ADCIRC community is vast, with utility libraries being developed at different institutions around the world. Below we highlight a few other third-party ADCIRC utilities and libraries that are not currently supported on DesignSafe, but can be useful and may be supported in the future.
Do you have an ADCIRC utility or library you'd like to add to the list? Open a ticket to contribute to the user guide!
-
For models with fewer than 1000 elements, parallel ADCIRC may not be necessary, but it can be used depending on the available computational resources. ↩
-
For small models, SWAN + ADCIRC might be more resource-intensive than required unless specific wave dynamics need to be resolved. ↩
-
For very large models, ADCPREP can take a significant amount of time due to the decomposition of large grids. It is recommended that this data be saved and reused when possible to avoid the need for repeated decomposition. ↩
-
For very large models with complex wave-current interactions, SWAN + ADCIRC in parallel is the recommended approach. ↩