Compile and Debug SWAT with gfortran and Eclipse

As a SWAT modeller and programmer, it’s very natural that I want to go deep into the SWAT source codes whenever questions come out. And again, it would be better to run SWAT live and look into values of all the variables to help trouble shooting. To do this, there are two options: commercial Intel fortran compiler and free gfortran. It’s hard to say which one is better. I chose gfortran for its price, freedom, cross-platform and multi-language compiling ability. For some reason, not all SWAT source code files follow the same standard. Compiling SWAT with gfortran is not as easy as Intel fortran. After some research, it comes out the best way to compile SWAT with gfortran. All the necessary steps are documented in one single user guide document with the hope it would be helpful.

All the efforts was recorded in following posts. And here are some important links.

Compile and Debug SWAT with GFortran and Eclipse (Windows version)

https://docs.google.com/document/d/16Do2U1_v4mZZBOV0hmcs6Gh1UvAUAXOMidpB-SE203A/edit?usp=sharing

Makefile

https://drive.google.com/file/d/0B16YhFB_9MejSG15ai0zYS1fMkU/edit?usp=sharing

Makefile Generator

https://drive.google.com/file/d/0B16YhFB_9MejZkxiakdLTHhMVlU/edit?usp=sharing

Posts

February 28, 2014 Single Document about Compiling and Debugging SWAT with GFortran and Eclipse

February 18, 2014 Makefile Updated – 1) No Need to Modify main.f any more; 2) A Single Makefile; 3) 64-bit Build

February 2, 2014 SWAT Makefile Updated – Ignore Underflow

January 28, 2014 MinGW Installation Guide for SWAT Debugging

January 28, 2014 Debug SWAT in Eclipse – Utilize Makefile

January 24, 2014 SWAT Makefile Updated – Stop running when overflow happens

January 23, 2014 Makefile – Compile SWAT using gfortran without modification

January 5, 2014 Compile and Run SWAT with Photran + MinGW – the easiest way to remove mathematical errors from SWAT

Advertisement

ECReader – Environment Canada Climate Data Reader

ECReader (Environment Canada Climate Data Reader) is a .NET tool to download Canadian climate data from Environment Canada website. It comes from the SWAT modelling project I’m working on. I’m tired of downloading the data from EC website one station after another station, year by year and then processing them into the file format ArcSWAT wants. It’s no fun at all. As a programmer, I rather spend the same time to make a tool to do all these work in just one click. It would also help others too. So, please leave a comment or send me email (hawklorry@gmail.com) if you needs more functions. I would be happy to add it for you. Also please follow me on wordpress or facebook to get notified for any further development.

Program Package
https://drive.google.com/file/d/0B16YhFB_9MejSjE4RlQ2VjZLVlk/view?usp=sharing

Source Code

https://code.google.com/p/environment-canada-climate-data-reader/

Screenshot

The main features are list below.

  • Download climate data for a bunch of stations with just one click
  • A build-in station definition window to help locate stations of interest by name or location.
  • Support different output formats including the SWAT-ready dbf/txt format which could be used in ArcSWAT without any post-process (only for precipitation and temperature).

  • Support daily and hourly data
  • Give a map of all climate stations, which could be used outside the tool. The attribute table gives the station ID, name and the data availability of hourly, daily and monthly data. It’s a good dataset for any research required climate data. It’s available right in the tool and in csv, shapefile and kmz format for different environment.

Several posts have been published on this topic (list below). New functions would be added shortly to help fill the data gaps using nearby stations using IDW (Inverse Distance Weight) method. Lapse rate would also be considered for precipitation and temperature in this process.

December 12, 2013 Environment Canada Climate Data Reader

December 18, 2013 How to Get Environment Canada Climate Station ID From Name

December 28, 2013 Uncompleted Data Bug Fixed, Please Update

December 28, 2013 Environment Canada Climate Stations (Shapefile and KMZ)

December 31, 2013 6 More Columns Added to Environment Canada Climate Stations (Shapefile) to Help Check Data Availability

January 2, 2014 ECRearder 1.1 – New Climate Station Definition Window – No longer need to look up station IDs yourself

January 6, 2014 File Name Convention in ECReader1.1

January 9, 2014 Save and Load Defined Stations in ECReader1.1

 

 

Work Flow of Point Source and Reservoir Monthly/Daily Discharge File Generation

When reservoir outflow is simulated with measured daily/monthly outflow, a discharge file (.day/.mon) would be generated. The file would be overwritten when the simulation is setup with the given simulation period. For those who don’t use ArcSWAT/SWAT Editor to run the model, remember to setup simulation before running the model if there are reservoir or point source in your watershed and the daily/monthly discharge method is used.

The data flow of reservoir discharge data in ArcSWAT/SWAT Editor is described below. Point source discharge data (.DAT) also follows exactly the same data flow.

1. Prepare the original discharge data in dbf/txt format

Daily

“DATE”, “RESOUTFLOW”

1/1/1978,0.00

1/2/1978,0.00

Monthly

“Year”,”Resout1″,”Resout2″,”Resout3″,”Resout4″,”Resout5″,”Resout6″,”Resout7″,”Resout8″,”Resout9″,”Resout10″,”Resout11″,”Resout12″

1990,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0

1991,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0

Note: The comma in the column name is a must. You may expect an error message shown below if they are missing.

2. Import the discharge data in ArcSWAT/SWAT Editor. The discharge data would be imported to timeseries data in project mdb. Note that TSTypeID is 0 and the time is same as the original discharge file. The data here should cover the future simulation period.

3. Re-write .Res/.Lwq, which would re-write all RES file and possible MON/DAY file. Note that the data in the generated DAY file doesn’t started from Jan 1, 1990 (in which day the discharge is 0.07 m3/s) as the simulation period hasn’t been set. ArcSWAT/SWAT Editor may set an arbitrary starting and ending date here to make the time range big enough (like from 1/1/1000 to 1/1/3001). This is not the final version used in simulation.

.DAY

Daily Reservoir Outflow file: .day file Subbasin:40 9/25/2014 12:00:00 AM ArcSWAT 2012.10_1.15

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

4. Setup Simulation, the MON/DAY file would be overwritten again to extract discharge data fell between the starting date and ending date from timeseries datatable. Now the data in DAY file starts from Jan 1, 1990.

.DAY

Daily Reservoir Outflow file: .day file Subbasin:40 9/25/2014 12:00:00 AM ArcSWAT 2012.10_1.15

0.07

0.07

0.07

0.07

0.07

0.07

0.07

0.07

0.07

Default CN2 in MGT

A default CN2 is defined in mgt file as shown in following examples.

49.00 | CN2: Initial SCS CN II value

Question 1: Where does this default CN2 come from?

The default CN2 comes from either crop table or urban table of SWAT2012.mdb. Four columns (CN2A, CN2B, CN2C, CN2D) are defined in table crop/urban, which are CN2 values for soils with hydrological group A, B, C and D respectively. HRU is a combination of unique landuse, soil and slope. From the soil type, the hydrological group is obtained from usersoil table. And then depending on landuse type, the CN2 value is read from table crop or urban.

Urban is quite unique compared to crop. The default CN2 is just for pervious surface. For impervious surface in urban area, URBCN in table urban would be used. Its value is usually 98.

Question 2: What’s the impact of the default CN2 on model results?

The CN2 in mgt file is the initial CN2 for the HRU. If the CNOP is NOT defined in plant/harvest/tillage operations, the curve number used in infiltration calculation would just depends on the default CN2 and soil moisture. In this case, the default CN2 would have big impact on infiltration and surface runoff and further on flow discharge. That’s why this is usually the main calibrated parameter.

However, the default CN2 could be changed to CNOP if it’s defined in any plant/harvest/tillage operation, where the default CN2 would be only the initial value and would only have impact before it’s changed. In this case, it’s the CNOP we should calibrate rather than CN2.

So, for some scenarios which focus on tillage or crop change, it’s very important to set CNOP.

Question 3: What if the default CN2 is 0?

SWAT model would check the default CN2 and make sure its value is between 35 and 98. See following codes from readmgt.f.

if (cn2(ihru) <= 35.0) cn2(ihru) = 35.0

if (cn2(ihru) >= 98.0) cn2(ihru) = 98.0

 

SWAT Changes from Rev 622 to Rev 627

Updates July 17,2014: The SWAT Team had confirmed that prf_bsn should be real rather than integer. It would probably be fixed in next release.

SWAT is updated on June 24, 2014 to Rev 627. It gives modelers more control on three parameters (prf, r2adj and surlag) and adds more outputs for auto-irrigation operation in output.mgt. Let’s see the details.

Changes from SWAT Version History

Revision 627

  • ‘surlag’ input (.bsn) changed to ‘surlag_bsn’ (if the input for surlag is <= 0 in the .hru file, the model will use surlag_bsn from .bsn file – defaulted to 4.)
  • ‘prf’ taken out (.hru) and changed to ‘surlag’ (if the input for prf is <= 0 in the .hru file, the model will use prf_bsn from the .bsn file defaulted to 1.)
  • Autoirr issues in output.mgt resolved (reporting issue)
    • NOTES regarding the autoirr changes: The SCHEDULED irrigation from the .mgt input file will be labelled as: “SCHED AUTOIRR” in the output.mgt file. The actual applications will be labeled in same file as “AUTOIRR”. The ‘scheduled irrigation’ is when it was scheduled in the .mgt rotation. AUTOIRR is when it actually was triggered and applied.
  • Subdaily problem with reservoirs fixed

Revision 626

  • No significant changes.

Revision 625

  • ‘r2adj’ input (.bsn) changed to ‘r2adjbsn’
  • ‘prf’ input (.bsn) changed to ‘prf_bsn’

Revision 624

  • Minor subdaily changes
  • Format extended for MSK_K in input.std output file (from f6.2 to f8.2)

Revision 623

  • No significant changes.

 

More Details from Analysis of Source Codes

  1. prf

Description

Reach peak rate adjustment factor for sediment routing in the channel. Allows impact of peak flow rate on sediment routing and channel reshaping to be taken into account.

Modification

In previous version, it’s a basin level parameter defined in .bsn file and each reach share the same parameter value.

Now, it’s a reach level parameter. Each reach could define its own value. It’s added to end of .rte file and the default value is 0, in which case the value given in .bsn value will be used (prf_bsn).

Potential Problem

In previous version, the variable for prj read from .bsn is defined as real. In new version, it’s defined as integer. When trying to run new version on model generated from a previous version of ArcSWAT, it may give following errors for trying to read integer value from a real value. Changing the real value to integer (e.g 1.0000 to 1) would solve this problem.

At line 386 of file readbsn.f (unit = 103, file = ‘basins.bsn’)

Fortran runtime error: Bad integer for item 1 in list input

Only gfortran compiled SWAT executables would have this problem. The Intel Fortran compiled version would work without problem. Should this variable be defined as real?

  1. r2adj

Description

Soil retention parameter adjustment factor (greater than 1)

Modification

Similar as prj, the basin level parameter is refined to HRU level. You would more control on this parameter for different HRU. It would be read from HRU file after surlag and default value is 1. The value defined in .bsn file (r2adj_bsn) would be used when the HRU value is less than or equal to 0.

  1. surlag

Description

Surface runoff lag time. This parameter is needed in subbasins where the time of concentration is greater than 1 day. SURLAG is used to create a “storage” for surface runoff to allow the runoff to take longer than 1 day to reach the subbasin outlet.

Modification

Similar as r2adj, this basin level parameter is refined to a HRU level parameter. It would be read after n_lnco and before r2adj. If it couldn’t be read, it will be set as surlag_bsn defined in .bsn whose default value is 4.

  1. Auto Irrigation

Add output information for irr_rch.f and irr_res.f to output.mgt. It’s still called “AUTOIRR” rather than “SCHED AUTOIRR” described in SWAT version history.

 

 

 

 

ArcSWAT Bug – Index was outside the bounds of the array.: IN, mWriteInputFiles.sol

Description

  • ArcSWAT 2012.10.15
  • SSURGO Soil Database is used
  • Following error message is given when running Create Tables

Reason

When generation sol table, ArcSWAT would do following work.

  1. Get all unique soil IDs from hrus table in project database.
  2. Create a new table (tbSoilList) in SSURGO database (which is usually located in C:\Swat\ArcSWAT\Databases\SWAT_US_SSURGO_Soils.mdb) and copy all soil IDs into this table.
  3. Find soil parameter for each soil ID from table SSURGO_Soils.
  4. Write the soil parameter into sol table.

The error message comes from Step 4. ArcSWAT doesn’t check the case when the soil ID couldn’t be found in SSURGO_Soils table. In that case, the returned soil parameter will be an empty array. When trying to read the first element of this array, it would give an error message shown above.

Solution

Check soil shapefile/grid to make sure all soil IDs have been defined in SSURGO_Soils table. And redo the Land Use/Soil/Slope Definition.

Note: This should also work for other soil options (user soil lookup table or STATSGO) and other ArcSWAT version.

Analysis of ArcSWAT Output Export – Why doesn’t SWAT Change Result Format a little bit?

What happened when the “Import Files to Database” button is clicked?

  1. Copy SWATOutput.mdb from C:\swat\ArcSWAT\Databases\SWATOutput.mdb to [ArcSWAT Project]\Scenarios\Default\TablesOut. The previous database will be overwritten.
  2. Copy Schema.ini from C:\swat\ArcSWAT\Databases\Schema.ini to [ArcSWAT Project]\Scenarios\Default\txtinout. This file defines the data columns in text files generated in step 3. More information could be found from MSDN.
  3. Read SWAT output files (output.rch, output.sub, etc.) and generate corresponding text files (outputsub.txt, outputRch.txt, outputSed.txt, outputHru.txt, outputRsv.txt, outputPst.txt) in [ArcSWAT Project]\Scenarios\Default\txtinout. The result files would be read line by line and string parser is used to extract data values for each column, which would be slightly different for different SWAT version.
  4. Copy the data from the generated text files to tables in SWATOutput.mdb though “Select INTO” statement.
  5. Delete all the text files generated in step 3.

Most of the time would be spent at step 3 and step 4, where the data format is converted twice: one from SWAT format to text format and the second from text format to mdb format, where the first conversion would probably cost more.

So, question is why doesn’t SWAT just generate the results in text format required in step 3. Are there any advantage the current result format over text format? What’s no doubt is generating results in text format would greatly save time on this import function.

The result format would also have impact on SWAT_CUP, which would read the specific results from result files after each simulation to calculate the objective functions. From my experience, sometimes the time spent on result reading is even longer than the simulation time!

For SWAT result analysis, I would usually want outputs for a specific element (hru, subbasin or reach, etc.) in a specific time period. It’s a query process and would gain the best performance in a database, e.g. mdb. This may be the thought under the “Import Files to Database” function. It’s also true for SWAT_CUP. So, why not just generate the outputs in a database format directly in SWAT model? Thus, we don’t need to spend extra time to convert the format and it would be a lot easier to do the result analysis, especially for daily outputs.

Makefile Updated – 1) No Need to Modify main.f any more; 2) A Single Makefile; 3) 64-bit Build

Download Link

Makefile Generator, Makefile

The Makefile is updated again. Main changes are:

  1. Don’t compile modparm.f any more to keep SWAT codes unchange.
  2. Don’t generate separated Makefiles for debug and release any more. A single Makefile is generated instead to include four make targets: debug32, debug64, rel32 and rel64, which matches the four official SWAT executables.
  3. 64-bit build added to the Makefile.

No Need to Modify main.f any more

In previous version of Makefile, to compile SWAT successfully, the first line of main.f (include ‘modparm.f’) must be comment/deleted, which is not so convenient but not so bad.

It has been found that the problem is in the compilation of modparm.f and main.f. The parm.mod would be generated when either modparm.f or main.f is compiled. The key here is the effect of include statement., which could be found in gnu website.

The effect of the INCLUDE directive is as if the included text directly replaced the directive in the source file prior to interpretation of the program.

So, include ‘modparm.f’ means adding all the contents of modparm.f to main.f, i.e. all the variables are also defined in main.f. If both of modparm.f and main.f are compiled to object files and used to in the link operation, all the variables would be defined twice and the linker would give error message shown below.

Solution is just very simple: don’t compile modparm.f at all. Thus, all the variable would be just defined in main.f and include ‘modparm.f’ doesn’t need to be comment/deleted.

A Single Makefile

The new Makefile would be put in the source codes folder rather than the debug and release folder. Sub-folders (debug32, debug64, rel32 and rel64) would be created to hold all the object files and mod file. The final executable files will be in the source codes folder and given different name similar as official SWAT executables. More details are list in the following table.

Command 32-bit or 64-bit debug or release executable name Comment
make debug32 32 debug swat_debug32
make debug64 64 debug swat_debug64 Doesn’t support by MinGW
make rel32 32 release swat_rel32
make rel64 64 release swat_rel64 Doesn’t support by MinGW

64-bit Build

64-bit build is added to the Makefile though flag -m64, although MinGW doesn’t support it. To compile 6-bit SWAT, either use MinGW-w64 or Rtools instead. More details would be published shortly in an integrated document.

SWAT Makefile Updated – Ignore Underflow

The SWAT Makefile is updated again. Underflow flag is removed to get same results as official SWAT.

Download Link

Makefile Generator, Debug Makefile, Release Makefile

In an attempt to run SWAT (rev 615) in Eclipse and compare with official SWAT, although official SWAT could run to end without problems, Eclipse-SWAT stops running due to underflow (erroneous arithmetic operation).

The underflow happens in subwq.f Line 88.

wtmp = 5.0 + 0.75 * tmpav(j)

where

tmpav is average air temperature on current day in HRU

From the Expression window, the value of tmpav(j) is 4.6281851e-038. The underflow should happen when 0.75 * tmpav(j) is calculated.

The fact that official SWAT could run to end means underflow is ignored, i.e. the compilation flag for underflow is turned off. As mentioned in previous post SWAT Makefile Updated – Stop running when overflow happens, following flag is used to stop SWAT running when invalid, zero, overflow or underflow happens.

-ffpe-trap=invalid,zero,overflow,underflow

To run Eclipse-SWAT successfully, only need to remove underflow flag and re-compile the source codes, which will be become as following format.

-ffpe-trap=invalid,zero,overflow

After that, Eclipse-SWAT could run to end and get exactly the same results as official SWAT.