The SAST Flood Water Balance (1993)

Pawel J. Mizgalewicz and David R. Maidment

Center for Research in Water Resources

University of Texas at Austin

09/09/1996

 
 
 

1. Introduction

Following the 1993 midwest flood, on November 24, 1993, President Clinton established the Scientific Assessment and Strategy Team (SAST) to study the effects of the flood and to make recomendations about future flood preparedness. The SAST joined the Integracy Floodplain Management Review Committee (FMRC) on January 10, 1994 (FMRC 1994). As part of this effort, the SAST project identified a need for a daily water balance of the flooded area to determine how much water fell in the flooded area and how quickly it moved through the landscape.

The goal of this project is to calculate daily water balance for the SAST region for 1993. (Information about the flood discharges and the precipitation in the Upper Mississippi River Basin, 1993, can be found in Parret et al., 1993, and Wachl et al., 1993). An USGS WEB site designated for SAST can be found at: http://edcwww2.cr.usgs.gov/./sast-home.html . The location and the extent of the SAST region is shown in Figure 1.1.

Spatial resolution: The SAST region is subdivided into 266 units defined by the location of the USGS gauging stations. Stations have been selected by Scott White analysis from the University of Utah scott.white@geog.utah.edu

Temporal resolution: daily average values of the precipitation depth, surface water storage, and streamflow discharge, monthly values of evapotranspiration taken as constant in each day of the month.


Figure 1.1. Location and the extent of the SAST region

The change in storage is calculated on a daily basis (01/01/93 through 09/30/93) for each gauged zone from the following equation:

dS = P - E + Qin - Qout

where:
dS = change in storage [mm/d];
P = precipitation depth [mm/d];
E = evapotranspiration [mm/d];
Qin = sum of the inflows that enter the gauged zone as average depth over gauged zone area [mm/d];
Qout = outflow from the gauged zone as average depth over gauged zone area [mm/d].


2. GIS database of daily precipitation

The GIS database of precipitation record is created in five steps

  1. The data are extracted from the Hydrosphere CD-ROMs in ASCII format (http://www.hydrosphere.com/ );
  2. The ASCII files are imported into an Arc/Info database format (INFO file);
  3. An ASCII file that contains: station ID, latitude, and longitude is created;
  4. A point coverage of stations is built; and
  5. The INFO file that contains station description and the precipitation record is merged (joined) with the PAT (point attribute table) of the stations coverage.


2.1 Extracting data from the Hydrosphere CD-ROM

The following steps were performed to select and extract daily precipitation depth for all days of 1993. Two selection criteria were used: state and latitude-longitude. The steps required to select and extract the daily precipitation depth for all days of 1993 are listed below:

Menu
Select State
Latitude/Longitude
Mark All Stations
Export Export marked stations
Precipitation total [in]
Format: delimited
Data for year: 1993, add ->


The names of the states and the latitude and longitude that were utilized in the data extraction process are listed in Table 1.

Table 1. Selection of the climate stations from the Hydrosphere(RT) CD-ROMs

State Abr. FIPS Latitude Longitude
Min Max Min Max
North Dakota ND 38 45:00 49:00 96:00 101:00
South Dakota SD 46 42:00 - 96:00 100:30
Nebraska NE 31 39:00 - 95:00 99:30
Kansas KS 20 38:00 - 94:00 97:00
Minnesota MN 27 43:00 49:00 91:00 -
Iowa IA 19 40:00 - 90:00 -
Missouri MO 29 36:00 - 89:00 -
Wisconsin WI 55 42:00 - 87:00 -
Illinois IL 17 36:00 - 87:00 -
Michigan MI 26 41:00 - 85:10 -
Indiana IN 18 40:00 - 85:10 -
Kentucky KY 21 36:00 - 88:00 -

A total of 1509 records were extracted. Below, an example of one record for the station GALESBURG is presented. It must be noted, that all values for a given station are stored in one line (!). Thus, spreadsheet programs such as Microsoft Excel can not be used to manipulate the data. (At least I don't know how to break the Excel limit of about 256 columns).

Example of precipitation data extracted from the Hydrosphere CD-ROM, a record for Galesburg, IL, 1993. (this line is more than 3000 characters long)


3320,GALESBURG,17,3,6,1948,10,1994,47,98.421,KNOX,770.000,4057,09023,9,0,IN,PRCP,1993,JAN,2.130,0.073,0.640,0.000,0.000,0.000,0.230,0.460,0.300,0.000,0.000,0.100,0.000,0.160,9998.000,9998.000,0.220,0.010,0.000,0.000,0.000,0.000,0.000,0.000,0.640,0.010,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,FEB,1.330,0.055,0.400,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,9998.000,0.000,0.390,0.050,0.030,0.000,0.100,9998.000,0.000,0.000,0.000,0.400,0.110,9998.000,0.000,9998.000,0.250,0.000,0.000,9999.000,9999.000,9999.000,MAR,3.390,0.126,1.210,0.000,0.000,0.000,0.280,0.680,0.010,0.000,0.000,0.000,0.000,0.090,0.040,9998.000,9998.000,0.000,0.000,0.160,0.130,0.000,0.050,0.110,9998.000,0.420,1.210,0.020,0.020,9998.000,0.000,0.000,0.000,0.000,0.170,APR,4.490,0.166,0.940,0.000,0.290,0.020,0.000,0.000,0.000,9998.000,0.000,0.380,0.260,0.000,0.000,0.000,0.000,0.450,0.940,0.750,0.160,9998.000,0.000,0.690,0.190,0.000,0.000,0.000,0.000,0.000,0.000,9998.000,0.220,0.140,9999.000,MAY,2.940,0.098,0.860,0.000,0.060,0.040,0.130,0.350,0.230,0.000,0.170,0.000,0.000,0.000,0.030,0.320,0.000,0.000,9998.000,0.020,0.000,0.000,0.000,0.000,0.000,0.000,0.860,0.090,0.000,0.000,0.000,0.060,0.110,0.010,0.460,JUN,5.630,0.201,1.580,0.000,0.000,0.580,0.130,0.120,0.670,0.000,9998.000,0.230,0.300,0.000,0.000,0.010,0.000,0.000,0.000,0.000,0.000,0.160,0.100,0.730,0.000,0.000,0.000,0.000,1.580,0.000,0.000,9998.000,0.180,0.840,9999.000,JUL,12.120,0.433,6.130,0.000,0.530,0.090,0.040,0.000,0.000,0.260,0.000,0.240,0.000,0.350,0.150,0.120,0.000,0.480,9998.000,0.010,0.420,0.230,9998.000,0.000,0.000,0.060,2.440,6.130,0.420,9998.000,0.000,0.150,0.000,0.000,0.000,AUG,9.730,0.347,3.070,0.000,0.360,0.000,0.000,0.060,0.000,0.170,0.000,0.000,0.000,3.070,9998.000,0.240,0.110,0.000,0.430,1.760,0.560,0.000,1.000,0.000,0.000,0.000,0.760,9998.000,0.000,0.000,0.000,9998.000,0.300,0.000,0.910,SEP,4.580,0.170,1.530,0.000,9998.000,0.300,0.290,0.000,0.000,0.720,0.000,0.050,0.000,0.000,0.000,0.000,0.000,1.530,0.340,0.010,0.000,0.000,0.010,0.010,0.030,9998.000,9998.000,0.000,0.000,1.240,0.050,0.000,0.000,0.000,9999.000,OCT,0.990,0.033,0.360,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.360,9998.000,0.000,0.000,0.000,0.000,0.000,0.260,0.310,0.000,0.050,0.000,0.010,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,0.000,NOV,1.580,0.066,0.580,0.000,0.000,0.000,0.050,9998.000,9998.000,9998.000,9998.000,0.000,0.000,0.000,0.000,0.000,0.580,0.010,0.220,0.000,0.050,9998.000,0.000,0.000,0.000,0.000,0.000,0.070,0.350,0.220,9998.000,0.030,0.000,0.000,9999.000,DEC,1.500,0.060,0.320,0.000,0.000,0.310,0.000,0.320,0.000,0.040,0.000,0.000,0.000,9998.000,0.000,0.000,9998.000,0.320,0.130,9998.000,0.020,0.180,0.060,9998.000,0.000,9998.000,9998.000,0.000,0.070,0.050,0.000,0.000,0.000,0.000,0.000,ANN,50.410,0.154,6.130,0.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000,9999.000

The data from each state were exported into one file and then all files were combined into one file by UNIX command cat:

cat  ia.txt il.txt in.txt ks.txt ky.txt mi.txt mn.txt mo.txt nd.txt ne.txt sd.txt wi.txt  > total.txt

I used GNU "gawk" to replace all values that equal to 9999 (a missing observation) by -1 and all values that equal to 9998 (a TRACE amount of precipitation) by 0.001 [in]. This means that the trace precipitation has been assumed 0.001 [in].

(I used gawk because the original awk had problems with the length of the line or with the file size)

gawk '{gsub(/9999/,"-1"); print}' abcin > abcout &

2.2 Converting precipitation data into Arc/Info format

There are two possible ways to create an INFO file of the precipitation.

  1. Utilizing Arc/Info TABLES
  2. Using ArcView

I selected ArcView. It is much easier and faster to create an INFO table in ArcView than in Tables. Before importing data from ASCII file, TABLES requires a definition of all items (name, width, precision, type etc.). Specifying more than 400 items can be a time consuming process.

To be able to import the ASCII file by ArcView, the first line of the ASCII data must contain the names of all items. I created such a line and added it to the ASCII precipitation database by simple cat command (UNIX).

Example of the header line for precipitation data:


id,stname,state,division,startm,startyr,endm,endyr,reccnt,percent,county,datum,latitude,longitude,mcritday,mcritmth,units,parameter,year,jan,jan_tot,jan_mean,jan_max,jan_min,p930101,p930102,p930103,p930104,p930105,p930106,p930107,p930108,p930109,p930110,p930111,p930112,p930113,p930114,p930115,p930116,p930117,p930118,p930119,p930120,p930121,p930122,p930123,p930124,p930125,p930126,p930127,p930128,p930129,p930130,p930131,feb,feb_tot,feb_mean,feb_max,feb_min,p930201,p930202,p930203,p930204,p930205,p930206,p930207,p930208,p930209,p930210,p930211,p930212,p930213,p930214,p930215,p930216,p930217,p930218,p930219,p930220,p930221,p930222,p930223,p930224,p930225,p930226,p930227,p930228,p930229,p930230,p930231,mar,mar_tot,mar_mean,mar_max,mar_min,p930301,p930302,p930303,p930304,p930305,p930306,p930307,p930308,p930309,p930310,p930311,p930312,p930313,p930314,p930315,p930316,p930317,p930318,p930319,p930320,p930321,p930322,p930323,p930324,p930325,p930326,p930327,p930328,p930329,p930330,p930331,apr,apr_tot,apr_mean,apr_max,apr_min,p930401,p930402,p930403,p930404,p930405,p930406,p930407,p930408,p930409,p930410,p930411,p930412,p930413,p930414,p930415,p930416,p930417,p930418,p930419,p930420,p930421,p930422,p930423,p930424,p930425,p930426,p930427,p930428,p930429,p930430,p930431,may,may_tot,may_mean,may_max,may_min,p930501,p930502,p930503,p930504,p930505,p930506,p930507,p930508,p930509,p930510,p930511,p930512,p930513,p930514,p930515,p930516,p930517,p930518,p930519,p930520,p930521,p930522,p930523,p930524,p930525,p930526,p930527,p930528,p930529,p930530,p930531,jun,jun_tot,jun_mean,jun_max,jun_min,p930601,p930602,p930603,p930604,p930605,p930606,p930607,p930608,p930609,p930610,p930611,p930612,p930613,p930614,p930615,p930616,p930617,p930618,p930619,p930620,p930621,p930622,p930623,p930624,p930625,p930626,p930627,p930628,p930629,p930630,p930631,jul,jul_tot,jul_mean,jul_max,jul_min,p930701,p930702,p930703,p930704,p930705,p930706,p930707,p930708,p930709,p930710,p930711,p930712,p930713,p930714,p930715,p930716,p930717,p930718,p930719,p930720,p930721,p930722,p930723,p930724,p930725,p930726,p930727,p930728,p930729,p930730,p930731,aug,aug_tot,aug_mean,aug_max,aug_min,p930801,p930802,p930803,p930804,p930805,p930806,p930807,p930808,p930809,p930810,p930811,p930812,p930813,p930814,p930815,p930816,p930817,p930818,p930819,p930820,p930821,p930822,p930823,p930824,p930825,p930826,p930827,p930828,p930829,p930830,p930831,sep,sep_tot,sep_mean,sep_max,sep_min,p930901,p930902,p930903,p930904,p930905,p930906,p930907,p930908,p930909,p930910,p930911,p930912,p930913,p930914,p930915,p930916,p930917,p930918,p930919,p930920,p930921,p930922,p930923,p930924,p930925,p930926,p930927,p930928,p930929,p930930,p930931,oct,oct_tot,oct_mean,oct_max,oct_min,p931001,p931002,p931003,p931004,p931005,p931006,p931007,p931008,p931009,p931010,p931011,p931012,p931013,p931014,p931015,p931016,p931017,p931018,p931019,p931020,p931021,p931022,p931023,p931024,p931025,p931026,p931027,p931028,p931029,p931030,p931031,nov,nov_tot,nov_mean,nov_max,nov_min,p931101,p931102,p931103,p931104,p931105,p931106,p931107,p931108,p931109,p931110,p931111,p931112,p931113,p931114,p931115,p931116,p931117,p931118,p931119,p931120,p931121,p931122,p931123,p931124,p931125,p931126,p931127,p931128,p931129,p931130,p931131,dec,dec_tot,dec_mean,dec_max,dec_min,p931201,p931202,p931203,p931204,p931205,p931206,p931207,p931208,p931209,p931210,p931211,p931212,p931213,p931214,p931215,p931216,p931217,p931218,p931219,p931220,p931221,p931222,p931223,p931224,p931225,p931226,p931227,p931228,p931229,p931230,p931231,ann,ann_tot,ann_mean,ann_max,ann_min,p931301,p931302,p931303,p931304,p931305,p931306,p931307,p931308,p931309,p931310,p931311,p931312,p931313,p931314,p931315,p931316,p931317,p931318,p931319,p931320,p931321,p931322,p931323,p931324,p931325,p931326,p931327,p931328,p931329,p931330,p931331 


The file was opened by ArcView [Tables, Add, file of type: Delimited Text (*.txt), File Name: ...] and then it was saved as an Arc/Info Info file [File, Export, INFO]. The process could be much easier if there were a way to extract data from the Hydrosphere CD-ROM using a command line method, without using the Hydrosphere Windows interface.

Since this Info file had to be joined with the PAT (Point Attribute Table) of the climate stations point coverage, an item that contains unique station ID for the US was added using Arc/Info command additem (Line 1, Listing 2.1). The ID was created in TABLES by multiplying station FIPS code (item state) by 1000000 and adding to the result the published station ID (item id)

Listing 2.1 Adding item that contains station ID

  1. additem totalinf totalinf st_id 12 12 I
  2. Tables:
  3. select totalinf
  4. calculate st_id = state * 100000 + id


2.3 Location of stations (latitude and longitude)

The data exported from the Hydrosphere CD-ROMs (Hydrosphere, 1994) contains a description of the climate stations including latitude and longitude. Since the coordinates are in degrees and minutes they need to be converted into decimal degrees. Listing 2.2 shows an example of the ASCII file that contains station ID, latitude, and longitude (in decimal seconds). Note that the station id is calculated from state FIPS number and the NCSC number, i.e.

st_id = state * 100000 + id (21 is the FIPS code for Kentucky)

Listing 2.2 Example of a file containing station IDs and coordinates

2100402,-320280,132780
2101727,-320820,132360
2103223,-317760,133260
2103295,-316800,132840
2104967,-319800,133080
2105150,-317040,134400
2105233,-319080,132420
2105235,-319200,132660
2105694,-317940,131820
2106110,-319560,133440
2106117,-318960,133560
END


2.4 Point coverage of the climate stations

Listing 2.3 shows Arc/Info dialog of creating a point coverage (precip) from the lat-long coordinates stored in the ASCII file xy.csv.

Listing 2.3 Creating point coverage of climate stations from latitude-longitude coordinates

  1. Arc: generate xxxgeo
  2. Generate: input xy.csv
  3. Generate: points
  4. Generate: q
  5. Arc: build xxxgeo point

To make the point coverage compatible with other maps used in this study, the map was projected into Albers system of coordinates as follows (Listing 2.4). The projection parameters used are standard ones for USGS maps except that the Albers rather than Lambert projection was used to preserve correct surface area througout the study region.

Listing 2.4 Projecting the point coverage of NCSC climate stations from Geographic system into Albers coordinates

  1. Arc: project cover xxxgeo xxx
  2. Input
  3. Projection geographic
  4. units ds
  5. Parameters
  6. output
  7. Projection ALBERS
  8. Zunits NO
  9. Units METERS
  10. Spheroid GRS1980
  11. Xshift 0.0000000000
  12. Yshift 0.0000000000
  13. Parameters
  14. 29 30 0.000 /* 1st standard parallel
  15. 45 30 0.000 /* 2nd standard parallel
  16. -96 0 0.000 /* central meridian
  17. 23 0 0.000 /* latitude of projection's origin
  18. 0.00000 /* false easting (meters)
  19. 0.00000 /* false northing (meters)
  20. end


2.5 GIS database of the daily precipitation

A GIS database is a database that contains both spatial and descriptive information. To create the GIS database of the precipitation record the INFO file with daily precipitation needs to be joined with the PAT of the climate stations coverage. This can be performed in Arc/Info by joinitem comand. This requires creation of the "common item". The join item st_id was added in Arc/Info (additem command, line 1, Listing 2.5) and in TABLES the values from item xxx-id were copied into item st_id (line 4, Listing 2.5). Note that here, station ID is not a NCSC's ID but it is calculated as a
state FIPS *10000 + NCSC's station id.

Listing 2.5 Joining the PAT of point coverage with the Info file of precipitation depth

  1. additem xxx.pat xxx.pat st_id 12 12 I
  2. Tables:
  3. select xxx.pat
  4. calculate st_id = xxx-id
  5. q
  6. Usage: JOINITEM <in_info_file> <join_info_file> <out_info_file> <relate_item>
  7. <start_item> {LINEAR | ORDERED | LINK}
  8. joinitem xxx.pat totalinf xxx.pat st_id

where: xxx.pat is the point attribute table of the climate station coverage and totalinf is the Info file with the precipitation record.

It is easier to join Info files using ArcView as compared to Arc/Info (no additional item needs to be created). Figure 2.1 shows the map of 1509 climate stations that was extracted from the Hydrosphere CD-ROM.

Figure 2.1 Climate stations extracted from the Hydrosphere CD-ROM.

To reduce size of the digital map, only the stations that are inside the SAST region and within a 50 km buffer outside the SAST region were selected for further analysis. Listing 2.6 presents the application of Arc/Info clip command. Coverage buffer50 is a map of SAST with a 50 km buffer zone. The name of the new point coverage of the precipitation stations is prec50. Figure 2.2 presents NCSC stations selected for spatial redistribution of precipitation depth in the SAST region.

Figure 2.2 Selected NCSC stations within SAST region and 50 km wide buffer zone

Listing 2.6 Selecting stations within SAST extended by 50 km buffer zone.

Usage: CLIP <in_cover> <clip_cover> <out_cover>
{POLY | LINE | POINT | NET | LINK | RAW} {fuzzy_tolerance}
clip xxx ~/flood/data/buffer50 prec50 point

Listing 2.7 shows a few items from the point attribute table :

Listing 2.7 Selected items of the prec50.pat.

COLUMN   ITEM NAME     WIDTH OUTPUT  TYPE N.DEC  ALTERNATE NAME  INDEXED?
    1  AREA                4    12     F      3                     -
    5  PERIMETER           4    12     F      3                     -
    9  CODEC50#            4     5     B      -                     -
   13  CODEC50-ID          4     5     B      -                     -
   17  ST_ID              12    12     I      -                     -
   29  ID                  4     4     I      -                     -
   33  STNAME             23    23     C      -                     -
   56  STATE               2     2     I      -                     -
   58  DIVISION            2     2     I      -                     -
   60  STARTM              2     2     I      -                     -
   62  STARTYR             4     4     I      -                     -
   66  ENDM                2     2     I      -                     -
   68  ENDYR               4     4     I      -                     -
   72  RECCNT              3     3     I      -                     -
   75  PERCENT             6     6     N      3                     -
   81  COUNTY             17    17     C      -                     -
   98  DATUM               8     8     N      3                     -
  106  LATITUDE            4     4     I      -                     -
  110  LONGITUDE           5     5     I      -                     -
  115  MCRITDAY            1     1     N      -                     -
  116  MCRITMTH            1     1     N      -                     -
  117  UNITS               2     2     C      -                     -
  119  PARAMETER           4     4     C      -                     -
  123  YEAR                4     4     I      -                     -
  127  JAN                 3     3     C      -                     -
  130  JAN_TOT             8     8     N      3                     -
  138  JAN_MEAN            8     8     N      3                     -
  146  JAN_MAX             8     8     N      3                     -
  154  JAN_MIN             8     8     N      3                     -
  162  P930101             8     8     N      3                     -
  170  P930102             8     8     N      3                     -
  178  P930103             8     8     N      3                     -
  186  P930104             8     8     N      3                     -
  194  P930105             8     8     N      3                     -
  202  P930106             8     8     N      3                     -
  210  P930107             8     8     N      3                     -
  218  P930108             8     8     N      3                     -
  226  P930109             8     8     N      3                     -
  234  P930110             8     8     N      3                     -
  242  P930111             8     8     N      3                     -
  250  P930112             8     8     N      3                     -
  258  P930113             8     8     N      3                     -
  266  P930114             8     8     N      3                     -
  274  P930115             8     8     N      3                     -

.

3. Spatial distribution of the precipitation depth

The grid of distributed precipitation depth has been created for each day's precipitation directly from the gauge precipitation station map. The inverse distance weighting procedure iwd has been applied. To make the calculations shorter and to save the disk space, the 4000 * 4000 m cell size was assumed for the precipitation grid.

Listing 3.1 Creating grids of daily precipitation depth

/* -----------------------------------------------------------------------
/* CODECIPITATION REDISTRIBUTION
/* ----------------------------------------------------------------------- 
/*      /*    input 
&s data = $HOME/flood/precip/poinmap/prec50 /* point coverage
&s wgrid = $HOME/flood/dem/buf50g /grid window sample 
grid 
&r rainsast 5 1 12 31 ~/flood/dem/buf50g 4000 ~/flood/precip/poinmap/prec50 YES
/* &run rainsast 4 1 4 31 %wgrid% 4000 %data% YES

q
/* &return /* return from this AML
/* -----------------------------------------------------------------------

q

The procedure rainsast.aml performs all calculations. It sets the cell size to user supplied value (here value 4000), sets the size of the new grid to the size of buf50g (a grid that serves as a standard for the size of new grids), and calculates a new grid. This process is shown in Listing 3.2.

Listing 3.2 Creating the precipitation grid by the inverse squared distance weight method

  1. setcell 4000
  2. setwindow buf50g
  3. p930408 = idw ( prec50 , p930408 )

Since the procedure described above must be executed for each day of 1993, it has been included into AML's DO feature that repeats calculations for each day of the specified time period, i.e., from day 1 of month %fm% to day 31 of month %tm%. An example of the &DO block is presented in Listing 3.3.

Listing 3.3 Application of the &DO command to repeat action for each day and each month of 1993

  1. &DO mt = %fm% &to %tm%
  2. &if %mt% lt 10 &then
  3. &sv nitem1 = [subst %nitem0% m %mt% ]
  4. &else
  5. &sv nitem1 = [subst %nitem0% 0m %mt% ]
  6. &DO dy = 1 &to 31
  7. &s ab = [calc %mt% = %fm%] AND [calc %dy% lt %fd%]
  8. &s bb = [calc %mt% = %tm%] AND [calc %dy% gt %td%]
  9. &s cc = %ab% OR %bb%
  10. &IF NOT %cc% &THEN
  11. &do
  12. &if %dy% lt 10 &then
  13. &sv nitem = [subst %nitem1% d %dy% ]
  14. &else
  15. &sv nitem = [subst %nitem1% 0d %dy% ]
  16. &type %nitem%
  17. /* check if the item exists
  18. &s iexists = [iteminfo %xdat% -point %nitem% -exists]
  19. &if %iexists% &then
  20. &do
  21. &sv selec = res %nitem% %klm% 0
  22. &type selecting stations %selec% [date -vfull]
  23. arc reselect %xdat% xxxxx1 point
  24. %selec%
  25. ~
  26. n
  27. n
  28. arc build xxxxx1 point
  29. %nitem% = int ( %cfac% * idw ( xxxxx1, %nitem% ) )
  30. kill xxxxx1 all
  31. &end
  32. &end
  33. &end
  34. & end

To include all available measurements of precipitation depth in the spatial redistribution process, all stations that have a measurement for the processing day have been used. For each day, station selection was performed (Arc/Info command reselect) and then the grids of spatial distribution of the precipitation depth were calculated (IDW procedure).

Listing 3.4 Example of the AML that selects precipitation stations that have a complete record on June 8th, 1993 and creates the grid of precipitation depth (adopted from RAINSAST.AML)

  1. arc reselect prec50 xxxxx1 point
  2. res p930608 ge 0
  3. ~
  4. n
  5. n
  6. arc build xxxxx1 point
  7. p930608 = int (25400 * idw ( xxxxx1, p930608 ) )
  8. kill xxxxx1 all

The number 25400 is the unit conversion factor (1 inch = 25.4 mm). Millimeters are mutiplied by 1000 to preserve the accuracy when converted to INTEGER number. (To get the results back to mm, the grids need to be divided by 1000). Figure 3.1 shows an example of the precipitation grid (precipitation spatially distributed for 1993/07/19)

Figure 3.1 Precipitation depth in SAST interpolated for 19 June 1993.



4. Grids of monthly evaporation


4.1 Potential evaporation data

Grids of the monthly evaporation have been constructed using the data downloaded from anonymous
ftp site: srv1rvares.er.usgs.gov
text file: pub/SAST/cde93.dat

File cde93.dat contains the potential ET for each month of the 1993 on climate-division spatial scale. This file has been converted into comma delimited format and the column units has been added (file name: ev1.txt). The coverage of the USA divided into climate divisions was downloaded from http://nsdi.usgs.gov/nsdi/wais/water/climate_div.HTML. Figure 4.1 presents the US divided into climate divisions. The divisions applied for the SAST evaporation estimates are identified by different color.

Figure 4.1 Climate divisions - coverage clim_div

Two items have been added to the clim_div.pat:

  1. ling_id (string 4 characters)
  2. atext (string, two characters)

and the climate division zone identification code has been calculated:

calculate [atext] = [Div#]
calculate [link_id] = [St] + [atext]

whre [St] contains two letters identifying the state (e.g. TX for Texas).

In this research, the climate division is identified by two letters that specify the state and a number that defines the division within the state. For example, climate division 3 in Washington has the identification code WA3. The data from file cde93.dat have been included into the polygon attribute table of climate divisions coverage in the following steps:

  1. Import data in text format into ArcView;
  2. calculate the division identification codes;
  3. link resulting table with the clim_div.pat;
  4. save results as clim_div.pat

The climate divisions for which data were included in the file cde93.dat have been selected in the following way:

Listing 4.1 Selection of the SAST climate divisions

  1. Arc: reselect ev2 ev4 poly
  2. Reselecting POLYGON features from EV2 to create EV4
  3. Enter a logical expression. (Enter a blank line when finished)
  4. >: res year eq 1993
  5. >:
  6. Do you wish to re-enter expression (Y/N)? n
  7. Do you wish to enter another expression (Y/N)? n
  8. 89 features out of 360 selected.
  9. Reselecting polygons...
  10. Number of Polygons (Input,Output) = 360 91
  11. Number of Arcs (Input,Output) = 1139 305
  12. Creating EV4.pat...
  13. Creating EV4.aat...
  14. 216 unique nodes built for /EXPORT/HOME2/PAWEL/FLOOD/EVAPOR/EV4
  15. Arc:

Figure 4.2 shows the climate divisions selected for evaporation analysis in the SAST region:

Figure 4.2 Climate divisions applied to SAST water balance estimation.

Since the file cde93.dat does not contain data for all divisions that compose the SAST region, the values for the following divisions have been calculated from the evaporation published for the neighboring divisions:

   -----------------------------------------------------------
   State Division [St] [St#] [Div#] 
   -----------------------------------------------------------
   Kansas    North East    KS   14     3
             East Central       14     6
   N. Dakota Central       ND   32     5  
             North Central      32     2 (within 50 km buffer)
             South Central      32     8 (within 50 km buffer)
   S. Dakota North Central SD   39     2
   Nebraska  North Central NE   25     2 (within 50 km buffer)
   -----------------------------------------------------------

Figure 4.3 shows map of the evaporation estimated for July 1993 by climate division (data from the file cde93.dat)

Figure 4.3 Evaporation in July 1993 (data from the USGS file cde93.dat)


4.2 Grids of the potential evaporation in the SAST region

For each month of 1993 a grid of the evaporation has been created. The structure of these grids, i.e. the cell size and the grid extend, are the same as the structure of the precipitation grids. Listing 4.2 shows GRID dialog of converting polygon coverage of the evaporation into 12 grids.

Listing 4.2 Converting the vector map of the potential evaporation (ev4) into a grid.

  1. setcell ../precip/gridprc/p930101
  2. setwindow ../precip/gridprc/p930101
  3. e930100 = int (25400 * polygrid (ev4 , jan ) )
  4. e930200 = int (25400 * polygrid (ev4 , feb ) )
  5. e930300 = int (25400 * polygrid (ev4 , mar ) )
  6. e930400 = int (25400 * polygrid (ev4 , apr ) )
  7. e930500 = int (25400 * polygrid (ev4 , may ) )
  8. e930600 = int (25400 * polygrid (ev4 , jun ) )
  9. e930700 = int (25400 * polygrid (ev4 , jul ) )
  10. e930800 = int (25400 * polygrid (ev4 , aug ) )
  11. e930900 = int (25400 * polygrid (ev4 , sep ) )
  12. e931000 = int (25400 * polygrid (ev4 , oct ) )
  13. e931100 = int (25400 * polygrid (ev4 , nov ) )
  14. e931200 = int (25400 * polygrid (ev4 , dec ) )
  15. e931300 = int (25400 * polygrid (ev4 , ann ) )

The multiplication factor 25400 converts units from inches to micrometers (10-6 m). Micrometers are used to preserve the precision of values after they are converted into integers.


5. Dividing the SAST region into gauged zones

The process of dividing a region into gauged zones (zones that are defined by the location of the USGS gauging stations) is a complex one. It can be divided into several sub-processes that are described in the following sections. The 500 m - cell size Digital Elevation Model (DEM) is utilized in this research. It was created by resampling 1-degree DEM and released by the USGS on the CD-ROM (Rea and Cederstand, 1995).

5.1 Enhancement of the stream delineation process

The grid that describes the cell-to-cell flow (flowdirection) is crucial for all hydrologic analysis that is performed in a rasterized environment. Procedures such as stream and watershed boundary delineation, dividing a basin into modeling units, stream slope calculation, length of the flow path estimation, and connection of hydrologic units, are examples of operations that cannot be performed without the map of flow direction. Moreover, the accuracy of all derived information depends on the precision of the flow direction grid. Therefore, an effort has been made to develop a method to improve the map that represents the flow paths.

The RF1 digital 1:500 000 map of the U.S. rivers was selected as a basis for the spatial framework of the entire flow system. The following explanations support the application of map of existing rivers to correct the flow system determined from DEM:

The process of the DEM adjustment is based on the converting the RF1 into grid form and then increasing the elevation of all DEM cells, that do not represent gridded RF1, by an arbitrary value (e.g., 10000 m). This operation forces the Grid GIS to create a map of flow direction that is compatible with the flow system represented by the vector map of rivers (RF1). The method of enhancing the flow system development has the following disadvantages:

  1. Since the elevations of the DEM are changed, the modified DEM can not be used for other tasks than the flow direction estimation.
  2. The stream network has to be represented by a single line. A double line description of rivers can be utilized if the distance between the lines is smaller that the cell size.
  3. All existing loops have to be removed or opened to eliminate the ambiguous flow paths.
  4. All lakes have to be converted into line representations or to polygons.
  5. The cell width applied for adjustment process should be smaller than the half of the distance between any streams in RF1, to avoid connections of stream networks from different basins, that may be created when converting from vector format into raster one.

If the river network does not fulfill the above mentioned requirements, some editing after converting RF1 into grid format is necessary.

5.2 Converting river map (RF1) from vector format into grid representation.

The following GRID commands have been applied to specify the cell size and the map extent of the rasterized RF1:

setcell stdem
setwindow stdem

where stdem is the DEM.

The portion of the RF1 map that represents rivers in the SAST region has been converted into grid format by applying the linegrid command:

strf1 = linegrid ( rf1 )


5.3 Creating a map of the flow direction and delineation of the stream network.

To ensure that the flow paths delineated from the DEM are in line with the RF1 river network, the terrain map is adjusted. Elevations of all cells that do not represent the real stream network have been raised by 10000 m:

stdem2 = con ( isnull ( strf1 ), stdem + 10000 , stdem)

The depressions are removed to ensure that the whole region contributes to the runoff, and the flow direction grid stfdr has been built:

fill stdem2 stfill2 stfdr

Rivers are delineated under the assumption that the runoff from a drainage area of 100 km2 produces a stream. 400 cells of 500 m (1 cell = 0.25 km2) constitutes an area of 100 km2. To all cells that have an accumulated upstream number of cells greater than 400 (= 100 km2 with 500 m DEM cells) "flowing" into them, the value one has been assigned:

stfac = flowaccumulation ( stfdr) ststr = con ( crfac > 400 , 1, 0)


5.4 Creating a map of the gauging zones

This preliminary map of the flow system (stfdr) and the river system (stdtr) was used by Scott ( scott.white@geog.utah.edu ) to locate the USGS gauging stations on the river.

A watershed is explicitly defined by its outlet point. A gauged zone is explicitly defined by the watershed outlet point (outflow from the zone) and the upstream gauging stations (inflow points). Thus, the precision of the location of the gauging stations on the river is crucial for proper drainage area delineation.

A map of gauging stations prepared by Scott White has been edited to minimize the error between the drainage area published by the USGS (data from Scott) and the drainage area determined from the DEM. The drainage area of selected zones has been adjusted by:

  1. Moving the stations downstream or upstream by 2-3 cells (to include or to exclude the river tributary from the delineated watershed);
  2. Building flow barriers to correct the flow path
    a) separating streams connected at the beginning
    b) separating streams connected in the middle;
  3. Locating stations on the stream that changed position after DEM was edited ( different flow direction grids lead to different flow paths).

Two stations have been removed:

  1. One was been deleted due to the discrepancies between the delineated drainage area and the area published by the USGS (area published by the USGS is not consistent with the HUC boundary or with the RF1, and the HUC boundary is not consistent with RF1);
  2. The other one was removed since it was situated outside the SAST boundaries.

Dummy stations have been used to exclude some portions of the delineated drainage area that do not belong to SAST. The location of corrected areas are presented in Fig. 5.1, Fig. 5.2 and Fig. 5.3.


Figure 5.1. Location of the dummy stations 998, 997, and 996 that have been applied to correct SAST North boundaries.


Figure 5.2. Location of the dummy stations 995, and 992 that have been applied to correct SAST East boundaries.


Figure 5.3. Location of selected stations to eliminate the drainage area West to SAST. Stations "cut" the following rivers: Missouri R., Platte R., Kansas R., Osage R., and Gasconade R.

One gauging zone (6479438) that is in North lake region was not corrected due to the lack of consistent information about flow direction (no RF1 rivers, questionable HUC boundaries ). The drainage area error for this zone is as large as 50%. The USGS estimate of the drainage area is 2608 km2, whereas the drainage area delineated from the DEM is 1554 km2. This error propagates to downstream zones, i.e., the value of the drainage area above stations 6479525 and 6480000 are affected.

Ninety percent of stations have the delineated drainage area close to the USGS estimates. The differences are smaller than 5%. Listing 5.1 shows the final steps of the gauging zone delineation.

Listing 5.1 Selected steps of the division of SAST into water balance units gauging zones).

  1. out56 = con (ed5v6 > 1, ed5v6 )
  2. tot56wsh = watershed ( st5fdr , out56 )
  3. zones56 = con ( tot56wsh > 1000000, tot56wsh)

The grid ed5v6 is a grid that contains delineated stream network (cell value = 1), dummy stations (cell values 900 - 999), cells that cut drainage areas West to SAST (cell value < 1000000), and the gauging stations (cell value = station ID).

In line 1, Listing 5.1 the cells that represent the stream network (major flow paths) are eliminated in the selection process. The con (condition) command is a selection command. If the cell value in the ed5v6 is greater than one, the cell is copied to the grid out56, otherwise the NODATA is assigned to the respective cell. This copies only cells that represents stations (dummy stations including) from the grid ed5v6 to grid out56.

In line 2 the watersheds are delineated (including watersheds that have outlets at the dummy cells). The grid st5fdr contains the indicators of the flow direction. In line 3 the watersheds that are determined by the USGS stations are selected and copied to the grid zones56 (the dummy watersheds are eliminated). Zones56 is used to calculate the average evapotranspiration, average precipitation depth and the water balance.

Figure 5.4 shows the gauged zones, the location of the USGS gauging stations, and the sations connectivity (flow path). All features have been determined using 500 m DEM.


Figure 5.4. SAST divided into gauged zones, USGS gauging stations selected for water balance calculations, and the stations flow links.

5.5 Connectivity of gauged zones

To calculate the flow balance the sum of inflows into the gauged zone must be calculated. The stations immediately upstream of a station, that is the zone outlet, can be identified if the position of each station on the flow path is known. The Arc/Info script nextwsh.aml creates an info file with two items that store the gauged zone ID number and the ID number of the next zone on the flow path (downstream zone).

The listing of the macro nextwsh.aml is shown in the Appendix. The major part of this AML assigns to the cell that corresponds to the watershed outlet out56 a value from the cell, located in watershed grid zones56, that is pointed by the flow direction st5fdr. In line 1, Listing 5.2, a value zero is assigned to all cells that have nodata (xwsh). This step is necessary to ensure that the next watershed to the last zone on the flow path is indicated by the ID equal zero (no downstream units). Lines 2-9 create a grid zonenxt similar to the grid of watershed outlets out56 but with ID of the downstream unit. Figure 5.5 shows the flow system (zones, stations, and flow links in Iowa).

Listing 5.2 Identification of the downstream zone ID

  1. xwsh = con ( isnull ( zones56 ), 0, zones56 )
  2. zonenxt = con ( out56 > 0, con (st5fdr == 1, xwsh(1,0), ~
  3. con (st5fdr == 2, xwsh(1,1), ~
  4. con (st5fdr == 4, xwsh(0,1), ~
  5. con (st5fdr == 8, xwsh(-1,1), ~
  6. con (st5fdr == 16, xwsh(-1,0), ~
  7. con (st5fdr == 32, xwsh(-1,-1), ~
  8. con (st5fdr == 64, xwsh(0,-1), ~
  9. con (st5fdr == 128, xwsh(1,-1), -1)))))))))


Figure 5.5. Flow system in Iowa used for water balance estimation in the SAST region.

6. Zonal Water Balance

Once the SAST region was subdivided into zones, the average precipitation and average evaporation for each gauged zone can be estimated. Section 6.1 explains the process of calculating the average precipitation depth for each day of 1993 and then storing the averages in an Info file. Section 6.2 presents similar methodology applied to calculate the average zonal evaporation depth for 12 months of 1993. The estimation of the water balance in gauged zones is presented in Section 6.3.

6.1 Average precipitation depth

Two Arc/Info macros have been applied to create Info files that store the gauged zones precipitation depth. The first AML (Listing 6.1 ) just executes the other macro, rmap56.aml, for each month of the 1993.

Parameters supplied to rmap46.aml:
p_path = $home/flood/precip/gridprc/ (path to the folder in which the daily precipitation grids are stored);
zone = zones56 (grid that defines the zones, 56 is the map-set version number);
id = unit_id (name of an item that stores station/zone ID);
rmap56 (name of the AML that calculates the zonal average and stores the results in the info file prc1, prc2, prc3,... prc12);
1 1 1 31 (start month, start day, end month, end day);
10 (one decimal place preserved: cell values multiplier before they are converted into integer);
p (prefix used to create precipitation items names).

Listing 6.1 Calculation of the average zonal precipitation ( execution of rmap56.aml).

  1. &messages &off
  2. &s p_path = $home/flood/precip/gridprc/
  3. &s zone = zones56
  4. &s id unit_id
  5. /*
  6. &type start jan [date -vfull]
  7. &r rmap56 1 1 1 31 %p_path% %zone% prc1 %id% 10 p
  8. &type start of feb [date -vfull]
  9. &r rmap56 2 1 2 28 %p_path% %zone% prc2 %id% 10 p
  10. &type start of mar [date -vfull]
  11. &r rmap56 3 1 3 31 %p_path% %zone% prc3 %id% 10 p
  12. &type start of apr [date -vfull]
  13. &r rmap56 4 1 4 30 %p_path% %zone% prc4 %id% 10 p
  14. &type start of may [date -vfull]
  15. &r rmap56 5 1 5 31 %p_path% %zone% prc5 %id% 10 p
  16. &type start of jun [date -vfull]
  17. &r rmap56 6 1 6 30 %p_path% %zone% prc6 %id% 10 p
  18. &type start of jul [date -vfull]
  19. &r rmap56 7 1 7 31 %p_path% %zone% prc7 %id% 10 p
  20. &type start of aug [date -vfull]
  21. &r rmap56 8 1 8 31 %p_path% %zone% prc8 %id% 10 p
  22. &type start of sep [date -vfull]
  23. &r rmap56 9 1 9 30 %p_path% %zone% prc9 %id% 10 p
  24. &type start of oct [date -vfull]
  25. &r rmap56 10 1 10 31 %p_path% %zone% prc10 %id% 10 p
  26. &type start of nov [date -vfull]
  27. &r rmap56 11 1 11 30 %p_path% %zone% prc11 %id% 10 p
  28. &type start of dec [date -vfull]
  29. &r rmap56 12 1 12 31 %p_path% %zone% prc12 %id% 10 p
  30. &type end of dec [date -vfull]
  31. &messages &on
  32. q
  33. q
  34. &return

The macro rmap56.aml uses GRID command zonalmean to calculate the average values. Before the precipitation depth is written to the INFO table, it is multiplied by 1000 to get results back in mm. Program rmap56.aml is a customized version of raininfo.aml. Figure 6.1 presents an example of estimated precipitation depth in the SAST gauged zones for July 19, 1993.


Figure 6.1. Estimated precipitation depth in the SAST gauged zones for July 19, 1993.

It is possible to calculate zonal average for the whole year utilizing the following parameters for rmap56 macro:

&r rmap56 1 1 12 31 %p_path% %zone% prc93 %id% 10 p

but the process is more controllable if the calculations are split into monthly time steps (if computer crashes, only up to one month, 31 days, needs to be recalculated).

6.2 Average evaporation

A similar method to the one described above, has been applied to create the Info file of the monthly evaporation. The macro emap56.aml is just a crude modification of rmap56.aml. It must be noted that the evaporation is in [mm/month], not in [mm/day] as the precipitation is.

Parameters of emap56.aml:
fm = 1 (first month, 1 - January);
tm = 13 (annual depth, not utilized in this research);
zone = zones56 (grid that defines the gauged zones);
outinf = evap.dat (Info table with zonal evaporation depth);
id_itm = unit_id (Item that contains zone ID);
prc = 10 (precision).
prx = e (prefix used to create evaporation item name)

6.3 Water Balance

Due to the Arc/Info restrictions on the size of Info file, the calculations had to be divided into monthly time steps. bal1.aml is an example of the macro that

Parameters for bal1.aml (example for January):
flw = d56eqp1.dat Info file that contains: zone ID, Id of the downstream zone (flow connectivity) zone area in km2, flow rate, evaporation (mm/day), and precipitation (mm/d) for January.
fm = 1 (from month)
fd = 1 (from day)
tm = 1 (to month)
td = 31 (to day)
out = bal1 output info file
data56 = an info file that contains zone ID numbers ("base or prototype" info file copyinfo data56 %out%)

The GIS-time series system created here uses metric units and international date format (year-month-day) to name the items in Info tables. An exception is the flow rate data base developed in University of Utah. The flow rate in cfs and the item names are in format ( month-day-year).

The water balance is calculated by b4v1 program written in C language. An AML macro exports ASCII data required by b4v1, executes b4v1, and creates an Info file in which it stores the results of b4v1 calculations.

Program b4v1 uses the conversion factor equal to 2.4466 (1 cfs = 2446.6 m3/d, 1 m3 = 109 mm3, 1 km2 = 1012 mm2), The following two lines are excerpt from the b4v1. They calculate the balance.

x1 = 2.4466 * (gsqo[i] - gsin[i] ) / unar[i];
bal[i] = unpr[i] - unev[i] - x1;
where:
gsqo[i] outflow from the zone i [cfs];
gsin[i] sum of inflows into the zone i [cfs];
unar[i] zone area [km2];
bal[i] water balance [mm/d];
unpr[i] average precipitation depth [mm/d] (daily values);
unev[i] average evaporation [mm/d] (monthly values);

6.4 Model calibration

To check if the storage depth (water balance) is reasonable, the change in storage for the SAST watershed has been calculated for period from 01 Jan 93 to 30 Sep 93. The results as well as the 30-day moving average are shown in Figure 6.2. There is a visible decreasing trend. Also the moving average localy increases in June, and localy decreases in July.


Figure 6.2. Incremental storage [mm] in the SAST region for 01/01/93 - 30/09/93

Figure 6.3 presents cumulative storage estimated for the SAST region.


Figure 6.3 Cumulative storage [mm] in the SAST region for 01/01/93 - 30/09/93

A systematic negative balance, i.e. more water escapes from the region (evaporation, outflow) than enters (inflow, precipitation), indicates that the water balance is not correct. It is unlikely, that the precipitation or the flow record exhibit such a "decreasing" behavior. I checked some gauged zones and I havent detected any errors in the calculations. In my opinion, the evaporation data introduces this error. The evaporation losses seem to be overestimated. Figure 6.4 shows the cumulative water balance after the evaporation has been decreased by about 14 mm /month.


Figure 6.4 Adjusted Cumulative storage [mm] in the SAST region for 01/01/93 - 30/09/9 (evaporation depth has been decreased by about 0.47 mm/day)

The evaporation depth can be adjusted in the following steps:

  1. determine adjustment coefficient (a = 0..1);
  2. calculate new evaporation depth in each climate division for each month of 1993;
  3. estimate balance.

For each day of the 1993, from January 01 to December 31, the cumulative precipitation depth in the SAST region, the cumulative inflow into the SAST region, cumulative outflow from the region (the Mississippi River discharge at Thebes), and the cumulative evapotranspiration depth have been calculated. The recorded discharge at the following stations were utilized to calculate surface water inflow into SAST area and outflow from the SAST region.

Inflow stations:
6926500 Osage River near St. Thomas, Missouri
6934000 Gasconade River near Rich Fountain, MO
6467500 Missouri River at Yankton, SD
6805500 Platte River at Louisville, NE
6892350 Kansas River at Desoto, KS
Outflow station:
7022000 Mississippi River at Thebes, IL

Figure 6.5 shows cumulative values of water balance components.


Figure 6.5. Cumulative values of the water balance components (units: mm / # of days from January 01);

qoc - surface outflow (Mississippi R. discharge at Thebes);
qic - surface inflow (sum of dicharges of Osage R., Gasconade R., Missouri R., Platte R., and Kansas R.);
pc - precipitation depth;
ec - potential evaporation.

The evaporation adjustment coefficient was calculated from the following formula:

k = ( qic + pc - qoc ) / ec
k = 645.4 / 810.6 = 0.796
where:
k = evaporation adjustment coefficient = 0.796;
qoc = total surface outflow = 594.5 [mm/yr];
qic = total surface inflow = 121.8 [mm/yr];
pc = total precipitation depth = 1118.2 [mm/yr];
ec = total potential evaporation = 710.6 [mm/yr].

Figure 6.6 Compares cumulative qic + pc - qoc with different fractions of the evapotranspiration:


Figure 6.6. Comparison of qic + pc - qoc with evaporation depth;

ec = potential evaporation;
ec775 = ec * 0.775 (narrowed later to 0.796);
ecx = ec * 0.9.

The total water balance for the SAST region is shown in Figure 6.7. The adjusted potential evaporation (77.5%) was utilized in calculations.


Figure 6.7. Water storage in the SAST region in 1993 [mm];

The water balance in each gauged zone was calculated applying adjusted evaporation values. Figure 6.8 presents cumulative storage in gauged zones allong the Mississippi River.


Figure 6.8 Example of the water storage in gaugad zones along the Mississippi River;

There are gauged zones which have error in the measurements of the discharge. Figure 6.9 shows an example of water storage estimated from the incorrect flow record. It is not clear if the error is a result of wrong measurements or because the data have been corrupted. Further investigation is needed.


Figure 6.9 Water storage in gaugad zones along the Iowa River; an example of influence of erroneous discharge time series on the local water balance (Station 5454500 at Iowa City, Iowa).

Figure 6.10 compares discharge measured at three sites along the Iowa River:

  1. Upstream station, the Iowa River at Marengo, IA , Station ID = 5453100
  2. Middle station, the Iowa River at Iowa City, IA, Station ID = 5454500
  3. Downstream station, Iowa River near Lone Tree, IA, Station ID = 5455700

The error in recorded discharge at station 5454500 is evident.


Figure 6.10 Example of discharge data measured at stations on the Iowa River, upstream, at, and downstream to the Iowa City (stations 5453100, 5454500, and 5455700 ). 

7. Summary and Conclusions

A methodology of estimation spatially distributed daily water balance for a large region has been developed. A daily water balance has been calculated for 261 gauged zones of the SAST region. The gauged zones as well as the flow connectivity between them have been determined from the 15" DEM (Digital Elevation Model) and from RF1 (Reach File 1) using Arc/Info GRID tools.

The following GIS databases have been created:

  1. daily precipitation depth in more that 1500 climate stations (record from 1078 stations was used to create 365 grids of spatially distributed over the SAST region precipitation);
  2. monthly evaporation depth for 12 months of 1993 for NOAA Climate Divisions that overly the SAST region;
  3. observed daily discharge in 261 USGS gauging stations, for days from 01 January 93 to 30 September 93.

Since the AML calculations are very slow, the water balance was calculated by a program written in C language. Arc/Info Tables was utilized to extract the data from the database, run the C progam, and import the results into an Info file. The evaporation data were multiplied by a coefficient of 0.796. This coefficient ensures the total SAST area water storage on 30 Dec, 1993 is similar to the storage at the beginning of the year (01/01/93).

The storage in the Midwest increased by as much as 1.5 m of water depth in about two months (from end of May/beginning of June to end of July/beginning of August). Some measurements of the dicharge are not correct (e.g. flow record from station 5454500, the Iowa River at Iowa City, IA). The source of error needs further investigation.

APPENDIX


RAINSAST.AML


/* ----------------------------------------------------------------------- /* Program: RAINSAST.AML /* Purpose: Creates GRIDS of daily precipitation (10e-3 cm/d) /* Before this AML runs IDW (inverse distance weight) it /* selects all stations that have precipitation depth record /* for a given day (set of stations selected can vary from /* day to day. /* Runs only in GRID /* ----------------------------------------------------------------------- /* Usage: &r RAINSAST <fm> <fd> <tm> <td> <wgrid> <xcell> /* <xdat> {YES|NO} /* Arguments: (input) /* (yr = 93) /* fm, fd = from year, from month /* tm, td = to year, to month /* wgrid = a grid for seting the GRID window /* xcell = cellsize of the new grids /* xdat = point coverage that contains data /* yes|no = yes - include records with values = 0 /* no - exclude records and stations that have 0 /* (default) /* Temporary: xxxxx1 /* Globals: /* Locals: templ, nitem, nit, mt,dy, selec (variables) /* ----------------------------------------------------------------------- /* Calls: /* ----------------------------------------------------------------------- /* Notes: Data are stored in PAT (point). Each record is related to /* gaging stations. The following naming convention is used /* for items: myymmdd, where m indicates monthly values, /* yy = year, mm = month,dd = day for example item that /* stores the records for 13 March 1976 has the name p760313 /* ----------------------------------------------------------------------- /* History: Coded by Pawel Mizgalewicz 08/08/96 /* /* ----------------------------------------------------------------------- &args fm fd tm td wgrid xcell xdat zero &severity &error &routine bailout &if [SHOW PROGRAM] <> GRID &then ; &return This only runs in GRID. &if [null %xdat%] &then &do &call usage &return &end &if [null %zero%] &then ; &s zero = YES &s xx = [translate %zero%] &select %xx% &when NO &s klm = gt &when YES &s klm = ge &otherwise &do &call usage &return &end &end /* select /* ========== settings !!! ================ /* year: 93 &s yr = 93 &s templ = pyy0m0d &s nitem0 = [subst %templ% yy %yr% ] /* unit convertion factor: /* 1 inch = 25.4 mm = 0.0254 m (mm are mutipied by 1000 /* to preserve accuracy when converted to INTEGER) /* To get the results back to mm, the grids need to be /* divided by 1000 &s cfac = 25400 /* ======================================== &messages &off /* ====== set the window and the cell size =========== setwindow %wgrid% setcell %xcell% &DO mt = %fm% &to %tm% &if %mt% lt 10 &then &sv nitem1 = [subst %nitem0% m %mt% ] &else &sv nitem1 = [subst %nitem0% 0m %mt% ] &DO dy = 1 &to 31 &s ab = [calc %mt% = %fm%] AND [calc %dy% lt %fd%] &s bb = [calc %mt% = %tm%] AND [calc %dy% gt %td%] &s cc = %ab% OR %bb% &IF NOT %cc% &THEN &do &if %dy% lt 10 &then &sv nitem = [subst %nitem1% d %dy% ] &else &sv nitem = [subst %nitem1% 0d %dy% ] &type %nitem% /* check if the item exists &s iexists = [iteminfo %xdat% -point %nitem% -exists] &if %iexists% &then &do /* &s namit%nit% = %nitem% &sv selec = res %nitem% %klm% 0 &type selecting stations %selec% arc reselect %xdat% xxxxx1 point %selec% ~ n n &end arc build xxxxx1 point %nitem% = int ( %cfac% * idw ( xxxxx1, %nitem% ) ) kill xxxxx1 all &end &end &end &messages &on &return /*------------- &routine USAGE /*------------- &type Usage: &r RAINSAST <from_mt> <from_dy> <to_mt> <to_dy> <a_grid> &type <cell_size> <point_coverage> {YES|NO} &return &inform /*------------- &routine EXIT /*------------- /* delete all temporary files: &if [exists xxxxx1 -COVER] &then ; kill xxxxx1 all &messages &on &return /*-------------- &routine BAILOUT /*-------------- &severity &error &ignore &call exit &return &warning An error has occurred in SELDATA.AML /* ----------------------------------------------------------------------- /* EXAMPLE OF APPLICATION /* ----------------------------------------------------------------------- /* /* input /* &s data = $HOME/flood/precip/poinmap/prec50 /* point coverage /* &s wgrid = $HOME/flood/dem/buf50g /grid window sample /* grid /* &run rainsast 1 1 1 4 %wgrid% 2000 %data% YES /* q /* &return /* return from the example AML /* ----------------------------------------------------------------------- &r rs 1 1 1 4 ~/flood/dem/buf50g 2000 ~/flood/precip/poinmap/prec50 YES

NEXTWSH.AML


/* ----------------------------------------------------------------------- /* Program: NEXTWSH.AML /* Purpose: Creates an info file that contains two items: ID and the /* downstream watershed ID. Also creates a grid of watersheds' /* outlets that contains the ID of downstream watershed. /* Runs only in GRID /* ----------------------------------------------------------------------- /* Usage: &r NEXTWSH <fdir> <fwsh> <flpp> <infn> <idin> <nxin> <fnxt> /* Arguments: (input) /* fdir = (grid) flow direction /* fwsh = (grid) watersheds (partial drainage areas) /* value in VAT = unit_id number /* flpp = (grid) pour points ( watershed outlets) /* value in VAT = unit_id number /* (output) /* infn = (info file) name of the info file to be created /* idin = (item) name of the item in which the unit_id number /* will be stored /* nxin = (item) name of the item in which the downstream /* unit_id will be stored /* fnxt = (grid) similar to the %flpp% (watershed outlets): /* the values in VAT are not equal to the unit_id /* number, they equal to the downstream unit /* ID number. /* Temporary: xxxtmp, xxxcomnx, (GRIDS) /* xxxtmp1 (INFO) /* Globals: /* Locals: see Temporary /* ----------------------------------------------------------------------- /* Calls: /* ----------------------------------------------------------------------- /* Notes: The ID of the next watershed for the most downstream /* watershed equals 0. /* If the flowdirection in watershed pour point can't be /* determined, the next watershed ID = -1 /* This AML checks neither for the existence and correctness /* of the input files nor for the existence of files that have /* the same names as the temporary files and the files to be /* created. /* If an error occurs then all grids and all info files that /* have the same name as output grids/info files and temporary /* grids/info are erased !!! /* ----------------------------------------------------------------------- /* History: original coding by Pawel Mizgalewicz 12/20/93 /* (tested on the Alegheny River basin). Major part /* extracted from net4.aml ( tested for the Cedar River /* - Iowa River basin, grid = 3000^2 cells, basin = /* 3.2*10e6 cells of size 100m*100m), /* and converted into a stand alone procedure 06/10/95 /* e-mail: pawel@nile.crwr.utexas.edu /* ======================================================================= &args fdir fwsh flpp infn idin nxin fnxt &severity &error &routine bailout &if [SHOW PROGRAM] <> GRID &then &return This only runs in GRID. &if [null %fnxt%] &then &do &call usage &return &end &messages &off /* ----------------------------------------------------------------------- /* Downstream watershed /* ----------------------------------------------------------------------- &type searching for next watershed [date -vfull] &sv wsh = xxxtmp %wsh% = con ( isnull ( %fwsh%), 0, %fwsh% ) &type start %fnxt% [date -vfull] %fnxt% = con (%flpp% > 0, con (%fdir% == 1, %wsh%(1,0), ~ con (%fdir% == 2, %wsh%(1,1), ~ con (%fdir% == 4, %wsh%(0,1), ~ con (%fdir% == 8, %wsh%(-1,1), ~ con (%fdir% == 16, %wsh%(-1,0), ~ con (%fdir% == 32, %wsh%(-1,-1), ~ con (%fdir% == 64, %wsh%(0,-1), ~ con (%fdir% == 128, %wsh%(1,-1), -1))))))))) kill xxxtmp all &if not [exists %fnxt%.vat -info] &then buildvat %fnxt% /* the %fnxt% grid is similar to the %flpp%, watershed outlets grid. /* The cell value is equal to the next watershed ID number xxxcomnx = combine ( %flpp% , %fnxt% ) /* ----------------------------------------------------------------------- /* Create info file /* ----------------------------------------------------------------------- arc additem xxxcomnx.vat xxxtmp1 %idin% 4 8 B arc additem xxxtmp1 xxxtmp1 %nxin% 4 8 B cursor xxnext declare xxxtmp1 info rw cursor xxnext open &do &while %:xxnext.aml$next% &s :xxnext.%idin% = [value :xxnext.%flpp%] &s :xxnext.%nxin% = [value :xxnext.%fnxt%] cursor xxnext next &end cursor xxnext close cursor xxnext remove arc pullitems xxxtmp1 %infn% %idin% %nxin% end kill xxxcomnx all &s x = [delete xxxtmp1 -info] &messages &on &return /*------------- &routine USAGE /*------------- &type Usage: &r NEXTWSH <fdir_grid> <wsh_grid> <pourpt_grid> <newINF_name> &type <id_item_name> <next_id_item_name> <nextwsh_grid> &return &inform /*------------- &routine EXIT /*------------- /* delete all temporary files: &if [exists xxxtmp1 -info ] &then ; &s x = [delete xxxtmp1 -info] &if [exists xxxtmp -GRID] &then ; kill xxxtmp all &if [exists xxxcomnx -GRID] &then ; kill xxxcomnx all &if [exists %fnxt% -GRID] &then ; kill %fnxt% all &return /*-------------- &routine BAILOUT /*-------------- &severity &error &ignore &call exit &return &warning An error has occurred in NEXTWSH.AML /* ----------------------------------------------------------------------- /* EXAMPLE OF APPLICATION /* ----------------------------------------------------------------------- /* grid /* /* input /* &s fdir = $HOME/iowa/data/crfdr /* crfdr = flowdirection(elev_grid) /* &s fwsh = crwsh /* watersheds /* &s flpp = crlpp /* pour points /* /* output /* &s fnxt = crnxt /* outflow,value=downstream wshd ID /* &s infn = %fnxt%.dat /* new info file /* &s idin = unit_id /* item (unit ID) /* &s nxin = next_id /* item ( downstream unit ID) /* /* &run nextwsh %fdir% %fwsh% %flpp% %infn% %idin% %nxin% %fnxt% /* /* q /* quit from GRID /* /* &return /* return from example AML /* -----------------------------------------------------------------------

RMAP56.AML


/* ----------------------------------------------------------------------- /* Program: RMAP56.AML /* Purpose: Creates INFO file that contains average value of rainfall /* (or other parameter) for specified zones. The daily /* values are extracted from the precipitation grids. /* Method: Zonal average values are calculated and then stored /* in INFO table (in field), day by day. /* Runs only in GRID /* ----------------------------------------------------------------------- /* Usage: &r RMAP56 <fm> <fd> <tm> <td> <p_path> <zone> <outinf> /* <id_itm> <prc> <kcell> {prx} /* Arguments: (input) /* fm, fd = from month, from day /* tm, td = to month, to month /* p_path = folder in which maps(grids) of rainfall are stored /* zone = (grid) defines zones and their IDs /* = it means that the grid name is used /* outinf = (info) output info file to be created /* id_itm = (item) name of the item in which the zone ID will /* stored /* prc = (number) "precision" - the data must be converted /* into integer values. Before they are converted they /* are multiplied by <prc> to preserve decimal part /* of value. /* prx = (prefix). Prefix of the names of precipitation maps /* and the names of the items that are created /* in <outinf> info file = <prx> + names /* ("p" = default). /* Temporary: xxx, xxx2, xxxcom (grid), xxxtmp1 (info) /* Globals: /* Locals: ab, bb, cc, nitem, prfnitem, yr,mt (variables) /* ----------------------------------------------------------------------- /* Calls: /* ----------------------------------------------------------------------- /* Notes: The following naming convention is used for items in PAT /* pyymmdd, where: p indicates precipitation) /* (there is also user defined prefix), yy= year, /* mm = month, dd = day, for example an item that contains /* records for 17 March 1976 has the name "prefix"+"p19760317" /* !!!! Grids of precipitation I used are in 10e-3 [mm] /* To go back to [mm] the %prc% inside this AML is /* multiplied by 1000 before values are stored in INFo table !! /* ----------------------------------------------------------------------- /* History: Adopted from RAININFO.AML 09/10/95 /* PM. /* ----------------------------------------------------------------------- &args fm fd tm td p_path zone outinf id_itm prc prx &severity &error &routine bailout &if [SHOW PROGRAM] <> GRID &then ; &return This only runs in GRID. &if [null %prc%] &then &do &call usage &return &end &if [null %prx%] &then ; &s prx = p &messages &off /* GRID settings: &s xxcell = [show setcell] &s xxwindow = [show setwindow] /* ----------------------------------------------------------------------- /* Make a copy of %zone%.VAT file and add item %zone% /* ----------------------------------------------------------------------- arc additem %zone%.vat xxxtmp1 %zone% 4 5 B cursor xxcur1 declare xxxtmp1 info rw cursor xxcur1 open &do &while %:xxcur1.aml$next% &s :xxcur1.%zone% = %:xxcur1.VALUE% cursor xxcur1 next &end cursor xxcur1 close cursor xxcur1 remove /* ========== settings !!! ================ /* year: 93 &s yr = 93 &s templ = yy0m0d &s nitem0 = [subst %templ% yy %yr% ] /* ======================================== /* ----------------------------------------------------------------------- /* Do it for all selected years and months /* ----------------------------------------------------------------------- &DO mt = %fm% &to %tm% &if %mt% lt 10 &then &sv nitem1 = [subst %nitem0% m %mt% ] &else &sv nitem1 = [subst %nitem0% 0m %mt% ] &DO dy = 1 &to 31 &s ab = [calc %mt% = %fm%] AND [calc %dy% lt %fd%] &s bb = [calc %mt% = %tm%] AND [calc %dy% gt %td%] &s cc = %ab% OR %bb% &IF NOT %cc% &THEN &do &if %dy% lt 10 &then &sv nitem = [subst %nitem1% d %dy% ] &else &sv nitem = [subst %nitem1% 0d %dy% ] &type processing item %nitem% ... [date -vfull] /* ---------------------------------------------------------------------- /* &DO yr = %ty% &to %fy% &by -1 /* &DO mt = 12 &to 1 &by -1 /* &s ab = [calc %yr% = %fy%] AND [calc %mt% lt %fm%] /* &s bb = [calc %yr% = %ty%] AND [calc %mt% gt %tm%] /* &s cc = %ab% OR %bb% /* &IF NOT %cc% &THEN /* &do /* &s templ = mxxxx0y /* &s nitem = [subst %templ% xxxx %yr% ] /* &if %mt% lt 10 &then /* &sv nitem = [subst %nitem% y %mt% ] /* &else /* &sv nitem = [subst %nitem% 0y %mt% ] /* &type processing item %nitem% ... [date -vfull] /* --------------------------------------------------------------------- /* put together the name of precipitation grid /* &sv xxx = %p_path%%prx%%nitem% setcell %zone% setwindow %zone% /* calculate zonal mean ( units micrometers/d ?) xxx3 = zonalmean ( %zone% , %xxx% ) /* multiply by precision ... ( 10 ?) xxx4 = %prc% * xxx3 kill xxx3 all xxx2 = int ( xxx4 ) kill xxx4 all /* xxx2 = int ( %prc% * zonalmean ( %zone% , %xxx% )) xxxcom = combine (%zone%, xxx2) /* here an error may occur. kill xxx2 all &s prfnitem = %prx%%nitem% arc additem xxxcom.vat xxxcom.vat %prfnitem% 4 12 F 4 xxx2 cursor xxcur1 declare xxxcom.vat info rw cursor xxcur1 open /* !!! %prc% is multiplied by 1000, because precipitation grids /* were not in [cm] but in 10-3 [mm] (to preserve precision /* in integer form &s prc2 = [calc %prc% * 1000] &do &while %:xxcur1.aml$next% &s :xxcur1.%prfnitem% = [calc %:xxcur1.xxx2% / %prc2%] cursor xxcur1 next &end cursor xxcur1 close cursor xxcur1 remove arc joinitem xxxtmp1 xxxcom.vat xxxtmp1 %zone% %zone% arc dropitem xxxtmp1 xxxtmp1 xxx2 kill xxxcom all &end &end &end /* ----------------------------------------------------------------------- &type Adding ID item ... [date -vfull] /* ----------------------------------------------------------------------- arc additem xxxtmp1 xxxtmp1 %id_itm% 4 8 B # %zone% cursor xxcur2 declare xxxtmp1 info rw cursor xxcur2 open &do &while %:xxcur2.aml$next% &s :xxcur2.%id_itm% = [value :xxcur2.%zone%] cursor xxcur2 next &end cursor xxcur2 close cursor xxcur2 remove /* ----------------------------------------------------------------------- &type Cleaning INFO table (droping redundant items) ... [date -vfull] /* ----------------------------------------------------------------------- arc dropitem xxxtmp1 xxxtmp1 value arc dropitem xxxtmp1 xxxtmp1 count arc dropitem xxxtmp1 %outinf% %zone% &s x = [delete xxxtmp1 -info] /* restore GRID settings: setcell %xxcell% setwindow %xxwindow% &messages &on &return /*------------- &routine USAGE /*------------- &type Usage: &r RMAP56 <fm> <fd> <tm> <td> <p_path> <zone> <out_inf> &type <ID_item> <prc> {prefix} &return &inform /*------------- /* &routine CHECK /*------------- /* IF temporary file exists, inform and exit /* If file to be build exists , inform and exit /* If input file is not correct or does not exist, inform and exit /* return /*------------- &routine EXIT /*------------- /* delete all temporary files: &if [exists xxxtmp1 -info ] &then; &s x = [delete xxxtmp1 -info] &if [exists xxx -GRID] &then ; kill xxx all &if [exists xxx2 -GRID] &then ; kill xxx2 all &if [exists xxx3 -GRID] &then ; kill xxx3 all &if [exists xxx4 -GRID] &then ; kill xxx4 all &if [exists xxxcom -GRID] &then ; kill xxxcom all /* restore GRID settings: setcell %xxcell% setwindow %xxwindow% &messages &on &return /*-------------- &routine BAILOUT /*-------------- &severity &error &ignore &call exit &return &warning An error has occurred in RAININFO.AML /* ----------------------------------------------------------------------- /* EXAMPLE OF APPLICATION /* ----------------------------------------------------------------------- /* grid /* &s fm = 1 /* &s fd = 1 /* &s tm = 1 /* &s td = 31 /* &s p_path = $home/iowa/rainmaps/ /* &s zone = crwsh /* &s id unit_id /* /* &r rmap56 %fm% %fd% %tm% %td% %p_path% %zone% prcinf2 %unit_id% 10 p /* /* q /* &return /* -----------------------------------------------------------------------

EMAP56.AML


grid &messages &off /* ----------------------------------------------------------------------- /* Program: eMAP56.AML /* !!!! this is a version of rmap56.aml !!! some comments written here /* are not valid !!! parameters are set inside this AML, they are /* not passed through &args. Monthly time step is applied !!!!! /* Purpose: Creates INFO file that contains average value of Evaporation /* (or other parameter) for specified zones. /* Method: Zonal average values are calculated and then stored /* in INFO table (in field), day by day (month by month). /* Runs only in GRID /* ----------------------------------------------------------------------- /* Usage: &r RMAP56 <fm> <fd> <tm> <td> <p_path> <zone> <outinf> /* <id_itm> <prc> <kcell> {prx} /* Arguments: (input) /* fm, fd = from month, from day /* tm, td = to month, to month /* p_path = folder in which maps(grids) of rainfall are stored /* zone = (grid) defines zones and their IDs /* = it means that the grid name is used /* outinf = (info) output info file to be created /* id_itm = (item) name of the item in which the zone ID will /* stored /* prc = (number) "precision" - the data must be converted /* into integer values. Before they are converted they /* are multiplied by <prc> to preserve decimal part /* of value. /* prx = (prefix). Prefix of the names of precipitation maps /* and the names of the items that are created /* in <outinf> info file = <prx> + names /* ("p" = default). /* Temporary: xxx, xxx2, xxxcom (grid), xxxtmp1 (info) /* Globals: /* Locals: ab, bb, cc, nitem, prfnitem, yr,mt (variables) /* ----------------------------------------------------------------------- /* Calls: /* ----------------------------------------------------------------------- /* Notes: The following naming convention is used for items in PAT /* pyymmdd, where: p indicates precipitation) /* (there is also user defined prefix), yy= year, /* mm = month, dd = day, for example an item that contains /* records for 17 March 1976 has the name "prefix"+"p19760317" /* !!!! Grids of precipitation I used are in 10e-3 [mm] /* To go back to [mm] the %prc% inside this AML is /* multiplied by 1000 before values are stored in INFo table !! /* ----------------------------------------------------------------------- /* History: Adopted from RAININFO.AML 09/10/95 /* PM. /* ----------------------------------------------------------------------- /* &args fm fd tm td p_path zone outinf id_itm prc prx &s fm = 1 &s tm = 13 &s zone zones56 &s outinf evap.dat &s id_itm unit_id &s prc = 10 &s prx = e &severity &error &routine bailout &if [SHOW PROGRAM] <> GRID &then ; &return This only runs in GRID. &if [null %prc%] &then &do &call usage &return &end &if [null %prx%] &then ; &s prx = e &messages &off /* GRID settings: &s xxcell = [show setcell] &s xxwindow = [show setwindow] /* ----------------------------------------------------------------------- /* Make a copy of %zone%.VAT file and add item %zone% /* ----------------------------------------------------------------------- arc additem %zone%.vat xxxtmp1 %zone% 4 5 B cursor xxcur1 declare xxxtmp1 info rw cursor xxcur1 open &do &while %:xxcur1.aml$next% &s :xxcur1.%zone% = %:xxcur1.VALUE% cursor xxcur1 next &end cursor xxcur1 close cursor xxcur1 remove /* ========== settings !!! ================ /* year: 93 &s yr = 93 &s templ = yy0m00 &s nitem0 = [subst %templ% yy %yr% ] /* ======================================== /* ----------------------------------------------------------------------- /* Do it for all selected years and months /* ----------------------------------------------------------------------- &DO mt = %fm% &to %tm% &if %mt% lt 10 &then &sv nitem = [subst %nitem0% m %mt% ] &else &sv nitem = [subst %nitem0% 0m %mt% ] &type processing item %nitem% ... [date -vfull] /* ---------------------------------------------------------------------- /* --------------------------------------------------------------------- /* put together the name of precipitation grid /* &sv xxx = %prx%%nitem% setcell %zone% setwindow %zone% /* calculate zonal mean ( units micrometers/d ?) xxx3 = zonalmean ( %zone% , %xxx% ) /* multiply by precision ... ( 10 ?) xxx4 = %prc% * xxx3 kill xxx3 all xxx2 = int ( xxx4 ) kill xxx4 all xxxcom = combine (%zone%, xxx2) /* here an error may occur. kill xxx2 all &s prfnitem = %prx%%nitem% arc additem xxxcom.vat xxxcom.vat %prfnitem% 4 12 F 4 xxx2 cursor xxcur1 declare xxxcom.vat info rw cursor xxcur1 open /* !!! %prc% is multiplied by 1000, because precipitation grids /* were not in [cm] but in 10-3 [mm] (to preserve precision /* in integer form &s prc2 = [calc %prc% * 1000] &do &while %:xxcur1.aml$next% &s :xxcur1.%prfnitem% = [calc %:xxcur1.xxx2% / %prc2%] cursor xxcur1 next &end cursor xxcur1 close cursor xxcur1 remove arc joinitem xxxtmp1 xxxcom.vat xxxtmp1 %zone% %zone% arc dropitem xxxtmp1 xxxtmp1 xxx2 kill xxxcom all /* &end /* &end &end /* ----------------------------------------------------------------------- &type Adding ID item ... [date -vfull] /* ----------------------------------------------------------------------- arc additem xxxtmp1 xxxtmp1 %id_itm% 4 8 B # %zone% cursor xxcur2 declare xxxtmp1 info rw cursor xxcur2 open &do &while %:xxcur2.aml$next% &s :xxcur2.%id_itm% = [value :xxcur2.%zone%] cursor xxcur2 next &end cursor xxcur2 close cursor xxcur2 remove /* ----------------------------------------------------------------------- &type Cleaning INFO table (droping redundant items) ... [date -vfull] /* ----------------------------------------------------------------------- arc dropitem xxxtmp1 xxxtmp1 value arc dropitem xxxtmp1 xxxtmp1 count arc dropitem xxxtmp1 %outinf% %zone% &s x = [delete xxxtmp1 -info] /* restore GRID settings: setcell %xxcell% setwindow %xxwindow% &messages &on &type end of dec [date -vfull] q q &return /*------------- &routine USAGE /*------------- &type Usage: &r RMAP56 <fm> <fd> <tm> <td> <p_path> <zone> <out_inf> &type <ID_item> <prc> {prefix} &return &inform /*------------- /* &routine CHECK /*------------- /* IF temporary file exists, inform and exit /* If file to be build exists , inform and exit /* If input file is not correct or does not exist, inform and exit /* return /*------------- &routine EXIT /*------------- /* delete all temporary files: &if [exists xxxtmp1 -info ] &then; &s x = [delete xxxtmp1 -info] &if [exists xxx -GRID] &then ; kill xxx all &if [exists xxx2 -GRID] &then ; kill xxx2 all &if [exists xxx3 -GRID] &then ; kill xxx3 all &if [exists xxx4 -GRID] &then ; kill xxx4 all &if [exists xxxcom -GRID] &then ; kill xxxcom all /* restore GRID settings: setcell %xxcell% setwindow %xxwindow% &messages &on &return /*-------------- &routine BAILOUT /*-------------- &severity &error &ignore &call exit &return &warning An error has occurred in RAININFO.AML /* ----------------------------------------------------------------------- /* EXAMPLE OF APPLICATION /* ----------------------------------------------------------------------- /* grid /* &s fm = 1990 /* &s fd = 1 /* &s tm = 1990 /* &s td = 12 /* &s p_path = $home/iowa/rainmaps/ /* &s zone = crwsh /* &s id unit_id /* /* &r rmap56 %fm% %fd% %tm% %td% %p_path% %zone% prcinf2 %unit_id% 10 p /* /* q /* &return /* -----------------------------------------------------------------------

BAL1.AML


/* version for daily time step ! JANUARY !!! &s flw = d56eqp1.dat /* total flow and evap, jan precipitation /* specify period JANUARY &s fm = 1 &s fd = 1 &s tm = 1 &s td = 31 &s out = bal1 /* output info file must exist copyinfo data56 %out% /* input1 xxflwin (what is it ? = name of the file for unload command) /* input2 xxprcin /* input3 xxevain /* output xxbalout &sv oldname = unit_id tables /* ========== settings !!! ================ /* year: 93 &s yr = 93 &s templ = yy0m0d &s nitem0 = [subst %templ% yy %yr% ] /* ======================================== /* ----------------------------------------------------------------------- /* Do it for all selected (years and) months and days /* ----------------------------------------------------------------------- &DO mt = %fm% &to %tm% &if %mt% lt 10 &then &sv nitem1 = [subst %nitem0% m %mt% ] &else &sv nitem1 = [subst %nitem0% 0m %mt% ] &sv nitem00 = [subst %nitem1% 0d 00 ] /* ====== unload monthly evaporation depth ============================= /* select %eva% /* unload xxevain unit_id unit_nx area_km2 order e%nitem00% DELIMITED INIT &DO dy = 1 &to 31 &s ab = [calc %mt% = %fm%] AND [calc %dy% lt %fd%] &s bb = [calc %mt% = %tm%] AND [calc %dy% gt %td%] &s cc = %ab% OR %bb% &IF NOT %cc% &THEN &do &if %dy% lt 10 &then &sv nitem = [subst %nitem1% d %dy% ] &else &sv nitem = [subst %nitem1% 0d %dy% ] &type processing item %nitem% ... [date -vfull] /* ---------------------------------------------------------------------- &sv ddd = [substr %nitem% 5 2 ] &sv mmm = [substr %nitem% 3 2 ] &sv yyy = [substr %nitem% 1 2 ] &sv scott = q%mmm%%ddd%%yyy% /* &sv eva = [substr %nitem% 1 4 ] /* &sv eva2 = e%eva%00 select %flw% unload xxflwin unit_id next_id areakm2 e%nitem00% %scott% p%nitem% DELIMITED INIT /* select %prc% /* unload xxprcin unit_id p%nitem% DELIMITED INIT /* select %eva% /* unload xxevain unit_id unit_nx gswsh area_km2 order e%nitem% DELIMITED INIT &sys b4v1 xxflwin xxbalout &s i = 0 &do &until [exists xxbalout -file] &s i = %i% + 1 &end define x%nitem%.dat unit_id,4,8,B b%nitem%,4,12,F,6 ~ add from xxbalout select x%nitem%.dat /* ------- delete what is not needed --------- &s x = [delete xxbalout -file] &if %x% eq 0 &then &type file xxbalout has been deleted &else &type ERROR: could not delete file xxbalout &s x = [delete xxflwin -file] &if %x% eq 0 &then &type file xxflwin has been deleted &else &type ERROR: could not delete file xxflwin &sys arc joinitem %out% x%nitem%.dat %out% unit_id %oldname% kill x%nitem%.dat &sv oldname = b%nitem% &end &end &end q q &return

B4V1.C


#include <stdio.h> #include <math.h> #define size 512 main (argc, argv) int argc; char *argv[]; /* arg 1) q input (flw) must have: id and flow. 2) p input (prc) must have id and prec 3) e input (evp) must have id and prec 4) d input ( ) must have id, next, order and area. flow [cfs/s], prc[mm] evp[mm] 5) name_out */ { char loop, notfound; int i, j, gsni, gsnn; long unid[1000], unnx[1000], id, nx ; float gsqo[1000], unar[1000], unev[1000], unpr[1000], bal[1000], gsin[1000], ar, qo, pr, ev, x1; char buf[size]; FILE *fqin, *ftable2; if (argc < 2 ) { printf ("wrong number of arguments = %d \n", argc); exit(1); } /* ------------------------ flow data ----------------------- */ fqin = fopen (argv[1], "r"); /* fgets(buf, size, fqin); */ i = 0; while (fscanf(fqin, "%li,%li,%f,%f,%f,%f", &id, &nx, &ar, &ev, &qo, &pr) != EOF) { unid[i] = id; unnx[i] = nx; unar[i] = ar; unev[i] = ev; gsqo[i] = qo; unpr[i] = pr; /* printf(" record %li, %li,%f,%f,%f,%f \n", unid[i], unnx[i], unar[i], unev[i], gsqo[i], unpr[i]); */ ++i; } gsnn = i-1; gsni = i; fclose(fqin); /* Set initial values: change evaporation */ for (i = 0; i <= gsnn; ++i ) { gsin[i] = 0.0; } /* Calculate river water inflow: */ for (i = 0; i <= gsnn; ++i ) { for (j = 0; j <= gsnn; ++j ) if ( unnx[i] == unid[j] ) { gsin[j] = gsin[j] + gsqo[i]; } } /* Calculate (outflow - inflow) * conversion factor / area: */ /* cfs => 2446.6 m3/d => 2446.6 * 1000 * 1000 * 1000 mm /* area km2 * 1000 * 1000 m /* cfs * 2446.6 / 10^6 = m/d => 2446.6 / 1000 mm/d /* cfs * 2.4466 /areakm2 [mm] /* Calculate (outflow - inflow) * conversion factor / unar: */ for (i = 0; i <= gsnn; ++i ) { x1 = 2.4466 * (gsqo[i] - gsin[i] ) / unar[i]; bal[i] = unpr[i] - unev[i] - x1 ; } /* replace very large numbers by -999 */ /* for (i = 0; i <= gsnn; ++i ) { if ( fabs(bal[i]) > 999 ) { bal[i] = -999 ; } } */ /* Write results to output file */ ftable2 = fopen (argv[2], "w"); /* arc view version: fprintf ( ftable2, "\"%s\",\"%s\"\n", argv[3], argv[4]); */ for (j = 0; j <= gsnn; ++j ) fprintf(ftable2, "%li,%f\n", unid[j], bal[j] ); fclose(ftable2); return 0; }

How to ... notes



References

Bhowmik, N.G., A.G. Buck, S.A. Changnon, R.H. Dalton, A. Durgunoglu, M. Demissie, A.R. Juhl, H.V. Knapp, K.E. Kunkel, S.A. McConkey, R.W. Scott, K.P. Singh, T.D. Soong, R.E. Sparks, A.P. Visocky, D.R. Vonnahme, W.M. Wendland, 1994. The 1993 Flood on the Mississippi River in Illinois. Illinois State Water Survey Miscellaneous Publications 151.

FMRC 1994. A Blueprint for Change: Sharing the Challenge: Floodplain Management into the 21st Century. Report of the Integracy Floodplain Management Review Committee to the Administration Floodplain Management Task Force. Washington D. C.

Hydrosphere, 1993b. Climatedata, TD 3200 Summary of the Day Cooperative Observer Network; User Manual and CD-Roms, Hydrosphere Data Products, INC., 1002 Walnut, Suite 200, Boulder, CO 80301. http://www.hydrosphere.com/ )

Parrett, Ch., Melcher, N. B., and James, R. W. Jr., 1993. Flood Discharges in the Upper Mississippi River Basin, 1993. US Geological Survey Circular 1120-A, pp. 14.

Parrett, Ch., Melcher, N. B., and James, R. W. Jr., 1993. Flood Discharges in the Upper Mississippi River Basin, 1993. US Geological Survey Circular 1120-A, pp. 14.

Rea, A. and J.R. Cedetrstrand, 1993. GCIP Reference Data Set (GREDS), GEWEX Continental-Scale International Project, U.S. Geplogical Survey, Open-File Report 94-388, CD-ROM.