Urban Flood Risk Model - Serv.built we cannot explain

Sure, @rxchelzhxng davefisher@stanford.edu

Thanks! I have sent an email with the attached Urban Flood Risk Model inputs and outputs :slight_smile:

It seems that attaching the files is too large for the mail to receive so I have resent an email with a link to the Drive. Please let me know if you got it :slight_smile:

Hi @rxchelzhxng , thanks for sharing your data. I don’t know the exact value you used for the rainfall depth for those particular results. But here’s what I see for the other numbers, in particular the pixel size of the input lulc raster is not exactly what you thought it was.

pixel_area = 13.8466 m * 13.8466 m
Q_m3.tif + runoff_retention_m3.tif = 6.48043 m^3 (true for all pixels, I added the rasters together.)
Rainfall volume (??? meters * pixel_area) = ???

For rainfall volume 0.038 m, volume = 7.286 m^3. I’m not sure how to explain that small discrepancy, but I’m also not sure 0.038 m was the value you used. It’s worth double-checking that with the model’s log, or your python script.

Let me know what you think!

Hi Dave,

The rainfall depth I entered was 33.8 mm so it seems that the rainfall volume: 6.4804 m^3 is the same as your calculated sum of Q_m3.tif + runoff_retention_m3.tif. I am interested in how you calculated the pixel_area to be 13.8466 m * 13.8466 m = 191.7 m^2 since I calculated the area of the lulc raster as 226 km^2. I know the ground-truth area of that watershed and it is a lot bigger than 191.7 m^2.

Thanks once again!

Oh good!

The pixel dimensions come straight from the raster metadata itself, no calculations necessary. You can see this in various ways, for example in the “layer properties” menu of QGIS or ArcGIS. Or with gdalinfo at the command-line (if you have gdal installed). That’s how the invest model determines the pixel size, by reading the raster’s metadata. 191.7 m^2 is the area of one single pixel in the raster. And all the model equations are applied per pixel, so that’s the relevant pixel area, rather than the area of the entire watershed.

Hope that helps,

I just checked the layer properties in QGIS and I see the 13.8466 by 13.8466 for the area of one pixel. I originally thought it was 100 m^2 but there could be some slight discrepancy which results in almost a two-fold area difference. Thanks for pointing that out!

For this calculation, did you take the maximum flood volume in the Q_m3.tif raster? That’s where I’m seeing the 6.48043. If not, could you explain how you added the rasters?

As well as if I do assume that

and I multiply by the number of pixels (2480 * 2412) to get the total rainfall volume I get 3.88E7 m^3. But this isn’t the same as summing the output from the flood_risk_service.shp file. Any thoughts on why this could be?

No problem! As an aside, this is where it could be important to consider the projection & coordinate system you’re using. I noticed the inputs are in a “pseudo-mercator” projection (aka the google maps projection), which is notoriously bad at representing accurate area. Areas are highly distorted/exaggerated in high latitudes. For fairly local analyses like this, a UTM projection can be good choice.

I used the “raster calculator” in QGIS. I knew that the two rasters were perfectly aligned (same extent and cell size) so the calculator simply “stacks” them and does the sum on each pixel stack, producing a third raster. All the pixels in the output raster had that same 6.4… value.

Yep! Many of the pixels in that array of 2480 x 4212 are outside the boundary of the watershed polygon. The pixels have a “nodata” value. QGIS hides them for convenience, but they are still part of that 2480 x 4212. If you want to reproduce the watershed summary step, the GIS tool to use is “zonal statistics”. That’s exactly what the model does for you, using a very well-tested zonal stats tool.

1 Like

This is a good point! Does this mean the projection that I’m currently in is the cause for why the lulc pixel resolution is 13.85 m per pixel instead of the 10 m per pixel (documented by Copernicus satellite)?

Thanks for the inputs on using the “zonal statistics” plugin and “raster calculator”

That’s a good question and I’m not really sure off-hand. If possible I would try going back to the original dataset - before it was projected to pseudo-mercator - and check it’s projection & cell size, and then if it’s not already in meters, project it to a different projection, maybe UTM or an “equal-area” type projection that uses meters as units, and check the resulting cell size.

1 Like

@rxchelzhxng -

Any time we reproject a raster, we get to choose what pixel size it has. Often, data comes in geographic coordinates, where the units are degrees. When we reproject/warp that to a projected coordinate system, the GIS tool asks what we want the pixel size to be. It will give a default value, but you can change it. Personally, I always choose a round number like 10m, and don’t leave the default. So like @dave says, it might be worth going back to the original dataset, and redo the projection, choosing 10m (or whatever you want) as the resolution. Just remember that if you change it too much from the original resolution, you’ll start to change the underlying land use data.

~ Stacie

1 Like

Hi Stacie,

Thanks for the insight! I am interested in validating the original dataset that is in EPSG: 3857 - WGS 84 in Python before reprojecting to a different coordinate system. Is there a way to do this in Python without a GIS tool?

With the GIS tool, do you know where I can view the default value of the pixel size and change it? I am currently seeing this PROJCRS[“WGS 84 / Pseudo-Mercator”,
BASEGEOGCRS[“WGS 84”,
DATUM[“World Geodetic System 1984”,
ELLIPSOID[“WGS 84”,6378137,298.257223563,
LENGTHUNIT[“metre”,1]]]

Is the lengthunit where you typically round to 10m?

Thanks once again!

@rxchelzhxng , if you’re running invest from python, then you should have pygeoprocessing installed. So you could do

import pygeoprocessing
raster_info = pygeoprocessing.get_raster_info('path/to/your/raster.tif')
raster_info['pixel_size']

That’s exactly what the invest models do to find that information. But you cannot change these values. To change the pixel size, you have to go back to the original dataset and project (or “warp”) it , choosing a pixel size for the new raster, like Stacie says. It’s easiest to do this in GIS, but if you want to do it in python, you could use pygeoprocessing.warp_raster. And you can find the documentation here: pygeoprocessing/geoprocessing.py at 76e82720532b588f3fc0763db8a6c64c0bdfab43 · natcap/pygeoprocessing · GitHub

1 Like

Currently, I’ve tried Set layer CRS and it seems that the pixel size is the same.
I do not see where to change the default value of the pixel size in QGIS. Any ideas on where it could be?

We cannot change the pixel size on a raster; it’s an intrinsic part of the file. Instead we have to create a new raster with the desired target pixel size. In QGIS, do this by right-clicking the original raster from the Layers panel, then “Export” > “Save as”. There you can choose where to save the new file, and choose the “resolution” which will always have units that match the units of the CRS (which you can also change at that point, or leave the same).

1 Like

Hi,

I am trying to reproduce this zonal statistics step in my urban flood risk mitigation model in order to derive the number of pixels in the raster that contain the actual watershed region. I took a look at the documentation here https://github.com/natcap/pygeoprocessing/blob/76e82720532b588f3fc0763db8a6c64c0bdfab43/src/pygeoprocessing/geoprocessing.py#L1859 for the zonal_statistics function and wanted to know more on the output dictionary.

Would the count parameter in the dictionary be the number of pixels that represent the watershed region?

Is the sum parameter supposed to be the sum of the count pixels and the nodata_count pixels?

Thanks

Hi @rxchelzhxng , that’s right. count is the number of pixels in the polygon. But sum is not about the number of pixels, it’s the sum of all the values of all the pixels in the polygon.

1 Like

Thanks that clarifies it!