Starting an InVEST python model

Dear coummunity

I have successfully installed the python InVEST code and would like to run a pollination model.

Is it right that I just have to run:
invest -vvv run ?

Where do I have to store all my input files for the model?

How do I have to feed in all the input files for the model?

Kind regards
Sibylle

Hi sibylles, that’s a good start!

To run an InVEST model from the command line, you need to specify the model name, the path to a datastack (a JSON file that specifies the inputs), and the path to a workspace directory (where you want the output files to go).

The format looks like this:

invest -vvv run <modelname> -d <datastack json file> -w <output_workspace>

and you can get more info about commands and options in the Invest CLI documentation (and/or by running invest --help).

You can store your input files anywhere on your computer. The key to feeding those inputs to the model is the datastack JSON file.

To understand what an InVEST datastack looks like, I’d recommend looking at the sample data. If you haven’t already downloaded the InVEST sample data, you can get it from the InVEST Downloads page. Follow the link to Individual Sample Datasets for InVEST, then select pollination.zip to download sample data that can be used to run the pollination model. If you open the file pollination_willamette.invs.json, you’ll see a JSON representation of the inputs (args) required by the pollination model. To run the model with your own data, you will need to create a JSON file with this same format, where the value of each arg is the path to an input file.

I’d also recommend running the model with the sample data first, just to make sure things are working as expected. So, for example, if you are inside the pollination sample data directory, you can run invest -vvv run pollination -d pollination_willamette.invs.json -w ../path/to/output-directory (where ../path/to/output-directory is the relative path to a directory where you want the output files to go).

1 Like

Many thanks @edavis

Would you recommend:
Best JSON Formatter and JSON Validator: Online JSON Formatter

Yes I am working also with the sample data to test everything, but JSON is new for me, as I was working with the workbench and single files before.

Kind regards
Sibylle

Many thank @edavis

I tried now the InVEST sample data
Unfortunately I got an error, possibly as I am working on a newer GDAL version (I am not allowed to install 3.3.0, SSL challenge)
(env-invest) C:\Users\f80809100>gdalinfo --version
GDAL 3.9.2, released 2024/08/13

For the json input, is it possible to provide the path to the code? I suppose I should not change the directory because then the InVEST python code will not be found.

Code
(env-invest) C:\Users\f80809100>invest -vvv run pollination -d pollination_willamette.invs.json -w O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data
09/21/2024 14:13:16 natcap.invest.cli INFO Imported target natcap.invest.pollination from <module ‘natcap.invest.pollination’ from ‘C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py’>
09/21/2024 14:13:16 natcap.invest.utils Level 100 Writing log messages to [O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data\InVEST-natcap.invest.pollination-log-2024-09-21–14_13_16.txt]
09/21/2024 14:13:16 natcap.invest.cli Level 100 Starting model with parameters:
Arguments for InVEST natcap.invest.pollination 3.14.2:
farm_vector_path farms.shp
guild_table_path guild_table.csv
landcover_biophysical_table_path landcover_biophysical_table.csv
landcover_raster_path landcover.tif
results_suffix
workspace_dir O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data

09/21/2024 14:13:16 natcap.invest.utils ERROR Exception while executing natcap.invest.pollination
Traceback (most recent call last):
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\utils.py”, line 165, in prepare_workspace
yield
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\cli.py”, line 470, in main
model_module.execute(parsed_datastack.args)
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py”, line 525, in execute
scenario_variables = _parse_scenario_variables(args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py”, line 1209, in _parse_scenario_variables
guild_df = validation.get_validated_dataframe(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\validation.py”, line 623, in get_validated_dataframe
df = utils.read_csv_to_dataframe(csv_path, **read_csv_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\utils.py”, line 438, in read_csv_to_dataframe
df = pandas.read_csv(
^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1026, in read_csv
return _read(filepath_or_buffer, kwds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 620, in _read
parser = TextFileReader(filepath_or_buffer, **kwds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1620, in init
self._engine = self._make_engine(f, self.engine)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1880, in _make_engine
self.handles = get_handle(
^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\common.py”, line 873, in get_handle
handle = open(
^^^^^
FileNotFoundError: [Errno 2] No such file or directory: ‘guild_table.csv’
09/21/2024 14:13:16 natcap.invest.utils INFO Elapsed time: 0.16s
09/21/2024 14:13:16 natcap.invest.utils INFO Execution finished; version: 3.14.2
Traceback (most recent call last):
File “”, line 198, in _run_module_as_main
File “”, line 88, in run_code
File "C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Scripts\invest.exe_main
.py", line 7, in
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\cli.py”, line 470, in main
model_module.execute(parsed_datastack.args)
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py”, line 525, in execute
scenario_variables = _parse_scenario_variables(args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py”, line 1209, in _parse_scenario_variables
guild_df = validation.get_validated_dataframe(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\validation.py”, line 623, in get_validated_dataframe
df = utils.read_csv_to_dataframe(csv_path, **read_csv_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\utils.py”, line 438, in read_csv_to_dataframe
df = pandas.read_csv(
^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1026, in read_csv
return _read(filepath_or_buffer, kwds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 620, in _read
parser = TextFileReader(filepath_or_buffer, **kwds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1620, in init
self._engine = self._make_engine(f, self.engine)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\parsers\readers.py”, line 1880, in _make_engine
self.handles = get_handle(
^^^^^^^^^^^
File “C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\pandas\io\common.py”, line 873, in get_handle
handle = open(
^^^^^
FileNotFoundError: [Errno 2] No such file or directory: ‘guild_table.csv’

GDAL
(env-invest) C:\Users\f80809100>pip install gdal==3.3.0
Collecting gdal==3.3.0
Using cached GDAL-3.3.0.tar.gz (743 kB)
Preparing metadata (setup.py) … done
Building wheels for collected packages: gdal
Building wheel for gdal (setup.py) … error
error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [115 lines of output]
error: Microsoft Visual C++ 14.0 or greater is required. Get it with “Microsoft C++ Build Tools”: Microsoft C++ Build Tools - Visual Studio
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for gdal
Running setup.py clean for gdal
Failed to build gdal
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (gdal)

Thanks a lot Many thank @edavis @dave @jdouglass

In the meantime IT provided me admin rights back. Unfortunately I got an error demanding for Microsoft Visual C++ 14.00 or greater. This is funny, as this error did not appear some weeks ago when was able to run InVEST python without any error (but with admin rights).

I installed now again Visual Studio, but the error still appears, even after restarting the computer. Do I need to make a link or something elsewhere? Is there another solution?

Kind regards
Sibylle

This error you are seeing when you try to run the model with sample data:

indicates the input files are not where they need to be. Make sure all the pollination sample data files are in the same folder with pollination_willamette.invs.json.

As for the Microsoft Visual C++ error, try following the instructions in this Stack Overflow answer (starting with “You can also go to your windows settings”) to see if you can find the version of Microsoft Visual C++ installed on your computer. If you can’t find it, you may need to install the Microsoft Visual C++ Tools separately.

Are you able to install the InVEST Workbench on your machine? If you don’t need to use the command line interface, you might find the Workbench a simpler way to run the models!

1 Like

@edavis, @sibylles is running the InVEST pollination model within a batch-processing job, so while the workbench would work, I do think this will be easier to do within a scripting environment!

Just to add on to @edavis 's good comments here, the Visual C++ error is only being presented because of the pip install gdal==3.3.0 command. Given that you had already installed gdal into your conda environment, I don’t think you should need to use pip to install gdal at all. In fact, the model failing where it did indicates to me that GDAL is already available to your program. So I think we can ignore the visual C++ issue as long as you are only using gdal that has been installed with conda.

1 Like

Many thanks @edavis @jdouglass

Yes I prefer using the InVEST python version, because we adapted know the pollination model to feed in pixel level species data. I am a new python user, so many thanks for the help, but extremely curious to see the output of our extended model:-)

So great I am able to run the model using GDAL 3.9.2 and not installing 3.3.0. However, I contacted internal IT helpdesk to solve the issue with C++. As you can see from my screenshot, C++ is installed, but somehow anaconda prompt does not detect C++. There were some warnings, but to my opinion not relevant: see code.

pollination_willamette.invs.json
I thought that this included all input files. Why I need to provide additionally all the other files like guild_table, landcover, farms.shp?

Path
Is it possible to provide a path to my input file “pollination_willamette.invs.json”?
At the moment the path is “(env-invest) C:\Users\f80809100>”. So I was forced to copy the input files from the original path: O:\Data-Work\27_Natural_Resources-RE\271_KLIM_Work\CC_Impacts\NCCS\Data\M2b_Pollination\InVEST\InVEST_Sample-Data. I suppose that I should not change to directory C:/Users to run InVEST successfully.

Code:
(env-invest) C:\Users\f80809100>invest -vvv run pollination -d pollination_willamette.invs.json -w O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data
09/24/2024 08:31:41 natcap.invest.cli INFO Imported target natcap.invest.pollination from <module ‘natcap.invest.pollination’ from ‘C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py’>
09/24/2024 08:31:41 natcap.invest.utils Level 100 Writing log messages to [O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data\InVEST-natcap.invest.pollination-log-2024-09-24–08_31_41.txt]
09/24/2024 08:31:41 natcap.invest.cli Level 100 Starting model with parameters:
Arguments for InVEST natcap.invest.pollination 3.14.2:
farm_vector_path C:\Users\f80809100\farms.shp
guild_table_path C:\Users\f80809100\guild_table.csv
landcover_biophysical_table_path C:\Users\f80809100\landcover_biophysical_table.csv
landcover_raster_path C:\Users\f80809100\landcover.tif
results_suffix
workspace_dir O:/Data-Work/27_Natural_Resources-RE/271_KLIM_Work/CC_Impacts/NCCS/Data/M2b_Pollination/InVEST/InVEST_Sample-Data

09/24/2024 08:31:41 natcap.invest.pollination INFO Checking to make sure guild table has all expected headers
09/24/2024 08:31:41 natcap.invest.pollination INFO Checking that farm polygon has expected headers
09/24/2024 08:31:41 py.warnings WARNING C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\osgeo\gdal.py:312: FutureWarning: Neither gdal.UseExceptions() nor gdal.DontUseExceptions() has been explicitly called. In GDAL 4.0, exceptions will be enabled by default.
warnings.warn(

09/24/2024 08:31:43 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:43 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:43 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-1 (stats_worker), started daemon 11124)>
09/24/2024 08:31:44 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:44 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:44 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:44 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:44 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-2 (stats_worker), started daemon 9816)>
09/24/2024 08:31:45 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:45 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:45 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:45 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:45 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-3 (stats_worker), started daemon 7316)>
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:46 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-4 (stats_worker), started daemon 10196)>
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:46 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:47 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:47 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:47 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-5 (stats_worker), started daemon 11960)>
09/24/2024 08:31:47 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:47 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:48 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-6 (stats_worker), started daemon 1572)>
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:48 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:48 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-7 (stats_worker), started daemon 9324)>
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:49 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-8 (stats_worker), started daemon 6704)>
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:49 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:50 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:50 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:50 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-9 (stats_worker), started daemon 4168)>
09/24/2024 08:31:50 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:50 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:51 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:51 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:51 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-10 (stats_worker), started daemon 6716)>
09/24/2024 08:31:51 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:51 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:52 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:52 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:52 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-11 (stats_worker), started daemon 10628)>
09/24/2024 08:31:52 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:52 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:53 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:31:53 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:31:53 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:31:53 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:31:53 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:31:54 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:31:54 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on floral_resources_apis.tif
09/24/2024 08:31:54 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:31:55 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on floral_resources_apis.tif
09/24/2024 08:31:56 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:31:56 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:31:56 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-14 (stats_worker), started daemon 1116)>
09/24/2024 08:31:57 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:31:57 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:31:58 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:31:58 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:31:58 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:31:58 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:31:58 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:31:59 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:31:59 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on convolve_ps_apis.tif
09/24/2024 08:32:00 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:00 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on convolve_ps_apis.tif
09/24/2024 08:32:01 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:01 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:01 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-17 (stats_worker), started daemon 8048)>
09/24/2024 08:32:02 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:02 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:03 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:03 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:03 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-18 (stats_worker), started daemon 11156)>
09/24/2024 08:32:04 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:04 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:05 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:05 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:05 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-19 (stats_worker), started daemon 12244)>
09/24/2024 08:32:05 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:05 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:06 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:06 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:06 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:06 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:06 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:32:07 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:07 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on floral_resources_apis2.tif
09/24/2024 08:32:07 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:08 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on floral_resources_apis2.tif
09/24/2024 08:32:10 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:10 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:10 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-22 (stats_worker), started daemon 10180)>
09/24/2024 08:32:10 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:10 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:12 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:12 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:12 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:12 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:12 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:32:13 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:13 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on convolve_ps_apis2.tif
09/24/2024 08:32:13 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:14 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on convolve_ps_apis2.tif
09/24/2024 08:32:15 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:15 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:15 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-25 (stats_worker), started daemon 10936)>
09/24/2024 08:32:16 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:16 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:17 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:17 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:17 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-26 (stats_worker), started daemon 6736)>
09/24/2024 08:32:17 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:17 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:18 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:18 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:18 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-27 (stats_worker), started daemon 2708)>
09/24/2024 08:32:18 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:18 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:19 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:19 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:19 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:19 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:19 pygeoprocessing.geoprocessing INFO 192 sent to workers, wait for worker results
09/24/2024 08:32:24 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:24 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on floral_resources_bombus.tif
09/24/2024 08:32:24 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:25 pygeoprocessing.geoprocessing INFO convolution nodata normalizer approximately 2.6% complete on floral_resources_bombus.tif
09/24/2024 08:32:25 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on floral_resources_bombus.tif
09/24/2024 08:32:26 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:26 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:26 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-30 (stats_worker), started daemon 4388)>
09/24/2024 08:32:27 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:27 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:28 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:28 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:28 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:28 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:28 pygeoprocessing.geoprocessing INFO 192 sent to workers, wait for worker results
09/24/2024 08:32:33 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:33 pygeoprocessing.geoprocessing INFO convolution worker approximately 97.9% complete on convolve_ps_bombus.tif
09/24/2024 08:32:33 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on convolve_ps_bombus.tif
09/24/2024 08:32:33 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:34 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on convolve_ps_bombus.tif
09/24/2024 08:32:35 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:35 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:35 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-33 (stats_worker), started daemon 8528)>
09/24/2024 08:32:36 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:36 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:36 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:36 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:36 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-34 (stats_worker), started daemon 6072)>
09/24/2024 08:32:37 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:37 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:38 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:38 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:38 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-35 (stats_worker), started daemon 10464)>
09/24/2024 08:32:38 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:38 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:39 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:39 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:39 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:39 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:39 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:32:40 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:41 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on floral_resources_bombus2.tif
09/24/2024 08:32:41 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:41 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on floral_resources_bombus2.tif
09/24/2024 08:32:43 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:43 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:43 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-38 (stats_worker), started daemon 10828)>
09/24/2024 08:32:43 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:43 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:44 pygeoprocessing.geoprocessing INFO starting convolve
09/24/2024 08:32:44 pygeoprocessing.geoprocessing DEBUG start fill work queue thread
09/24/2024 08:32:45 pygeoprocessing.geoprocessing DEBUG fill work queue
09/24/2024 08:32:45 pygeoprocessing.geoprocessing DEBUG start worker thread
09/24/2024 08:32:45 pygeoprocessing.geoprocessing INFO 48 sent to workers, wait for worker results
09/24/2024 08:32:46 pygeoprocessing.geoprocessing DEBUG work queue full
09/24/2024 08:32:46 pygeoprocessing.geoprocessing INFO convolution worker 100.0% complete on convolve_ps_bombus2.tif
09/24/2024 08:32:46 pygeoprocessing.geoprocessing INFO need to normalize result so nodata values are not included
09/24/2024 08:32:47 pygeoprocessing.geoprocessing INFO convolution nodata normalize 100.0% complete on convolve_ps_bombus2.tif
09/24/2024 08:32:48 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:48 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:48 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-41 (stats_worker), started daemon 7416)>
09/24/2024 08:32:49 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:49 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:49 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:49 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:49 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-42 (stats_worker), started daemon 10412)>
09/24/2024 08:32:50 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:50 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:51 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:51 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:51 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-43 (stats_worker), started daemon 7512)>
09/24/2024 08:32:52 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:52 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:53 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:53 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:53 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-44 (stats_worker), started daemon 11192)>
09/24/2024 08:32:54 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:54 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:55 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:55 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:55 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-45 (stats_worker), started daemon 1292)>
09/24/2024 08:32:57 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:57 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:58 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:58 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:58 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-46 (stats_worker), started daemon 1696)>
09/24/2024 08:32:58 pygeoprocessing.geoprocessing INFO 100.0% complete
09/24/2024 08:32:58 pygeoprocessing.geoprocessing INFO Waiting for raster stats worker result.
09/24/2024 08:32:59 pygeoprocessing.geoprocessing INFO starting stats_worker
09/24/2024 08:32:59 pygeoprocessing.geoprocessing_core DEBUG stats worker PID: 9740
09/24/2024 08:32:59 pygeoprocessing.geoprocessing INFO started stats_worker <Thread(Thread-47 (stats_worker), started daemon 3396)>
09/24/2024 08:32:59 py.warnings WARNING C:\Users\f80809100\AppData\Local\miniconda3\envs\env-invest\Lib\site-packages\natcap\invest\pollination.py:1396: RuntimeWarning: overflow encountered in add
result[local_valid_mask] += array[local_valid_mask]

09/24/2024 08:33:03 natcap.invest.utils INFO Elapsed time: 1m 22.260000000000005s
09/24/2024 08:33:03 natcap.invest.utils INFO Execution finished; version: 3.14.2

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.