Ndr: taskgraph locked & NoneType not iterable

I am attempting batch processing with the NDR model using the attached script and input data. The script loops over multiple lulc years and for each lulc year loops over multiple watersheds. I have used the script multiple times on one device, and it has worked fine, however, I have tried running it on two other devices, and each time I receive the following errors.

If I run the script on an empty workspace directory, I always get this: “(taskgraph.Task) Task._execute_sqlite(1594) WARNING TaskGraph database is locked because another process is using it, waiting for a bit of time to try again”

If I attempt to run the model again without deleting the taskgraph_cache folder, I get this: “TypeError: ‘NoneType’ object is not iterable”

I have tried moving my workspace directory around to different drives (C, D, and a network drive). I have also made sure that no other tasks are running on task manager and tried deleting the workspace and restarting my computer between runs. I’ve tried running the script with and without gdal exceptions enabled and have used ‘n_workers: 1’ and ‘n_workers: -1’ in the args dictionary. However, nothing has made a difference.

It is interesting because I can’t recall doing anything differently on the computer where it runs versus on the ones where it doesn’t.

On both devices where the script doesn’t run, I have tried running the model on one watershed at a time using the workbench, and that has worked fine.

Thank you for the help!

Hi @AS-Conley, welcome to the forum, and thanks so much for providing all these details! The errors you are running into seem to be related to an issue in the latest release of sqlite (3.49.1).

Can you please try the following:

  1. Delete the taskgraph_cache directory from the workspace.
  2. In the conda environment you are using to run InVEST, downgrade sqlite to 3.48.0.
  3. Try running the model again.

That should resolve your issue; if not, please let us know!

2 Likes

Hi @edavis that solved the issue perfectly! Thank you so much for the quick reply!