We obtained some results, and for USLE output, the majority of the model is consistent with the data found in the literature: In fact, 98.3 % of the pixels have USLE values that are lower or equal than 100 tons/pixel (pixel size is 90 x 90).
However, we do have some pixels values that present very high values and, I am looking for ways to analyze outliers.
The software team or @RafaSchmitt might have more information on this, but I do know that the USLE equation in general was created for low-slope conditions, and does not tend to perform as well in places with steep slopes. Are your pixels with very high values in mountainous/steep places? I have definitely seen surprisingly high values in this situation.
Usually it’s just a few pixels, and if the values don’t skew the results significantly we don’t worry about it. Or, we’re doing an analysis with relative values (high/low) not absolute values, so it’s ok to keep the high areas as they are. A few times, the values in these pixels have been so high as to skew the result or at least the maps produced, and in this case I’ve removed the outliers, or for visualization capped the max value at something lower.
I’ll let others speak more to the underlying model processes causing these high values.
I created a slope map to analyze this situation and confirmed that my pixels with very high values are indeed located in steep areas. So, I am thinking of removing those few pixels from the analysis or to assign them the average value.
Also, I think that it can be useful to point out that I was using the default value for the Threshold Flow Accumulation. I did some trials and the maximum usle value got reduced significantly when I used a lower value (100).
I’ve eliminated any pixels slopes over 40deg (we have steep steep cliffs in Hawaii) as one step to deal with this.
It might help to get a geology and soils map to see if there are areas that are contributing high sediment loads that are unlikely sources of fine sediment.