Block reduction is useful for decimating data. But it always returns the data as arrays. It's sometimes useful to generate a grid of the reduction results, for example to plot point density or standard deviation on a map.
For this, we could have a new function:
block_statistics(function, coordinates, data, block_size=None, block_shape=None, ...)
It would split the data into blocks of the given size, calculate the function on the blocks, and populate an xarray.DataArray with the values. Blocks that don't have data will get a np.nan. The function can be anything that takes an array and returns a single value. It could also be "count", in which case we use a function to count the data points, or "density" where we count and then divide by the block area.
Block reduction is useful for decimating data. But it always returns the data as arrays. It's sometimes useful to generate a grid of the reduction results, for example to plot point density or standard deviation on a map.
For this, we could have a new function:
It would split the data into blocks of the given size, calculate the function on the blocks, and populate an
xarray.DataArraywith the values. Blocks that don't have data will get anp.nan. The function can be anything that takes an array and returns a single value. It could also be"count", in which case we use a function to count the data points, or"density"where we count and then divide by the block area.