`numpy.mean` Used With A Tuple As `axis` Argument: Not Working With A Masked Array
Solution 1:
For a MaskedArray
argument, numpy.mean
calls MaskedArray.mean
, which doesn't support a tuple axis
argument. You can get the correct behavior by reimplementing MaskedArray.mean
in terms of operations that do support tuples for axis
:
defmean(a, axis=None):
if a.mask is numpy.ma.nomask:
returnsuper(numpy.ma.MaskedArray, a).mean(axis=axis)
counts = numpy.logical_not(a.mask).sum(axis=axis)
if counts.shape:
sums = a.filled(0).sum(axis=axis)
mask = (counts == 0)
return numpy.ma.MaskedArray(data=sums * 1. / counts, mask=mask, copy=False)
elif counts:
# Return scalar, not arrayreturn a.filled(0).sum(axis=axis) * 1. / counts
else:
# Masked scalarreturn numpy.ma.masked
or, if you're willing to rely on MaskedArray.sum
working with a tuple axis
(which you likely are, given that you're using undocumented behavior of numpy.mean
),
defmean(a, axis=None):
if a.mask is numpy.ma.nomask:
returnsuper(numpy.ma.MaskedArray, a).mean(axis=axis)
sums = a2.sum(axis=axis)
counts = numpy.logical_not(a.mask).sum(axis=axis)
result = sums * 1. / counts
where we're relying on MaskedArray.sum
to handle the mask.
I have only lightly tested these functions; before using them, make sure they actually work, and write some tests. For example, if the output is 0-dimensional and there are no masked values, whether the output is a 0D MaskedArray or a scalar depends on whether the input mask is nomask
or an array of all False. This is the same as the default MaskedArray.mean
behavior, but it may not be what you want; I suspect the default behavior is a bug.
Post a Comment for "`numpy.mean` Used With A Tuple As `axis` Argument: Not Working With A Masked Array"