Limitation and Pitfalls¶
Not support in-place array operations¶
In-place array operations lead to confusion in gradient definition and therefore the team decides to exclude the support for all in-place array operations. For example, using the following mutable array operation is not allowed in MinPy.
import minpy.numpy as np a = np.zeros((2,3)) a.transpose()
--------------------------------------------------------------------------- AttributeError Traceback (most recent call last) <ipython-input-1-c6777f33a28a> in <module>() 1 import minpy.numpy as np 2 a = np.zeros((2,3)) ----> 3 a.transpose() AttributeError: 'Array' object has no attribute 'transpose'
But you can use immutable operation instead:
a = np.transpose(a) # instead of a.transpose(), which is feasible in NumPy. # In MinPy, it will occur an error, since we can't calculate # its gradient.
A more common example which is not supported is:
a[0, 1] = 12
The system will still allow you to perform such operations, but keep in mind that the autograd will fail in such cases.
Use MinPy consistantly¶
If you try to put NumPy array into MinPy operation, there are some cases that the computation will still happen in NumPy’s namespace instead of MinPy’s. For example
import minpy.numpy as np def simple_add(a, b): return a + b # Now we declare two NumPy arrays: import numpy as npp a = npp.ones((10, 10)) b = npp.zeros((10, 10))
If we pass
b into function
simple_add. the add
operation will happen in NumPy’s namespace. This is not the expected
behavior. So we recommend you to use MinPy array consistently.
On the other hand, if you want to recover NumPy array from MinPy array
for other packages like
matplotlib, you can use
will return the corresponding NumPy array:
a = np.zeros((2,3)) print(type(a)) # this is a MinPy array print(type(a.asnumpy())) # convert to NumPy array
<class 'minpy.array.Array'> <type 'numpy.ndarray'>
Not support all submodules¶
Since NumPy package is distributed as multiple submodules, currently not
all submodules are supported. If you find any submodules (such as
numpy.random) without support, please raise an issue on GitHub. The
dev team will add support as soon as possible.
Not support multiple executions of the same MXNet symbol before BP¶
Unlike MinPy’s primitives, MXNet symbol has internal stage to record gradient information. Thus applying same symbol to different data will fail BP in the later stage. You can create duplicate symbol to fulfil the same goal and designate same parameter name for parameter sharing.