Handler#

class ivy.utils.backend.handler.ContextManager(module)[source]#

Bases: object

__init__(module)[source]#
ivy.utils.backend.handler.choose_random_backend(excluded=None)[source]#
ivy.utils.backend.handler.current_backend(*args, **kwargs)[source]#

Return the current backend. Priorities: global_backend > argument’s backend.

Parameters:

*args/**kwargs

the arguments from which to try to infer the backend, when there is no globally set backend.

Returns:

ret – Ivy’s current backend.

Examples

If no global backend is set, then the backend is inferred from the arguments:

>>> import numpy as np
>>> x = np.array([2.0])
>>> print(ivy.current_backend(x))
<module 'ivy.functional.backends.numpy' from '/ivy/ivy/functional/backends/numpy/__init__.py'>   # noqa

The global backend set in set_backend has priority over any arguments passed to current_backend:

>>> import numpy as np
>>> ivy.set_backend("jax")
>>> x = np.array([2.0])
>>> print(ivy.current_backend(x))
<module 'ivy.functional.backends.jax' from '/ivy/ivy/functional/backends/jax/__init__.py'>   # noqa
ivy.utils.backend.handler.dynamic_backend_converter(backend_stack)[source]#
ivy.utils.backend.handler.prevent_access_locally(fn)[source]#
ivy.utils.backend.handler.previous_backend()[source]#

Unset the current global backend, and adjusts the ivy dict such that either a previously set global backend is then used as the backend, otherwise we return to Ivy’s implementations.

Returns:

ret – the backend that was unset, or None if there was no set global backend.

Examples

Torch is the last set backend hence is the backend used in the first examples. However, as seen in the example after, if previous_backend is called before ivy.native_array then tensorflow will become the current backend and any torch backend implementations in the Ivy dict will be swapped with the tensorflow implementation:

>>> ivy.set_backend("tensorflow")
>>> ivy.set_backend("torch")
>>> x = ivy.native_array([1])
>>> print(type(x))

<class ‘torch.Tensor’>

>>> ivy.set_backend("tensorflow")
>>> ivy.set_backend("torch")
>>> ivy.previous_backend()
>>> x = ivy.native_array([1])
>>> print(type(x))
<class'tensorflow.python.framework.ops.EagerTensor'>
ivy.utils.backend.handler.set_backend(backend, dynamic=False)[source]#

Set backend to be the global backend.

Will also convert all Array and Container objects to the new backend if dynamic = True

Examples

If we set the global backend to be numpy, then subsequent calls to ivy functions will be called from Ivy’s numpy backend:

>>> ivy.set_backend("numpy")
>>> native = ivy.native_array([1])
>>> print(type(native))
<class 'numpy.ndarray'>

Or with jax as the global backend:

>>> ivy.set_backend("jax")
>>> native = ivy.native_array([1])
>>> print(type(native))
<class 'jaxlib.xla_extension.ArrayImpl'>
ivy.utils.backend.handler.set_backend_to_specific_version(backend)[source]#

Update the backend dict to make the original function name point to the version specific one.

Parameters:

backend – the backend module for which we provide the version support

ivy.utils.backend.handler.set_jax_backend()[source]#

Set JAX to be the global backend.

equivalent to ivy.set_backend(“jax”).

ivy.utils.backend.handler.set_mxnet_backend()[source]#

Set MXNet to be the global backend.

equivalent to ivy.set_backend(“mx”).

ivy.utils.backend.handler.set_numpy_backend()[source]#

Set NumPy to be the global backend.

equivalent to ivy.set_backend(“numpy”).

ivy.utils.backend.handler.set_paddle_backend()[source]#

Set paddle to be the global backend.

equivalent to ivy.set_backend(“paddle”).

ivy.utils.backend.handler.set_tensorflow_backend()[source]#

Set TensorFlow to be the global backend.

equivalent to ivy.set_backend(“tensorflow”).

ivy.utils.backend.handler.set_torch_backend()[source]#

Set torch to be the global backend.

equivalent to ivy.set_backend(“torch”).

ivy.utils.backend.handler.unset_backend()[source]#
ivy.utils.backend.handler.with_backend(backend, cached=True)[source]#

This should have hopefully given you an overview of the handler submodule, if you have any questions, please feel free to reach out on our discord!