ivy.unify()
#
⚠️ Warning: The tracer and the transpiler are not publicly available yet, so certain parts of this doc won’t work as expected as of now!
Ivy’s Unify function is an alias for ivy.transpile(..., to="ivy", ...)
. You can know
more about the transpiler in the transpile() page.
Unify API#
- ivy.unify(*objs, source=None, args=None, kwargs=None, **transpile_kwargs)#
Transpiles an object into Ivy code. It’s an alias to
ivy.transpile(..., to="ivy", ...)
- Parameters:
objs (
Callable
) – Native callable(s) to transpile.source (
Optional[str]
) – The framework thatobj
is from. This must be provided unlessobj
is a framework-specific module.args (
Optional[Tuple]
) – If specified, arguments that will be used to unify eagerly.kwargs (
Optional[dict]
) – If specified, keyword arguments that will be used to unify eagerly.transpile_kwargs – Arbitrary keyword arguments that will be passed to
ivy.transpile
.
- Return type:
Union[Graph, LazyGraph, ModuleType, ivy.Module]
- Returns:
A transpiled
Graph
or a non-initializedLazyGraph
. If the object is a native trainable module, the corresponding module in the target framework will be returned. If the object is aModuleType
, the function will return a copy of the module with every method lazily transpiled.
Usage#
As we mentioned, ivy.unify()
is an alias for ivy.transpile(..., to="ivy", ...)
.
So you can use it in the same way as ivy.transpile()
. In this case, instead of
getting a graph composed of functions from the functional API of the target framework,
the function will return a graph fully composed of ivy functions, allowing you to run
the graph in any framework directly.
import ivy
ivy.set_backend("jax")
def test_fn(x):
return jax.numpy.sum(x)
x1 = ivy.array([1., 2.])
# transpiled_func and unified_func will have the same result
transpiled_func = ivy.transpile(test_fn, to="ivy", args=(x1,))
unified_func = ivy.unify(test_fn, args=(x1,))
Examples#
Below, we will define a function in torch and try to call it with different native arguments.
Here we will define the torch function and unify it:
import ivy
import torch
def normalize(x):
mean = torch.mean(x)
std = torch.std(x)
return torch.div(torch.sub(x, mean), std)
normalize = ivy.unify(normalize, source="torch")
Now we can call the function with different ivy backends:
import numpy as np
import jax.numpy as jnp
import tensorflow as tf
# create random numpy arrays for testing
x = np.random.uniform(size=10).astype(np.float32)
ivy.set_backend("numpy")
print(normalize(x))
# jax
x_ = jnp.array(x)
ivy.set_backend("jax")
print(normalize(x_))
# tensorflow
x_ = tf.constant(x)
ivy.set_backend("tensorflow")
print(normalize(x_))
# torch
x_ = torch.tensor(x)
ivy.set_backend("torch")
print(normalize(x_))