# einsum

Graph.einsum(equation, tensors, *, name=None)

Perform tensor contraction via Einstein summation convention.

Use this node to perform multi-dimensional, linear algebraic array operations between tensors.

### Parameters

• equation (str) – The equation describing the tensor contraction. The format is the same as in NumPy’s einsum.
• tensors (list [ np.ndarray or Tensor ]) – The tensors to be contracted.
• name (str or None , optional) – The name of the node.

### Returns

The contracted Tensor.

### Return type

Tensor

Graph.matmul : Matrix multiplication between objects.

Graph.sum : Sum elements in a tensor along one or multiple axes.

Graph.trace : Trace of an object.

Graph.transpose : Reorder the dimensions of a tensor.

## Notes

You can use tensor contraction with Einstein summation convention to create a new tensor from its element-wise computation from other tensors. This applies to any tensor operation that you can write as an equation relating the elements of the result as sums over products of elements of the inputs.

The element-wise equation of the operation is summarized by a string describing the Einstein summation to be performed on the inputs. For example, the matrix multiplication between two matrices can be written as

$R_{ik} = \sum_j P_{ij} Q_{jk} .$

To convert this element-wise equation to the appropriate string, you can: remove summations and variable names (ik = ij * jk), move the output to the right (ij * jk = ik), and replace “*” with “,” and “=” with “->” (ij,jk->ik). You can also use an ellipsis (…) to broadcast over unchanged dimensions.

## Examples

>>> x = np.arange(16, dtype=float)

Diagonal of a matrix.

>>> graph.einsum("ii->i", [x.reshape(4, 4)], name="diagonal")
<Tensor: name="diagonal", operation_name="einsum", shape=(4,)>
>>> result = bo.execute_graph(graph=graph, output_node_names="diagonal")
>>> result["output"]["diagonal"]["value"]
array([0., 5., 10., 15.])

Trace of a matrix.

>>> graph.einsum('ii->', [x.reshape(4, 4)], name="trace")
<Tensor: name="trace", operation_name="einsum", shape=()>
>>> result = bo.execute_graph(graph=graph, output_node_names="trace")
>>> result["output"]["trace"]["value"]
30.0

Sum over matrix axis.

>>> graph.einsum('ij->i', [x.reshape((4, 4))], name="sum_1")
<Tensor: name="sum_1", operation_name="einsum", shape=(4,)>
>>> result = bo.execute_graph(graph=graph, output_node_names="sum_1")
>>> result["output"]["sum_1"]["value"]
array([ 6., 22., 38., 54.])

Sum over tensor axis ignoring leading dimensions.

>>> graph.einsum('...ji->...i', [x.reshape((2, 2, 4))], name='sum_2')
<Tensor: name="sum_2", operation_name="einsum", shape=(2, 4)>
>>> result = bo.execute_graph(graph=graph, output_node_names="sum_2")
>>> result["output"]["sum_2"]["value"]
array([[ 4.,  6.,  8., 10.],
[20., 22., 24., 26.]])

Reorder tensor axes.

>>> graph.einsum('ijk->jki', [x.reshape((8, 1, 2))], name="reorder")
<Tensor: name="reorder", operation_name="einsum", shape=(1, 2, 8)>
>>> result = bo.execute_graph(graph=graph, output_node_names="reorder")
>>> result["output"]["reorder"]["value"]
array([[[ 0.,  2.,  4.,  6.,  8., 10., 12., 14.],
[ 1.,  3.,  5.,  7.,  9., 11., 13., 15.]]])

Vector inner product.

>>> graph.einsum('i,i->', [x, np.ones(16)], name="inner")
<Tensor: name="inner", operation_name="einsum", shape=()>
>>> result = bo.execute_graph(graph=graph, output_node_names="inner")
>>> result["output"]["inner"]["value"]
120.0

Matrix-vector multiplication.

>>> graph.einsum('ij,j->i', [x.reshape((4, 4)), np.ones(4)], name="multiplication")
<Tensor: name="multiplication", operation_name="einsum", shape=(4,)>
>>> result = bo.execute_graph(graph=graph, output_node_names="multiplication")
>>> result["output"]["multiplication"]["value"]
array([ 6., 22., 38., 54.])

Vector outer product.

>>> graph.einsum("i,j->ij", [x[:2], x[:3]], name="outer")
<Tensor: name="outer", operation_name="einsum", shape=(2, 3)>
>>> result = bo.execute_graph(graph=graph, output_node_names="outer")
>>> result["output"]["outer"]["value"]
array([[0., 0., 0.],
[0., 1., 2.]])

Tensor contraction.

>>> graph.einsum(
...     "ijk,jil->kl", [x.reshape((4, 2, 2)), x.reshape((2, 4, 2))], name="contraction"
... )
<Tensor: name="contraction", operation_name="einsum", shape=(2, 2)>
>>> result = bo.execute_graph(graph=graph, output_node_names="contraction")
>>> result["output"]["contraction"]["value"]
array([[504., 560.],
[560., 624.]])

Trace along first two axes.

>>> graph.einsum("ii...->i...", [x.reshape((2, 2, 4))], name="trace_2")
<Tensor: name="trace_2", operation_name="einsum", shape=(2, 4)>
>>> result = bo.execute_graph(graph=graph, output_node_names="trace_2")
>>> result["output"]["trace_2"]["value"]
array([[ 0.,  1.,  2.,  3.],
[12., 13., 14., 15.]])

Matrix multiplication using the left-most indices.

>>> graph.einsum(
...     "ij...,jk...->ik...", [x.reshape((1, 4, 4)), x.reshape((4, 1, 4))], name="left_most"
... )
<Tensor: name="left_most", operation_name="einsum", shape=(1, 1, 4)>
>>> result = bo.execute_graph(graph=graph, output_node_names="left_most")
>>> result["output"]["left_most"]["value"]
array([[[224., 276., 336., 404.]]])