onnx_ir.convenienceΒΆ

onnx_ir.convenience.convert_attribute(name, attr, attr_type=None)ΒΆ

Convert a Python object to a _core.Attr object.

This method is useful when constructing nodes with attributes. It infers the attribute type based on the type of the Python value.

Parameters:
Returns:

A Attr object.

Raises:
  • ValueError – If attr is None and attr_type is not provided.

  • TypeError – If the type of the attribute is not supported.

Return type:

Attr

onnx_ir.convenience.convert_attributes(attrs)ΒΆ

Convert a dictionary of attributes to a list of _core.Attr objects.

It infers the attribute type based on the type of the value. The supported types are: int, float, str, Sequence[int], Sequence[float], Sequence[str], _core.Tensor, and _core.Attr:

>>> import onnx_ir as ir
>>> import onnx
>>> import numpy as np
>>> attrs = {
...     "int": 1,
...     "float": 1.0,
...     "str": "hello",
...     "ints": [1, 2, 3],
...     "floats": [1.0, 2.0, 3.0],
...     "strings": ["hello", "world"],
...     "tensor": ir.Tensor(np.array([1.0, 2.0, 3.0])),
...     "tensor_proto":
...         onnx.TensorProto(
...             dims=[3],
...             data_type=onnx.TensorProto.FLOAT,
...             float_data=[1.0, 2.0, 3.0],
...             name="proto",
...         ),
...     "graph": ir.Graph([], [], nodes=[], name="graph0"),
...     "graphs": [ir.Graph([], [], nodes=[], name="graph1"), ir.Graph([], [], nodes=[], name="graph2")],
...     "type_proto": ir.TensorType(ir.DataType.FLOAT),
...     "type_protos": [ir.TensorType(ir.DataType.FLOAT), ir.TensorType(ir.DataType.FLOAT)],
... }
>>> convert_attributes(attrs)
[Attr('int', INT, 1), Attr('float', FLOAT, 1.0), Attr('str', STRING, 'hello'), Attr('ints', INTS, [1, 2, 3]), Attr('floats', FLOATS, [1.0, 2.0, 3.0]), Attr('strings', STRINGS, ['hello', 'world']), Attr('tensor', TENSOR, Tensor<DOUBLE,[3]>(array([1., 2., 3.]), name=None)), Attr('tensor_proto', TENSOR, TensorProtoTensor<FLOAT,[3]>(array([1., 2., 3.], dtype=float32), name='proto')), Attr('graph', INTS, Graph(
    name='graph0',
    inputs=(

    ),
    outputs=(

    ),
    len()=0
)), Attr('graphs', GRAPHS, [Graph(
    name='graph1',
    inputs=(

    ),
    outputs=(

    ),
    len()=0
), Graph(
    name='graph2',
    inputs=(

    ),
    outputs=(

    ),
    len()=0
)]), Attr('type_proto', TYPE_PROTO, Tensor(FLOAT)), Attr('type_protos', TYPE_PROTOS, [Tensor(FLOAT), Tensor(FLOAT)])]
Parameters:

attrs (Mapping[str, str | int | float | Sequence[int] | Sequence[float] | Sequence[str] | TensorProtocol | TensorProto | Attr | GraphProtocol | Sequence[GraphProtocol] | GraphProto | TypeProtocol | Sequence[TypeProtocol] | None]) – A dictionary of {<attribute name>: <python objects>} to convert.

Returns:

A list of _core.Attr objects.

Return type:

list[Attr]

onnx_ir.convenience.create_value_mapping(graph)ΒΆ

Return a dictionary mapping names to values in the graph.

The mapping includes values from subgraphs. Duplicated names are omitted, and the first value with that name is returned. Values with empty names are excluded from the mapping.

Parameters:

graph (Graph) – The graph to extract the mapping from.

Returns:

A dictionary mapping names to values.

Return type:

dict[str, Value]

onnx_ir.convenience.get_const_tensor(value, propagate_shape_type=False)ΒΆ

Get the constant tensor from a value, if it exists.

A constant tensor can be obtained if the value has a const_value set (as in the case of an initializer) or if the value is produced by a Constant node.

This function will not alter the const_value of the value, but it will propagate the shape and type of the constant tensor to the value if propagate_shape_type is set to True.

Parameters:
  • value (Value) – The value to get the constant tensor from.

  • propagate_shape_type (bool) – If True, the shape and type of the value will be propagated to the Value.

Returns:

The constant tensor if it exists, otherwise None.

Raises:

ValueError – If the Constant node does not have exactly one output or one attribute.

Return type:

TensorProtocol | None

onnx_ir.convenience.replace_all_uses_with(values, replacements)ΒΆ

Replace all uses of the given values with the replacements.

This is useful when nodes in the graph are replaced with new nodes, where the old users need to be updated to use the outputs of the new nodes.

For example, suppose we have the following graph:

A -> {B, C}

We want to replace the node A with a new node D:

>>> import onnx_ir as ir
>>> input = ir.Input("input")
>>> node_a = ir.Node("", "A", [input])
>>> node_b = ir.Node("", "B", node_a.outputs)
>>> node_c = ir.Node("", "C", node_a.outputs)
>>> node_d = ir.Node("", "D", [input])
>>> replace_all_uses_with(node_a.outputs, node_d.outputs)
>>> len(node_b.inputs)
1
>>> node_b.inputs[0].producer().op_type
'D'
>>> len(node_c.inputs)
1
>>> node_c.inputs[0].producer().op_type
'D'
>>> len(node_a.outputs[0].uses())
0

When values and replacements are sequences, they are zipped into pairs. All users of the first value is replaced with the first replacement, and so on.

Note

You still need to update the graph outputs if any of the values being replaced are part of the graph outputs. Be sure to remove the old nodes from the graph using graph.remove() if they are no longer needed.

Parameters:
  • values (ValueProtocol | Sequence[ValueProtocol]) – The value or values to be replaced.

  • replacements (ValueProtocol | Sequence[ValueProtocol]) – The new value or values to use as inputs.

Return type:

None

onnx_ir.convenience.replace_nodes_and_values(graph_or_function, /, insertion_point, old_nodes, new_nodes, old_values, new_values)ΒΆ

Replaces nodes and values in the graph or function.

Parameters:
  • graph_or_function (Graph | Function) – The graph or function to replace nodes and values in.

  • insertion_point (Node) – The node to insert the new nodes after.

  • old_nodes (Sequence[Node]) – The nodes to replace.

  • new_nodes (Sequence[Node]) – The nodes to replace with.

  • old_values (Sequence[Value]) – The values to replace.

  • new_values (Sequence[Value]) – The values to replace with.

Return type:

None