onnx-mlir

Logo

Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure

View the Project on GitHub onnx/onnx-mlir

How-Tos

Inference Using Python
Inference Using C/C++
Inference Using Java

References

ONNX Dialect
OMTensor C99 Runtime API
OMTensorList C99 Runtime API
OMTensor Java Runtime API
OMTensorList Java Runtime API
Generate ONNX Dialect
About Documentation

Development

Add an Operation
Testing Guidelines
Error Handling
Command-line Options
Instrumentation
Constant Propagation
Add an Accelerator

Tools

Tools

RunONNXModel.py
DocCheck

This project is maintained by onnx

Hosted on GitHub Pages — Theme by orderedlist

onnx-mlir: onnx-mlir/include/OnnxMlirRuntime.h Source File
onnx-mlir
OnnxMlirRuntime.h
Go to the documentation of this file.
1 /*
2  * SPDX-License-Identifier: Apache-2.0
3  */
4 
5 //===------- OnnxMlirRuntime.h - ONNX-MLIR Runtime API Declarations -------===//
6 //
7 // Copyright 2019-2023 The IBM Research Authors.
8 //
9 // =============================================================================
10 //
11 // This file contains declaration of external OMTensor data structures and
12 // helper functions.
13 //
14 //===----------------------------------------------------------------------===//
15 #ifndef ONNX_MLIR_ONNXMLIRRUNTIME_H
16 #define ONNX_MLIR_ONNXMLIRRUNTIME_H
17 
18 #ifdef __cplusplus
19 #include <cstdint>
20 #else
21 #include <stdbool.h>
22 #include <stdint.h>
23 #endif
24 
25 #include <onnx-mlir/Runtime/OMEntryPoint.h>
26 #include <onnx-mlir/Runtime/OMInstrument.h>
27 #include <onnx-mlir/Runtime/OMSignature.h>
28 #include <onnx-mlir/Runtime/OMTensor.h>
29 #include <onnx-mlir/Runtime/OMTensorList.h>
30 
180 #endif // ONNX_MLIR_ONNXMLIRRUNTIME_H