onnx.hub¶
ModelInfo¶
- class onnx.hub.ModelInfo(raw_model_info: dict[str, Any])[source]¶
A class to represent a model’s property and metadata in the ONNX Hub. It extracts model name, path, sha, tags, etc. from the passed in raw_model_info dict.
- model¶
The name of the model.
- model_path¶
The path to the model, relative to the model zoo (https://github.com/onnx/models/) repo root.
- metadata¶
Additional metadata of the model, such as the size of the model, IO ports, etc.
- model_sha¶
The SHA256 digest of the model file.
- tags¶
A set of tags associated with the model.
- opset¶
The opset version of the model.
download_model_with_test_data¶
- onnx.hub.download_model_with_test_data(model: str, repo: str = 'onnx/models:main', opset: int | None = None, force_reload: bool = False, silent: bool = False) str | None [source]¶
Downloads a model along with test data by name from the onnx model hub and returns the directory to which the files have been extracted. Users are responsible for making sure the model comes from a trusted source, and the data is safe to be extracted.
- Parameters:
model – The name of the onnx model in the manifest. This field is case-sensitive
repo – The location of the model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
opset – The opset of the model to download. The default of None automatically chooses the largest opset
force_reload – Whether to force the model to re-download even if its already found in the cache
silent – Whether to suppress the warning message if the repo is not trusted.
- Returns:
str or None
get_model_info¶
- onnx.hub.get_model_info(model: str, repo: str = 'onnx/models:main', opset: int | None = None) ModelInfo [source]¶
Gets the model info matching the given name and opset.
- Parameters:
model – The name of the onnx model in the manifest. This field is case-sensitive
repo – The location of the model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
opset – The opset of the model to get. The default of None will return the model with largest opset.
- Returns:
ModelInfo
.
list_models¶
- onnx.hub.list_models(repo: str = 'onnx/models:main', model: str | None = None, tags: list[str] | None = None) list[ModelInfo] [source]¶
Gets the list of model info consistent with a given name and tags
- Parameters:
repo – The location of the model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
model – The name of the model to search for. If None, will return all models with matching tags.
tags – A list of tags to filter models by. If None, will return all models with matching name.
- Returns:
``ModelInfo``s.
load¶
- onnx.hub.load(model: str, repo: str = 'onnx/models:main', opset: int | None = None, force_reload: bool = False, silent: bool = False) ModelProto | None [source]¶
Downloads a model by name from the onnx model hub.
- Parameters:
model – The name of the onnx model in the manifest. This field is case-sensitive
repo – The location of the model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
opset – The opset of the model to download. The default of None automatically chooses the largest opset
force_reload – Whether to force the model to re-download even if its already found in the cache
silent – Whether to suppress the warning message if the repo is not trusted.
- Returns:
ModelProto or None
load_composite_model¶
- onnx.hub.load_composite_model(network_model: str, preprocessing_model: str, network_repo: str = 'onnx/models:main', preprocessing_repo: str = 'onnx/models:main', opset: int | None = None, force_reload: bool = False, silent: bool = False) ModelProto | None [source]¶
Builds a composite model including data preprocessing by downloading a network and a preprocessing model and combine it into a single model
- Parameters:
network_model – The name of the onnx model in the manifest.
preprocessing_model – The name of the preprocessing model.
network_repo – The location of the model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
preprocessing_repo – The location of the proprocessing model repo in format “user/repo[:branch]”. If no branch is found will default to “main”
opset – The opset of the model to download. The default of None automatically chooses the largest opset
force_reload – Whether to force the model to re-download even if its already found in the cache
silent – Whether to suppress the warning message if the repo is not trusted.
- Returns:
ModelProto or None