October 16, 2018

ONNX Runtime for inferencing machine learning models now in preview

We are excited to release the preview of ONNX Runtime, a high-performance inference engine for machine learning models in the Open Neural Network Exchange (ONNX) format. ONNX Runtime is compatible with ONNX version 1.2 and comes in Python packages that support both CPU and GPU to enable inferencing using Azure Machine Learning service and on any Linux machine running Ubuntu 16. READ MORE

September 6, 2018

ONNX version 1.3 Released

We are excited to announce the v1.3 release of ONNX is now available! For those who aren't aware of or know about ONNX, you can learn more about the project, who is involved and what tools are available at the onnx.ai site. READ MORE

July 23, 2018

ONNX Model Zoo: Developing a face recognition application with ONNX models

Today, Amazon Web Services (AWS), Facebook and Microsoft are pleased to announce that the Open Neural Network Exchange (ONNX) Model Zoo is publicly available. ONNX is an open standard format for deep learning models that enables interoperability between deep learning frameworks such as Apache MXNet, Caffe2, Microsoft Cognitive Toolkit, and PyTorch. ONNX Model Zoo enables developers to easily and quickly get started with deep learning using any framework supporting ONNX. READ MORE

July 12, 2018

Vespa introduces ONNX support

ONNX (Open Neural Network Exchange) is an open format for the sharing of neural network and other machine learned models between various machine learning and deep learning frameworks. As the open big data serving engine, Vespa aims to make it simple to evaluate machine learned models at serving time at scale. READ MORE

July 9, 2018

Announcing ML.NET 0.3 with support for ONNX

Two months ago, at //Build 2018, Microsoft released ML.NET 0.1, a cross-platform, open source machine learning framework for .NET developers. Today they are happy to announce the latest version: ML.NET 0.3. This release supports exporting models to the ONNX format, enables creating new types of models with Factorization Machines, LightGBM, Ensembles, and LightLDA, and addressing a variety of issues and feedback received from the community. READ MORE

July 3, 2018

MathWorks joins the Open Neural Network Exchange

AI/ML researchers and developers can now export a trained MathWorks Neural Network Toolbox deep learning network to the ONNX (Open Neural Network Exchange) model format. They can then import the ONNX model to other deep learning frameworks that support ONNX model import. READ MORE

June 7, 2018

BITMAIN partners with Skymizer on an open source compiler for ONNX to speed up AI development

BITMAIN and Skymizer today announced their cooperation for ONNC, an open source compiler aiming to connect ONNX to all AI ASICs. Sophon, BITMAIN’s AI ASIC solution, would be the first hardware platform for ONNC development. It would greatly benefit the broad ONNX audience to utilize Sophon for their deep learning inference work. READ MORE

June 6, 2018

Type annotations for ONNX

At Facebook, we work with community best practices to ensure high code quality, readability and reliability. In line with this, we just added type annotations to our python code to help ONNX developers more easily contribute to the project.

These type annotations are used by mypy within the ONNX CI systems to ensure a continuously high code quality standard. We also have type annotations for our APIs, which means your tools built on top of the ONNX APIs can use static analysis tools like mypy to ensure they are using the APIs correctly. READ MORE

June 5, 2018

HPE to join the Open Neural Network Exchange

Hewlett Packard Enterprise is joining ONNX to work alongside industry leaders in pushing open AI standards. They will be joining Microsoft, Facebook, and Amazon, the founders of ONNX, and ONNX partners like AMD, NVIDIA, IBM and other industry leaders to push open artificial intelligence (AI) standards forward in the coming years. READ MORE

May 2, 2018

ONNX Expansion Speeds AI Development

In the beginning of the recent deep learning revolution, researchers had only a handful of tools (such as Torch, Theano, and Caffe) to work with, but today there is a robust ecosystem of deep learning frameworks and hardware runtimes. While this growing toolbox is extremely useful, each framework has the potential to become an island unto itself without interoperability. But interoperability requires a lot of custom integration work for each possible framework/runtime pair, and reimplementing models to move between frameworks is typically difficult and can slow development by weeks or months. READ MORE

May 2, 2018

Introducing ONNX to Core ML model converter

Today, we're pleased to add production-grade Core ML support through the availability of an ONNX to Core ML model converter. Core ML enables developers to quickly build apps with intelligent new features across Apple products. This new capability allows developers to use their favorite ONNX-compliant framework to design, train, and test their models, and then seamlessly integrate them into apps for Apple products. READ MORE

APRIL 20, 2018

Skymizer connects ONNX to all deep learning accelerator ASICs

Skymizer, a compiler company founded in 2013, will launch the open source compiler “ONNC” (Open Neural Network Compiler) to ONNX backed by its unique compiler technologies.

Hundreds of AI chips are releasing in the near future, the latest figures indicate 34 IC and IP vendors will provide various AI chips and deep learning accelerator (DLA) ASICs in 2018. These all reflect the urgent need for an open compiler to support different AI chips. READ MORE

APRIL 10, 2018

AI chip company, BITMAIN, is officially joining and embracing ONNX AI software ecosystem

BITMAIN, founded in 2013, was best known for its massive success in the digital currency. In 2017, BITMAIN launched its AI solution, Sophon, and the first AI chip, BM1680, and started to sell into the market. BITMAIN is committed to providing the most powerful and energy efficient AI solutions to the market. For the customers to unleash the power of Sophon with minimal development work, BITMAIN is implementing the inference platform to support all ONNX models for all Sophon solutions. READ MORE

MARCH 13, 2018

ONNX working groups established

We are excited to announce the formation of community working groups. Working groups will bring together ONNX partners and members of the community to help steer the direction of ONNX. We have created 4 new working groups to provide guidance and feedback on the topics of Quantization, RNNs and Control Flow, Test and Compliance, and Training. READ MORE

MARCH 7, 2018

ONNX models to be runnable natively on 100s of millions of Windows devices

Today Microsoft is announcing the next major update to Windows will include the ability to run Open Neural Network Exchange (ONNX) models natively with hardware acceleration. This brings 100s of millions of Windows devices, ranging from IoT edge devices to HoloLens to 2-in-1s and desktop PCs, into the ONNX ecosystem. Data scientists and developers creating AI models will be able to deploy their innovations to this large user base. And every developer building apps on Windows 10 will be able to use AI models to deliver more powerful and engaging experiences. READ MORE

FEBRUARY 22, 2018

MediaTek Joins Open Neural Network Exchange to Evolve its Edge AI Platform

MediaTek today announced that it has joined the Open Neural Network Exchange (ONNX) to drive AI innovation and support the evolution of its edge AI platform. Existing involvement in the Android Neural Network (ANN), combined with its new support of and participation in ONNX, is part of MediaTek’s strategic imperative to continue integrating AI across its technology portfolio. READ MORE

FEBRUARY 5, 2018

Model Server for Apache MXNet introduces ONNX support and Amazon CloudWatch integration

Today AWS released version 0.2 of Model Server for Apache MXNet (MMS), an open source library that packages and serves deep learning models for making predictions with just a few lines of code. With the new release, engineers are now able to serve ONNX models, and can publish operational metrics directly to Amazon CloudWatch, where they can create dashboards and alarms. READ MORE

JANUARY 17, 2018

ONNX support by Chainer

Today, we announce ONNX-Chainer, an open source Python package to export Chainer models to Open Neural Network Exchange (ONNX) format. This blog post explains how to export a model written in Chainer into ONNX by using chainer/onnx-chainer. READ MORE

DECEMBER 6, 2017

ONNX V1 released

In September, we released an early version of the Open Neural Network Exchange format (ONNX) with a call to the community to join us and help create an open, flexible standard to enable deep learning frameworks and tools to interoperate. Today Facebook, AWS, and Microsoft are excited to announce that with the support of the community and new partners the first version of ONNX is now production-ready. READ MORE

DECEMBER 6, 2017

Announcing ONNX 1.0

Today, the Open Neural Network Exchange (ONNX) working group, which includes Amazon Web Services (AWS), Facebook, and Microsoft, announces the availability of ONNX 1.0. This release introduces the stable and production-ready version of the ONNX format. ONNX is an open standard format for deep learning models that enables interoperability between deep learning frameworks such as Apache MXNet, PyTorch, Caffe2, and Microsoft Cognitive Toolkit. ONNX 1.0 enables users to move deep learning models between frameworks, making it easier to put them into production. For example, developers can build sophisticated computer vision models using frameworks such as PyTorch and run them for inference using CNTK or Apache MXNet. READ MORE

DECEMBER 6, 2017

Announcing ONNX 1.0 – An open ecosystem for AI

Today we are announcing that Open Neural Network Exchange (ONNX) is production-ready. ONNX is an open sourcemodel representation for interoperability and innovation in the AI ecosystem that Microsoft co-developed. The ONNX format is the basis of an open ecosystem that makes AI more accessible and valuable to all: developers can choose the right framework for their task, framework authors can focus on innovative enhancements, and hardware vendors can streamline optimizations. READ MORE

DECEMBER 4, 2017

NVIDIA GPU Cloud Now Available to Hundreds of Thousands of AI Researchers Using NVIDIA Desktop GPUs

NVIDIA today announced that hundreds of thousands of AI researchers using desktop GPUs can now tap into the power of NVIDIA GPU Cloud (NGC) as the company has extended NGC support to NVIDIA TITAN.

NVIDIA also announced expanded NGC capabilities — adding new software and other key updates to the NGC container registry — to provide researchers a broader, more powerful set of tools to advance their AI and high performance computing research and development efforts. READ MORE

NOVEMBER 16, 2017

Announcing ONNX support for Apache MXNet

Today, AWS announces the availability of ONNX-MXNet, an open source Python package to import ONNX (Open Neural Network Exchange) deep learning models into Apache MXNet (Incubating). MXNet is a fully featured and scalable deep learning framework, that offers APIs across popular languages such as Python, Scala and R. With ONNX format support for MXNet, developers can build and train models with other frameworks, such as PyTorch, Microsoft Cognitive Toolkit (CNTK), or Caffe2, and import these models into MXNet to run them for inference using MXNet’s highly optimized and scalable engine. READ MORE

NOVEMBER 16, 2017

Amazon Web Services to join ONNX AI format, drive MXNET support

The Open Neural Network Exchange (ONNX) is a community project originally launched in September 2017 to increase interoperability between deep learning tools. ONNX is a standard for representing deep learning models that enables these models to be transferred between frameworks. It is the first step toward an open ecosystem where AI developers can easily move between state-of-the-art tools and choose the combination that works best for them. READ MORE

NOVEMBER 16, 2017

Support for open AI ecosystem grows as Amazon Web Services joins ONNX AI format

It’s been an exciting few months! In September we introduced the Open Neural Network Exchange (ONNX) format that we created with Facebook to increase interoperability and reduce friction for developing and deploying AI. In October a number of companies that share our goals announced their support for ONNX. READ MORE

OCTOBER 11, 2017

Open standards for deep learning to simplify development of neural networks

Among the various fields of exploration in artificial intelligence, deep learning is an exciting and increasingly important area of research that holds great potential for helping computers understand and extract meaning from data, e.g. deciphering images and sounds.

To help further the creation and adoption of interoperable deep learning models, IBM joined the Open Neural Network Exchange (ONNX), a new industry ecosystem that was established by Facebook and Microsoft in September. ONNX provides a common open format to represent deep learning models. The ONNX initiative envisions the flexibility to move deep learning models seamlessly between open-source frameworks to accelerate development for data scientists. READ MORE

OCTOBER 10, 2017

ONNX AI Format Adds Partners

Today, following the introduction of the Open Neural Network Exchange (ONNX) format on September 7, AMD, ARM, Huawei, IBM, Intel, Qualcomm have announced their support for ONNX. These companies, like Facebook and Microsoft, recognize the benefits ONNX’s open ecosystem provides engineers and researchers by allowing them to more easily move between state-of-the-art machine learning tools and choose the best combination for their projects. ONNX also makes it easier for optimizations to reach more developers. Any tools exporting ONNX models can benefit ONNX-compatible runtimes and libraries designed to maximize performance on some of the best AI hardware in the industry. READ MORE

OCTOBER 10, 2017

Microsoft and Facebook Call for Open AI Ecosystem Gaining Broader Industry Momentum.

Last month we introduced the Open Neural Network Exchange (ONNX) format with Facebook to increase interoperability and reduce friction for developing and deploying AI. Since then we’ve talked with many companies that share our goals and recognize the benefits of the ONNX open ecosystem. READ MORE

OCTOBER 10, 2017

AMD announces ONNX support

AMD is excited to see the emergence of the Open Neural Network Exchange (ONNX) format bring common format model to bridge three industry-leading deep learning frameworks ( Pytorch, Caffe2, and CNTK) to give our customer simpler path to explore their networks via rich foundation of framework interoperability. READ MORE

OCTOBER 10, 2017

Arm joins Facebook and Microsoft to bring next-generation AI to life

At Arm, our commitment to artificial intelligence (AI) starts with developing and delivering technologies that are secure, scalable, and power-efficient. After all, AI is already simplifying and transforming our lives, but we’re really only scratching the surface of what’s possible. AI will increasingly happen on end device systems whether it’s your smartphone or your car, which means we’ll continue to see more compute power and AI algorithms. As part of that effort, we’re excited to announce that we’ve joined industry leaders on an open-source project that aims to enable interoperability and innovation in the AI framework ecosystem. READ MORE

OCTOBER 10, 2017

QTI announces support for ONNX, simplifying AI choices for developers

QUALCOMM - So you’ve started working with neural networks and artificial intelligence (AI), but did you find it hard to choose one machine learning framework over another – like Caffe/Caffe2, TensorFlow Cognitive Toolkit or PyTorch? Whether you’re training your own models or using freely available ones, you’ll want to choose a framework that you stick with all the way through production. READ MORE

OCTOBER 10, 2017

Intel Joins Open Neural Network Exchange Ecosystem to Expand Developer Choice in Deep Learning Frameworks

As part of Intel’s commitment to furthering artificial intelligence across the industry, Intel is joining Microsoft*, Facebook*, and others to participate in the Open Neural Network Exchange (ONNX) project. By joining the project, we plan to further expand the choices developers have on top of frameworks powered by the Intel® Nervana™ Graph library and deployment through our Deep Learning Deployment Toolkit. Developers should have the freedom to choose the best software and hardware to build their artificial intelligence model and not be locked into one solution based on a framework. Deep learning is better when developers can move models from framework to framework and use the best hardware platform for the job. READ MORE

SEPTEMBER 7, 2017

Facebook and Microsoft introduce new open ecosystem for interchangeable AI frameworks

Facebook and Microsoft are today introducing Open Neural Network Exchange (ONNX) format, a standard for representing deep learning models that enables models to be transferred between frameworks. ONNX is the first step toward an open ecosystem where AI developers can easily move between state-of-the-art tools and choose the combination that is best for them. READ MORE

SEPTEMBER 7, 2017

Microsoft and Facebook create open ecosystem for AI model interoperability

At Microsoft our commitment is to make AI more accessible and valuable for everyone. We offer a variety of platforms and tools to facilitate this, including our Cognitive Toolkit, an open source framework for building deep neural networks. We also work with other organizations that share our views to help the AI community.

Today we are excited to announce the Open Neural Network Exchange (ONNX) format in conjunction with Facebook. ONNX provides a shared model representation for interoperability and innovation in the AI framework ecosystem. Cognitive Toolkit, Caffe2, and PyTorch will all be supporting ONNX. Microsoft and Facebook co-developed ONNX as an open source project, and we hope the community will help us evolve it. READ MORE