ONNX

from Wikipedia, the free encyclopedia
ONNX

ONNX logo main.png
Basic data

Current  version 1.7.0
( May 9, 2020 )
operating system cross-platform
License MIT license
onnx.ai

ONNX (Open Neural Network Exchange) is an open format for representing deep learning models. With ONNX, AI developers can exchange models between different tools and choose the best combination of these tools for them. ONNX is jointly developed and supported by Microsoft , Amazon , Facebook and other partners as an open source project.

ONNX enables models to be trained in one framework and then transferred to another framework for face recognition, recognition of gestures or objects, etc. This allows developers to use the right combination of tools. ONNX models were supported in Caffe2 , Microsoft Cognitive Toolkit, MXNet , PyTorch, and OpenCV in 2019 , and there are interfaces for many other popular frameworks and libraries.

The LF AI Foundation, a subsidiary of the Linux Foundation , is an ecosystem building organization to support open source innovations in artificial intelligence (AI), machine learning (ML), and deep learning (DL). She started ONNX as a graduate-level project on November 14, 2019. This relocation of ONNX under the umbrella of the LF AI Foundation was seen as an important milestone in establishing ONNX as a manufacturer-neutral open format standard.

The ONNX Model Zoo is a collection of pre-trained models in the deep learning area that are available in the ONNX format. Each model has Jupyter notebooks for model training and for performing inferences with the trained model. The notebooks are written in Python and contain links to the training data set as well as references to the original scientific document that describes the model architecture.

ONNX.js

ONNX.js is a JavaScript - library to run ONNX models in web browsers and on Node.js . With ONNX.js, web developers can integrate and test pre-trained ONNX models directly in the web browser. This has the following advantages: reduced server-client communication, protection of user data, cross-platform machine learning without installing software on the client.

ONNX.js can be executed on both the CPU and the GPU . WebAssembly is used for operation on the CPU . This will run the model at near native speed. In addition, ONNX.js uses Web Worker to provide a "multi-threaded" environment for parallelization of data processing. The empirical evaluation shows very promising performance increases on the CPU by taking full advantage of WebAssembly and Web Workers. WebGL , a standard for accessing GPU functions, is used to run on GPUs .

Web links

Individual evidence

  1. Release 1.7.0 . May 9, 2020 (accessed May 10, 2020).
  2. Braddock Gaskill: ONNX: the Open Neural Network Exchange format. Linux Journal , April 25, 2018, accessed January 17, 2019 .
  3. heise online: Microsoft and Facebook are doing a joint AI thing. Retrieved January 17, 2019 .
  4. LF AI Welcomes ONNX, Ecosystem for Interoperable AI models, as Graduate Project . In: LF AI . November 14, 2019 (English, lfai.foundation [accessed November 15, 2019]).
  5. heise online: Machine Learning: Linux Foundation takes over the ONNX project. Retrieved November 15, 2019 .
  6. Microsoft: ONNX.js: run ONNX models using JavaScript. In: GitHub. March 7, 2019, accessed March 7, 2019 .
  7. Will Badr: ONNX.js: Universal Deep Learning Models in The Browser. In: Towards Data Science. January 8, 2019, accessed March 7, 2019 .
  8. ONNX.js - Run ONNX models in the browser (demos). Microsoft, accessed March 7, 2019 .