Microsoft AI Research Open-Sources ONNX Script Library for Directly Authoring ONNX Models in Python

In the ever-evolving landscape of machine learning, ONNX (Open Neural Network Exchange) models have emerged as a pivotal technology, offering a standardized and flexible representation that spans a diverse range of hardware platforms and runtime environments. From cloud-based supercomputers to resource-constrained edge devices like smartphones and web browsers, ONNX models empower seamless execution across the spectrum.

Central to ONNX’s power is its graph format, usually represented using the Protobuf format. However, ONNX is more than just a graph representation; it consists of a concise set of primitive operators, which are universally implemented by runtimes and hardware vendors. To maintain a broad ecosystem and minimize overhead, ONNX deliberately keeps its operator count low, encouraging modularity through ONNX functions.

While machine learning models are often conceived using high-level frameworks such as PyTorch and TensorFlow, deploying them to production necessitates a transition. Models are exported to ONNX using framework-provided tools, followed by optimization for specific targets using tools like Olive.

Meet ONNX Script, a novel open-source library designed by the Microsoft team to facilitate the direct creation of ONNX models using Python. ONNX Script prioritizes clean, idiomatic Python syntax and fosters composability through ONNX-native functions. This approach simplifies model authoring and facilitates integration with existing Python tooling, enhancing readability and productivity. Importantly, ONNX Script is the cornerstone upon which the future PyTorch ONNX exporter, supporting TorchDynamo, is being built.

Before ONNX Script’s advent, crafting ONNX models demanded a deep understanding of the specification and serialization format. While a helper API improved this process, it required familiarity with ONNX’s intricacies. ONNX Script takes a different approach by embedding deeply within Python on two levels:

1. Strongly Typed API for Operators: ONNX Script offers a strongly typed API for all 186 operators as of opset 19. This allows standard Python tooling, linters, and IDEs to provide valuable feedback and enforce correctness.

2. Natural Python Constructs: ONNX Script supports Python language features, including conditionals, loops, binary and unary operators, slicing, and more. In Python, expressions like ‘a + b` correspond to the ONNX operator `Add(a, b)` in ONNX Script.

ONNX Script’s integration with the PyTorch ONNX exporter responds to the evolving landscape. The advent of PyTorch 2.0 and TorchDynamo marks a transition from TorchScript, necessitating a major overhaul of the ONNX exporter. ONNX Script was conceived as the foundation for this transformation, reimagining the fundamentals of the exporter’s architecture. The development of Torchlib—a pure ONNX implementation of PyTorch operators—simplifies the exporter’s role by translating FX graph nodes into ONNX graph nodes without operator-level concerns.

Furthermore, ONNX Script enables augmenting PyTorch model code with custom ONNX functions as specialized operators, enhancing the model’s flexibility and functionality.

ONNX Script promotes easy testing and debugging by seamlessly integrating within the Python ecosystem. With built-in support for NumPy, debugging through standard Python tooling or advanced IDEs like Visual Studio Code becomes seamless.

ONNX Script’s future is promising. It streamlines ONNX model authoring and opens doors to extending the ONNX standard itself. Core operators and higher-order functions contributing to the ONNX standard can be authored in ONNX Script, expediting the standard’s evolution. In the coming months, ONNX Script will support converting ONNX into ONNX Script, enabling smoother editing of existing models and facilitating optimization passes. The goal is to propose ONNX Script’s inclusion within the ONNX GitHub organization, solidifying its place in the machine learning landscape.

Check out the Microsoft Blog and GitHub. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to join our 28k+ ML SubReddit, 40k+ Facebook Community, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.

Niharika is a Technical consulting intern at Marktechpost. She is a third year undergraduate, currently pursuing her B.Tech from Indian Institute of Technology(IIT), Kharagpur. She is a highly enthusiastic individual with a keen interest in Machine learning, Data science and AI and an avid reader of the latest developments in these fields.

🐝 [FREE AI WEBINAR] 'Beginners Guide to LangChain: Chat with Your Multi-Model Data' Dec 11, 2023 10 am PST