Developers Can Now Use ONNX Runtime (Machine Learning Inference Engine) To Build Machine Learning Applications Across Android And iOS Platforms Through Xamarin

Traditionally, AI models were run over powerful servers in the cloud. Implementing “on-device machine learning,” like using mobile phones, is rarely heard of. This lack of mobile-based implementation can be attributed mainly to the lack of storage memory, compute resources, and power required for using AI models. Despite these limitations, mobile-based AI implementation can be pretty helpful under some problem scenarios.     

For achieving the goal of implementing mobile-based AI models, Microsoft has recently released ONNX Runtime version 1.10, which supports building C# applications using Xamarin. Xamarin is an open-source platform for building applications using C# and .NET. This is likely to aid developers in building AI models over Android or iOS platforms. This new release enables the building of cross-platform applications using Xamarin.Forms. Microsoft has also added an example application in Xamarin, which runs a ResNet classifier using ONNX Runtime’s NuGet package in Android and iOS mobiles. For understanding the detailed steps for adding the ONNX runtime package and learning about Xamarin.Forms applications, one can take a look here.              

ONNX Runtime supports deep learning frameworks like Python, TensorFlow, and classical machine learning libraries such as scikit-learn, LightGBM, and XGBoost. It is also compatible with a wide range of hardware, thus providing a faster customer experience by using the best accelerators wherever possible. ONNX Mobile Runtime will offer a considerable boost in implementing Android and iOS AI models optimized for lower storage spaces. A list of available packages for different platforms can be found here.   

There are some considerable advantages in moving towards an on-device AI implementation. The latency arriving due to the involvement of download and upload from servers is eliminated. This can enable real-time data processing like tracking, classification, detection of objects using mobile cameras in real-time without any network connectivity requirements. As all the processing will be happening offline, extra mobile data charges will not be incurred, ultimately reducing the costs of using the apps for the end-user. Better data privacy will be ensured as the data won’t be sent to any server. This is very important in cases involving sensitive data.     

As Microsoft receives developers’ feedback, it will continue to update the packages. Shortly, we can expect to have the possibility of on-device training packages too.

References: