قالب وردپرس درنا توس
Home / Technology / Apple's Core ML 2 vs. Google ML Kit: What's the difference?

Apple's Core ML 2 vs. Google ML Kit: What's the difference?



At Apple's global developer conference on Tuesday, the Cupertino company announced Core ML 2, a new version of its Software Development Kit (SDK) for iOS devices. However, it's not the only game in town: just a few months ago, Google announced the launch of ML Kit, a cross-platform AI SDK for iOS and Android devices. Both toolkits are designed to reduce the development burden of optimizing large AI models and datasets for mobile apps. What is different about them?

Core ML

Apple's Core ML debuted in June 2017 as an easy way for developers to integrate machine learning models into their iOS apps. Core ML 2 is similar, but more efficient: Apple says it's 30 percent faster thanks to batch prediction and can reduce the size of models with quantization by as much as 75 percent.

Still, it's not perfect. Unlike Google's ML kit, it's not cross-platform (Android is not supported) and is an offline service only ̵

1; Cloud-hosted models and features like versioning require a third-party service, such as IBM's Watson Studio.

Core ML is otherwise restrictive. The latest version supports 16-bit floating point, which can greatly reduce the size of AI models. However, machine learning models can not be compressed into smaller packages, and models can not be updated at runtime. trained models are loaded into Apple's Xcode development environment and packaged in an app bundle.

That should not minimize the advantages of Core ML, of course. It ships with four ready-made machine learning models based on popular open source projects and a converter that works with Facebook's Caffe and Caffe2, Keras, scikit-learn, XGBoost, LibSVM, and Google's TensorFlow Lite. (Developers can create custom converters for frameworks that are not supported.) Apple offers privacy benefits (apps do not need to transfer data over a network), and it is said that Core ML is optimized for energy efficiency.

Then, there's Create ML to check. It's a new GPU-accelerated tool for training native AI model on Mac computers that supports vision and natural language. And because it's encoded in Swift, developers can use drag-and-drop programming interfaces like Xcode Playgrounds to train models.

ML Kit

At the I / O 2018 Developer Conference in May, Google introduced ML Kit platform for machine learning tools for the Firebase mobile development platform. ML Kit uses the Neural Network API on Android devices and is designed to compress and optimize machine learning models for mobile devices.

One major difference between ML Kit and Core ML is support for both on-device and cloud APIs. Unlike Core ML, which can not natively deploy models that require Internet access, ML Kit leverages the power of Google Cloud Platform's machine learning technology for "improved" accuracy. For example, Google's on-device image capturing service has about 400 labels, while the cloud-based version contains more than 10,000 labels.

ML Kit provides a set of easy-to-use APIs for basic use cases: text recognition, face recognition, barcode scanning, image captioning, and landmark detection. Google says new APIs, including a smart answer API that supports in-app contextual message responses and an advanced face recognition high-density facial recognition API, will arrive by the end of 2018.

ML Kit does not restrict developers to pre-made versions of machine learning models. Custom models trained with TensorFlow Lite, Google's lightweight off-line learning framework for mobile devices, can be deployed with ML Kit through the Firebase console, which dynamically manages them. (Google says it also works on a compression tool that turns complete TensorFlow models into TensorFlow Lite models.) Developers have the ability to decouple machine learning models from apps and deploy them at runtime, cutting off megabytes of app installation sizes to date.

Finally, ML Kit works with Firebase features such as A / B tests that allow users to dynamically test different machine learning models, and Cloud Firestore, which stores image captions and other data.

What is better?

So, which machine learning framework has the upper hand? Neither really nor.

Of course, Core ML 2 does not support Android, and developers familiar with Google's Firebase probably prefer ML Kit. Likewise, long-time Xcode users will likely tend to Core ML 2.

Perhaps the biggest difference between the two is the plug-and-play first-party support: Google offers a wealth of ready-made machine learning models and APIs to choose from. including APIs for contextual message responses and barcode scans. Apple, on the other hand, is a little more hands-off.

As with many things, choosing between ML Core 2 and ML Kit is usually a matter of personal preference – and whether the developer in question prefers a top-to-solve solution like Firebase or a piece-wise solution like Create ML.


Source link