Intel first launched its "Meteor Lake" based Core Ultra chips in December 2023. They included integrated NPU (Neural Processor Unit) chips to help hardware-accelerate the performance of AI applications. Today, Microsoft announced it has released a developer preview version of its DirectML API that adds support for NPUs, and specifically the ones in Intel Core Ultra CPUs.
In a blog post, Microsoft said the preview release of DirectML 1.13.1 and the ONNX Runtime 1.17 would support NPU acceleration in PCs with Intel Core Ultra CPUs. The company worked with not just Intel but also with Samsung to help boost the performance of AI apps that use DirectML.
In a separate blog post on this development, Intel says that Samsung used bout Microsoft's DirectML and its Intel AI Boost tech in its recently released Samsung Galaxy Book 4 notebook with Intel Core Ultra chips. Intel said that Samsung offloaded the face and object classification features on its Gallery Windows app onto the Intel Core Ultra CPU with its NPU, using Microsoft's DirectML API. Intel claims that resulted in a faster development time for the project.
Since this is a preview release. Microsoft notes a few limitations of this version of DirectML on Intel Core Ultra-based PCs:
- NPU support in DirectML is currently only compatible with a subset of machine learning models that have been targeted for support. Some models may not run at all or may have high latency or low accuracy. We are working to improve the compatibility and performance of more models in the future.
- Models that are functional and were not selected for this developer preview may have some stability and reliability issues. We are continuing to expand coverage for additional models suitable for NPU.
Developers can provide feedback and post any issues they might encounter with this preview release at the DirectML GitHub page.
0 Comments - Add comment