As part of the Build 2024 developer conference, Microsoft officially announced a new member of its Phi-3 AI small language model family: the multimodel-based Phi-3-vision. However, there is another Phi SLM member, a tie-in to its big Copilot+ PCs reveal from Monday.
Microsoft CEO Satya Nadella revealed the new SLM, Phi Silica, as part of the company"s Windows Copilot Library. This is a 3.3 billion parameter model, which also makes it the smallest of the Phi family of AI models.
Microsoft added:
It is optimized to run on the NPUs in Copilot+ PCs, bringing local inferencing and achieving first-token latency performance. Developers can access the Phi Silica API and deliver user experiences across the Windows ecosystem.
The new on-device Phi Silica SLM is part of Microsoft"s latest Windows Copilot Library. It"s a set of APIs that are powered by over 40 on-device models that will ship with Windows. Microsoft added:
Windows Copilot Library provides a rich set of functionalities that will enhance developers’ ability to take advantage of AI in Windows. Developers will be able to access Studio Effects, Live captions translations, Phi Silica, OCR and Recall User Activity APIs as part of the Windows App SDK release in June. More APIs like Text Summarization, Vector Embeddings and RAG API will be coming later.
As reported yesterday, Copilot+ PCs is the overall brand name Microsoft is using to describe Windows PCs that have their own neural network chip made specifically to run AI applications like Copilot and Phi Silica.
The first Copilot+ PCs are scheduled to ship sometime in mid-June and will have Qualcomm"s Arm-based Snapdragon X Elite and Plus chips inside. Microsoft and many other major PC makers will offer these kinds of laptops this summer. Intel also plans to offer its own Copilot+ PC-based processor, with the current code name Lunar Lake, sometime in the third quarter of 2024.