Cerebras Systems launched Cerebras Inference, the world's fastest AI inference solution. It's 20x faster than NVIDIA's solutions and offers 100x higher price-performance.
Inference RSS
Google has announced the Android ML Platform. Coming this year, it will make on-device inference easier by offering a consistent API and deeper integration with the OS without too many dependencies.
Japanese software development firm OKI IDS will be combining its expertise with Mipsology's field-programmable gate arrays to bring high-quality machine learning applications to Japan.