Samsung has announced that it’s partnering up with NAVER to develop semiconductors that can better handle AI workloads. The partnership will combine Samsung’s semiconductor design abilities and NAVER’s expertize in AI algorithms and AI services.
Apparently, advancements in hyperscale AI now mean that exponential data volumes need to be processed. Unfortunately, current systems do not handle these workloads well, so these two Korean companies are now working together to develop semiconductors that are optimized for working with artificial intelligence workloads.
“Through our collaboration with NAVER, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” said Jinman Han, Executive Vice President of Memory Global Sales & Marketing at Samsung Electronics. “With tailored solutions that reflect the most pressing needs of AI service providers and users, we are committed to broadening our market-leading memory lineup including computational storage, PIM (processing-in-memory) and more, to fully accommodate the ever-increasing scale of data.”
Samsung says it has been developing memory and storage products designed for AI, including SmartSSDs, PIM-enabled high bandwidth memory, and next-gen memory that supports the Compute Express Link interface. Collaborating with NAVER, the company can optimize this technology. NAVER will be working on refining HyperCLOVA, a hyperscale language model, and improving compression algorithms that boost computing efficiency.
NAVER isn"t too popular outside of South Korea, but inside the country, its search engine is popular. People outside of Korea will most likely know NAVER for its popular comic app, Webtoon.