2025-06-28, Sat.

Top Stories       Business       Culture & Life       Science & Technology       World

Lecture

Notification

 

NEWS > Science & Technology


VeriSilicon¡¯s Ultra-Low Energy NPU Provides Over 40 TOPS for On-Device LLM Inference in Mobile Applications

The energy-efficient architecture scales across AI-enabled devices, including AI phones and AI PCs
Date: 2025-06-21

SHANGHAI  -- VeriSilicon (688521.SH) announced that its ultra-low energy and high-performance Neural Network Processing Unit (NPU) IP now supports on-device inference of large language models (LLMs) with AI computing performance scaling beyond 40 TOPS. This energy-efficient NPU architecture is specifically designed to meet the increasing demand for generative AI capabilities on mobile platforms. It not only delivers powerful computing performance for AI PCs and other end devices, but is also optimized to meet the increasingly stringent energy efficiency challenges of AI phones and other mobile platforms.

Built on a highly configurable and scalable architecture, VeriSilicon’s ultra-low energy NPU IP supports mixed-precision computation, advanced sparsity optimization, and parallel processing. Its design incorporates efficient memory management and sparsity-aware acceleration, which reduce computational overhead and latency, ensuring smooth and responsive AI processing. It supports hundreds of AI algorithms including AI-NR and AI-SR, and leading AI models such as Stable Diffusion and LLaMA-7B. Moreover, it can be seamlessly integrated with VeriSilicon’s other processing IPs to enable heterogeneous computing, empowering SoC designers to develop comprehensive AI solutions that meet diverse application needs.

VeriSilicon’s ultra-low energy NPU IP also supports popular AI frameworks such as TensorFlow Lite, ONNX, and PyTorch, thereby accelerating deployment and simplifying integration for customers across various AI use cases.

“Mobile devices, such as smartphones, are evolving into personal AI servers. With the rapid advancement of AIGC and multi-modal LLM technologies, the demand for AI computing is growing exponentially and becoming a key differentiator in mobile products,” said Weijin Dai, Chief Strategy Officer, Executive Vice President, and General Manager of the IP Division at VeriSilicon. “One of the most critical challenges in supporting such high AI computing workloads is energy consumption control. VeriSilicon has been continuously investing in ultra-low energy NPU development for AI phones and AI PCs. Through close collaboration with leading SoC partners, we are excited to see that our technology has been realized in silicon for next-generation AI phones and AI PCs.”



 to the Top List of News

Cargill Earns #1 Global Ranking for Removing Trans Fats From Edible Oils Portfolio
LabPMM¢ç Receives New York State Approval for the NPM1 MRD Assay - Informing Therapy and Accelerating Targeted Trials
CSG & AWS Expand Collaboration to Accelerate Cloud Transformation in Telecommunications & Financial Services
India to Host Inaugural World Technology Summit in 2025
Erbe introduces VIO¢ç 3n & VIO¢ç seal: Tailored electrosurgical generators for high-performance workflows
Ferrer Receives FDA Fast Track Designation for FNP-223 in Progressive Supranuclear Palsy (PSP)
New CSC Survey Finds Overwhelming Majority of CISOs Anticipate Surge in Cyber Attacks Over the Next Three Years

 

EDB Postgres¢ç AI Accelerates New Era of Sovereign Data and AI for Ent...
Pioneering Cancer Plasticity Atlas will Help Predict Response to Cance...
OSG Corporation Extends its Contract with Rimini Street for Support of...
Smart Communications Simplifies Digital Archiving with Launch of Smart...
PubNub Evolves Its Platform with AI-Native Development, Real-Time Mode...
Ecolab Life Sciences Launches New Bioprocessing Purification Resin to ...
QpiAI Announces Dawn of Quantum Era in India With 25 Qubit Quantum Com...

 

 

60, Gamasanro 27gil, Guro-gu, Seoul, Korea, e-mail: news@newsji.com

Copyright, NEWSJI NETWORK.

.