In the classic AI arrangement, the Cloud server, which processes data, serves as the brain. The smartphone or tablet PC serves as the ears and eyes, interfacing with the user to gather data. But under this scheme, the actual AI is far away from the site where the data is being generated, limiting the ability of the AI to understand what is actually going on.
But with on-device AI, edge devices, which are the hardware devices that come into direct user contact, incorporate artificial intelligence. The AI processes data gathered from environments it experiences firsthand, and can make more suitable decisions for the user.
How on-device AI will change our lives
Samsung Electronics is deeply involved in the research of innovative technologies such as deep learning algorithms and on-device AI. At the “Samsung AI Forum 2019” held in November, the company unveiled “on-device AI interpretation” technology, providing on-device interpretation without the need for a central server.
Samsung’s Exynos 9 (9820) premium mobile application processor (AP) is a prime example of on-device AI. The processor incorporates a neural processing unit (NPU), which is a system semiconductor able to handle a variety of computations in real-time without delay. This makes AI processing up to seven times faster possible, all self-contained in the mobile device. Samsung Electronics aims to further boost NPU performance to support performance-hungry on-device AI applications.
Progress in on-device AI technology will allow devices to develop a better idea of user diets and lifestyle patterns such as exercise, leading to further improvements in individually tailored user services. We look forward to the many positive changes that user-centered on-device AI technology will bring to our daily lives.