Skip to content

[Tech Day 2022] Hyper-intelligence: AI and future experiences

  • mail
Our devices are getting more intelligent and getting better as well as faster at processing tons of data. The development of artificial intelligence (AI) solutions continues to improve the capabilities of our devices and how we interact with them. And Samsung is leading the development of AI solutions for integration on its Exynos processors. In Samsung’s Tech Day 2022 event where they shared the latest innovation for various System LSI related technologies, Samsung provided updates on the development of AI solutions for Exynos processors. Importance of NPUs Sung-Boem Park, Vice President of Advanced AP Development Team, opened the session with an overview of Neural Processing Units (NPUs) development. With mobile devices receiving tons of personal data through video, audio, location services and positioning, more sensors integrated into future devices will improve the quality of the data being gathered. However, the data collected only provides value to the device owner if the device has the intelligence to extract the useful information and analyze it. “An on-device AI accelerator, also known as the NPU, is the main IP that enables on-device intelligence to be applied in real-time while preserving privacy,” said Park. “With the current NPU’s power and efficiency, we are already seeing dozens of AI related applications running on mobile devices.” Evolution of NPU in Exynos Processors In 2019, Samsung revealed the first NPU design in Exynos, which improved efficiency over CPUs and GPUs. Since the first iteration, Samsung has worked to improve both the area and power efficiency. Each generation has improved to eliminate wasted computation to help improve power efficiency.
Samsung Exynos' evolution of AI hardware by generation
Samsung Exynos' evolution of AI hardware by generation
The newest generation, which was launched early in 2022, provides three new major features:
  • Scatter-gather helps prevent memory bottlenecks and provides effective data feeding to the ALUs (Arithmetic Logical Units).
  • Extreme low power mode allows the NPU to operate without DRAM to support always-on scenarios.
  • Multi-precision ALU provides FP16 (Half-precision floating-point format) support in addition to existing INT8 and INT4 support using a single ALU to achieve greater efficiency and flexibility.
Advancing NPU Technology “We believe that advancements in NPU technology will continue at three different level. The first level involves enhancing efficiency at an IP level,” explained Park. “One approach is ensuring the massive number of ALUs are well fed with data and well utilized. Another method is making more ALUs available through an increased compute density. Work reduction through bit-precision optimization and better gating of unused blocks can also help.” The power limit of today’s mobile devices will soon support IPs running at the lowest possible voltage supported by processer technology. This will make it difficult to translate performance improvement to efficiency improvement which is why other methods are needed to improve NPU technology. “The second level involves improving system-wide power efficiency. Improvements in NPU efficiency see other components in the system starting to use a significant amount of power. Other components that have been ignored are CPU, ISP (Image Signal Processor), NOC (Network-on-Chip), DRAM and Power Management IC,” said Park. “Power consumed by other IPs is generally proportional to the inference rate and if those IPs don’t improve at the same rate as the NPU the budget for the NPU will reduce over time. Minimizing cross-IP data movement while increasing CPU task offloading to more efficient cores are some of the ways to drive improvement at this level.”
Multiple NPUs are specialized for different purposes.
Multiple NPUs are specialized for different purposes.
The third level is specialization or building multiple NPUs that are specialized for different purposes. This resolves the challenge of efficiently running multiple applications on a single NPU architecture. Extreme measures could be taken to have a separate NPU for each application, embedding one into each of the application specific IPs. For example, an individual NPU for each IP such as ISP, GPU, MFC (Multi-format codec) and audio. Specialization supports more power efficiency at the expense of the chip-level area. With this demand, Samsung believes multi-die solutions will start being used to address the cost issue. “In the future we will see extreme power efficiency in processing a wide range or sensor data that’s collected. Only when this capability is ready can applications like AR be fully realized. It has more sensors and collects more data than mobile, and it has worse power efficiency due to limited battery size,” claimed Park. “However, it’s a place that will provide more personalized experience. We believe Samsung’s NPU solutions provide the hyper-intelligence needed and that Exynos processors will be at the center of your personal life.” Multimedia Technology Ki-Joon Hong, vice president of Multimedia Development Team, focused his session on multimedia core technology for hyper-intelligence and where the technology is heading. Samsung is developing the core brain in SoCs (System-on-Chips) for use in future multimedia systems or machines that will have human-level intelligence. The devices will use cameras, mics and speakers to create a sensory system that mimics human brain activity for sight and sound. The goal is to create a realistic virtual world using 3D images and audio. The world will be generated by the multimedia cores for use cases in AR, VR, autonomous driving and for robots. Developing Visual Processing Capabilities “To help us develop visual processing that mimics brain activity for human sight, we’ll use computational photography and high-quality imaging systems. Smart multi-frame processing and deep learning-based motion estimation will enable recreating high-quality images and videos,” explained Hong. “Camera systems with integrated deep learning technology will be able to replicate the human brain’s ability to store a lifetime of memories.”
AI Visual Solution
Recognizing what you are perceiving
Samsung AI visual solutions are visual processing capabilities that mimic brain activity for human sight.
Samsung AI visual solutions are visual processing capabilities that mimic brain activity for human sight.
Samsung's content-aware ISP solution is an object detection solution that can recognize objects and other details about them.
Samsung's content-aware ISP solution is an object detection solution that can recognize objects and other details about them.
Samsung has developed content aware ISP solutions that can recognize objects and other details about them. Through adaptive pixel processing in ISP chain, regions are recognized and segmented. This can be used in more complex scenarios such as high-resolution video recording. In addition, the object detecting solution is already being used in various applications. The computation power of NPUs and GPUs in mobile processors has permitted huge developments with computer vision and deep learning algorithms. “Our brains are constantly sensing objects and gestures, triggering appropriate actions in response. In many ways, humans and computers interact in similar ways to sensory inputs. Implanting these processes into robot vision systems for use in autonomous driving could be the final step in bridging this connection,” said Hong. “In extended reality, robot system and future smart car technology, we believe AI visual systems will be at the core for all multimedia technology innovation in the future.” Creating AI Audio Solutions Hong also introduced Samsung’s AI-based audio technology that makes machines hear and speak like human do. “Audio is the simplest form of communication as we send information through speech and receive it by listening,” shared Hong. “Speech enhancement algorithms and speech recognition technology based on machine learning help make conversations over the phone clearer and natural sounding. It’s also allowing us to communicate with machines who sound human.” Audio consumption is shifting from multiple audience targeting like that used for movies to more intimate experiences geared towards individuals or select groups of users. “Specialized SoCs for applications like wireless earbuds provide richer sound experiences and capabilities like active noise cancellation allow users to create their own personal acoustic space and experience,” said Hong. “We believe the next big step will be creating true to life acoustic environments through mobile devices.” Combining SoCs with AI Solutions To bring these technologies to life the challenge is figuring out how to configure all these advanced technologies into one container. Samsung has powerful computation units that provide enough power to process complex algorithms with huge amounts of data, and its SoCs can provide low latency and low power to processing with connected sensors. Combining these technologies with AI solutions will create exciting new multimedia experiences for users with Exynos processors.
The combination of Samsung's SoC and AI solutions will provide a new multimedia experience for Exynos processor users.
The combination of Samsung's SoC and AI solutions will provide a new multimedia experience for Exynos processor users.

Would you like to
leave this page?
If you leave this page, the content you are creating
will not be saved.

Registration Are you sure you want to submit this?

Thank you! Please confirm your registration

Your subscription is not active yet!
An email with an activation link
has just been sent to your email address.
Please activate your subscription by clicking on
the activation link inside the email.

Thank you! Please confirm

your existing registration

You have already registered, but before we can send you the
information about upcoming events, we need your confirmation.

If you missed our previous email, please use the button below to resend it.
To activate your subscription, please click on the link included in the email.


To proceed, please click on the "check" button located in the email section.