본문 바로가기 메뉴 바로가기

News

Prof. Jae-Joon Kim of SNU Receives May 2025 ‘Scientist of the Month’ Award

  • Uploaded by

    대외협력실

  • Upload Date

    2025.05.01

  • Views

    337

Prof. Jae-Joon Kim of SNU Receives May 2025 ‘Scientist of the Month’ Award
- Recognized for the AI Model Compression and Custom Semiconductor Accelerator (Chip) Design
- Supporting Sustainable Growth of the AI Industry with Low-Power AI Technologies

564144_528184_554
▲ Professor Jae-Joon Kim, Department of Electrical and Computer Engineering, Seoul National University
  
Seoul National University College of Engineering announced that Professor Jae-Joon Kim of the Department of Electrical and Computer Engineering has been selected as the recipient of the “Scientist of the Month” award for May 2025, jointly presented by the Ministry of Science and ICT (MSIT) and the National Research Foundation of Korea (NRF).

The Scientist of the Month Award honors one researcher each month who has made significant contributions to the advancement of science and technology through outstanding research and development. The recipient is awarded a Ministerial Citation from the Ministry of Science and ICT and a prize of 10 million KRW. The award is supported by the Ministry’s Science and Technology Promotion Fund and Lottery Fund.

According to the MSIT and the NRF, Professor Kim has been recognized for his significant contributions to the compression of artificial intelligence (AI) models and the development of semiconductor accelerators* optimized for the efficient execution of lightweight models. His work lays the groundwork for foundational low-power AI technologies that can be applied across diverse environments.
* Semiconductor accelerator: A semiconductor chip optimized for specific application tasks, rather than general-purpose computation.

With the widespread adoption of large-scale language models (LLMs) such as ChatGPT across industries and daily life, the surge in computing resources and energy consumption has emerged as a pressing societal concern.

In response, researchers are actively pursuing ways to improve AI model efficiency. Two key areas have garnered particular attention: software (S/W)-based compression techniques to reduce AI model size, and hardware (H/W)-based accelerators for efficient computation. However, these two research areas have largely progressed in isolation, often leading to practical challenges such as degradation in processing speed, when deploying compressed models on existing hardware platforms.

Professor Kim addressed this challenge through integrated research that bridges software and hardware domains. He developed innovative compression techniques tailored to hardware characteristics, while simultaneously designing a custom semiconductor accelerator chips capable of efficiently executing the compressed models.

His research led to a new architecture that supports AI models with variable bit precision* using a single accelerator circuit. Unlike conventional approaches that require separate hardware circuits for each precision level, Professor Kim proposed a novel method of reorganizing the execution sequence. This approach allows simpler arithmetic units to handle multiple precision levels efficiently, streamlining hardware design without compromising flexibility.
* Variable bit precision: A method of adjusting the number of bits used in computation depending on required accuracy or hardware resource constraints.

Additionally, Professor Kim challenged the prevailing notion that accurate AI computation necessitates the use of floating-point* arithmetic units, which are typically area- and power-intensive. He demonstrated that integer-based arithmetic units, which require less area and power, can achieve comparable levels of accuracy. This insight lays a crucial foundation for the next generation of low-power AI semiconductor accelerators.
* Floating point: A representation of real numbers that allows flexible placement of the decimal point, enabling a wide range of numerical expression.

His findings on variable-precision AI hardware were published in June 2022 in the IEEE Journal of Solid-State Circuits (JSSC), a leading journal in semiconductor circuit designs. Follow-up research on low-power AI accelerators was presented at the International Conference on Learning Representations (ICLR) in May 2023, a leading venue for cutting-edge AI research.

Commenting on the recognition, Professor Kim stated, “I will continue my research on ultra-low-power integrated circuit design to support the sustainable advancement of artificial intelligence,” adding, “I hope this work will serve as a core technology enabling AI processing even in power-constrained environments such as mobile devices.”