*Editor’s note: K-VIBE invites experts from various K-culture sectors to share their extraordinary discovery about the Korean culture.
Matthew Lim's AI Innovation Story: The Need for an Open-Source AI Strategy
By Matthew Lim, AI expert and director of the Korean Association of AI Management (Former Head of Digital Strategy Research at Shinhan DS)
In the movie "The Matrix," the protagonist, Neo, is faced with the shocking truth that the world he lives in is a simulation. He finds himself at a crossroads with two choices:
Should he take the red pill and confront the uncomfortable truth, or should he take the blue pill and continue to live in a comfortable illusion?
The direction of South Korea's AI strategy today seems to be at a similar crossroads.
Utilizing open-source resources is uncomfortable but akin to the "red pill"—facing the truth.
Recently, the government announced an active plan to secure GPUs (Graphics Processing Units) to strengthen the country's AI competitiveness. According to local media reports, the government plans to secure 18,000 high-performance GPUs by the first half of next year.
The plan is to prioritize securing 10,000 units by the end of this year, with an additional 8,000 units to be introduced in the first half of next year.
These GPU resources will be used in the development of large-scale language models (LLMs) at the world’s highest level. This plan, which is four years ahead of its original goal, is a response to the rapidly changing AI environment, including the emergence of the Chinese startup DeepSeek.
While it may seem a little late, it is nonetheless good news. However, simply securing hardware is not enough.
How to allocate and utilize these resources efficiently has become an even more important task.
The key question now is: How can we maximize the use of these limited resources through the right strategy?
◇ Focusing on "Usage" Rather Than "Creation"
It is time to free ourselves from the pressure of "immediately creating a Korean version of ChatGPT."
In the fierce competition of super-large AI models, it is realistically difficult for South Korea to catch up with leading companies such as OpenAI, Google, or Anthropic. But this doesn’t mean we should give up.
The true value of generative AI lies not in the technology itself, but in how it is integrated into industries to increase efficiency or create new markets.
While cloud computing infrastructure is led by companies like Amazon and Microsoft, innovative services like Netflix and Slack have successfully built business models by utilizing this infrastructure.
AI is no different. Open-source models like Meta's LLaMA, Mistral, and Stable Diffusion already boast high performance. By fine-tuning and applying them to South Korea’s industrial needs, we can achieve remarkable productivity improvements and revenue generation.
Open-source AI models are free for anyone to use, but if you have the expertise and data to optimize them for specific tasks, you can secure a competitive advantage. South Korea already has world-class competitiveness in various industries such as manufacturing, finance, and healthcare.
What if we combine this domain knowledge and data with AI?
For example, specialized solutions for manufacturing facility diagnostics and quality control, customer-tailored consultations in finance, and medical image analysis could be quickly implemented based on open-source LLMs.
Globally, many successful cases of utilizing open-source LLMs and generative AI have already emerged.
One of the most notable examples is Hugging Face, which, through its open-source AI platform, has made high-performance open-source LLMs like Meta's LLaMA and Mistral easily accessible to anyone. Hugging Face's service applications have earned it a company valuation of over $1 billion.
![]() |
▲ Hugging Face, captured from the firm's website. (PHOTO NOT FOR SALE) (Yonhap) |
In the business realm, global companies are increasingly using small LLMs (sLLMs), such as Mistral’s Mistral 7B, to develop solutions specialized for their internal data. These models are gaining attention because they can provide high performance optimized for specific domains with relatively fewer resources.
According to a 2023 domestic AI adoption survey by Samsung SDS, while many companies (76.7%) prefer commercial solutions, open-source (11.6%) and self-developed (8.2%) solutions are also in use, indicating that diverse approaches coexist.
In particular, a recent report by Samil Accounting Corporation, titled The Current State of Business Using Generative AI, reveals that generative AI is being widely utilized in traditional industries such as finance, manufacturing, and healthcare to improve customer service, automate tasks, and analyze data. The productivity and competitiveness of companies that have adopted AI have also significantly improved.
◇ Success Factors: Data Processing Capabilities and Quick Execution
The key to utilizing open-source models boils down to two factors; Securing the domain-specific data and the speed in applying these models in the field.
First, there is a need to build "Korean-style data." We must standardize and share field-centric datasets, such as production data from domestic manufacturing, financial transaction patterns, and medical images, by running a shared platform. It is also important to digitize the knowledge and experience of experts in various fields, such as doctors, teachers, and lawyers. This could become a unique competitive edge that foreign companies cannot easily replicate.
Second, there needs to be a domestic integrated platform where small and medium-sized enterprises (SMEs) and startups can easily use large language models (LLMs). We must closely examine the national AI computing centers and national supercomputing centers currently under construction by the government. Based on this infrastructure, solutions should be provided in collaboration with domestic cloud services such as Naver Cloud, KT Cloud, and NHN Cloud, enabling easy tuning and deployment of open-source LLMs.
In terms of data sovereignty and cost-effectiveness, prioritizing domestic infrastructure over global cloud providers is a preferable strategy. Creating an environment where domestic companies can experiment and apply generative AI without heavy burdens is essential.
Finally, workforce retraining is urgent. Training programs must be expanded to allow professionals outside of development, such as those in other industries, to utilize generative AI tools. Practical education is critical, especially for non-experts to fully exploit the potential of LLMs, such as through skills like prompt engineering.
Open-source AI utilization strategies can unfold through the simultaneous participation of various actors. Domestic models such as LG’s ExaOne and Upstage Solar, as well as international open-source models like DeepSeek from China and Meta’s LLaMA and Mistral, should be integrated into industry fields to develop innovative solutions.
Notably, DeepSeek has been evaluated as having 82% performance in complex multi-question tests, performing ahead of or at parity with the top U.S. models. This high-performance model needs to be optimized for South Korea's industrial environment, and this work must be simultaneously pursued across large corporations, SMEs, startups, and public institutions.
Through the broad application process, the data and know-how accumulated should accelerate the development of domain-specific models tailored to South Korea’s industrial environment.
In this process, the government’s role should be more focused on supporting innovation through infrastructure and regulatory sandboxes rather than directly developing models. Expanding platforms like the AI Hub operated by the Ministry of Science and ICT to support not only dataset sharing but also the tuning and deployment of open-source models will be vital. Creating environments where businesses can freely innovate, such as through the designation of medical and legal field test zones, is crucial.
According to research from the U.S.-based private research group EPOCH AI, South Korea has developed 11 large language models at the level of ChatGPT, ranking third after the U.S. and China. Recently, LG’s ExaOne 3.5 model was also listed as a “Notable AI” by EPOCH AI, which is an encouraging sign of South Korea’s steady progress in AI technology.
◇ AI Dominance Belongs to the ‘Quick Adopters’
For South Korea to achieve meaningful results in the AI competition, we must prioritize "technology utilization" over stubbornly insisting on "technological independence" in order to revolutionize productivity in the field. As open-source success stories accumulate in areas where South Korea has strengths—such as manufacturing, finance, and healthcare—the domestic AI ecosystem will naturally grow.
Just as the semiconductor industry secured hardware competitiveness, in the AI era, we must combine data and domain knowledge to create differentiated AI solutions. In particular, if we develop Korea-specific AI solutions for industries like smart factory construction in manufacturing or for regulatory industries like finance and healthcare, we will be able to secure competitiveness in the global market.
Now is the time to focus on repeating small experiments with entrepreneurial spirit rather than grandiose visions. Rather than exhausting the GPU resources secured by the government to develop just one or two super-large AI models, we hope these resources will serve as the foundation for the creation of hundreds of innovative, industry-specific AI solutions.
That is the shortcut for South Korea to truly build competitiveness in the AI era.
(C) Yonhap News Agency. All Rights Reserved