202601160841303717700874
Joong-il Rho, CEO of Visang Education’s Global Company
 

The Ministry of Education recently released the 'Guidelines and Selection Criteria for Learning Support Software' last month to assist school operation committees in their deliberation process. This news hit the education industry like a bolt from the blue. While meeting these guidelines is considered a minimum qualification for school adoption, critics point out that the personal information protection standards—included as mandatory compliance items—are excessively ambiguous.

Concerns are mounting that this will hinder the development of the education industry and further exacerbate the "tilted playing field" that already favors foreign products. Money Today Network (MTN) met with Joong-il Rho, CEO of Visang Education’s Global Company, who has been a consistent voice in the AI and digital education sectors, to discuss these issues.


Q: There is a growing controversy regarding the 'Learning Support Software Selection Criteria' announced by the Ministry of Education.

Joong-il Rho: To train AI, educational content and data are absolute necessities. Such elements should be exempted and kept outside the scope of strict regulation. However, the current selection criteria apply the Personal Information Protection Act in its entirety. This structure treats educational technology as a risk factor simply because it involves AI, meaning the most educationally useful tools are ironically put at the greatest disadvantage.

The current guideline is a perfect example of the proverb "refraining from making soybean paste for fear of maggots" (sacrificing a great benefit for fear of a small risk). By being excessively afraid of the minor risk of personal information protection, we are abandoning the far greater educational value of AI-based personalized learning. The key is not to block technology, but to manage data so that it can be used safely for educational purposes. The core issue is whether data is used for "educational intent," but the guidelines fail to capture this distinction. While President Lee Jae-myung has declared a vision to become an AI superpower, the Ministry of Education is moving in the exact opposite direction of the government's stance when it comes to crucial educational AI.

Q: The industry is particularly taking issue with the phrase "minimum data collection."

Joong-il Rho: The problem is that the Ministry of Education, which should only be defining operating principles and governance, is interfering with such minute details. When asked what the "minimum unit" is, how far data can be used, or whether voice recognition and eye-tracking data are allowed, the Ministry cannot provide a single answer.

However, the concept of "minimum information" does not exist in educational AI. AI is a system that learns through data. For example, Boston Dynamics' physical AI walks, jumps, and balances like a human. To learn this, it collects data on joint angles, torque, sole pressure, inertial measurement units (IMU), camera footage, and records of collisions or slips. If even one of these is missing, the robot falls and fails to learn.

Educational AI is the same. Logs such as a student's utterances, errors, reaction times, repetition counts, and hesitations are the "sensory data" for learning, equivalent to the joints and sensors of an AI robot. Telling us to "collect only the minimum" is like asking a robot to walk and run without eyes or a sense of balance. If that happens, AI cannot become "smart," and Korea's educational AI will inevitably fall behind in the global competition.

Q: Education companies seem confused about whether they are even subject to these guidelines.

Joong-il Rho: This is because the guidelines regulate "pieces of technology" rather than the "educational acts occurring on a platform." Today’s educational system is a structure where multiple technologies are integrated onto a single platform. There is a Learning Management System (LMS), video classes, and AI tutors. Learning logs are collected, and personalization algorithms operate on top of them. This is not just multiple pieces of software; it is a single data-based learning platform.

Yet, the current guidelines suggest that video classes are fine, LMS is ambiguous, AI tutors are risky, and log analysis is regulated. This is like saying cars are allowed, but engines and brakes are dangerous and must be approved separately. Learning effectiveness comes from the integrated structure of a platform, not individual technologies. If only certain technologies are regulated, the entire platform stops working, and no one on the ground can judge what is subject to regulation. The current confusion is not a problem for schools or companies; it is a failure of regulatory design that ignores the platform era.

Q: Student personal information must be protected, especially after recent hacking incidents. Do you believe there is an alternative approach to the Personal Information Protection Act?

Joong-il Rho: The world has already moved from simple personal information protection to educational data protection and AI algorithm governance. However, Korea still treats all learning data under a single personal information law. This does not even align with the UN’s Sustainable Development Goals (SDG). What is needed is a governance system, not just simple law enforcement. Unless technology, schools, and the government create rules together, the law will always lag behind technology. Educational data should be viewed as a public resource that improves learning outcomes and creates fair opportunities, rather than an object of surveillance.

Q: How do the US and UK regulate this?

Joong-il Rho: The US and UK clearly permit the use of data for educational purposes. On the other hand, Europe protects platform sovereignty alongside personal information through the General Data Protection Regulation (GDPR). The problem lies with Korea. We claim to want innovation like the US, but apply regulations like Europe. This is a policy of stepping on the accelerator and the brake at the same time. The result is stagnation. If we want to be an AI powerhouse, our data utilization strategy must match that ambition.

Q: How do you evaluate the Ministry of Education’s measures to foster the EdTech industry?

Joong-il Rho: Rather than fostering the industry, the Ministry’s EdTech policy is closer to a repeated collapse of the ecosystem. Through projects like digital textbooks and AIDT, private companies have lost access to platforms and data. A bigger problem is the lack of accountability and expertise in policy. Continuity and long-term strategy are key to the AI industry, but the current structure, where officials in charge change every few months, cannot handle this. If they don't understand it, they shouldn't regulate it. Misunderstood regulations quietly suffocate an industry.

Q: There are concerns that the guidelines might further promote a "tilted playing field" for foreign companies.

Joong-il Rho: If domestic data usage is regulated, we will have no choice but to rely on overseas APIs in the future. We will be forced to use models trained by foreign companies according to their standards. Those models will likely already include our children's data. We could end up in a situation where we have to pay to buy back analysis reports made from our own children's data.

Q: What do you want from the government?

Joong-il Rho: Please do not control educational data; instead, protect the utilization of data for educational purposes. AI education is not built by regulation. What is needed is not regulation, but private-public-academic governance. Relaxing regulations for real-world testing, creating data norms dedicated to educational AI, linking domestic platforms, and establishing a permanent consultative body—with just these four elements, Korea can grow AI education through trust and cooperation. EdTech is not a target for regulation; it is a national learning infrastructure that must be nurtured together.


Source: Seok-jin Yoon, Reporter at Money Today Network (MTN)