Interview with Sook-Jung Dofel, Head of AIZ: Artificial Intelligence in Corporate Learning
Artificial intelligence (AI) is a hot topic when it comes to capacity development at GIZ. AI is used to prepare learning content and in the learning process itself. In this interview, Sook-Jung Dofel, Director of the Academy for International Cooperation (AIZ), talks about the opportunities and risks of AI in corporate learning.
Ms Dofel, let’s begin by clarifying the term AI. How would you define it?
When I talk about AI, I’m referring to generative AI, which is a specific form of artificial intelligence based on machine learning. It uses a large dataset to generate new data. A well-known example of generative AI is ChatGPT. This technology is used in various fields, for example to produce text or visual content. It can help you to improve an email or optimise learning content.
So AI can make things easier for people working in corporate learning, or ‘learning experts’. What does that look like in practice?
AI can help people to prepare learning materials faster. And if they use AI as a teaching coach as well, the content will also be more effective. ChatGPT is very useful for creating additional learning material, for example quiz questions or role-play exercises.
There is one aspect of AI that I find particularly interesting for corporate learning:
personalising learning content is the holy grail in this context. Generative AI allows you to create personalised learning experiences by adapting the content, pace and feedback to the needs and learning style of the individual learner. Some schools are already using AI to provide their students with tailored learning content based on self-tests that they complete the day before. Personalised learning content not only boosts motivation, it also promotes a culture of continuous learning by making learning an individual experience. The learning content is geared towards a person’s current level of knowledge. Of all the advantages that AI offers, personalisation is the one that benefits learners the most.
You’ve mentioned a number of plus points. What about the risks of AI in corporate learning?
One risk is certainly that AI can be wrong. ChatGPT is not a search engine. The system does not provide knowledge, it merely calculates probabilities. This can sometimes result in misleading or even false information because AI ‘invents’ new content based on these probabilities. So it is important that a human performs a final check, particularly with complex and sensitive content.
Data protection is another risk issue. Personal data or internal information should not be made available to AI. A solution to this problem might be to develop your own chatbot. Since 2023, the GIZ bot »KIM« has been linked to a version of ChatGPT that only works within the company. This means that GIZ’s workforce can use AI in a safe environment. »KIM« is part of the company’s IT infrastructure, so all data therefore remains in the GIZ cloud. This protects our data.
One last well-known problem is that AI can reproduce prejudices. Generative AI is trained using data generated by humans and adopts the stereotypes the data contains. Here, too, the human factor is key. As a learning expert, I need to review the content and ensure that it is representative and inclusive.
Talking about the human factor, some employees are worried that AI will replace their jobs. Is that concern justified?
How I see it is that AI supports human intelligence. It makes improvements, amendments or suggestions. AI needs people who can use it well in order for it to deliver good results. Human learning experts are the only ones who know the company’s goals, its culture and the expertise it requires. They are the ones who find out about new corporate learning trends and coordinate and agree on content with partners. Empathetic learning experts with a strategic mindset will therefore continue to be the main ingredient for successful learning.