Microsoft, OpenAI, and Anthropic are investing millions to train teachers on how to harness AI in the classroom. Rather than simply developing new tools, these companies recognise that the real change happens through teachers, who are on the front lines every day, shaping how students learn and grow.
We wanted to get Volodymyr Getmanskyi's, Head of Artificial Intelligence Office, perspective on:
This conversation sheds light on the future of teaching and learning — a future where technology supports, but doesn't replace, the essential role of educators.
Modern educational products lose their effectiveness very quickly. ChatGPT and similar models can already solve standard school assignments, and students actively use them. This means involvement in educational activities no longer guarantees knowledge — there's always the temptation to just 'ask AI.'
Learn more about how AI is transforming Learning Management Systems.
In this context, it's crucial for teachers to understand how AI works, how it generates answers, how to spot plagiarism, and how to evaluate the quality of a student's work. This helps maintain control over the learning process and adapt teaching to new realities.
Moreover, a teacher who uses AI effectively becomes a role model for others. That's why investing in people, not just products, is such a strategic choice.
Data protection is a balance between the technical safeguards provided by developers and the responsibility of the users themselves. Most data leaks don’t happen because of system breaches, but due to irresponsible usage of technology, when users share sensitive information without thinking it through.
Don't overestimate AI: it’s not a family member, friend, or therapist. It’s just a tool.
First, we need to understand what information is truly sensitive. Technical details like a name or birthdate aren’t that critical, but personal issues — family matters, health conditions — should never be shared with AI tools.
Second, most modern systems already have basic protections in place, so the main risk lies with the user. If teachers and students avoid oversharing, the chance of leaks is minimal. And finally, we shouldn’t overestimate AI: it’s not a family member, friend, or therapist. It’s just a tool.
For me, the biggest risk is the decline of students’ cognitive abilities. I see AI increasingly used not just as a support tool, but as a complete replacement for thinking, solving problems, writing texts, and even forming opinions.
That’s a dangerous trend. If children get used to receiving ready-made answers without truly understanding, we risk raising a generation unable to analyse, think critically, or focus for long periods. This isn’t just an education problem — it threatens the future of our economy, science, and culture. How will we train analysts, researchers, or creators if basic thinking skills disappear?
Absolutely, the role of the teacher will change significantly. AI is gradually taking over traditional functions like providing information, helping solve problems, and preparing materials. Teachers are no longer expected to know everything by heart. Instead, their focus shifts to organising the learning process, guiding students, explaining concepts, and sparking interest.
In this new context, teachers become mentors, facilitators, and sources of emotional and social support. They notice ‘weak signals’ — changes in mood, loss of motivation, emotional burnout. This requires sensitivity, trust, and flexible thinking — qualities AI still lacks.
Theoretically, we may see avatars or holograms that provide personalised teaching, remember students’ histories, and imitate human behaviour. But the question remains: can a digital entity truly replace a real teacher — someone with emotions, intuition, and empathy? Education is not just about knowledge, but also relationships, support, and personal development.
So, the teacher’s role won’t disappear; it will transform. The focus will move from delivering information to fostering human connection, guidance, and care.
Teachers will need digital literacy, data interpretation skills, and the ability to evaluate AI tools critically. Equally important will be soft skills such as adaptability, emotional intelligence, and the ability to guide students in the ethical and thoughtful use of AI.
Students should be active participants, giving feedback, co-creating learning experiences, and learning how to use AI responsibly. Their perspectives are key to building tools that are engaging, fair, and truly learner-centred.
The breadth of knowledge and understanding that ELEKS has within its walls allows us to leverage that expertise to make superior deliverables for our customers. When you work with ELEKS, you are working with the top 1% of the aptitude and engineering excellence of the whole country.
Right from the start, we really liked ELEKS’ commitment and engagement. They came to us with their best people to try to understand our context, our business idea, and developed the first prototype with us. They were very professional and very customer oriented. I think, without ELEKS it probably would not have been possible to have such a successful product in such a short period of time.
ELEKS has been involved in the development of a number of our consumer-facing websites and mobile applications that allow our customers to easily track their shipments, get the information they need as well as stay in touch with us. We’ve appreciated the level of ELEKS’ expertise, responsiveness and attention to details.