Dr Julia Hofweber speaks with Peter Embleton-Smith about Artificial Intelligence (AI), discussing how it may be used to assist diverse students’ learning experience in Higher Education.
Transcript of interview
Dr Julia Hofweber: Okay. Today, Ai4unidiversity is interviewing Peter Embleton-Smith. Peter shares a core research interest with our project, namely the potential scaffolding role of AI in education. Peter, could you explain the notion of scaffolding and Vygotsky’s Zone of Proximal Development and how it links to AI?
Peter Embleton-Smith: Yes, I’m just going to share my slides very quickly. Hopefully, you can see that. Yes, absolutely. Vygotsky was a Russian-Soviet psychologist who’s best known for his work in the psychological development of children.
One of his theories was to do with the Zone of Proximal Development, which I have represented here a bit like Wi-Fi bars. Here, the dot represents the learner and each of those bars represents a piece of learning with increasing levels of difficulty. That first zone, or the first bar, would be what the learner can do independently, their current level of ability, which Vygotsky referred to as the Zone of Actual Development.
The second zone is what Vygotsky refers to as the Zone of Proximal Development. That would contain tasks that the learner couldn’t do alone yet but can do with some support. The third zone, the Zone of Distal Development, is beyond the learner’s reach, even with support. I like to simplify this by thinking about it like this: zone one is addition, zone two is multiplication using repeated addition, and zone three might be algebra or calculus.
That Zone of Proximal Development is where the richest learning happens, with guidance or collaboration from a more knowledgeable other.
My theory was that in a conventional classroom, you might have one teacher who would be differentiating the learning, perhaps three different ways if the learners are lucky, maybe four. But AI could be used to differentiate the learning for each individual learner and adapt the learning journey to suit the needs of the individual. It might look like AI offering real-time feedback or hints, adjusting the difficulty based on the learner’s performance, or providing explanations or examples that are tailored to the individual learner. All of that would hopefully mirror the support that a teacher or peer might offer within the learner’s Zone of Proximal Development.
Dr Hofweber: Okay. Thank you very much. I really like your illustration of the Zone of Proximal Development, the digitally inspired aspect. You mentioned the potential of the scaffolding role of AI responding to differences in learner profiles and, obviously, in the Ai4unidiversity project, we are particularly interested in these differences or diversities. Concretely, with examples, how do you think AI could be used to support neurodiverse students?
Embleton-Smith: I think the adaptability of an AI-driven programme could be really useful for supporting neurodiverse learners, as it could be customised easily to suit the needs of the learner. This relates to adapting for different learning styles, but also for accessibility. For example, interfaces could be customised, and speech-to-text and text-to-speech could be used to help learners with dyslexia or processing difficulties. For learners with autism, virtual tutors can provide consistent and non-judgemental interaction, which might help with anxiety or social interaction issues.
Dr Hofweber: Yes. That’s very interesting what you say about autism because it’s actually been shown that autistic children sometimes learn better from robot interaction, as it doesn’t have all this human social complexity and associated anxiety. Another way in which we’re looking at diversity in our project is linguistic diversity; in other words, non-native speakers. Higher education involves many users of English as a lingua franca. In this case, how do you think AI could be used to support students who are second-language users of English?
Embleton-Smith: Absolutely. AI-powered tools could do a few things. They could simplify or explain academic texts by rephrasing them into clearer, plain English or by translating key terms, which can reduce cognitive load. Writing assistants like Grammarly, or even generative AI tools, could offer real-time feedback on grammar, structure and tone.
I know colleagues for whom English isn’t their native language, and they use these tools to good effect. It can also provide transcripts or subtitles for lectures or videos, which can make a big difference. Overall, it’s about making language less of a barrier and more of a bridge, helping students focus on learning the subject rather than decoding the language.
Dr Hofweber: Yes, that’s phrased very nicely — bridging the gap. Again, we have this idea of scaffolding to enable them to focus on the content rather than language barriers being a problem. Now, one of the things we are also doing in this project is exploring the use of AI experimentally. I wanted to touch upon an experiment you conducted in a workshop you recently organised on AI in higher education. Could you describe this mini-experiment?
Embleton-Smith: Yes, of course. I split the room into small groups of three or four and gave every group the same task: to design a one-hour workshop on climate change for first-year undergraduate students. Half of the groups were instructed to use AI to assist with their design, and the other half had to do it in a more traditional manner. It was an exercise to see how good generative AI was at brainstorming and coming up with ideas.
Dr Hofweber: A very innovative task. What observations did they make afterwards?
Embleton-Smith: The quality of the AI-produced ideas was quite good — consistent and well-structured — though often quite similar. The groups that worked without AI were much more original; for example, some took students on a mini-field trip to see the impacts of climate change. The AI groups were more conventional.
The AI groups also said it was hard when they had a vision and the AI led them in a different direction. Those who created their workshops without AI were more emotionally invested. My feeling is that to enthuse learners, you need to be enthusiastic yourself — and that’s harder if you didn’t design the material.
Dr Hofweber: Yes, that’s a really interesting point. I can see how it would affect delivery if you didn’t create it yourself. And if we transfer this to students writing essays or presentations, this would also affect their delivery and how well they remember the content.
Embleton-Smith: Absolutely.
Dr Hofweber: This is a very important point about emotional attachment and connection with the material. Now, venturing into a controversial topic — the notion that AI could potentially take on some of the teaching itself. In your workshop, you presented the possibility of AI teaching roles and an approach called the ‘unbound’ approach. Could you briefly describe this and what participants thought?
Embleton-Smith: Yes. The Unbound Academy is an online charter school in Arizona that plans to use AI to deliver core academic studies — reading, writing, maths and science — in 25-minute blocks in the morning. The AI will deliver the content and adapt it to each learner, while teachers focus on support, guidance and life skills. While AI handles delivery, teachers monitor progress, provide motivation, lead life skills workshops and handle pastoral care. They haven’t started using it yet, but it was an interesting case study.
The reception in the room was mixed. Some disliked the idea of removing core teaching from staff. But we discussed that AI could differentiate learning in real time without capping students’ potential. If a learner improves quickly, AI could respond immediately. Some participants were resistant, but others were open to adopting it.
Dr Hofweber: And I guess it’s also a matter of resources. If you have students with additional needs, there often aren’t the human resources to support them all, whereas AI could be a complementary tool.
Embleton-Smith: Indeed.
Dr Hofweber: Okay, thank you so much for sharing your workshop insights. One last question: looking to the future, what do you think universities need to do to maximise the potential of AI in scaffolding diverse students — both neurodiverse and non-native speakers?
Embleton-Smith: Investing in inclusive design solutions will be crucial, selecting AI tools that are accessible and flexible to support varied learning styles. Universities also need to embed digital and AI literacy across the institution, so everyone knows how to use tools effectively and ethically, and students use them to enhance their learning rather than replace it. Including diverse learners in testing and feedback is also vital, to ensure the tools are suitable for those with lived experiences of neurodiversity or studying in a second language. And we need to ask students what they actually need, and give clear policies on what’s permissible and what’s not.
Dr Hofweber: Yes, indeed. Okay, well, thank you so much, Peter. I guess we’re going to stop the recording at this point.
