Episode 88 How AI and Assistive Technology can take the Dis out of Disabilities
Below, you can view or listen to Episode 88 of The Personal Brain Trainer Podcast.

In this episode of the Executive Function Brain Trainer Podcast, hosts Darius Namdaran and Dr. Erica Warren discuss the importance of becoming an expert on one's challenges and strengths. They explore the intersection of executive functions with conditions like dyslexia, ADHD, and autism, emphasizing the need to self-advocate and accommodate personal needs. The conversation touches on the significance of understanding processing styles, such as kinesthetic, interactive, and sequential, and utilizing tools like AI and Google Keep to optimize learning and productivity. Additionally, they share insightful anecdotes and practical strategies for enhancing self-awareness and leveraging individual strengths.
Listen:
Watch/Listen on YouTube:
Links:
- Speechify: https://share.speechify.com/mzxDU3e Use this link to get $60 off Speechify and 1 month FREE when they sign up for Premium.
- Shovel: Shovel Referral ID for 20% off: Go to https://shovelapp.io/dig/108/ Use coupon code: DRWARREN
- Student Processing Profile: https://goodsensorylearning.com/search?type=product&q=yppi
- Your Professional Processing Inventory: https://goodsensorylearning.com/search?type=product&q=yppi
- Voice Dream Reader: https://www.voicedream.com/
- Elevenlabs: https://elevenlabs.io/
- Bookshare: https://www.bookshare.org/
- Learning Ally: https://learningally.org/
- Executive Functioning Resources: https://goodsensorylearning.com/collections/executive-functioning-skills-training
- Executive Functioning Assessments: https://goodsensorylearning.com/search?type=product&q=EFCA
- How to Teach Executive Functions in Grades 1-6: https://goodsensorylearning.com/blogs/news/how-to-teach-executive-functioning-to-elementary-and-middle-school-students-and-make-it-fun
- Inner Voice: https://goodsensorylearning.com/blogs/news/inner-voice-app
- EF Coaching with Darius: https://www.ivvi.app/coaching
- EF Student Coaching with Erica: https://learningtolearn.biz/
- EF Adult Coaching: https://dropintoyourbestself.com/coaching
Brought to you by:
- https://goodsensorylearning.com
- https://learningspecialistcourses.com
- https://bulletmapacademy.com
- https://iVVi.app
- https://dropintoyourbestself.com/
- Dr Erica Warren Assessments
Transcript:
#88: How AI and Assistive Technology can take the Dis out of Disabilities Recorded
Erica: Welcome to the Executive Function Brain trainer podcast. I'm Dr. Erica Warren.
Darius: And I'm Darius Nomduron, and we're your hosts.
Erica: Sponsored by the Executive Functions Coaching and Study Strategies certification course, a comprehensive training for educators, coaches, and parents.
Darius: Sponsored by ivvi. Imagine turning your meeting's audio into a live mind map instantly. So. So you remember what matters. Well, try ivvi for free now at ivvi App. That's ivvi.app.
Erica: Hey, Darius, Great to see you as always.
Darius: Great to see you, Erica. What have we got in store for today?
Erica: I've been really excited about this episode. It's been brewing for a while, and I want to talk about how AI and assistive technology takes the dis out of disabilities. I'm finding that it's really leveling the playing field for a lot of students with disabilities, and it's really enabling them to accommodate themselves in their way of processing, which is really, really exciting. And interestingly enough, and we'll also dive into this a little bit, too. I think it's creating some new disabilities.
Darius: Oh, interesting. Okay. So for anyone who's not listened to us before, what would you define? What are you talking about? Disabilities? Because, you know, a lot. For example, we're talking about learning disabilities, and then dyslexia and ADHD and so on come underneath that. But then people with dyslexia and ADHD often don't feel disabled, per se. We've had conversations before where you don't think of your dyslexia as a disability, and yet we are in this podcast talking about things like dyslexia as well. So let's define our terms.
Erica: Yeah, well, you know, and it's interesting because I don't think you and I really believe in disabilities. I'm very much of a social constructivist. I believe that if you do not function the way society expects you to function, then they diagnose you with a disability because you're just not within the norm. And the reason why I even entertain or accept that terminology is because in the United States, it gets you reasonable accommodations. It gets you. We have 504s, which is a form of accommodation. We also have IEPs, individual education plans, and you have to have a disability in order to receive those types of accommodations. So I often will tell my students that's why we have to use that term. But I don't believe in it because it really is socially constructed. And I think with the new AI wave that's hitting us, it's going to change disabilities because there's so much in AI that is going to literally take out the dis out of the disability for individuals with dyslexia, dyscalculia, dysgraphia, because it allows you to process in a way where you're no longer disabled, you're abled.
Darius: Yes.
Erica: However, there, there are going to come some intellectual challenges with AI, because if you're not good at conducting it, you could have a disability because if you do not have strong analytical skills, you don't have strong reasoning skills, that you're more of a passive learner than an active learner and you're using AI inappropriately to do the work for you, so you're actually not learning, then that's going to be a new type of disability in education. So it's pretty interesting. So, yeah, I think it's really going to change the landscape of what disabilities are and how we accommodate them. I see you looking off into the distance with a big smile on your face.
Darius: Yeah, yeah, yeah. That's all. That's great. Time to play. So you've just laid down a challenge where it's like it'll create a new form of disability and I'm wondering what kind of dis it would create. We've got dyslexia, dysgraphia, dyscalculia. For those of you don't know, graphia means difficulty processing graphical handwriting or writing. Dyscalculia, difficulty processing maths. It doesn't calculation, calculations. And then you've got dyspraxia, which is difficulty processing movements because you can be clumsy
00:05:00
Darius: and not judge things quite well. And then dyslexia, difficulty processing words. So if you've got difficulty processing AI or difficulty conducting AI, what this would that be, Erica?
Erica: Well, I think we're looking at analytical skills. Disanalytical, disanalyst.
Darius: What would. What would the. There needs to be analytics.
Erica: Yes. So something of that sort. But. But you know what it comes down to. This is very interesting. You know where we're going? Executive dysfunction.
Darius: Oh, yes, yes. Executive dysfunction. Yes, it is.
Erica: So, you know, we do it a little bit differently with executive functioning. We put the. The dis on the functioning, not the executive. Not dis. Executive functioning.
Darius: This executive.
Erica: That's kind of funny.
Darius: That's quite weird, though, because if you're. If you're not good at this executive basically means not good at processing executive skills. Yeah. You're not good at being an executive. If you are really good at. At being an employee, then you're going to find AI really, really hard. If you are because the AI will become the employee. And to swap you out if you are really good at being an executive, planning, driving and determining the direction of things. That's what executive function is. It's about. I'm staying on track. I want to get there. I want to do this in, in spite of all the distractions, I'm going to use my inhibitor control to stay on point. In spite of all the different changing circumstances, I'm going to use my cognitive flexibility to adapt, etc. You know, in order to achieve an overall meta-aim that, that is going to be highly valued in the world of AI. But let's. Okay, now we've talked about defining the terms. Listeners know about that. Let's get stuck into the meat and potatoes and, or, or what would you eat if you're a vegan? Yeah, into the meat and potatoes. And talk about how AI and assistive technology can take the DIS out a disability and amplify your ability. So basically, it's kind of like unwrapping the DIS that is round about you so that it reveals your ability and implements your abilities, empowers your abilities.
Erica: And perhaps a nice way to do this is to look at each of the traditional disabilities and let's talk about how AI and assistive tech can really erase that DIS piece of it. So let's take dyslexia first. So how can assistive technology and AI help those with dyslexia? Wow, there's so much great stuff out there. I love Speechify. I've got a YouTube video out there about voice cloning. Voice cloning is remarkable. You know, we've talked a lot about how if you visualize something, you can almost trick your brain into believing that you've experienced it in real life. I feel like it's this that's really kind of visualization and I don't know if this is an actual word or not, but auditorialization. So with Speechify, you can now voice clone your voice. So I've had young students that have voice cloned their voices that can't read well at all, but they were able to express a passage out loud, voice clone their voice, and now they can hear themselves read to them perfectly. And then they get exposed to what it sounds like to hear themselves read perfect perfectly, and it blows their mind. First of all, it gets them really excited about reading. They get to hear their future self. And just like when you visualize something enough, you can make it happen if you hear yourself reading. I just, I can't even begin to tell you, this little boy that I was working with, very flat affect, not happy about reading, was an older student still having to go through reading remediation. And I was just like. And when I showed this to him, he came back like a different kid. I mean, his face lit up, he was excited, he couldn't believe it. He couldn't wait to use it. And I'm like, you know what? I want you to start reading books that you've always wanted
00:10:00
Erica: to read that were way too hard. And now all of a sudden, he's like, absolutely. And of course they can do one of two things, read along with the text. Now all of a sudden, they're going to start to see whole words. They no longer have to learn to decode, they can see whole words. And they will learn to be readers really fast if they're looking at the words and hearing themselves. Read it perfectly. Right? Or just close your eyes and just use all your brain power to visualize. Because we need to build both of those skills to automaticity in order to be an outstanding reader. So I encourage them to do those two things separately. And once they've become automatic, then you have the space to. Cognitive space to unite them. So you can see that you, a child, can almost do their own remediation that way. But what's so great about it is the level of excitement and how it just opens that possibility and it kind of tricks the brain into saying, oh, I can, I can do this. Listen, that's me reading that perfectly. I'm reading a book that I never thought I would ever be able to read perfectly and the level of confidence. And so, yeah, that speechify really blew my mind. And then for readers, you can also use NoteBookLM.
Darius: Can I just pause for a moment there? Because there's so much more to this that people, I mean, when I'm coaching people and I show them how you can scan a document into a PDF and just read it out, they're like, oh my goodness. Never crossed my mind that I could even do that because people are like, oh, it's not on audible. That book's not on. If that book's on audible, great, I can listen to it as a book. But it's not on audible and this report isn't on audible or whatever. But you can just take your phone, take photos of the pages quite quickly. It turns it, it does a quick scan and then it just starts reading it out to you. And I highly recommend it. It seems a bit counterintuitive. Sometimes I need to spend maybe 15 minutes scanning a whole book in. Maybe, maybe 20 minutes scanning a whole book in. But let's say there's 250 pages to that book, and it takes me 20 minutes to scan it in. For me to read those 250 pages, it's going to take me six to eight hours if I'm lucky and I don't get distracted. And if I do actually finish reading it because I get distracted, it takes a lot of will open the book and just read it. If I scan it, it takes 20 minutes to scan it. I can read it at two times speed and sometimes three times speed, because my natural pace is about three times speed.
Erica: Oh, and it's amazing how fast this you get used to reading a faster and faster and faster speed.
Darius: And also the AIs have been so good that it's easier to listen to an AI at three times speed than a human. Have you noticed that?
Erica: Right, because. Yes, because they don't make your voice sound faster. They take away the spaces between the words instead of making you sound like a chipmunk.
Darius: Yes. And what we're doing is instead of taking a normal person's reading voice and then accelerating it three times, they're just being told by the AI, speak three times faster. And so all the intonations and changes and so on become more natural because they're just saying, right, okay, I'm going to speak three times faster. So everything feels just more intuitive. So anyway, so the point is, instead of it taking six hours, it takes me two hours and 20 minutes to scan the book, to read that book with my ears, rather than, can I.
Erica: Can I solve that problem?
Darius: Yeah.
Erica: So there's a company called Bookshare. There's also learning Ally, and I particularly like Bookshare. And if you have a disability, I can authorize it. You could authorize it, Darius. Teachers can. But you can get your Bookshare authorized if you have a disability and you're in school and you can get these books for free, and they have AI voices that read it aloud. And of course, the AI voices are getting better and better, but it also allows you to see the text, and it highlights the text as you're reading it. So you can still do that kind of natural remediation in if you listen and look at the same time, you will become a better reader because you'll start to see whole words.
Darius: But just America, because I think you can do Bookshare as well in the UK and it's a slightly higher subscription, but yeah, you.
Erica: I, I don't know if they have anything free worked at. It's just our government compensate if you have a dis. If you have a disability. But I think that the price is pretty darn reasonable.
Darius: It is, yes. $25 or $50 a year for.
Erica: But, yeah, but Speechify,
00:15:00
Erica: as long as you can find a PDF of something and you usually can. Yes, as long as you have a PDF you can just drop into speechify. Their voices are outstanding. They're I think the best voices that I've come across and of course that's where they have the voice cloning and I will put a coupon code so that you can get 60% off in the first month for free.
Darius: How Muchify? Give me an idea.
Erica: I don't know. I, it could be around a hundred is a year but I, I think I can, I can get that price down pretty significantly with my coupon code. So it definitely makes it more manageable.
Darius: Yes.
Erica: And then if you then recommend it to other people and get your own code then they give you credit. So if you have a few friends that sign up for it then you'll probably get it for free the next year. So you can kind of wrangle it to do it for free.
Darius: Absolutely. I would say Speechify is the premium product in all, but I would agree.
Erica: I would agree they have so many great voices to choose from too.
Darius: There are other options. Like I found Voice Dream Reader really useful.
Erica: I like it too but they're… It's just a matter of a year or two before all the voices are outstanding. I mean you listen to Notebook LM and the in their voices are the best. I've heard even better. Speechify, I mean, wow.
Darius: Speechify is like six months ahead of everyone else in terms of they're paying for the top tier voices. Voice Stream Reader is cheaper but it's still a price. Now 11 labs who are the originators of these voices have actually created a book reading app for you that is free for all of the top tier voices. They don't have the voice cloning and so on but it's basically a direct competitor to Voice Stream Reader and it's actually currently free. So it's very worthwhile looking. But what tends to happen with people is you, you kind of use it for a month, two months, three months and then you start, it becomes part of your habit, and you go actually I want to commit to this. I'm going to invest in this. This is an investment in myself. And then they buy. I think it's really useful with dyslexia to start actively expecting and looking for where the speak aloud function is. I mean, even in ChatGPT, for example, you. You click on the button, there's a little voice button there, you click on the voice, and you can start speaking to chat GPT instead of typing in, and then you can start listening to it. So basically, you've got both those assistive technologies being created, which is speech to text, and then what would.
Erica: That's a whole other accommodation. Yeah, that's definitely a whole other accommodation, which is just amazing. Another great accommodation. Perhaps we should. Should use that as a segue to. To. Yeah. Where you can speak and it types for you, which. That accommodates dyslexics, but it also accommodates those with dysgraphia.
Darius: Yes.
Erica: Because now you don't have to be thinking about writing, the process of writing, you don't have to be thinking about spelling. So there are those with dyslexia that they primarily are impacted in their spelling. Well, that's no longer an issue if you're using voice to text. So you just speak, and the text appears. And I often tell people, even when they're using Google Docs, that there is the. What do they call it? They call it voice typing. Voice typing, that's right. That if you don't know how to spell something, just turn on voice typing and then you can just say the word and it spells it for you, which is really, really helpful. And then of course, we've got things like Grammarly and. And then you can even do spell checks and grammar checks on whatever writing platform you're using. And all of that is AI.
Darius: It is. And. And the interesting thing here is that it's. In the past we had assistive technology, okay. And we had speech to text, but it cost a lot of money, okay, for the programmer to use speech to text. So it was kind of doled out just in little chunks. Like if you did speech to text on your Apple phone, you'd get like 30 seconds or a minute if you're lucky, before Apple switched it off. And then you had to restart it again. And then.
Erica: Yeah, yeah, yeah, it was a pain.
Darius: So annoying. You said, I'm not even going to use this. So whereas now that's 10 minutes long on Apple, you can put it on and record it, or you can have longer. If you use Apple Notes and use the speech recording function within Apple Notes, it'll transcribe everything you say and record the audio and put it into your notes. So why am I mentioning all of that.
Erica: Wait, wait, wait, wait. Slow down. What. How are you using in Apple Notes?
Darius: Have you not discovered this incredible new feature in Apple Notes?
Erica: No, I
00:20:00
Erica: have not.
Darius: It's. It's a brilliant new feature.
Erica: Oh, so you can actually speak to Apple Notes, and it will type it for you.
Darius: Let me be more precise. You know the voice note function Apple used to have that was separate. You could record a voice note and it would just be an audio recording.
Erica: They had that in Apple Notes.
Darius: They. They now have it in Apple Notes. Okay. Right. So you can create a new Apple Note. Okay. You can go into your accountant, who's just about to explain something complex to you. Create accountancy note, and then you hit the paperclip icon, and it gives you the option to record a voice note, and then it records the voice note, adds it in. But better than that, it also gives you the option to transcribe it as well. So if you tap transcribe, it'll give you the full transcript and the audio inside of your note as a little block.
Erica: Oh, I just found it. It just got you. They have like a little icon that says record audio.
Darius: Yes, yes.
Erica: Oh, I had not seen that. Ah, that's beautiful. Oh, my goodness. That's. Yeah. So it really. Wow, that takes a lot of pain out, you know. Makes me. Makes me think of one of my favorite little features of Notebook LM. Say you're. You find a really great YouTube video that you want to use in a research paper. You can just drop the URL of that particular video into Notebook LM, and it transcribes the whole video for you.
Darius: Yeah.
Erica: So you have the video transcription.
Darius: It also. What? It doesn't just transcribe it; it also watches it.
Erica: Right. So then it. Then you can. All that content is in there, and you can ask it questions. And the nice thing is it will then even take you back to where it was in the video that it found that information. It's really. Sure. Notebook LM is. Is unbelievable. It accommodates so many different disabilities.
Darius: Yes. You know, so actually, the underlying point in all of this, the meta point in all of this is we've had assistive technologies before, and they've all been held back because they've been so expensive.
Erica: Yes.
Darius: Now, because the AI models are becoming multimodal, they can watch a video, they can listen to the video, they can understand the whole thing, and then they can communicate with that with you at like one thousandth of the price. It was two or three years ago.
Erica: Yes.
Darius: Then it's now Being rolled out into all sorts of apps. And for someone who does have a disability, if they are switched on enough, and I hope this podcast switches you on enough, that you start looking at for that button that does the speech to text, text to speech, multimodal image, create this into an audio summary or whatever, then you, you get it. But you often have to know to just click that extra little button. It used to be three buttons deep and you know, it got rationed out and it wasn't much worth it unless you paid a massive subscription. Now it's up close and personal and it's fluid and it's so much cheaper. It's amazing.
Erica: And the important thing to note is that most of these are now available across platforms. Like, I have one subscription to Speechify, but I can use this with the same subscription. I can use it on my phone, I can use it on my computer, I can use it on Google Chrome. And that's the game changer for me. So if I'm working on anything in Google Docs, anywhere on the Internet, even within Canva, anywhere, I can now hit that little icon on the top of Google Chrome and it will, it will read it to me.
Darius: And that's quite a big deal because Google. One of the weird weaknesses of Google in Google Docs, for example, is you can do voice typing, but it won't read out the text to you. You can get the text to be read out to you on Apple Pages, select the text that will read it out to you using Siri's voice. It's not perfect, but it's workable. But Google Docs doesn't read out to you. It's amazing.
Erica: Well, the other thing that I've noticed is if I'm ever on a site where it won't read it, I can copy and paste it into Speechify because Speechify also has a platform where you could just paste anything in and then you can have it read out loud to you. So it's, it's very diversified in the ways that you can use it. And it's always a little bit of a pain to sign up and get all of those different places logged into the same account. But once it is, it works pretty seamlessly.
Darius: And I love Erica, that world is going away. Speechify is going away. Okay.
00:25:00
Darius: Unfortunately, in, in some way. Because what's happening, what's going to happen in the next year or two years, okay, is these technologies are going to be native everywhere and you will, our level of expectation will shift where you'll say, well, of course I want to see this as a mind map. Of course I want this read out to me. Of course I want you to speak this to me. I speak.
Erica: So translate it to me. I mean, there's so many different things.
Darius: Absolutely, yeah, do it. Show it to me in sign language. You know, they've got AIs that will, that will interpret sign language being done by hand and then do sign language back to you with official avatar. So the point is, we're getting to the point where we're not going to have individual apps to read out this to us or to do this to us or to do that to us. It will all be brought in together. And that dis in terms of disability will literally melt away because that AI model has now been tuned in to communicating with you according to your strengths. And it won't be some module that you download to do that. It will be native, and you won't.
Erica: Be forced into a program. For example, I knew this one little girl who, the way she learned how to read was she sang, she sang the text to herself, and she made everything into a song. Well, with AI, you could say, turn this into a song. So that way it's fully accommodating what you need, and it fully accommodates your creativity and wherever you want to take it, because it specializes it to your brain. And that's what I'm noticing in Notebook LM is, you know, I give my students, I mean, one of the things that I really have to do first is I have to figure out what their best ways of processing are. That to me is the blueprints. So I give them the student processing inventory or the YPPI, your, what is it called? Professional processing inventory. I give them those assessments first, figure out what their best ways of processing are. Because I, I can't read their mind, and they often aren't aware of how they process best. Once I know that, then I have the blueprints so that I can teach them how to use AI and assistive technology in a way that will work for them. Because they don't always consciously know how they process. And once they consciously know how to process, then they don't necessarily know what are the assistive technology tools that can do that. But I do know that. So then I can guide them, say, oh, this one's going to be amazing for you because you're a simultaneous processor, you're a visual processor, and you're very hands on. And this one was made for you.
Darius: And I think what's going to happen is when we look into the future, even in the next year or two. Two years, say. I think AI is going to take a lot of the dis out of disability. If you got access to it. That's the big question. If you've got access to it.
Erica: For example, let's take dyscalculia, because we haven't talked about anything to do with Math.
Darius: Okay.
Erica: And ChatGPT is amazing. I mean, when I'm working with kids with math, I have a split screen because students. Because I also want to make sure that I'm processing it in it the way that the teacher taught it. Because sometimes there are multiple ways of teaching a particular math method and you can confuse them more by giving them a completely different way of doing it. So I often have that open. And also just to make sure that because there's so many places that you can make an error within math that we're going through each of the steps correctly. But it's amazing how it can really break down the steps of how to do something. And in the past, we had things like photo math, which was great. You could take a photograph and then it would show you the steps to go through a math problem. But when you had a word problem, it couldn't do that. But now with Chat GPT, you can throw in a word problem, and you can say, can you show me how to do this? And it breaks it down. Oh, can you break it down into more detailed steps? Oh, can you make it a little bit more visual for me? Oh, can you put into a sequence? Oh, could you create, you know, so you can keep. Again, it allows you to tailor the. The math. Can you give me a memory strategy? I want to be able to remember the sequence of steps that it takes. I'm a very sequential learner. So will you break this into a sequence of steps and then help me come up with an acronym so that I can memorize the steps, you know, or here's a formula that I have to memorize. Can you help me figure out how to make this easier for me to memorize? Because I'm having trouble remembering it.
00:30:00
Erica: It can even help with memory strategies. It's really quite remarkable what it can do. So, yeah, I've found that ChatGPT has been amazing for math, and I know that there are all sorts of other math things out there, but I find that that's usually the one that I go to for math and teaching the kids to use it. You know, do the math problem, then double check that you got it correct, or, or if you don't know how to do it. Ask it to go over the steps of how to do it. It's.
Darius: This is just the beginning. You know, the next step is when we have. When you can lift up your phone and show it what the maths that you're. You're writing and it's watching you doing it, and then it can give you guidance. Oh, remember to do this. Oh, did you notice you forgot that? Oh, gosh. You know, instead of making a mistake all the way six lines down, do.
Erica: You know how it's going to change? Because so many kids, they mislearn something and nobody knows. And so they. They've been repeating over and over again, doing it wrong. And then it's really hard to unlearn something because the brain has established a neural pathway. If we can catch them early on, the second they make their mistake, then we're not going to have nearly as many difficulties in teaching something because we're not trying to backtrack where the mistake was made. But most of the time, the teachers aren't looking. They don't have the time to look at each kid to see the moment that they make a mistake. And AI is going to be brilliant. It's going to really accelerate our learning and it's going to decelerate the frustration.
Darius: So it basically we categorize these things as learning disabilities, okay? And it's incorrectly classified. I do not have a learning disability. You do not have a learning disability.
Erica: It's a learning difference.
Darius: It's a learning difference, but I do not have a learning disability. Okay? The school has a teaching disability.
Erica: Yes.
Darius: Okay. That is really what it is. They've identified a teaching disability that they then say, right, you have a learning disability, so we'll accommodate your learning disability. But actually what they're doing is they're accommodating their teaching disability by adding in extra tools and so on, because they can't teach in that way. They can't read it all out for you. They can't write it out for you or whatever. But with AI we will no longer have these learning disabilities because we will now be taught in the mode that we prefer. And so this teaching disability will disappear. This learning disability will fade away. It's not that we won't find certain things hard, but we will have avenues to actually learn to read in a dyslexia friendly way. We will learn in a dysgraphic friendly way, etc.
Erica: The other thing is, which I can't wait for, is having better tests because I have so many kids that come in to me and will say, like, you know, like, everybody failed the test. And when I hear that, then it means that the teacher failed to teach the information. And for some reason, teachers don't realize that if not a single person did well on the test, then it's not a reflection of the students, A reflection of them. And I think. Or. Or the test is poorly written.
Darius: Yes.
Erica: I mean, many times it's just that the students didn't get it wrong. It's a really bad item, and they don't throw out the item, and they should be. When I was in graduate school, we were taught about how important it is to evaluate the items. And if so, many students miss it, you throw out the item. And I've never seen a teacher do that. And it's really important. But I would imagine that AI could help them write better tests, also help them score the tests better, because there's human error. I mean, how many times does it. I can remember being a student going back and saying, hey, I think this is right. And teacher's like, oh, you're absolutely right. I missed that. I don't know what happened there.
Darius: Those tests are going to go away because, you know, we'll have AI that will score the test on the fly in real time and.
Erica: But nobody will have to fail. I mean, there's no reason to fail. There's never a reason to. Failing only shuts down learning. It doesn't help us want to learn. And how often do students learn from their mistakes? They rarely learn from their mistakes because they're not given the opportunity to retake a test. Oh, why not? Aren't we trying to teach them? Yes, we're trying to teach them. They should always be able to retake
00:35:00
Erica: a test and learn from their mistakes because that's where you learn the most.
Darius: And a lot of people with dyslexia, for example, because their working memory is not so great, they will forget that experience and that moment, and then you come back two weeks later and say, there's your marks. They won't remember what on earth happened there. So it doesn't mean much to them. But if you gave them that immediate feedback. Yes, this is going off. Oh, I'll adjust. This is going. Oh, just so it's that shortening, that feedback loop. AI will shorten feedback loop. And the best thing that's going to happen. I cannot wait, Erica, until Android XR comes out. Okay. All these glasses are going to come out. Okay. And they're going to be as common as our mobile phones. So you're going to put on glasses like you've got just now. If you record, a red light will come on. But for you, what it will do is it will see what you see. So if you're in a class and you're reading a book, you don't need to scan that book in. You'll have the glasses. You open the book, it will read it out to you if you want, and it'll have a little speaker on the thing and it'll read it out straight into your ear and no one else will hear. That's going to be a game changer for a lot of very visual people. For those people who are visual and verbal, these are the two most difficult things to teach people with visual and verbal in that it takes a lot of effort to create something visual for a person and dual code it. And often you don't have the time to verbally give them the information or verbally read out that book for them or verbally support them or have a discussion or whatever. You know, those two big categories. It's much easier to do it via text. It's just more efficient as a list. Do you know what I mean? But now with the advent of AI and how cheap AI has become, the visual and the verbal aspects of things become much easier to do and more natural.
Erica: Well, and think about what these glasses could do for executive dysfunction, you know? Well, a couple things. It could let the teacher know exactly who's paying attention and who's not. It can let that person know because, you know, if we catch ourselves thinking about something else soon enough, we're not lost in that thought. We can pull ourselves back in. But I think that's. I think that's one of the biggest challenges in school now is that kids are entertaining themselves, doing other things in class, and the teachers don't always know whether it's on their Apple Watch or whether it's on their computer. They, they look like they're taking notes, but only a part of them is taking notes. Another part of them is scrolling on TikTok or whatever or texting with a friend or something of that sort. But it would be really interesting to see because teachers don't always know when somebody's listening, but AI might be able to do that in time, which is scary.
Darius: Yeah, it's a tricky one. So maybe we should change our hypothesis here to from, AI will take the DIS out of disability to something like, AI can take the DIS out of disability if you choose to, because that's one of the biggest Challenges with these capabilities is you have to have the device on a lot of the time and then. Well, first of all, you need to be on a device to make these capabilities work. And then devices are designed to distract you a lot of the time. And so these poor kids have got the potential of the worst of both worlds. They've got this opportunity for AI. Multimodal. Multi-processing. AI. Your word is processing styles. The term in the AI world is multimodal. So these different modes of processing and communicating, whether it's words, sounds, stories, videos, art, music, AI can do all of those things simultaneously in any way that you want very soon, if not now. So it's got all of that multimodal engagement, but then you've got all the distraction that comes with it. What do you do about that? Because that's in a way that ties into your executive function side of things that can become as much as. As disabling as your learning style. Yeah, that's a big challenge because we are. I'd say it's a lot easier as an adult who's got a learning disability. I hate saying that word.
Erica: I do too different. Let's just call it a learning difference.
Darius: Learning difference.
Erica: That's what it is.
Darius: They think differently, they learn differently. When you switch on the multimodal
00:40:00
Darius: side of things, you've probably got much more executive function ability. No, I'm going to focus in on this, I'm going to learn this, et cetera. Whereas children are by nature much more exploratory, much more inquisitive, much more curious.
Erica: Well, you want to expose them to as much as possible so that you're helping them to find their own true strengths and true affinities.
Darius: They're going to. Yeah, this is going to be. So I would say, in my opinion, I could say a lot of what we're talking about here is us exploring this concept rather than teaching about it. We're definitely exploring this like everyone else.
Erica: Yeah.
Darius: And the exploration that I've kind of so far come to through this conversation with you, Erica, is I think there's a big difference between children and adults when it comes to using AI to deal with the challenges of your learning mode. You know. So, for example, the children you're teaching just now, they're using these tools outside of school in their own time, but then all different kettle of fish going into a classroom. When you've got 30 kids in a classroom and they're all using different things, one person might be, you know, if you had this world, one person's using Notebook LM. Another person's using OpenAI's image generation capabilities to turn what they've just learned into this incredible image. Another person's using Ivy, for example, to create a mind map and create images and so on. So there's all these different tools. How are they going to use them in class?
Erica: Well, and ironically, I find that now the schools don't let kids use any of those tools.
Darius: No, they don't.
Erica: You know, and, and they're going to have to get over that.
Darius: No, stop for a minute on that. Okay? I've heard people say that. Okay, I've heard people say that. And we say they've got to get over that. Okay, do they?
Erica: Because I think so.
Darius: We've been saying that about typing in class or exams for 30 years. We've now got computers, but you've still got a handwrite, an exam, for example. There's absolutely no reason for that. Every student should be able to choose where they hand write it or computer type it on a neutral computer and submit it. But we still haven't managed to do that. You've got to have a blooming learning.
Erica: Here's the other side of that coin is you want to give them; you want to build certain skills. Like a lot of kids that I'm seeing now that are in high school have handwritten so little. They don't know how to do script, and their fine motor skills are underdeveloped. So, you know, particularly for the elementary school kids, they need to develop skills. They need to develop skills. So they need to kind of go back to the rudimentary things because you want to be able to write a handwritten letter and you should see the letters, I'm seeing that kids handwrite these days. They don't know how to do script, and that, that's a shame. So we have to be careful that we're not allowing them to pick so early that they're not developing core skills that they need.
Darius: Okay, so let's just go back to the school issue for a moment. You, you made a statement which was something like they're just going to have to.
Erica: Right, right, right.
Darius: And I, I, I've seen schools being so intransigent, as a bureaucracy, as a system, that, you know, they're probably about 15 years behind the technology curve before it actually gets implemented in class. At least, at least 10 years behind. Many are 15 years behind. Okay. In the world of AI, so what's going to happen is these children are not going to be allowed to use it in class because the teachers will say, we need to teach you these core skills of handwriting and all the rest of it. And then they're going to go home and they're going to use NoteBookLM and they're going to use ChatGPT and they're going to use their Android XR glasses or their Apple Vision Pro lenses, and they're going to be visualizing things on their walls. They're going to be drawing things in 3D, making things with their hands. They're going to be saying to Google Gemini, look, create an app for me that does the solar system, but in the style of, you know, I want to remember all the solar system in the style of football clubs. And each, each planet is associated with a football club. And I want to see the football club on a planet so I can remember it and start. And it will create a literal app that they can interact with in 3D and manipulate in 3D. And then they can just say, that's really great. Export that to my Airtable, you know, or my Google Docs or whatever. And they'll be doing this all natively at home, but at school they'll be living a different world. That is the reality. I see.
Erica: So when people say maybe, maybe, I think it. I think it'll probably be a combination of. Because, you know, there are going to be
00:45:00
Erica: those. And I think schools have to. I think schools have to. Otherwise there's going to be really no reason other than socialization to go there. And they have to step into the real world. And kids aren't going to be able to focus in a classroom if it's not engaging enough, if it's not speaking their language. We have to speak their language. So we've got to weave in these technology tools into the lessons as well. But I love the idea that a teacher allows kids to process the information how they choose. And then the teacher just gets to go around and connect with the kids. Maybe they could even be in like little pods or little groups of what we learned similarly together. All the interactive learners are together so they can interact with each other, and the teacher can kind of just go from group to group. And then maybe this student's like, okay, I'm done with the interactive group. Can I go over to the visual group for a while? So that they're almost like these little stations, learning stations that have all these diverse ways of processing. And then the teacher just has to conduct the different ways of processing and allowing the kids to go from station to station. That's what I would like to See.
Darius: Yeah, I suppose this highlights for me the two different worlds we mainly live in. You mainly live in the world of children and school, although you do some adults. I'm so, I realize I'm so fixed on adults now with learning differences and how they operate within the workplace that we are able to deploy these tools so much more rapidly. We've got the autonomy to say, I'm going to use this tool, I'm going to, you know, got the executive functions to stay focused on it, on the whole, etc. So I suppose where I see the most deployment, most efficacy of these tools is within the parents of people who have got children with learning differences. Because then you'll teach your child. So that's what we're doing with Ivy, for example, if you don't know. It's this app that I'm, I've built, which is a mind mapping app that takes real time recording of a, a meeting or a lecture and turns it into a mind map or classic mind map or take some text and turns it into an instant mind map. With AI, the thing is the parents can start using it in the workplace and then what happens is, and this is what I'm observing is that the parents go, oh my goodness, it'll be so useful for me. They start doing meetings, they start doing presentations with it, they start thinking through projects that then they switch on to the fact that they go, oh my goodness, I could do a mind map with my son or daughter while planning out a project. Oh my goodness. We could go through their, you know, morning routine as a mind map and reorganize it, get it into the right sequence, do a screenshot of that little section and print it out or paste it next to their bed so they can see the sequence. And it's a visual structure. And so what happens is there's this cascade from the parent down into the child and then I suspect the child will use it outside of the school and there'll be a cascade from outside of the school into the school where the children go, Ms. Brody, why can't I just mind map this? Why can't I just do this as an image with ChatGPT image? And why can't I do this? Why can't I do that? And they're like, well, okay, do a little bit in the corner and then gradually it seeps into the school. I suspect it'll work from the outside in rather than the administrator saying, we're doing this from the inside out.
Erica: Because I think you're right, I see it's already happening. It's already happening. But you know, it's interesting to pull this around to executive functioning since it is our executive functioning podcast. I can't wait. And it's already kind of happening with AI. AI infiltrating calendars.
Darius: Yes.
Erica: You know, I'm really looking forward to seeing what that's going to happen. I think, you know, Shovel is a good example of how they're using. It's a really cool app for high school and college students that helps you to keep track of all of your assignments. And they're starting to integrate AI into that as well to make sure that you've scheduled enough time. Did you schedule enough time to prepare for this test or this project? And then it gives you warnings. That's bringing the AI into it. But understand that assistive technology kind of is AI.
Darius: It is. Well, it's being a very primitive, expensive AI key thing.
Erica: Yes.
Darius: Now it's very sophisticated and cheap. I'll give you an example of learning. I'm learning how to putt, golfing,
00:50:00
Darius: putting that. I've got a little nine-hole golf, putting green in the local, golf in the local park and I go out there and putt. And the other day I was thinking to myself, I could really do with a caddy, and I switched on ChatGPT, and I started to talk about the putting. Okay. And I said, I think this putt is, you know, quite a medium putt, but I don't know how long it is. And chat GPT says, well, you could pace it out with your, your legs and then we could turn it into a distance. Oh yeah, great idea. Let's pace it out. It's 24 paces. Oh right. A pace is 2.4 foot roughly. So that's about 60 foot. And I'm going, oh great, that's a 60-foot part. All right, now I want to raise my, my putter, you know, about 3 foot and then hit it and I'm not sure if that's long enough or not. And, and so you have this conversation and oh, that went off to the right and it's acting like a caddy and saying, well, is that because of the, the roll of the hill or is it. And you start looking at it differently. Gosh, it is the role of that hill. I never noticed that little slight roll and, and so on. And often you, you use all of these different modes of learning without realizing it when you're in conversation. If you're a very experiential learner, you want to hear a beautiful story about Something someone who did it before. If you're really into like, you're saying sequential, you're like, right, remind me of the steps I used to take before I putt. Oh, right. I place my feet. Oh, I like putting my putter to point to, across my toes to point to the hole. Oh, yes, I forgot about that.
Erica: Can you imagine when you have the glasses and then it's able to say, oh, you forgot this.
Darius: Yes.
Erica: Going to be able to correct what you left out. Because I mean, part of the reason why it's so hard to learn all these new skills is there are all these micro still skills that you have to learn to authenticity. And it could really point out, oh, you missed this one. The reason why it didn't work is you weren't taking, you didn't take into consideration this or your, your left foot was too far to the right, or you didn't y. It's going to be really interesting to see so much you learn.
Darius: I, I, in that short eight minutes of doing this with it, it was like I, I asked it to track how many holes I, how many putts I did, what the lengths of the putts are, whether you broke right or left, whether the green was rough or medium or hard, et cetera. And what happens is it starts speaking back to you, okay, because it's just in your shirt pocket or in your iPod Air Pod. And then I started saying, oh, could you stop the encouragements, please? I don't need that. I just need the facts, you know, and then it changes, and it doesn't do all the, oh, go for it. You can do it. It's like, no, no, I don't need that. Thanks very much. I'm British. And you know, and you know, you can do that when it's a really good score. You don't need to do that for every single putt, you know, Etc. And, but the, the lesson in that is every time you interact in that way, and you say something like that, it adjusts and says, oh, Darius doesn't like that type of positive reinforcement in that way. But it, he does really like it when I remind him of some key steps and say, remember to calm your breathing or whatever, you know, and you're like, actually, I quite like that, thank you very much. You know, and I talk to AI like that and you, you do it and it gradually adapts. And in a way, a lot of these styles of learning that you're unconscious of come out when you're talking and when you're interacting with it, and it.
Erica: Kind of if, if you're conscious enough about it or if it's able to read your subtleties. Yeah, it's interesting. I mean, it's. That's why I like my two assessments, because people aren't always conscious of them.
Darius: Yes, yes. And I wonder if AI will start making us more conscious because we might start saying to AI, yeah, I've just been wondering, is there something, a pattern that I'm doing that I, you know, and you ask for some positive feedback or some critical feedback. Grok is very good at this with Elon Musk's X AI model. So it can. You can turn it into modes where it goes into beast mode and it starts totally ripping you down and start saying, you know, you're terrible, you know, but it's honestly, you know, like, you always overthink this, you always do this, you do this. I've noticed you do that. Whatever. Anyway, but that's an extreme for a joke, as it were. But the point AI will do is when you're ready to get some feedback and maybe you're strong enough to say, look, I'm ready to take it, you know, I've been doing quite well.
Erica: Oh, but if you get. If that becomes our norm, imagine how great people will be. I mean, like, I never saw
00:55:00
Erica: myself interacting with other people until I started seeing myself in videos. And then once I saw in videos, I was like, oh, I need to change that. Oh, that's awkward. I do that. Wow, that's weird. Why am I making that weird face? And then. And then you change it. And so what technology in general is doing is it's enabling us to see ourselves move through life so that we can be more conscious of our own pitfalls, so that we can ultimately grow faster and grow better.
Darius: I'd say in about a couple of years’ time, one to two years’ time, we'll start having private AIs who are not connected to the Internet. They will be on our device, on our mobile phone, or on our desktop, probably just on our mobile phone. We're just about to do this with Ivy and make the AI native inside of Ivy and there, even if you're offline. So it's 100% private. That's a future product that we're going to do. But that's where AI is going. And when that happens, you're going to have private conversations with AI that will not go into the public domain and will stay inside on your device, and it will start becoming a reflective companion with you. And that reflective companion will be a learning aid it will be very much more like a guide. In teaching terms, teachers shift from being the teacher to the guide. The best teachers are those who guide. And AI is going to really operate in that sweet spot of be my guide, don't be my boss, but be my guide to help me achieve what I want to do.
Erica: Well, yeah. What an exciting time to be alive. Exciting and a little bit scary.
Darius: And it's even more crazy when you're developing that app yourself. Like, I'm free. I'm. I'm finding. Amazing within Ivy because we're now moving from not only just calling on the AI to create the mind map, but we’re also now getting to the point where we're going to put the AI inside of Ivy and. And then he's going to talk to you and you're going to talk to ivvi, and you're going to discuss the map and discuss what's just been said and have a conversation with it. And then you're going to say, could you do a different image for that? And could you. Let's rearrange the mind map or let's simplify it and how do we turn this into something I can remember for a test or an essay, and it just becomes this companion. And we're going to have very specific companions that, that are, you know, like you're going to have a golf companion, you're going to have a mind mapping companion, you're going to have a health companion, et cetera.
Erica: But even within mind mapping, you could have like an academic companion, you could have work companion, you could have a home life companion.
Darius: Yes, you could have. I suppose another way of saying them is instead of companion, maybe guide, guides, guides folders.
Erica: You know, ways of just compartmentalizing. Because we're not always the same people with the same needs in the, in different environments.
Darius: Double click on that, expand on that.
Erica: Well, my needs might be different in an academic environment versus at a. In a home environment.
Darius: Yes.
Erica: So being able to, for example, if I had Ivy, I might use it very differently for academics than I would. Or I might be using it with my son who would need it in a different way.
Darius: So you might say, hey, Evie, be my academic guide. Or hey, Evie, be my dyslexia coach. Or hey, be my putting coach, my putty coach. Yeah, etc. Yes, absolutely.
Erica: Be my pottery coach. Be my tennis coach.
Darius: Yes, I suppose that wouldn't happen.
Erica: Be my sleeping coach. Be my meditation coach.
Darius: Yes.
Erica: My, my yoga coach.
Darius: I don't. Would that work within Ivy, though? Because in Ivy you would probably have A similar dynamic, but it would be more centered around mind mapping and be my planning coach or be. Let's go.
Erica: Right. Well, planning is a little different than studying for a test.
Darius: That's right. Yes. Yes.
Erica: So there are little nuances. You are the same person. But it's interesting how it's funny. I get people from time to time when they take the spy. They'll say, well, can I take it a few times? Because when I'm doing math, I approach it completely differently than do English. So there's some people that are kind of the same across domains, and then there are other people that really kind of morph and change in their different environments. It's fascinating. The bottom line is we just have to be careful not to
01:00:00
Erica: assume that people are the same because they're not. We are wildly different from one another. And it's fascinating how well we do communicate. But I think that there's a lot more miscommunication out there than we're aware of.
Darius: Yeah.
Erica: We often don't even know it's miscommunication. We just create a story of that's what was said. And. And maybe it wasn't.
Darius: So here's my kind of theory that's developed over this conversation. Or the schools tell us we've got a learning disability. I believe it's a teaching disability. And that teaching disability will be solved by AI in the home environment first, because the AI will start teaching me in my mode of learning, and I'll probably do more learning after school than in school. And then gradually that will. That will permeate all of work in school.
Erica: I agree for adults, but with children, we have to be careful because the parents may force the kids to learn in a way that they want them to learn. So they. The parents could do it like a homeschooler.
Darius: Yes.
Erica: Yes, a homeschooler could. Not as accommodating.
Darius: Yeah. I suppose my point is you could include school as homeschool or any kind of schooling where I'm going to tell you what you're going to learn, how you're going to learn it. Okay. They might think of you as having a learning disability. You've got something wrong. You can't learn this properly. Okay. Actually, they have a teaching disability. And what will happen is these individuals, whether they're children or adults, will especially. I've learned this with dyslexia. They'll go. And they'll go sidey ways around the problem and they'll find some sort of tool or technology, and it'll be an AI massive one. And they'll find this technique, and they'll go, oh my goodness, I found if I do this, that. And next I've learned it, you know, I've got ChatGPT to tell me a story about this as if it's, you know, Birmingham Football Club, you know, Bristol, Bristol Football Club or whatever, my favorite football club or Manchester United or whatever. And it all makes sense to me now because it's with a context I understand and my teacher doesn't understand football or soccer etc., and they go and find a way to do it and then they create the framework etc. so it, they will end up using AI to take the dis out.
Erica: Well, you know, what it comes down to is we're, we're all going to have to become. It goes back to, I think our last podcast where we have to really learn about our, who we are, what our strengths are so that we can navigate them, and we can advocate for them.
Darius: Yes, yes.
Erica: I think our biggest pro. One of our biggest problems in education is kids aren't aware of how they learn best.
Darius: Yes.
Erica: And so they don't even really know how to advocate for themselves. They're stuck in the disability, in a place of learned helplessness when, when in fact we've really got to learn how to navigate our challenges, which is exactly what our last podcast was about. And we've got to be aware of them not subconsciously, but consciously.
Darius: And kids are going to do that. So if they're really experiential learners, they will say to the AI, make a game for me out of this. And literally within one minute there'll be a full, like AI can create a full functioning Tetris game for you immediately within one minute. Okay. Then you can say, I want it to be a three-dimensional Tetris game, and it turns it into a three-dimensional Tetris game that you can play there. And then with scores you're playing, you're doing something and then it's a matter.
Erica: Of time before you get, you can say, all right, I want you to create a Tetris reality around me.
Darius: And yeah, well, okay, but the kids will, yes, it's true, but the kids will start asking, expecting there to be a gamification of something if they're really engaged in that or, and what they'll, what they'll do is they'll start learning what their strengths are. I really love it when it turns it into a song. I really love it.
Erica: Yeah, they'll be able to turn, turn learning into what the intel, into activities they love. And if they love it, they'll be motivated. And if they're motivated, then they're going to be successful.
Darius: Absolutely, absolutely. And they'll discover that outside of the class and they'll bring it in. And they'll discover that outside of work and they'll bring it in.
Erica: Let's hope. Let's hope that we can bring that into the class. I mean, it's challenging, but let's hope. I love the idea of having a little learning stations, you know, where they can dabble. They can dabble because, you know, the other thing we have to be careful of is we don't want to let people go down a rabbit hole to the
01:05:00
Erica: point where they are ignoring other skills that they really need to develop. And they're only. I mean, I can think of a student that I had brilliant kid, high marks and everything, but was terrible at visual spatial skills, and he avoided them all his life. And then they haunted him. They were tripping him up in high school and we had to go back, and, in a summer, we took him from the second percentile to the 98th percentile, but he had to face it. You know, if you. If we avoid things that we don't like.
Darius: Yeah, we.
Erica: We can disable ourselves.
Darius: Yes, yes.
Erica: Later on in life. It's interesting. We've kind of gone full circle here, but yes, great discussion.
Darius: I. I wonder. I think Google is in a very strong place with Google Classroom, because I suspect if I was Google, I would create an AI for Google Classroom. Yeah, Google Classroom, work instructions and so on. If you put Gemini 2.5 into the classroom and there were certain restrictions on it, like it automatically depersonalized every person in the classroom. So when it says Gemma Barnes, you know, it would turn it into GB and so anonymize it automatically. It would only keep all of that information within the school, for example, so it becomes a school database rather than a Google database, etc. And it becomes like a classroom model for the teacher and only the school has it. And they put all these kind of restrictions in there, safety restrictions, privacy restrictions, all of that. So the child would probably. I'm just thinking aloud right now. But the child would probably be used to using Gemini at that point. Let's say three or Gemini four, which would be incredibly intelligent. And then they'd come into the classroom, and they'd have a classroom version of Gemini 3 or 4, which would be, you know, are you sure you're really meant to be exploring, you know, Pokémon Go strategies right now in class? Sort of Thing. Do you know what I mean? That there's some sort of.
Erica: Yeah, yeah. There'd be a way of monitoring what people are doing, how they're doing it, their level of engagement, excitement, joy, learning.
Darius: And if they are doing a Pokémon Go project, for example, and not just getting distracted by Pokémon Go in the class, which I would be. They can go up to the teacher and say, oh, can you give me permission for the AI saying, no right now. Could you authorize it? And she would say, yeah, authorize. And then they carry on. So I think I'm just thinking aloud here. I think there is actually a way which this could naturally come into the classrooms that appeases the issues that schools have with safety, security, and distractions.
Erica: It would change teaching where teachers would now be facilitators of learning.
Darius: Yes.
Erica: Instead of teachers of learning. And that would work. We can't teach 35 kids, but you can facilitate learning for 35 kids very successfully.
Darius: Yes, yes. And that. I suppose you could think of the AI in that setting as being the teacher's aide.
Erica: Right.
Darius: The teaching assistant. So you've got facilitators.
Erica: Assistant.
Darius: Yeah, yeah. You got the. The Mrs. Jones. This is Mrs. Jones's assistant. Okay. And so you can ask Mrs. Jones, assistant in the class, the A.I. maybe it's got a nickname, you know, class. What will we call Mrs. Jones assistant this year? We'll. We'll call it Rainbow, you know, and, you know, so you can ask Rainbow, you know, about your project, and Rainbow will tell you, you know, something that Mrs. Jones would approve of and so on. So it would be quiet.
Erica: It's interesting. Interesting little theory that we've. We've been riffing on maybe that in me.
Darius: Maybe we could do a classroom ivy. Because a lot of classrooms are asking. Teachers are asking for ivy in the classroom because they want to talk. What. What they really like is audio recording, transcription and the M map. So if they're giving an instruction that the children can see it in the mode that they want. Audio.
Erica: Yeah, they could see. They could see. They can share it with them on the smart board. Just happening.
Darius: Yeah, yeah, that's what they want. Yeah, on the smart board. And then when they go home, the parents have got the link to those instructions, so they'll see them sequentially. They'll see them spoken so they don't have to type out a document. They just say, this is what I said in class. It's maybe five minutes long or whatever, a homework explanation or whatever. And it'll go with them, with their homework or maybe put into Google Classroom, etc. That would be fun.
Erica: There are all sorts of interesting things. All
01:10:00
Erica: right. This was great.
Darius: That was good.
Erica: Great to see you as always, Darius, and thank you for joining us today.
Darius: Yeah, see you next time. Sponsored by Ivy. Imagine turning your meeting's audio into a live mind map instantly so you remember what matters. It's ideal for students and managers with dyslexia or ADHD. Try Ivy for free now at Ivy App. That's IVVI App.
Erica: Sponsored by learningspecialistscourses.com courses and resources that support educators and coaches.
Darius: Thank you for joining us at the Executive Function Brain Trainer Podcast.
Erica: Check out our show notes for links and resources and follow us on social media.
01:10:45