YTread Logo
YTread Logo

Can Smart Glasses Revolutionize How We Learn Languages? | Cayden Pierce | TEDxMIT

May 03, 2024
Alright, hello everyone, in November 2023 I flew across the planet to Shenzhen, China, and Shenzhen is a place that many people call the Silicon Valley of hardware, and as a researcher who makes a lot of hardware, I knew that this was a place I needed to

learn

more, but I grew up in an English-speaking country. I didn't speak a word of Mandarin Chinese, but I thought it was going to be fine. Of course, people always travel to countries where they don't know the language. but I personally thought I would be uniquely prepared to deal with the language barrier because my research over the past few years has focused on

smart

glasses

and my team and I had created a custom language translation app for

smart

glasses

.
can smart glasses revolutionize how we learn languages cayden pierce tedxmit
This is an app that listens. speak to the world around you in any language and then translates it into your native language and overlays it into text in your vision so you can understand what other people are saying. So I landed confidently in China and walked out of the airport to a taxi and the taxi driver turned around and looked at me and said some words that I didn't understand but they immediately superimposed on my vision I could see hello where you want to go and at that moment I was dizzy I felt like a cyborg.
can smart glasses revolutionize how we learn languages cayden pierce tedxmit

More Interesting Facts About,

can smart glasses revolutionize how we learn languages cayden pierce tedxmit...

I could understand what was happening around me and I started to confidently explain to them, hey, you know where I was going to my hotel and where it was, and the taxi driver looked at me confused and I realized at that moment that because some time. which is why my Chinese taxi driver wasn't wearing smart glasses, so of course you know I took my phone out of my pocket and opened a more classic machine translation app and we fumbled around for a couple of minutes trying to understand each other, ya You know, eventually he was able to understand what I said and we were on our way, but over the course of the next few weeks of my visit to China, I started to fall in love with the culture there, the people and the crazy people.
can smart glasses revolutionize how we learn languages cayden pierce tedxmit
Fast Pace I found that almost all of my interactions were a bit stuck with this translation issue where I thought these improvements I had brought with me were going to help me become better and smarter, they were actually kind of a detriment that got in the way , I thought I would wake up with my high-tech smart glasses and be like Tony Stark in China, but I ended up being more like a cyborg Mr Bean and at that moment I decided I was going to

learn

to I speak Mandarin Chinese and when I made that decision I had to change in my head the technology I was building.
can smart glasses revolutionize how we learn languages cayden pierce tedxmit
I had to switch from something that I was outsourcing to something that would communicate to something that would teach me how to communicate and Since then, I have been working on smart language learning classes and to be honest I could say that I have probably spent more time working on The smart language learning classes I've really dedicated to learning the language, but I like to think of it as both. Of them are progressing, so how are smart language learning classes different from translation glasses in the first place? Why don't you run the translation all the time and learn the language like that?
Well, the problem is that you run the translation all the time. What overlaps your vision becomes a crutch if everything you hear around you in the foreign language you're trying to learn is presented to you in the language of your native tongue and you don't actually spend time trying to process it, understand and translate it. language in your uh and understand that language imagine that every time you go swimming you use floaties, you might be able to jump in the pool from time to time, but you will never actually learn to swim, so the translation functions are a crutch, but They are also useful, we don't want to just get rid of translation completely because it helps you understand the information better and can help you increase your vocabulary over time, so one of the first features we created for smart learning classes

languages

.
It is an unknown word translation, you can see this here on the top left, this is a system that does not translate everything you say, it only translates the words you do not know, imagine that you travel to a country where a language is spoken. You're trying to learn and someone says something like, "I want to come to the library with me and my friends." If maybe you are an advanced beginner, you may be able to understand most of the words they said, most of the grammar, but you may have missed just one or two words, maybe you have missed the library, but That word is essential so that you can understand what the person is telling you, so with this system we are going to translate only that word that you don't know.
You can fill in the blank so you can understand what another person is saying. you can perform at a slightly higher level than your fluency allows and develop your vocabulary over time, and I'm actually running this app on my glasses right now. and our friend, the mannequin, is running it with his smart glasses, so we've attached a camera to this mannequin's eye and I'm going to show it to you very quickly. Yes, it is working live. This is a Russian example, so go figure. If you're someone who speaks Russian and you're trying to learn English, you're traveling to an English-speaking country, the system might turn it up a little bit so you can see it better.
The system is translating only the rare. words that I say and they can look beautiful, this is real and live, the system is translating only the rare words that I say into Russian so that the speaker can practice his ability to interpret things that are said in English without being polite. Whether this system is a crutch for them or something they rely on too much, if we could go back to the slides, another system that we have developed is to imagine that you are not actually traveling to the country where you are trying to learn. that language and imagine that you are trying to practice at home the best way to practice that new language is for you to listen, listen to the language and speak it, but we don't always have access to people around us who can do that. we can talk and that is why we are developing contextual AI conversation tools.
This is a system on your wearable device on your smart glasses that will detect different times throughout the day when you have the opportunity to learn a little bit of the language, maybe you're going for a run by the river maybe you're running some errands in the supermarket and that will spark a conversation with you that, on the one hand, is at the level of fluency that you can understand simple enough for you to understand based on what you've learned so far, but it will also be contextually relevant. We know that people learn

languages

​​best when the information is relevant to the situation they are in, so the system will take your GPS and information about what you are doing. do and strike up a conversation about it, maybe you're walking around town and he'll ask you about the landmarks around you or maybe you're going to the grocery store and it's as simple as asking you what foods you buy.
We're picking up, but the context is not just the GPS location, it's not just where we are, it's the information of the world around us and that doesn't look good, right? If we can play this video that we have also developed. systems that can create an immersive artificial world, you know, children every day ask their parents what this and that is while trying to develop a vocabulary that helps them understand what the things around them in the world are called, we can create this immersive world. artificial immersion with smart glasses too. This is a demo from a couple of years ago where we use the glasses' point of view camera to run recognition of objects in the world around us and then translate the title of those things so we can annotate the world we see.
It surrounds us with the names of everything we need to develop our vocabulary quickly and this area is very interesting to me for another reason because smart glasses are advancing very quickly just a few years ago, smart glasses were bulky, heavy and uncomfortable. wear, but they have been quickly developed to be comfortable enough, light enough, and stylish enough to be our everyday glasses. These vsix Z100 glasses that I am using now are my everyday glasses. I have my prescription built in and I use them every day like I used to use my stupid prescription glasses, so there are a lot more use cases I could talk about in terms of learning languages ​​with smart glasses, but I want to quickly jump to the future to see where this technology could go as we're building.
In discovering these wearable super-learning tools, much of my research in the fluid interfaces group at the MIT Media Lab is about combining smart glasses and neurotechnology to help us learn better and faster, and in neuroscience we've known for a long time. long enough that when you hear a word you don't know or see a word you don't know, your brain emits a very specific event-related potential that is a small signal that tells us that you heard a word that you are not familiar with and this too. applies to the confusion that we can detect with EEG when you are confused about something and I can verify that this confusion algorithm works for me because it always seems to light up when someone mentions quantum physics or my love life, but imagine applying that. algorithm for these systems that I'm talking about today, imagine taking that unknown word translation algorithm, that unknown word translation function in the glasses that just guesses what words you don't know and actually having a ground truth from data brain that we can understand. the words you don't know and when you get confused and the grammar that slows you down and actually fills in your spaces just when you need them in the moment and this not only helps you understand things in the moment, this could also help you over a period of time. longer time because when we have that information about your confusion we can create a personalized study plan that is based on exactly where your knowledge gaps are so that when you sit down to learn that new language or whatever you want. you're learning, you can focus on the things you need most, but the applications of neurotechnology to create these super learning systems don't stop there;
We've known for a long time about audio evoked potentials, which is the idea that when someone is talking to me and I hear them talk, the speech signal and the audio and the electrical signal in the audio processing part of my brain are will correlate and this has been known for quite some time, but in the last few years some neuroscience researchers said: What if we turn this around and while you listen to a voice signal we record it and transform it a little bit and then inject it into your brain with a little bit of brain stimulation called Transcranial Alternating Current Stimulation and what they found was surprising: They found that people were better able to understand, remember and pay attention to the spoken information they received, so I'm very excited about the idea of ​​applying that kind of idea to language learning. through EMB put brain stimulation in glasses and use it when someone is learning a language.
Imagine trying to listen to a stream of speech in a foreign language you don't know that well and being able to better understand and pay attention to that information using this simple brain. stimulation signal it's worth mentioning that this project is incredibly early there are still many questions to be answered there is a lot of research to be done it's actually so early that I'm as confused as you about why I'm here now, but I'm very happy to share these ideas with you and I am very grateful to my collaborators in the media lab, my Pi Patty Moss and my collaborators in Team Open Smart Classes for all the help so far in developing these tools and I would like to point out that this does not apply only to learning languages.
Language learning is something that excites me very much, but these types of interventions can be applied to all types of areas of learning and education and to the understanding of information in general. and I want to point out that today many of us are concerned about the direction that technology is taking us, especially in terms of our information, our attention and our learning. People are concerned about the kind of devastating attention that is happening with social media and worried. about how the technologies of the future and the social media of today are affecting children's education, but I would like to present a different narrative, a narrative in which we build educational and learning technology that is as addictive as the social media we have today . that people can learn better than ever and humanity can become more than ever before, thank you

If you have any copyright issue, please Contact