Is Merlyn Safe for Schools? A Conversation on Privacy & Security

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Is Merlyn Safe for Schools? A Conversation on Privacy & Security. The summary for this episode is: <p>Our founders started Merlyn Mind with an unwavering belief in privacy, so we do things differently. "That's the fundamental premise of this company," says Ravi Kokku, Co-founder and Chief Technology Officer of Merlyn Mind. "We want to give these advancements of AI to classrooms, but we want to do it safely in terms of security and privacy." Go deep with Ravi as he discusses how Merlyn accommodates the unique needs of education, including the privacy and security needs of schools.&nbsp;</p>
Why the focus on education?
00:36 MIN
How can AI simplify a classroom?
01:16 MIN
Using AI so teachers can focus on students
00:23 MIN
Focusing on privacy and security in classrooms to ensure safety
00:49 MIN
Initial reaction to Merlyn in the classroom
01:01 MIN
Team effort on all fronts
00:42 MIN
What is deep learning?
02:57 MIN
Data and Merlyn Mind
02:26 MIN
Decision to discard data
01:17 MIN
Data for support purposes
01:18 MIN
Second form of data
01:21 MIN
Second form of data
01:21 MIN
Deleting the data and serving the schools
01:13 MIN
Protecting student privacy
01:00 MIN
Helping others feel safe with technology in the classroom
01:37 MIN
Bringing safety to voice-enabled applications
02:37 MIN

Levi: Welcome to Unsupervised Learning, where we bring members of the Merlyn Mind team together to talk about artificial intelligence, technology, and education. We hope you'll enjoy these conversations and learn something with us. Let's learn. Okay. Ravi Kokku. Welcome to the show. This is a really important episode because we're here with our Chief Technology Officer Ravi, and we're going to talk about Merlyn Mind and privacy and security, and how we have thought about that from the beginning of our company to now, and how we're thinking about it going forward. So, Ravi, could you just introduce yourself and tell us what does it mean to be the Chief Technology Officer at Merlyn Mind?

Ravi: Sure, Levi, thank you. So, yeah, I'm a technologist at heart and I do a lot of architecture and design decisions at Merlyn Mind. I'm a computer scientist by background, and I've done a lot of networking and operating systems research in the past. And I've also applied AI technologies in education over the last seven years now.

Levi: Okay, wonderful. So let's dig in there. Today, the topic of conversation is going to be education, AI and education, voice technology and education. So let's start with why education? Why are you focusing your life and all of your effort for the last seven, eight years, and today in particular with Merlyn Mind, on education?

Ravi: Yeah, so our goal at Merlyn Mind has been to make AI solutions that assist in making human processes more efficient. And we think that education is a space where human processes are most impactful. Especially education touches everyone's lives in the knowledge economy of today. And teachers specifically play a significant role in the education process. So at Merlyn Mind, we believe that if we design the technology really well, it can assist in making the teachers productive, and less stressed, and more connected with their students and classrooms.

Levi: Okay. Love that. Let's actually go back and even just touch on some of those things. So what does it mean to make these processes easier? To automate them, to simplify them, what does that actually mean for a teacher? Well, just talk me through, you're in a classroom, you're a teacher. What are you dealing with that is so overwhelming, and why could AI help simplify some of that?

Ravi: Yeah, it's very, very interesting question of exactly what happens in the teaching process. The moment a teacher enters a classroom, imagine, especially in a K- 12 environment where there are 20 to 30 kids in the classroom. And the teacher's job is first to make sure that the classroom is organized to even start imparting the lesson. And then she has to set up all the technology that she needs to even begin the process. And then as she goes through her lesson, she's actually having to juggle between any number of applications before the lesson is completed. It can include showing videos, showing specific content, specific exercises, running assessments. There's a lot of work that happens in the classroom during that process. If teachers get busy trying to deal with this technology, switching between all these different technologies, there's a very high chance that they lose control of the classroom. So this is why I think the jobs to be done in a classroom are very important, and any assistance that we can provide to teachers in order to make that process more efficient can be highly valuable to make these teachers very effective, and also less stressed out in the process.

Levi: Okay, great. So the whole foundation of our company is basically, we believe teachers are incredible at helping students learn, but so much gets in the way. We're going to use AI and some of the best new technologies, like voice technology that allows teachers to interact more naturally with computers and technology and applications, so they can focus on the students, and teach and use technology very naturally and simply to accomplish their goals. So let's now talk about why is that difficult, specifically in terms of building cutting-edge AI technology for education?

Ravi: Yeah, totally. You already touched on one of the most important problems of the field. There are many challenges in classrooms, especially in K12 classrooms, with both adults and children being in the environment and there is personally identifiable data being captured in the process. Any technology maker needs to be well aware of what data are we touching with respect to privacy and security. Are we dealing with that data? These classrooms are environments that are very dynamic, and so much happens in the classroom. And there are many laws that apply to these environments, and we need to be aware and we need to comply with all these laws like COPPA, FERPA, GDPR, their state-specific laws, et cetera. And the sooner any company takes up focus on privacy and security, the better prepared a company is to make sure that the technology is safe in the classrooms.

Levi: Okay. That's a fantastic kind of overview of just how much you have to think about if you're going to build an education-specific piece of technology. Let's before we go into that, let's just look back a little bit. So I've been at this company now for a little more than three years. You guys founded this a little over four years ago. From day one, privacy and security was something we were thinking about. And I think there's a reason why. Right? So I heard you tell stories, and other founders Satya and Share tell stories about when you first came up with a seed of this idea to bring voice and AI into education, you started telling everyone. Not everybody believed that this was something that was actually possible. Right? Can you tell me more about the response you got from people as you went out and told them?

Ravi: Yeah, it's interesting that by the time when we started this company, voice assistance had already started being in different environments, especially at homes. And people were getting used to using voice in all these environments. And there were some efforts of taking some of these voice systems into classrooms, and there was news around it saying, there are so many laws around it. You cannot use it. So when we said that we are going to take voice into the classroom, people were completely saying, "Good luck with that." So there was a lot of skepticism, caution around taking this technology. And we were also cautious. But at the same time, we were optimistic that if we do things right, we can actually take it into the classroom. And we really wanted to design with that in mind as the fundamental starting point.

Levi: Yeah. I'm not an engineer, I'm not building this incredible cutting-edge technology that brings voice into the classroom, but I've seen just how much work it takes from our team. What people probably don't understand is we have this team of innovators, and technologists, and creators who are pushing the boundary on everything that's possible technically to make this work in classrooms. But we've had almost equal amounts of effort to make it work in classroom environments from a privacy, and security, and legal perspective. It's increased the difficulty of doing this dramatically. But to your point, we believe it's worth it. Can you just go back to why is it worth it? In the end of it all, if we accomplish what we want to, what are we going to achieve and why was all that work worth it?

Ravi: See, ultimately, there are a lot of concerns around voice. As a parent of a 14-year-old and an 11-year-old myself, when you talk about any kind of technology in the environment that children are in, you always think about, okay, definitely there is technology. In some sense, there is data being touched, there is data being gathered. And there will be some implications of that data being collected or stored. It's basically making the use of that data responsibly and nothing beyond that. If you provide that and you earn the trust around how we are handling that data, I think any technology will have a chance. To be used anywhere. And if you take a step back and look at where the whole technology is going, the whole voice as a modality is something that has been driving a lot of interfaces in the last 10 years. And in the next five years, if you see, many, many solutions will be voice-enabled and multimodal enabled. Now, if we can actually give that chance to classrooms ...

Levi: Right.

Ravi: Right. And that's the fundamental premise of starting this company, to say that we want to give the same chance of all these advancements of AI to classrooms.

Levi: And just so we're clear, when we say multimodal there, what we mean is you can interact with the solution in whatever way is most natural to you. You can speak to the device, you can touch it on a touch display, you can push a button on a remote control, you could type something on your computer, basically creating an ecosystem where you can interact with our assistant Merlyn and the automations and the simplifications of the workflows that you have, opening things, teaching things, playing videos, in whatever's way simplest for you. So that requires integration, and data, and orchestration across all of these. Let's talk a little bit about data, because you said people get worried when they think, wait, you're getting my data. That's mine. I don't want you to have my data. But data is a broad term that can mean anything. Right? So let's talk about AI generally. First, before we talk about what we're doing specifically at Merlyn, talk about why artificial intelligence is so hungry for data, and what kind of data is most useful when you're trying to improve AI algorithms.

Ravi: Sure. Yeah. So yeah, there have been a lot of advancements in the last decade or so, especially with respect to deep learning technologies. The whole point about deep learning technologies is there are these fundamental neural networks of different configurations that take data and learn patterns out of them. Which are, once you have enough representative data flowing in through these networks, during the training phase, they learn enough models of what it means to do a particular job.

Levi: So if I-

Ravi: inaudible inference. Yeah.

Levi: As a layman, let's say I don't understand what deep learning means. Basically if what I hear, what I understand now, after learning from you is before, you would've had to really be very specific and tell the computer what to do. If this, then that. When you see this, do this thing. You have to create rules and say," This is how you should act in this situation." But that doesn't scale, and so you're basically letting the computer now learn from a set of data saying," Go look at this and learn on your own the patterns and what looks like what, and what's similar to what." Is that a fair kind of representation of the advancement?

Ravi: Yep. Yep. Let me give a very specific example to contextualize what you are saying. Think of a dog, right?

Levi: Okay. A dog. I have a dog.

Ravi: A dog. Yeah. Right. The moment you see a dog, the number of forms in which dogs are, there's just a diverse number of them.

Levi: Big, small fat, skinny, black, white, yellow.

Ravi: Right, yeah. And with all kinds of different backgrounds. They could be in any number of backgrounds. So when you say something is a dog, it actually is a very complex process for our brains to arrive at the decision that, yeah, that's a dog. And we, our brains inaudible.

Levi: My four-year-old, five-year-old, he could recognize any variety of dog. No matter if he's only seen one, he would look and say, that's a dog.

Ravi: Exactly. Right. Yeah. So our brain actually goes through a very complex process. Machines are not there. They don't have that reasoning capacity as much as our human brains. But what machines are really good at, especially the deep learning technologies are really good at, if you give enough number of samples of dog images, they can automatically derive patterns. You don't actually have to say that when it looks like this, it's a dog. When it looks like this also it's a dog. When it looks like this also it's a dog. You don't have to do all those individual specifications, how their eyes should look, how their ears should look. All those patterns are automatically learned through this deep learning process. So that's the high-level idea of what the deep learning process really underneath create patterns of. Once you have those patterns, you can apply it in any real-time context saying, "Now if I have this new sample, can I say what this new sample is?" So the same thing is true for voice, for example. To convert speech signal to text in order to be able to process on a computer, you actually need to give the system enough number of speech samples to be able to convert the voice to the particular text. So over time, over a number of samples, it basically ends up learning the patterns of our speech, of the sound signal that indicates specific phonemes. And as a result, specific words, and as a result, specific sentences.

Levi: Yes.

Ravi: Right. So yeah. It's another example of deep learning process.

Levi: Okay, great. So then let's talk about the data that we are actually capturing. Because when people think of, "Oh my gosh, I don't want you to have my data." And that caused a lot of concern. When you think about bringing voice into the classroom, and when everybody told you, "Good luck," that's going to be a problem you can't solve, what's different? What are we actually doing in a classroom? What data do we need, and how are we treating things differently? Because we're Merlyn Mind, built just for education. We didn't build this for the home, or the car, or your phone.

Ravi: Yeah. So, yeah. So like I was saying, it's in general true that the efficacy of AI depends on data. And the more representative that data is, the better the performance will be. Of any particular technology, whether it is speech recognition, image recognition, or any other AI or task that you have. But what we leverage, what Merlyn Mind solution leverages, is the fact that all these advancements that have happened in the consumer space can actually be directly applied. When only contextualized to the kind of jobs that need to be done in the classroom. Through some kind of an over-the-top customization, for instance. Learning speech recognition of the English language, it actually does not depend whether it's happening outside the classroom or inside the classroom. Ultimately, what the speech recognition is learning is how do you say each individual words? What is the conversion from a signal to an individual word? Now, you take that, and when you apply in the classroom, you actually don't need any new data to say that, "Oh, for the classroom environment, I need speech." All you need is that when these words are put together in the context of a classroom, what does that mean? So you don't need voice for that. You just need the context of a class instead.

Levi: This is a really important point, right? And this is actually a fundamental difference in how Merlyn Mind operates. An AI company dealing with voice in most cases really wants to have the audio recordings of voices so they can improve their models, change their models, use those voice recordings, and keep them forever to keep using them and keep iterating on them. And so what did we do differently? What did we decide to do with the audio recordings of teachers and students who interact with Merlyn?

Ravi: Yeah. So since we are leveraging all the advancements of AI that have happened in the consumer domain, we actually end up leveraging all the models that have already been created for speech recognition. So we don't need data, voice data, to train new models anymore. What we just need is the context of the classroom. And in the context of the classroom, when someone utters, you just need the speech signal to convert that into text. And then just discard the data, discard the voice data, don't need to store that data anymore.

Levi: So this is critical. Discard the voice data, meaning we made a really difficult choice in our company to say, Merlyn's there. You say," Hey, Merlyn," it understands what you say. "Hey, Merlyn, start my presentation." That start my presentation goes from Levi's audio voice into text that the computer then uses in our algorithms with Merlyn to go and start the presentation, and do what I asked it to do. But we decided to delete Levi's voice. You make the command, we delete the audio recording from my voice, because we not going to use that. Because that's not what we're there to do. That was a big, huge decision in the beginning of Merlyn Mind to say, this is what we're going to do differently to make it work in education. And we're off to the right-

Ravi: Yeah, yeah, yeah. And that one decision really made the process of adhering to all these lot simpler. The moment you don't have voice, which is personally identifiable data, adhering to the loss becomes a lot easier.

Levi: Okay, great. So then let's go on to who controls ... So another big part of data, and student privacy, and security in schools and complying with the laws like COPPA, FERPA, GDPR, et cetera, is whose data is it? So even though we delete the voice data, there's still a transcript of me saying "Start my presentation." Because we have been built for education, how do we do things differently? Who controls and owns all the data that we have access to?

Ravi: Yeah. All the amount of data that gets collected actually takes two forms. There's one form. So any product that you make that you take in front of customers, you do need some amount of data to be able to provide either the features themselves, or support, or debugging purposes. And in order to provide specific support and debugging of any of the features for the classrooms, you end up having some amount of identifiable data. Saying that this particular person did utter this command, and the system did this. So that is an identifiable category of data that we don't touch unless someone actually asks for support or debugging.

Levi: Okay. So Levi calls, I'm a teacher. And I say, "Hey, I was trying to use Merlyn in the classroom and I kept saying, 'pair my device,' and it wouldn't pair. What was wrong?" And Ravi, as my helper at Merlyn Mind says, "Oh, let me look." And you go into the logs and you look up, oh, I see what was wrong. You had a network error, it wasn't connecting to the internet. Let's work on your network connection. That data that you just accessed, we don't use that to improve our product, to improve our algorithms, do anything, because that's the school's data. And we only use it for their benefit. Is that correct?

Ravi: That's right. Yeah. And the second part of the data actually, inaudible. See, the system is about naturally interacting with technology using voice. There are different ways in which people say or ask for different things. Let's say if I want to switch between different displays. Someone might say, "Switch to HDMI." Or someone else might say that, "Just show my laptop." Someone might say that, "Get me HDMI 1." There are different variants in which people would want to get some service done. Get some job done. This de-identified data that we use is for identifying those kinds of variants so that the different ways in which teachers want to get their jobs done can actually be improved.

Levi: So this means whether it's Levi teacher, Ravi teacher, or all the other millions of teachers that are using Merlyn, those commands switch to HDMI, go to HDMI, give me HDMI, all of that comes into a set of data where we look at all of those in combination, we have no idea who said what. We say, "Oh, we need to build our system to better work for all of these different ways to accomplish that goal," because lots of people do it in different ways.

Ravi: Yeah, exactly. Yeah. So fundamentally it's to improve the function that the system that we designed itself will provide to the teachers.

Levi: Okay, great. So then for the data that is connected to my device, I'm a teacher I'm using Merlyn. I call and I ask, "Hey, can you help me with this?" We go back and we look at that. Let's say for whatever reason, I say, "Hey, I don't want you to keep any data from me." Or, "I'm leaving this school. Can you delete my data?" Can they do that? Does the school have control over the data? Or is it our data?

Ravi: So any personal information that is stored for the purposes of providing the features of the system, or for support or debugging, any of this information, if the school comes and requests us for the data, we can give it. If the school comes in and says, "Delete the data," we will delete within a specified amount of time. And when a contract with the school ends, we also delete all the school-related data that we can identify.

Levi: Okay. And this is really important, right? Because if you look at some of the fear and concerns over voice and AI and bringing it into the classroom, people often conflate it with some of the concerns in the consumer world. And what happened in their homes and on their phones. When we started Merlyn Mind, we started it as a company, built for education, to comply with the privacy and security requests and needs of schools. We don't use that data for anything except to support the schools in delivering Merlyn, to help them, and to help make it better for them. And if at any point they want their data back, they want to delete it, any data that we have collected, it's their data. We agreed to that in our contract with the school, we're here to serve them. We're not here to make money off that data, or use that data in any other way. Is that correct?

Ravi: Exactly. Yeah. And that's the fundamental premise of us saying that, yes, we are committed to providing privacy and security for this domain. And as a result, we do not want to use the data in any purpose other than what we specify to the schools. That as a part of the feature, or as a part of the contract. And we do not sell the data, we do not use it for advertising, et cetera.

Levi: So that's one of the reasons why it matters to have companies built specifically for education. Why focusing on education and bringing AI and technology and education with this intent allows us to do it differently. But let's just look at why it matters. That's a lot of extra steps. It's setting up the company correctly, it's building and processes, it's making hard decisions, like not to save audio recordings from teachers or students or people using Merlyn. Why does protecting privacy generally matter? And specifically, why does protecting student privacy matter?

Ravi: Yeah. Yeah. I think like we've been talking about, firstly, it's just the law. By law, you are required to provide a number of controls around privacy and security. And especially around student data. But more importantly, and I think it's not just a technical question, but a question emotional in nature. We are talking about students here, who are one of the most vulnerable parts of the society. And it is their right to be protected. This is why a number of laws are placed to protect students and student data. Before we do anything with their data. And it's the right thing to do. And we should strive to set an example for all the future generations, because very importantly, increasingly everything around us is getting digitized. Everything is online. So the more we take steps earlier, the more we set a precedence that this is the right way to do it. This is the right way to handle data.

Levi: Okay. Thank you for sharing the why. I think we often get lost in the contracts, the legal language, the terms. And we forget that there's humans behind both of these things. We have teachers, and students, and administrators, and parents who really care about their responsibility to safeguard their students. That they have care over and need to make sure they're safe. And then on our end, we have humans building this technology who really care about helping teachers, and helping students, and helping administrators. And we're trying to all work together to do this right, so that everyone has access to the best the technology can do to help today. So what would you say to an administrator, a parent, a teacher, or a student who may have concerns about this? Who may say, "Wait a second, you're going to bring a voice assistant into the classroom? Is it safe? Am I okay? Is my data safe?" How would you respond to that as the CTO of Merlyn Mind who's thinking about this every day?

Ravi: Yeah. I think it's exactly to your point. It's very personal in nature. Everyone has some association with education, some association with classrooms. Either they, themselves are in the classroom, or their children are in the classroom, their grandchildren are in the classroom. So whatever applies in general, in terms of loss, in terms of best practices, it actually hits back home. So we really need to be sensitive to that sentiment. And the whole premise of this company was exactly grounded in those principles saying, how do we make sure that people will feel comfortable using what we develop around them? Not losing the possibilities that technology is opening up. And at the same time, feel safe that the technology around them is going to remain safe for them. Yeah. So there are a lot of controls that we are putting in exactly to make sure that we deliver on this. And this is something that we think that is very important to make sure that this technology succeeds, and really makes classrooms be at the forefront of technology.

Levi: And we haven't done it alone. Correct? For years now, we've worked with experts in privacy and security, legal experts, subject matter experts in education. And I think this is probably just a good reminder that doing these things right takes collaboration with customers. We talk to schools, and administrators, and IT staff, and we talk to lawyers, and we talk to privacy experts, because this is an ongoing battle. We're going to be doing this forever.

Ravi: Yeah, totally. Yeah. Over the last one year, we have talked to legal experts and privacy experts. And we have been on top of different kinds of litigations to understand what people are not doing right. And understanding what are the laws, and even understanding how complex the laws are across different states, and how different they are. And answering long questionnaires from different schools. And really understanding what are they finally concerned about? And how can we make them feel safe, that we will take care of all these things? And we make sure that we are able to help teachers, and they feel comfortable that they want to help the teachers too, all the school administrators, for instance, have the innate urge to help the teachers succeed. And we want to make them feel comfortable. That's when they will buy any technology into their environments. And it's a continuously evolving activity. Both privacy and security is not like you can do this and then you can sign off saying everything is done. This is a continuous process, but we are committed to getting it right.

Levi: So let's just look at that. Let's fast forward to next year, five years, ten years. We're going to keep evolving. We're going to keep changing student data privacy laws are going to keep evolving and keep changing. What technology can do is going to change. How do you see our commitment staying the same? How do we keep going in this direction and doing it right?

Ravi: Yeah. So yeah, at a fundamental level, Merlyn Mind's technology is about bringing in multimodality. There is the notion of voice, plus touch, plus remote control, plus keyboard. There's a combination of interaction technologies coming into place, which will be a very common place everywhere in the world. In fact, voice will play a huge role around us in the next decade-plus. At the same time, a point that you made a little while ago, so in terms of previous interfaces, like touch interface has been there for a few decades now. Before that, keyboard interfaces were there. Mouse interfaces for that. When they came, the previous technology did not completely go away. If you see, keyboard technology is very old, but keyboard technology is still there. So any new interface that comes, ends up blending into existing interfaces. And to your point a little while ago, there are different interfaces that are good for different jobs, different activities. Even voice will reach a maturity and will coexist with all other interfaces, and will be a commonplace everywhere. In five years, what we want to do is that this interface will be in classrooms also. And this advancement that is there everywhere around us actually is in classrooms also, but making sure that it's both private and safe. And yeah, in general, the whole solution that we are building at Merlyn Mind, for teachers being able to be untethered from the computer, and being able to move around, and being close to students while interacting with the computer through a variety of these interfaces. It's nothing short of a revolution. The whole fact that teachers are much closer to students, untethered from technology, is a big, big jump for what they currently are. And even if you talk about all the different kinds of technologies that get used in the classrooms like applications, there are many applications that get used. And all those applications can actually benefit from voice interfaces. And that's where Merlyn Mind takes notion of a platform. If Merlyn Mind is there, Merlyn's devices are in the classrooms, Merlyn's interfaces are there in the classrooms, we can pretty much voice-enable all those applications safely. So that's our vision.

Levi: This is a powerful vision. It's an exciting future. I have committed my life to it. You've committed your life to it. We're doing this all day, every day. I just want to thank you Ravi, for making time to come and have this conversation. And for anyone listening, just want to express how special it is really, to get Ravi's attention and focus on sharing this with the world. Ravi has so much to do with building this amazing technology, and bringing it to life. But because he's closest to the data, and closest to our technology, and understands the decisions that have been made and how we've aligned it to work with privacy and security, getting him to have this conversation with me, I felt was one of the best ways for teachers, administrators, parents, anyone who has questions about how we're thinking about privacy and security, you can hear it from the source. We just heard from Ravi the way Merlyn Mind approaches privacy and security, why it's so important, and why it makes it possible to bring AI and voice tech into the classroom the right way to help teachers. The future looks really bright. I'm very excited about it. Thank you Ravi, very much, for joining us today.

Ravi: Thank you, Levi.

Levi: Thank you for joining us for this episode of Unsupervised Learning. Until next time, keep learning.


Our founders started Merlyn Mind with an unwavering belief in privacy, so we do things differently. "That's the fundamental premise of this company," says Ravi Kokku, Co-founder and Chief Technology Officer of Merlyn Mind. "We want to give these advancements of AI to classrooms, but we want to do it safely in terms of security and privacy." Go deep with Ravi as he discusses how Merlyn accommodates the unique needs of education, including the privacy and security needs of schools.