Don't Be Frightened of Student Data Privacy - Best Practices to Follow

Share

Protecting student data privacy remains a priority for local and state education agencies across the U.S. The high number of education technology solutions in use and changing state legislation requirements make it challenging for districts to manage tools and maintain compliance.

Share
Video Transcript
Welcome. Don't be frightened of student data privacy is the topic for our webinar with Halloween right around the corner. My name is Rob. I work at learn platform. We're gonna do, some introductions in a moment. And just before we get started of course, yes, this is being recorded.

That does not mean you should hang up right now. You should enjoy the conversation. You should participate in the Q and A, which, we will keep open. We will try to answer your questions as they come in. Or at the end and have the panelists address them.

We're not gonna share a ton of slides. This really is gonna be designed to be a conversation. We've got a unique perspective on student data privacy that we're excited to share with you all today. So, yes, the recording will be sent out to you and, your colleagues that might have signed up and, unfortunately, are not in attendance. So who is learn platform? Just to start things off and discuss learn platforms' role in this space and this discussion.

Learn platform helps to support, k twelve stakeholders in generating evidence and driving efficiency with that evidence to inform decision making. So we're building out EdTech effective is systems. At a number of different levels, you'll see below, we are working with districts, states, and education service agency as well as well as now providers, the vendor space to really help, our providers increase visibility and build their evidence base to support education partners like yourself. So how do the providers better understand, whether or not their their product is doing what it says on the label and so forth. And, since learn platform started tracking ed tech usage and engagement in twenty eighteen, we've seen the number of ed tech tools in school districts rise dramatically.

And, obviously, with school closures, that number accelerated rapidly. And it's it's no wonder we're having these conversations around student data privacy. And what we have found is that there's there's really staying power in the types of tools that are being used because even though the pandemic They have subsided. I'm not sure if we can say it's it's done, yet. Knock on wood, we're we're getting there.

The number of EDTech tools really hasn't come down, and and the rapid expansion brings about a number of implications around data privacy around, redundant resources, around You know, we've got a funding cliff, coming down the pike, and how do you make better informed decisions about what tools to start stop and continue? So some, some visibility into this work, is available, from their platform If you go to our website, you can take a look at our ed tech top forty that outlines the the tools in use at a average school district. And so Again, this is what brings us, here today to discuss what we really should be thinking about when it comes to student data privacy and EdTech use and k twelve settings. And with that, I'm gonna introduce my amazing panelists. So we have Sarah Clark, who's the vice president of edge education and children's policy at the software and information industry association. You may have heard of that's IAA before, and she works with the education technology, vendors, or providers, do you understand state and federal legislation and the laws that impact the education space? And she was previously at the US Department of Education in the privacy office and has worked on education policy in the United States Congress for a member that was presenting her home state of Minnesota.

Thank you, Sarah. Thanks for joining. And next we have Jill Bromman, who is the privacy council for counsel for Hamilton Media, and they're one of, our partners within the platform. We have, some of their privacy evaluations that are creating these evaluations to help educators choose products that are, more privacy protected. And she comes from an academic background and has had corporate experience as well, and her experience as a parent has also helped her shape her concern for students.

And then Melissa Tevan Camp has served as the director of instructional technology and is now the chief information officer for Raytown Quality Schools, and she's been there since twenty o six She was among the first fifty in the nation to earn the SEDle designation from Kosun and has led Raytown to becoming one of the first seven districts in the nation to earn the TLE seal, the trusted learning environment seal. So if you if you haven't explored that, Melissa is a wealth of information and can provide more background, at. And so we're gonna begin our our panel discussion here, and I'm actually just gonna stop sharing because we don't we don't need this. Right now, And I'd like to start things off. Here, Sarah, working for s I I a is really giving you national perspective potentially beyond that.

But a unique perspective, if nothing else, where you're working with school districts really boots on the ground, and hearing from some of those stories, but also bigger picture looking at what you're hearing from the the vendor space, the provider space. Can you kick things off for Sarah around, what you're seeing as far as maybe some of the trends that are out there and and where things are headed? What are you hearing from the providers, if you will? Sure. Yeah. And I am going to give a little bit of background on student data privacy just to level set because There's a lot going on. There's it's a complicated legal framework, and it continues to evolve from year to year.

And oftentimes, month to month depending on what time of year and if state legislators are in session. So I'm gonna start with the federal level. At the federal level, we have the family educational rights and privacy act, which is known as ferpa, And that was one of the very first privacy laws in the US passed in the nineteen seventies. It is intended to protect student privacy in the classroom. With respect to education technology in many situations and Jill will go into this a little bit more about what they're doing at common sense media, but I'll talk about how in many situations there is typically a contract between a tech company and a school whenever that technology company has access to student data.

Furba requires that data at accessed by the technology companies to be used only for the purpose it was disclosed and that that data remain under the direct control of the school. If a tech company does not abide by the contract with the school that those provisions required by ferpa, then the school could be banned from using that tech that technology for five years or more. And no company no tech company wants a press release out there saying that their product was banned first. From a school. That is that is not a positive press release.

A ferpa is actually intended to be the floor for student data privacy in the US. I if you've been following the conversation, federally on consumer privacy legislation, you've probably heard the word preemption and, a push for a overall consumer privacy law. The education space is very, very different we, our educational system, is built on local control in the United States. So we have ferpa which is the base level of student privacy protections. We also have the protection of people rights amendment and the children's online privacy protection act at the federal level, protecting, learners in Papa, for instance, protecting those under the age of thirteen.

At the state level since twenty thirteen or so dozens of states have passed legislation to add additional protections on top of the federal laws, California, for example, passed. So PIPAA in twenty fourteen, which prohibits the sale of to data, prevents companies from using data, student data to advertise to students and require certain security measures. Many states since then have also student privacy laws, some similar to so so Pippa, some different, we've seen laws passed in Connecticut, in, New York and most recently, Minnesota passed a student privacy bill this past spring. All of them are modeled and intended to protect student privacy. Melissa will go into this a little bit more, but at the local level local districts and schools can set their own best practices for and requirements for protecting student data.

We will likely see additional legislation introduced at both the state and federal level in the coming months and years intended to student data privacy and children's privacy and teens privacy. So the landscape is ever evolving if we have the same conversation in say six months, I would guess that there will be additional things that we have to discuss and additional laws and implications in the education space. So it, like I said, ever evolving, ever changing, and a really interesting space to follow and very complicated So kudos to everyone on this call interested in learning how best to navigate this space. Yeah. Hundred percent.

This is the first step. It's just, like, when you wanna work out. Just getting to the gym is the hardest part. So you folks on the call. You've you've made it.

Folks that are watching this recording, good on you. Watch at two speed and inherit chipmunk voices. So, Jill, how do we do this work? How did how do we do we start getting, our, our hands wrap wrapped around this entire, system? And, and, you know, I'm maybe a small district. I don't have a lot of resources. Where do I go? What do I do? And and and what's the work that you're starting to, you know, that you've done, and continue to do at Common Sense Media around your privacy evaluations.

I'm mute. There's been a lot of attention, historically on things like stranger danger and data breaches, but we're trying to push, with our privacy evaluations a little bit more towards not just including those issues, although we address those issues, but also looking at the risks and harms associated with sharing personal information for commercial purposes. So we're looking deeply at what kind of data is being collected by about kids and why kids and how they're collecting the data and what happens to that data after they collect it. For example, and this is, you know, behind the scenes. We have a new report on virtual reality coming out soon.

It's in copy editing right now. So we're not only looking at the software and the privacy policies like we usually do. But we also did device testing. We previously had a report out on streaming, which you may have seen on streaming services. And basically what we're trying to do is, start from scratch.

Like Rob says, you know, what if what if you understand the basics of what Sarah just described. What if you understand? Okay. There are privacy laws there are privacy regulations, and we have to follow them. How do we apply that? How do we figure out, what we should do for a small district or school? So you're just an educator or even a parent wondering whether you use a product or a service. And if you use it how to protect your student or your kid's privacy.

So our privacy evaluations, which are available on our website, which I just shared in the chat, privacy dot common sense dot org. You can go right to those privacy. It's click on the button to look at the privacy policy click on the button to look at the privacy evaluations, which look at the privacy policies in terms of use, to see what the companies say they're doing to protect privacy. It's not easy. It's not easy to look at all of these privacy policies in terms of use, and some of the larger companies, like your Amazon or your Google have, you know, eight to ten different policies to filter through, and we feel your pain.

You know, we feel, the educators and the parents pain trying to imagine, looking at all these documents, which are legal documents, and sometimes hard to read. Never mind the the quantity issue. So our privacy evaluations are intended to be a shortcut for educators, for districts, wondering what to use, or they're using it and wondering, how can we use it to protect privacy, in a better you know, fashion and we can look at specific issues. Sometimes, districts will come to us as part of the district consortium, and say to us, you know, we're concerned about this particular issue. We're concerned about whether third party, whether this information that the kids are providing or the parents are providing or the teachers are providing goes to third party.

So you can look specifically not just on our overall evaluation score or, rating, but look specifically at particular issues in the standard privacy report that we put out, again, for free on the website. And look at specific issues that are concerning you or that are on your checklist when you're getting products for your district. That's great. And and I've, I used it when I was formally in the dish strict. I know many, out there also use it.

Great work on we're gonna wanna circle back, on that. I I'm curious to hear kind of what's maybe, next phase of that or or where your direction is is headed next. But, okay. So Melissa, we know about the federal laws We've got some state laws sprinkled around here and there. None yet in in Missouri notice, although there's been some tracking, and everyone's starting to sense that it's probably coming, in one way, shape, or form for their respective states.

But, okay. So why is this hard? You've got resources. We understand the law. What's what's the challenge? Just just use these things. Right? Where where do you wanna start Melissa? Yeah.

Yeah. That's a loaded question. Sorry. That's a loaded question. Be frightened by it.

That's right. It's hard to find a place to start because as Joe said, we have the published privacy policies in terms of service, but sometimes not always those submitted to common sense or those submitted or reviewed aren't what they always include in the contract when it comes to the schools, or they have additional contract language that rides on top of the terms of service and privacy policy that they sent to the schools or that they publish. And what I have found is that maybe there's a separate privacy policy in terms of service for schools or the one that's published online is really geared towards the parents or the general consumer not a school district using the platform. And so it gets complicated. And I I love the term that the US department of PTech always gives us, which is it depends.

And it really depends on the company on how we approach the contract side of it because we do need to make sure that we maintain direct control. And what I have found more and more is that even though the terms posted are seemingly okay There's always that magic line at the bottom that says we reserve the right to change these terms at any time. And when they can change their terms and they can change their privacy policy, then it can be argued that we don't maintain direct control. And so how do we as districts navigate this There are consortiums out there. There are model districts.

There are model contracts. And there's a lot of entities out there trying to help us get to an okay place. Trying to help the small districts, trying to help the ones who don't have the law background because let's be honest how many tech directors or lawyers how many, you know, CIOs or administrators in a school or lawyers, or how many are even sending those contracts through that channel. And instead, maybe it's the curriculum person just signing off on what is sent to them on their sales agreement and not realizing what they're getting into. And so where do we start? We start by understanding the laws and attending webinars and doing trainings and their are a lot of great resources out there to help us get an understanding of the laws and what our roles are because Furba applies to schools.

There is not applying to our vendors and we need to make sure that we understand our requirements and then find trusted partners And if you're partnering with a consortium and you're or you're partnering with an organization to help you through the contracts, make sure that they understand your nuances I liked what Sarah said that these federal laws and even some of our state laws are the floor. Missouri does have one law, by the way. We have one. It states that if we have a data breach of a student record, we must notify your attorney general and our state department of ed. That is our privacy law.

So, at least we've gone that far that we have to notify if there's a breach, but it's the floor. And because it's the floor, that doesn't mean that's that's what we should aspire to achieving. And we have our laws on one side and what legally we can do And on the other side, we have ethically what we should do and what our community expects of us. And that looks different in my community than it might look like in Minnesota or New York or in California or even, you know, from Raytown to hold in thirty five miles away, what is expected what is considered ethical and right by your communities might be different. And so making sure that whatever you're doing to be compliant also meets the expectations and what is considered acceptable risk and acceptable use of data and data privacy within your district in your community because that could look different in my town versus your town.

And so I need to make sure that I'm doing by my students and my children and my families and not just adopting what you did because it's the easy way. I wish I could give it an easy answer. But I I think looking at it in whole is really important and, you know, understanding our journey through in our responsibilities through that process. Yeah. Thank you.

There was a lot there, and I probably did jump in on on many things. But but one thing I I I kind of circle back to me, and I talked to tech directors, CIOs folks across the country, mostly Midwest is where I work, but I I I often get the, you know, I'm asking them, did you sign a, data privacy agreement with x y z company or, you know, did you read through the the privacy policies? And and if it's the latter question that reading through these these policies, and they are spending hours and hours and hours, and they are not lawyers. So, I guess How do we how do we convince folks that that their a is a little bit of an easier way? And and b, that you should be doing it anyway. You shouldn't just be flicking. You shouldn't just kind of bury your head in the sand and and accept the provided firms, you know, and that kind of thing.

So I so I guess I I'm I'm really just wondering, like, hey, convince me that I should do this work and that it's a valuable, exercise? I'll say that, Melissa is navigating it difficult. I'm imagining her situation, like, kind of almost juggling. Documents, you know, a virtual representation of what she was talking about. And, you know, this is where I usually say, and I'm a lawyer, but not your lawyer. And, that's usually what lawyers say when they're about to give legal advice.

The two phrases that I that Melissa mentioned that jump out to me or order precedence and incorporated by reference. And those are two things to look out for, and here's why. The order of precedence issue is like what Melissa was talking about with when you sign a contract that jumps on top of the terms and conditions that are online. So the terms and conditions that are online are generic and they're for everybody. And then Melissa, you know, might have some expertise, because she's done a lot of these contracts.

Right? Or sometimes a larger school district will have some experience or expertise. And they get much better terms than you can get if you're just a a smaller district or an individual and what you see online. So that contract that is negotiated will often say it has a higher order of precedence. It takes precedence over. You know, the generic online terms.

And then the other tricky thing, the other expression to always look out for is incorporated by reference. And we get into the weeds in this with our privacy evaluations that come in since media. Sometimes we have to just cut them off because we can only look at so many documents and so many policies before we are spend the next month evaluating one company So sometimes you'll see in an online privacy policy, this document incorporates by reference, and it'll list seven other policies. So you will see sometimes in our privacy evaluations, as I mentioned, for the larger companies, we do look at a CCPA policy because it talks about separate terms for California or GDPR policy because they give separate terms for people and users in the EU or, a children's, a separate children's privacy policy. And to the best of our ability.

We try to incorporate by reference, in our privacy evaluations, all the documents that they've incorporated by reference. In their privacy policy. Sometimes it's a lot. But it's important, to look at those all those ancillary extra policies for two reasons. One, because they count, you know, they're saying that they count that they're going to use them.

And two, sometimes we have to make a judgment call because while we're super focused, on student privacy at common sense media, We also want parent privacy and teacher privacy and, you know, just regular human privacy, to be protected. So we kind of look at that tricky language and the privacy policy. If it says, you know, this only applies to children. Or this only applies to residents of the EU. We're like, mm-mm.

No. No. You don't get points for that. We want privacy everyone, and we really only give them full credit if they have, the privacy standards for everyone, and not just a little little group of people get special treatment. So I I love that incorporated by reference.

I had a company that I was working with And they had so many reference documents. And within those reference documents, referenced other documents, their contract was forty two pages. Between their terms of service privacy policy and all of their reference data governance and random and this and that and that. And I sent it back to them and I said no. Figure it out because there were so many conflicts within the documents themselves in the order of precedence, and this is where I caution folks.

I'm not an attorney. I do not pretend to be an attorney. I work a lot with contracts. I have got it down to where I don't spend hours and hours and hours with every contract. Because there are some that we just say no to because they're not okay in the way that they're handling our data.

And so those are the easy ones. And also use resources like common sense media reuse consortiums to help eliminate the ones that you know you can't. Like, we're not even gonna be to be, like, you got an f rating. We're not gonna be able to work with you. So help those to be easy wins, but you have to look at what comes on on top of the data governance in Denim that might already be signed with the consortium because if they have a contract they're they're having signed, that takes precedence over everything that they just agreed to in the data governance addendum.

And now they're changing governing long jurisdiction to wherever they're at, and not your state, or they're, limiting their liability and they're excluding any liability that they that they gave you or any coverage that they gave you in their data governance addendum, those are ways that I've seen our companies kind of be sneaky. And we're gonna say that this is gonna supersede and I'm not probably not using the right legal terms. And I'm gonna limit liability of what you've paid me in the last twelve months So if there's a data breach, no, I have no liability to you because by the way, you bought five years upfront and were in year three, so you paid me zero in the last twelve months. And you have no coverage and you have no remediation. And so we have to be careful about that.

And I you have to form a partnership with your legal your legal counsel to make sure you're doing what's right with your district. Don't try to do it alone. I spent years with my legal before I really started working on contracts by myself. I and I think Sarah had something. And then, Rob, I wanna go back to how do you how do you convince people to do this work because I have an analogy that I use.

Yeah. I I think I wanted to I want to reiterate the complicating factors of a lot of this is that there's no strongest state privacy law and no strongest federal privacy law. So a lot of the lawyers love to use a lot of words. So a lot of the reason and we're seeing that with some of the California regulations that that are coming out. There are a lot of words in there for us to go and read.

But a lot of the complicating factors is that New York will have some requirements that you need to have specifically in your contract There are there are requirements in Connecticut for specific things you need to have in your contract. So you'll see vendors adding various extra terms or things like that. It's because the states are requiring them to do that. So So it's not it they're not just doing it for fun but for fun. And it's not an easy thing.

It's not an easy thing for commonsense media to go through. It's not an easy thing for Melissa to go through. It's not an easy thing for me to go through. It it's there's a reason for it and there's a reason why and vendors typically are not building products specifically for each school district. They'll build one product.

So you'll see those forty two pages of documents built out to reflect the practices of a specific state or district because they are required to do so. So it's it's just the reality of our system and the reality of in the nature of how things work unfortunately, it makes it really hard. You know, really hard. Maybe I'm pointing obvious pointing out the obvious here, but I know the California law, and I don't think the other laws have this restriction. I mean, the California privacy laws don't say Okay.

Now this only applies to California. So you can't use this standard anywhere else. So following up on what Sarah said, don't forget that all of these laws and regulations should be the baseline, like the absolute minimum the company should do. And there's no law that says they can't apply these better California standards or better other state standards or even the EU standards to everyone. You know, they can they can totally there's no rule that says I can play these better standards to everyone.

And as a matter of practicality, having worked at a large tech company as a lawyer, it's sometimes easier for them to do that. You can argue to them, Hey, you know, it would just be easier if you gave us it gave everyone the higher standard because then you can beat your whole technical backgrounds as can be simplified. So that that's kind of where we're coming from with the common sense media privacy evaluations where we're saying we'd like everyone to meet not just the legal standards, but the higher standards. Absolutely. So, Melissa, you wanna did you wanna add to that, or did you wanna, we'll convince us why we need, why we need a data privacy, piece of paper that's gonna that's gonna protect And and and and in addition, I'm also, you know, you said about the data privacy agreement and the contract.

And and kinda making sure that that those two really jive any any specific guidance around that that you can throw in too, because I think, you know, much of the conversation really does get into the well, I signed a DPA with them. But, you know, are they hiding other things in that purchase agreement that potentially overrides, things that were in that data privacy agreement. Are through a number of things that you treat. You can treat and choose your own adventure. You did.

Okay. So I'm gonna start with the analogy first. And this is one that years ago I was on a mission to get all of my teachers to buy into my passion. And to believe with what, you know, believe that we needed to do this with our students. And I had to find a way to reach our teachers.

I don't like scare tactics. There's a time and a place sometimes to scare folks, but I think we need to be real. And so I'm I'm trying to figure out how to reach my teachers And just kind of off the cuff. I said, would you ever let a stranger walk in off the street and come into your classroom and interact with your students. And of course, my teachers are always saying no.

We would not let that person. We don't know and off the streets into our classrooms and interacting with her kids. And so, well, how do you know that that's not happening? We as schools take so much time to protect the physical safety of our kids. We put in what we call guided entry where you have to come in through a set of doors, you're funneled through the office, and you only get led into that school if the secretary vets you and buzzes you in. We don't let people just walk in.

And we do that because we care so deeply about the physical safety of our kids. We do, training for active shooter preparedness. We have emergency operation plans. We have physical safety guards that we put in place. Right? We not only we invest in security guards, but we invest in things to keep our kids safe.

I had up safety, physical safety for my school as well. We put so much time and effort into the physical safety of our students. And you know what? When something happens to violate that, we know immediately. For the most part. We know immediately that something has happened to violate the physical safety of our kids, but we don't always put in that same effort to protect the digital safety of our kids, which by the way is the social emotional safety of our kids.

This is where kids can get preyed upon and we don't know it. And if we don't put that same amount of effort to protect the digital safety of our kids, we're doing such a injustice to them. And we don't always find out immediately. If something violates the privacy for kids, the online privacy for kids or for kids are targeted, because we connected with a bad vendor who now is praying on them. Not all of them are like that, but we know that there's bad actors that create software just because they want connections to our kids.

Or we allow the kids the students' data to be sold off because we're not in, you know, in in compliance and we're not making sure that we're vetting our resources properly or there's a data breach and that information is now sent out to whoever has it we don't always know that about never know that immediately. And we should prioritize the digital safety of our kids just as much as we prioritize the physical safety. Because what we are doing is allowing strangers off the street to have access to our students and our students data. And if we think about what our students are sharing online, when they're writing, when they're engaging in resources, think about the writing prompts alone that they might engage in with some of our online systems and the information that they're sharing and how we can profile our students. We need to protect them.

We need to protect our children, and we should have that just as much of a priority. And when I have had those conversations with my teachers, I have had some come up to us in tears not realizing what they could have potentially have done with some of the resources that they're using online and asking students to share things that we would never want shared with strangers. We need to understand what we're asking our kids to do online and what data that they have access to and make sure that we're doing that in a very safe way. So that is the story that I tell. That is the analogy.

That is how we get the the personnel aligned to it. That's how we get teachers to understand. Our teachers don't want to harm our kids. They don't wanna put our kids in danger. They don't want to do bad things.

They don't wanna set our kids up for a data breach. They just don't know better. And when we start teaching them, people people who are trained know better, and then they do better. And that's where we need to reach them. Okay.

Other question, what was the next I'm sorry. That was that's my soapbox. Oh, I mean, it it it it's a, it's a harsh reality. And, I think one that you know, when folks talk about cyber security, it often seems overlooked to some extent because you technicians wanna talk about penetration tests and multi factor authentication education. And, obviously, those are important components, but this is the, the social engineering aspect of things.

This is the this the side softer off of cybersecurity risks. We talk about it as an organization, but I think we're we're also out there educating folks on the importance of this. And, oftentimes, those are the conversations I'm having is is helping to bring this to other people's radar who may not necessarily, think about it or think that it's an easy undertaking. I I think everybody is aware of the data privacy concerns to some extent, but I think, where the rubber meets the road is really in the, well, how do I do this? How do I how do I shrink this down into bite sized pieces? I do have a question around that, but is there anything that you wanted to add from, your perspective site came off me, and so I just wanna give you some space. Sure.

Yeah. I wanted to talk a little bit about, something that I've been thinking about lately about how I think I think there are two different products that we're talking about we're talking about education technology that is built for use in the classroom and built for schools. And then technology that happens to be education that that is not built for you that is not necessarily built for use in the classroom doesn't give the requirement that both Melissa and I talked about the direct control of student data. That might be brought in by a teacher, and probably why Melissa needs to use some of the scare tax tactics and how learn platform helps schools identify what products are used classroom. I'm looking at the common sense privacy evaluations right now.

We could all probably agree that Candy crush soda saga is not designed is not an educational technology tool and not designed for use in the classroom. But it it it was reviewed. It got fifty two percent. But if you run candy crush soda to saga, through your curriculum committee, through your parent committee, the parents may have some issues if you're deciding to use candy so the saga in your classroom to help students you learn about I don't know what they'd be learning about with candy colors, maybe orders, gamification. I don't know.

Maybe it's used maybe it's used in a game development course or something that. But I think it's important to recognize that there are products that are built for use in the classroom that allow schools to help schools, uphold their privacy obligations are built with engineers that understand and in some cases are trained on the privacy laws. There are others that may be educational, maybe not, maybe our candy and I hate to pick on candy crush soda saga, but I was clicking through the the common sense privacy reviews that I have. That one, that's a very clear one that helps a lot of us, think about this. But there there are others on here that might be educational that do require that additional review because they might not be designed for use in the classroom.

And how is that company protecting the data? How are you as a school making sure that you're implementing policies that protect student data. Are you you are are you are students using aliases? Are they not signing into it? Are they just using the product and not sharing any strict personal information. I think it's important to think about those two different buckets when you're building out your privacy program because it it the reality is both are used in classrooms and how are we making sure that student privacy is protected in both buckets. Probably don't use candy crust soda saga in your classroom. Yeah.

It it's that raises a lot of issues with regard to, and there's a Q and A question as well. Like, how do we, how does common sense media interact with the companies? Like, are are we just putting out our privacy evaluations and, they're just for the schools and districts to look at or parents to look at and we don't have any interaction with the companies. We actually have a lot of interaction with the companies. And some of it is extremely positive. Like, thank you so much, for doing this work.

And going through and and looking at our privacy evaluations. Sometimes I have a a lovely call with, some marketing people or even the legal team from one of the tech providers and they're like, Our privacy policy says that. Oh, no. We're gonna fix that ASAP because that is not what we're doing. So we do have those very positive interactions where we feel like we're making a difference and, you know, when you work for a nonprofit, that's always a good thing.

And and we have sometimes negative interactions. I will admit with the companies where they're, quite astonished. And then answer to the Q and A question no. We do not, have a notification system where we put out a press release or, you know, to the public or even individually notify the companies that they have, either brand new, evaluation or refreshed evaluation since we, do refreshes, and that's an important point for anybody who's from a company who's listening in to know, we don't just put out one evaluation, like, the, content people at common sense media will review a movie or a television show, and they just put it out there. And then it's done, you know, revise that generally unless, something is an error.

But we actually regularly refresh our privacy evaluations, and you all know why that is, you know, necessary, and that's because the privacy policies, as Melissa mentioned, change rather regularly, and sometimes without warning. So we do have software that lets us know, if there's been a change, when there's been a change, if there's been a change, and what the changes are, like a red line of the old policy versus new policy, which is awesome because we have awesome engineers, I did not invent that myself. And, yeah, we have these, one on one interactions with the companies. Unfortunately, not like a, what I call a tickler system where you get a notification that something's been changed. We're we're not at that level because we're a very very small nonprofit team.

That would be lovely if we had that kind of service for the public. But we do have we do welcome interaction with the individual companies that we've evaluated. Like I said, sometimes, will refresh or reevaluate because they've let us know that they have a new policy, and they're hoping it's better based on looking at our evaluation and trying to be better. Or they'll actually, as I mentioned at the beginning, they'll, notice something in their privacy policy that is not actually what occurring. And as Melissa mentioned, with regard to the, policies versus contracts, I mean, I work when I worked at Verizon, we had three hundred lawyers, and so sometimes I am I am quite aware that sometimes there are different teams and, different people are actually writing the contract versus the policy or different policy people different people are writing different policies.

And then as Melissa mentioned, sometimes you even see conflicts, like they're not they're they're completely opposite. So that is because different, your backstage information on that is that's because different lawyers are writing those different documents and that's where it gets messy. But we do welcome, not only, I put in the chat, a link for school district to join our school district consortium, and then we also we don't have a a similar group for the companies, but we do welcome. Input from the companies once they have seen their evaluation online. Thanks.

So, Joe, you're absolutely right. And I've had conversations with sales and engineers of companies because there is something kind of obscure in their privacy policy or terms of service I've read somewhere they talk about that they're gonna that they retain the right to use the data for marketing. I'm like, that doesn't fall in line with and we had a conversation that and the engineers are appalled and they're like, we would never. And I'm like, so this is what I enlighten it to. Like, companies hire attorneys to help mitigate risk and limit liability.

We're all doing that. Right? We're using attorneys for that. And sometimes these attorneys are really, really good at doing that, but aren't really good at talking to their engineers or that communication's not there or understanding the product. Because sometimes it takes a technical understanding of the product and know really what's happening. And so we as districts need to give our and this is the not I don't wanna be, like, appear like I'm a vendor bashing by any means.

We need to give them some grace too. Because we have these different channels and these different areas of expertise working together to try to give us a product. And I know that it seems like a heavy lift to do the contracts. I don't spend we I don't spend that much time. We have a lot of partners.

Most of the time I have my standard data governance addendum. I have a standard contract that says we will eliminate your terms of service and your privacy policy. You adopt my terms and we're good. Or I'll take your contract and I'll marry them. I can spend anywhere from twenty minutes a a a vendor to maybe a couple hours but that's spread out over the course of a couple months to get a contract in place.

Very seldom am I doing a lot of back and forth? And very seldom am I struggling with companies to really do what's right for districts? Most of them are okay with it and most of them are gonna make minor changes and we're gonna move on. And and it's not ten years ago, it was a hard lift. Eight years ago, I had people saying you want me to do what? I've had companies say, what is a contract? I didn't do business with them. They needed to grow up a little bit as a company before we could do business with them. But Most of them want to do right.

Most of them are out for education. They're not our enemies. There are some there are some ones out there. That aren't in it for the right reasons that aren't doing the right things. We need the process to review.

And like I said, just eliminate those. Get those out of our vocabulary, get them out of our environment, We have to have that vetting process. But for the most part, they're partners with us in this in this mission, and they wanna do right. And it's not a heavy lift anymore. Not like it used to be eight years ago.

I'm not getting nobody's ever asked us to sign something like this. That was a different conversation. I'm having fewer conversations educating our partners on what ferpa is and that they do collect ferpa records because I've had a lot of them say, well, we're not collecting educational what you are. Like, you have educational records. You're housing that for us.

I'm spending less and less time doing that. And less time on the contracts. It's getting easier for us. We just need to continue making progress. Just continue doing this.

And even small districts are having success now where before we're it it just wasn't that case. So we're definitely making progress. I don't wanna make it all sound scary. They're not all the bad guys. There are a few bad actors out there that give the the good guys the bad names.

And and our thing is to not be frightened. She, yeah, it just it just look like she she wanted to stay on the ground. I think it it's just just to add some nuance to what Melissa said and answering a question that came in about getting those data protection addendum signed how to navigate that. I think one of the things that I like to clarify is that vendors don't wanna find a standard addendum that says that they will not do something that they already don't do. Like For instance, if it says we will protect all biometric data we collect but they don't collect any biometric data their lawyer probably doesn't wanna sign that generic addendum that says we will protect all biometric data because that just doesn't doesn't jive with what they do.

So when Melissa was talking about working to align their practices with what you are doing your requirements, that's what they're looking to do is to make sure what they are signing, whatever DPA or DPIA or whatever whatever it's happening whatever whatever data protection agreement that they're signing is reflective of their actual practices is just what they wanna do. It their lawyers and their legal team wanting to make sure that they're saying what they're doing so that they don't end up in lawsuits. That's that's the simple fact of it. And it might take some additional back and forth, but it is it is a much different landscape than it was ten years ago. And I would agree.

And I will tell you I think that sometimes and I know the easy button for districts is you sign this and we're not gonna allow you to negotiate or we're not gonna do business with you. You sign this. For us, I didn't think it would I don't find it fair to the small startup that is trying to help my students master math facts to sign the exact same data governance addendum and the exact same standards as my system that houses my IEP information. That's totally different information. Now are there a lot of things there that are the same? Yes.

But my my level of expert tape, my level of expertise, for my IEP system is different than a math facts that they're just doing progress. They're just tracking progress on, you know, on mastery of multiplication. And we need to look at that because we have to form partnerships somewhere along the lines or we're not gonna all be successful. And and so that's where I I try to work with each company. I'm not gonna ask somebody to sign off that they're keeping my biometric data safe if they're not keeping biometric data.

Just like I'm not gonna ask them to keep my IEP information safe if they're not collecting IEP information. That contract looks different. If that makes sense. Yeah. Not not dramatically.

It's just being open to redlining and aligning with what that practice is. So Melissa, one thing, that that I'm curious about, and I've heard asked both in the chat and and just in district conversations. You know, I've got maybe lots of tools that are free that I've got a handful of teachers using. I run my learn platform report and I see, boy, I've got you know, all these math tools and and whatever. You don't have a piece of paper signed with those presumably.

I I assume. Right? So what's your course of act Do you go to get a data privacy agreement with some of those, or, you just kind of tell the teachers, is it more around the education and pro activity? What what's your take there and philosophy and approach? It depends. No. So we we do this periodically. We watch our our dashboard, and we look at what our apps are that are being used.

And it really does depend. We've had apps where we're, like, why are students spending this much time on a free app, then we have a vetting process. So, obviously, these teachers are using and that they've They thought it or they assumed it was okay because one teacher used it so that somebody else used it. Right? You have the domino effect. But the amount of time teacher students are using this math website that isn't aligned to my curriculum.

That is not a discussion or a decision that I make in isolation. I sit down with my curriculum team We review it together. My director of teaching and learning, we sit down and we review those applications together that aren't vetted aren't approved or have been denied that people are still using, and we decide what to do. We went to a couple weeks ago. We blocked a few.

Because there were a significant number of students spending a significant amount of time in an application that wasn't directly aligned to our curriculum. And we're paying for resources that we have proven, like, proven vetted resources that get the outcomes we need in math, fact, you know, mastery, and they're using a different product. We then help plug those teachers into the resources that they should be using in classroom. We provide coaching and support around that. And so we often look at data privacy as this isolated silo Right? And data privacy is just this, but if we're aligning our practices to make sure we're vetting resources, we're accomplishing more than one goal.

We're doing more than just ensuring the the safety security and privacy of our students. We're not only helping our our efforts in cybersecurity as well. But we're aligning our curriculum. We're maintaining the fidelity of our resources that we've invested in. We're making sure that it is impossible for me to decide or to term and from data, if a resource we're investing in is is actually creating the outcomes, generating the outcomes we thought it would, if we're using five other resources to help us get there as well.

And so how are we determining the ROI, the rest return on investment of our resources if we're not implementing those resources with Fidelity. And we can accomplish many goals. We can bring the entire team on and and make this effort one that has many outcomes and benefits for all involved and not just data privacy if we do it properly. So I speak out of both sides of my mouth when we talk about data privacy. This vetting of resources in analyzing how resources are used isn't just for privacy.

It's also for Fidelity of implementation and alignment of our curriculum and ensuring that we're spending our money really. And we all have skin in the game when we approach it that way. I love it. Yep. It starts with with vision and vertical alignment and shared understanding of what it is you are trying to accomplish, and you're not leading with a tool.

You're you're really leading with the pedagogical approaches and understanding the tools that fit within that realm and ecosystem, and then rolling up your sleeves and doing the vetting and the and the privacy and continuing to support your teachers and engage them and help them uncover the resources that are available. It's great. I I, you know, we're we're pretty much out of time. If either of you wanna chat anything, to answer a couple questions in the chat, feel free to to kind of jump in and type in answer live there. I'm going to, really just kind of, dismiss our our group here, as far as next steps, I mean, I think this conversation, will continue to evolve, of course.

I think there's lots that we definitely could have touched upon. I would I would suggest, reaching out to your colleagues and peers, folks on this panel and, panel I appreciate your time. I thank you very much for your partnerships, Alyssa, working with you on these past couple of years district level has been great, common sense, amazing partner, SIAAA. Sarah, we just we're just starting to get to know each other, me and you, but, you've got a wealth of information And as far as next steps, you can go to our website. You can start your own free inventory dashboard.

You know, and this isn't a conversation about, running surveillance on your teachers and doing the eye the eye gotcha. It's really about better understanding, like Melissa said, what are your resources? Are the ones that you're paying for getting used Are there others that you don't know about? And they represent not only data privacy implications, but resource allocation implications. There's all sorts of questions that can be answered and better asked based on this type of data. And we have all sorts of demos and and so forth on our website. You're welcome to join those, reach out to any of us.

And common sense media has their evaluations free to the public, And, I would just say, feel free to, drop anything else in the chat that you would like stay in touch. I'd ask my panelists if you're open to it to put any contact information that you have. Unfortunately, I guess we we failed to put it here on the slide, but feel free. I would ask for departing words, folks. If there's anything you you wanna say, feel free now is time, but we're we're just so short on time.

I think, there's so much packed into this conversation. It's hard to limit things. But I but I thank you all for your time. It's it's been a very insightful conversation. Feel free to unmute, say bye, say any.

Departing words, all that fun stuff, but without further ado, I'll I'll get to dismiss people. Thank you. Thanks for coming in. Thanks. Thank you. Alright.
Collapse

In this webinar, Jill Bronfman, Privacy Counsel from Common Sense Media, Melissa Tebbenkamp, Chief Information Officer from Raytown Quality Schools, and Sara Kloek, Vice President, Education and Children’s Policy from SIIA discussed:

  • The various ways that states are regulating student data privacy and how K-12 local and state education agencies and others are responding.

  • How districts can build a culture of privacy within their organizations.

  • Third-party resources for evaluating solution providers’ privacy commitments.

  • The critical role that goals, timing and professional development play in managing change and avoiding disruption.

Discover More Topics: