top of page
  • Writer's pictureTeam

The Data Drop Panel: September 2021

Our host Heidi Saas takes a deeper dive into some of the most important, concerning, and downright fascinating data privacy and data protection items covered by the Data Drop News podcast in recent weeks.


Pro tip: get The Data Drop on your phone by subscribing to our podcast.

Heidi: I am Heidi Sass, a data privacy and technology attorney based out of the Washington DC area. I'm also a member of the iOWN community at the Data Collaboration Alliance. Welcome to the Data drop Panel. Each month, we gather some leading data and privacy professionals to hear about the news stories that stood out for them over the past month or so.


In the fast-paced world of data privacy, it's always interesting to hear what's raising the eyebrows and curling the fists of the practitioners. I should note that all the stories that we'll feature today have also appeared on episodes of the Data Drop News which delivers a four-minute news roundup every other week. Check it out.


So let's get started. This month on a Data Drop Panel, we have three great guests.


First up, we have Daniel Knapp who is a data privacy consultant based in Atlanta, Georgia, and a principal at Red Clover Advisors. We also have Dan DeMers, the CEO and Co-founder of Cinchy and the President of the Data Collaboration Alliance, and last but not least, we have David Krueger, the Dallas-based Co-founder at Absio, a leading protect company that's advancing distributed key cryptography.

 

President Biden signs Executive Order


Heidi: So first up is Daniel, I think you're going to be talking about Biden and his 72 points in his executive order, which one struck your fancy?


Dan K.: Well, we'll just go through all 72 real fast!


In all seriousness, I love the fact that this executive order is finally getting people to focus on the need for a federal privacy regulation in the United States. What I find less impressive is that, while the data privacy protection aspect of the executive order is getting a lot of focus, the reality is this executive order the intent is to aim at Big Tech and focusing on potential monopolies, potential anti-competitive behavior.


And I don't necessarily want to touch on that so much as the fact that that focus means that in practice, this is really limited in terms of, in my view, the impact that this will have in terms of data privacy because it's really focused on Big Tech and their data-harvesting practices that the administration views is anti-competitive. Not that that's not important, but the reality is that in my view, Big Tech is the portion of the sector that's best able to handle the thirties, know, some odd individual state privacy regulations that are either being implemented or being discussed right now.


I think it's the little guy really that will ultimately benefit most from federal privacy regulations because they're less equipped to really shift from state to state how they're actually set up to respond to and handle privacy regulations. So I think it's good that the executive order is getting, I think, more talk about the need for a federal privacy regulation in the US. This kind of highlighted more in people's minds.


But my concern is that people will look at this and say, "oh, now we have some kind of federal privacy rules in place." And really as it's designed, from my read, it's really only going to directly impact a small number of really big corporations. And that's the intent behind the whole executive order.


But I don't want it to get lost that there's still really a need to push for federal privacy regulations because right now we just have a few key states implementing their own rules, more kind of on the hopper and a lot of smaller to mid-sized companies still don't really know exactly what to do about that.


And until we get something more expansive until we get something that's actually, I think an actual law pushed through Congress, that's going to be pretty limited.


Heidi: I hear you. I'm not sure that's going to get through Congress the way it is. The way I look at the executive order is, and you can let me know if you agree with this or not, but it says we'll make our best efforts to do these things.


So it's not a right, and it's not enforceable. So it's not really going down the path that we need it to, but it is giving a little foreshadowing to the industry to say, we're not really cool with what you're doing anymore and we know how you do it. So I think that was kind of what was the unsaid statement from those executive orders.


Does anybody else have an opinion on that? Yeah, David,


David: What I'm concerned with that the people in DC and in state governments for that matter seem to miss is that legislation does not affect data. It has no force and effect on data. Only software does that.


So you know that there is this - I mean if we just get the words right in the legislation that somehow that's going to magically solve all our problems. And I'm not saying that legislation is not needed, but I'm also saying that there needs to be a clear recognition that if you're going to change the way that we do things with data, then you're going to have to change the things that manufacturer and manage data.


And that's software applications that seem to be not on the radar screen at all. And I don't want there to be some kind of prescriptive legislation, but there at least needs to be some accountability and some liability on the part of people that provide software to be able to disclose exactly what can be done and what controls they have on the data that they are either producing or further processing. That's my 2 cents.


Heidi: I Agree. Dan, did you want to get on that?


Dan K.: David, wouldn't you say that's kind of part of the cycle, the way we're seeing things right now is, it really is cyclical, right?


We have tech companies kind of advancing at a rapid pace, things like ad tech that involve


that have privacy implications and therefore we have the different regulatory bodies or in this case, not so much a regulatory body, but just the administration coming up with an executive order to sort of responding to that.


And then, in some way, shape, or form, whether you're dubious about the motivation behind it, we had the tech companies in term responding by innovating in such a way to adhere to said legislation. So right now we're in that cycle. I don't know if it's a good thing or a bad thing. But it strikes me that's just sort of the current state of affairs and it seems like you're indicating there needs to be something more that happens. I'm just curious what your thought process is?


Heidi: Yeah. A private right of action would do it. That's exactly what it needs is a private right of action that right is enforceable and set clear guidelines, then businesses will know what they need to do. And the ones that don't, they're going to have some trouble. So that's how you get the big shift in the market, but nobody wants to talk about that.


So that was a quick answer right there. at this point, I want to ask Dan DeMers, did you want to get in on this or do you want to wait until we move on to another topic?


Dan D.: Yeah, I just wanted to say that I very much agree with what David was saying. As the only way that this is going to happen, that absolute scale is if the way that technology is created and managed is fundamentally changing and that does require enabling technology but also does require any such regulations to mandate adoption of standards. For the same reason that if I'm manufacturing a car, I have to put seatbelts in the car. Whereas if I try and say that the driver of the car is responsible for the safety of the passengers, but the car doesn't need seatbelts, doesn't need crash testing, doesn't need any of these things. It's going to be very difficult to ensure the safety of my passengers even if I'm ultimately accountable for that. So we need to make sure that the cars have seat belts and they're crash tested.


And so the technology that gets created needs to be designed in a way that makes compliance at scale possible such that it respects privacy and I think that's where we're going to need to see a shift for this to make a real dent and an impact.


Heidi: I agree

 

Zero-click hacks threaten mobile devices


Heidi: Up next again, David. And David, I think we're going to talk about zero clicks and you want to talk about that one first?


David: You know, we get this story that zero-click hacks threatened malware, and that this basically, this is a piece of malware that can get loaded onto the phone. And you don't have to click a link. You don't have to do anything like that. Just merely having the opening whatever is bearing this zero-click malware's enough to get it installed on your phone. The thing that strikes me about this is this further failure to understand authentication and how to properly implement it.


So you've got a piece of a room of malware that's been loaded on you. You've got remote execution. That's attached to that. That's part of the story. Here's the problem with that, we know how computers work and we know that when we know how authentication works. If we're going to authenticate something personally, we always look at


You know, assigned values that we give something, you know, we give it a name, we give it a serial number or something like that as some kind of assigned semantic or numerical value. And then we also look at the physical characteristics of the thing that we're trying to authenticate.


What you have here is this, the provider that doesn't even bother to ask if it's a remotely executed piece of malware to verify who it is, where it's from, what device is it reporting back to. So this utter failure to understand that if you're going to reliably authenticate anything, whether it's a user or it might be.


You got to do the user, you've got to do the hardware, you've got to do the software. You have to have all three to make the malware work. This is an especially egregious case where we didn't ask for anything, but even in most cases, the only authentication we asked for was user authentication. So I'm probably doing a bad job of framing the problem but imagine this, you're protecting the money that's involved in a bank, you have a door that people have to go to the vault door and you'll let anybody in there as long as they slip their driver's license in, and we know that those are easily faked. That's the only thing that we ask for. And then we open up the vault door and let them in. Is it any wonder we have all of these problems?


This again is just sort of showcasing the continued sort of engineering stupidity about the way that we do authentication. It's just maddening.


Heidi: I hear you on that. You know, I would also add that they probably do know how but it is not their obligation to do so right now. If we think back to when car stereos were being stolen, well then they sent forth legislation to demand the kill switch be added.


Well, then they stopped stealing stereos overnight. Because it was regulated. They knew how all the time, but it wasn't in their benefit until it was regulated to make them do it. So as soon as they did that, people stopped stealing radios. So that's not a problem that we have right now. Right now you're talking about phone jacking.


So what do we do to have a kill switch, to stop phone jacking? It's authentication and it's gotta be done the right way. So does anybody else want to get in on that? Did you have something else you wanted to add?


David: I mean, the only thing, I go back to Dan DeMers talk about cars, my background is in process safety. So I'm in agreement with you. If you know the method that something is going to become hazardous, that bad things are going to happen, you have full knowledge of that and you take no action to stop. In any other industry but our industry, we have a legal term for that you're familiar with. We call it negligence.


But there's no liability to attach us for letting something as stupidly egregious as zero-click malware to get on a device. Until there is some kind of negligence that attaches to that, then the cell phone provider has no reason to do anything about it. Except for a vague apology.


And that's a situation that just can't persist. My two cents.

 

Time to kill standard privacy notices?


Heidi: Dan DeMers, do you want to add on this one or did you want to start your next story? I think killing standard privacy notices was one of your topics. I'm excited to hear what you have to say about that.


Dan D.: Well, it's not a new story by any stretch. It's a, I'm sure we've all accepted thousands of such privacy agreements and I'm sure we've all diligently read all of them. But one of the things that I've been fascinated with is the Apple move for the App Tracking Transparency and what does that mean to privacy agreements of the future? Because now it makes it digestible, consumable in terms of what am I actually accepting by utilizing this particular application or whatever it is that I'm doing. And I wanted to just open it up to the group to think of, to ask what you guys think of, is the emergence of tracking transparency and the longer-term consequences of that.


What is the impact of that on kind of the standard privacy agreement when you're accepting software or playing a video game or doing that? Is that going to eliminate that as a concept? My hope is yes, but I'm curious what you all think.


Dan K.: It's hard for me to imagine a world without privacy agreements at all, just because they're so much a part of our everyday lives. But I could definitely foresee further standardization since you have Apple and potentially Google following suit not long after pretty much-forcing standardization in terms of how tracking actually works within apps. So, maybe we're being in this situation where we can kind of templatize things.


And what would be great about that is, yes, you're kind of joking, people actually read every privacy notice and we all know they don't. But if you basically have to read one once, maybe you will, at some point actually, not only will it enhance the privacy and security of the apps, but it could actually enhance awareness as to what's actually going on.


Because at some point someone does read the template and yeah, maybe it's updated once a year. For all the reps, so their entire phone. And so in addition to actually helping with the privacy and security to begin with I think if it raises awareness that just a big thing of what we need in general and I'm optimistic that yeah


Heidi: Yeah. If you pay people to read them and get informed consent, now I'm serious, in a tokenized kind of discount for your purchases online or whatever. If you pay people to read it and give you the informed consent that you need, you've got a better likelihood of raising their base knowledge.


Otherwise, they're not incentivized to read that thing. I'm not incentivized to read those. I think it's the digital answer to something that I don't think anyone grows out of past college, people will do just about anything for a free pizza. This is the digital free pizza.


Dan D.: I think part of the problem is that the way that they tend to be phrases from the person trying to protect their liabilities perspective rather than


the person accepting it so that it's not expressing it in terms of what the actual impact of it is. So to me, that's the reframing where it now makes it digestible and consumable where you can actually understand what you're accepting. Like many of you have probably heard the story.


This is going back to quite a while ago where there was a retailer in the UK that, as part of April fool's joke put, basically you're signing over the right to your eternal soul. And thousands of people were happily accepting this privacy agreement and signing over the rights of their eternal soul.


And, if anyone did not, they would actually reward them. But the fact that most did just reinforces what we already knew. It's that people don't read it. And then even if they did, they probably wouldn't understand it. I've tried myself just a couple of times, but I think the reframing it in terms of the impact to you, which - the templatization - I think is a really big shift.

 

School posts on Facebook could threaten student privacy


Heidi: So I want to move on now to our next topic here. I think we're back at Daniel, I think Facebook was your next topic. Always something to talk about over there.


Dan K.: Yeah. So, this was, specifically that there is I'm trying to recall which publication uncovered this, but essentially schools and school systems posted their Facebook account information and photos information about and photos of students all the time.


And what they realized was that even if parents and students themselves have their own Facebook accounts and they've set all the privacy settings to as private as possible. If schools post pictures and information about their students you don't even have to be logged in some cases to Facebook at all to actually access this information.


It's alarming, but to me, not necessarily surprising. As a parent myself whenever I sign up my kids for anything, be it registering for a new school year or camp program or any extracurricular activities, there's almost always you sign off the approval for their photos to appear somewhere.


So I don't think that much is surprising. What's a little disappointing is that you have school systems that don't necessarily understand what they're doing when they're posting kids' pictures. And part of that is on them, right? You have people running social media accounts that aren't necessarily tech savvy enough to realize what protections they ought to put in place.


And I think part of it's also on Facebook and other social media companies that realize that they have all of these accounts set up by school systems and other groups that involve children and they're not necessarily proactively doing the education for them. These are the settings that you ought to have in place in order to protect your students' privacy.


So I see two sides to it. I see, because school systems, they don't necessarily have the resources to have trained tech professionals run their various social media accounts. So there's going to be a gap there. But Facebook's a big enough company with enough smart people, they ought to be aware of that too.


And so, I think part of it's on them. There definitely should be training provided or at least some kind of educational template that when a school group is creating a Facebook page, they can look at this and realize, okay, this is how we set things to protect our students' and families best interests.


Heidi: I think that's a great idea. Something that I've noticed in the last year, and you guys can add in if you have experienced this too. But in asking permission to get the consent, now that the rules have changed for children's online privacy. They've split the consent into two parts and one is, can we show your kids' artwork and such, yes. And then it's, can we show their likeness and do social media and those sorts of things. So they've separated it out. So you, and at least in my case, I still had the opportunity for my kid to have somewhat of a normal life without the digital life of being at school. So it's nice to have that option, but I don't see that option in a lot of places. It's part of the reason why I'm moving.


But yeah, I think as long as parents understand what the options are, don't just keep checking the "I consent" boxes because those matter. And I don't know if you guys, how do you feel? Do you still have children in the school system? How do you feel about this?


David: Well, I have grandkids in the school system. But you know, my very cynical nature wants to stop before I answer any detailed questions and ask people, what do you think Facebook is designed to do?


What's its function?


And if you, if you think that it's designed to allow people to keep up with their friends and family and post funny cat videos and things like that, you don't understand its function is produced exactly this kind of content and make it available to be scraped. Either by Facebook or by third parties who paid for the privilege.


Because that data has economic value in the analysis that you can do from it has economic value. And that is the function of Facebook. Posting these pictures and things like that and making consents either non-existent or deliberately dense or are hard to understand, isn't accidental, it isn't an oversight.


So, when I look at reactions, a story like this and people say, oh, why they did that? Well, they can add some additional controls. And apologies to Dan, because you have a good positive - Dan Knapp - you have a good positive attitude. But when you see these stories, I look at it and say, oh, Facebook is doing exactly what Facebook is designed to do.


And they got caught. So they're going to put a bandaid on it, put some additional consents in there that we have no capability at all to determine if those consents are actually being enforced. There's no capability at all. It's kind of like, what did you expect would happen? What do you think Facebook's mission is?

 

How Apple plans to root out images of child sexual abuse


Heidi: I agree with you on Facebook. Let's talk about a company that is not founded on poaching your data to make money off of you behind your back. Apple. Let's talk about Apple. They make stuff and the stuff they make is useful. How do we feel about what they're doing now, going on the hunt for kiddie porn? Mr. Kruger, you had that one as one of your topics.


David: You remember when the Snowden revelations came out, and when people were all abuzz and rightfully so. I was one of those. At the level of surveillance capability, both the capability and the exercise of that capability of the NSA and monitoring phone call traffics and emails and text messages and so forth. We were horrified because they simply had that ability to do that. Now, the horror at them being able to do that, the consternation at then being able to do that. That's a separate thing from how they use the capability, right? Because it's entirely possible that the NSA could use that ability to do good things, catch terrorists, but they could also use it to do bad things. It's a very potent capability.


So my question to the rest of you is how is Apple's capability to scan everybody's photos on all the iPhones different than the NSA's capability.


Heidi: Go ahead, Mr. Knapp, you wanna, you want to get after this one? I see you.


Dan K.: I mean, I think it raises the exact same type of concerns and it has the exact same flavor of potential to do good at the same time. Personally, I think it's disconcerting that after all these years Apple just kind of nonchalantly threw out there, well, we're going to do this great thing and so we're going to enact this capability we've already had an order to do. And you're seeing the same debate play out, I think right. They're trying to address a very real problem. But it also brings up very real concerns that a device that everyone keeps in their pocket or their purse that they have come to believe is their own really isn't. I think on some level we've always known this, but this just confirms it. But you know, then there is also, there've been enough articles in the last couple of weeks and questions put out there, even if the intent's right, even if the majority of people feel this is okay intent, what if it doesn't work right as well?


So there's that other concern, right? Whether the false positives, Apple hasn't necessarily put much out there, other than the very base of how it's gonna work. If they put more out there, could that either have the unintended effect of alerting people who are trying to do the wrong thing and, or alerting competitors to specifics of what they're doing.


I don't know, but it's just rough because there's this ambiguity out there about how they're doing something for a good cause. That is very alarming to people because they don't, there's really no clear way to tell how much our privacy is going to be impacted and how frequently. And it just, it sounds like the whole Snowden thing all over again, just more personal because everyone has a phone.


Heidi: Well, I think that whenever I hear Apple doing something big or Google doing something big about privacy, it sounds to me, number one, like cover for anti-competitive behavior, but that's my nature. And number two, I want to look at it from a different perspective because yeah, they want to start looking for sex abuse, but there's already a training set for it.


So the national database has a training set for this data to try to recognize. So they've trained the AI on this awful data to try to go and find it. Once they find more than 30 pictures, you get 30 free pics before they light you up with the regulators to say, Hey, have you got some bad stuff on this cloud.


It does bring up privacy concerns, but so do other things that we have in our lives. And quite frankly there are a lot of people out there doing freaky things and that's fine, but when you're doing freaky things that harm other people or planning an insurrection or something.


Somebody needs to be looking through that just to find out what is going on out there when they take action on it. If it's a government and they're taking action on it, that's where people's rights are impacted. But for information purposes only. Yeah. As far as I know, the NSA is listening to everything we say every day because we learned that years ago.


And they didn't say we'll stop. They didn't. They totally didn't say we'll stop. They just kept going. They're just like, that's too bad. How do we do it? So, Dan DeMers, did you want to get it on this one or did you want to move on to the last topic.


Dan D.: For this one, I don't know what to think about it because I think this is just the same age-old dilemma of new technology that can be weaponized for good or weaponized for bad.


So it's like the dawn of any new innovation creates both opportunities. And, I don't know, maybe the world needs a framework for any new technological innovation on how to ensure that it is for good, but beyond that, I didn't know what to think about.


It's tricky because of course, I want to prevent this but at the same time, I understand that the capability could be used for intents that are not things that I would want.


Heidi: Yeah, absolutely. Yeah. The government stepped in and stopped the acquisition of Grindr because they were worried about members of Congress and their privacy concerns.


Whether the members of Congress were concerned about it or not. So, I mean, yeah, the government's definitely thinking about these things. And they do have capabilities and sometimes we don't know what those capabilities are. Sometimes that's good. Sometimes it's not. But they are more or less transparent when you go into FinCEN and those sorts of regulatory entities that are watching these kinds of financial crimes.


They let on suspicious activity reports. They give you reports to let you know what kind of activity they're seeing. So, they're getting that data from somewhere. Do you really want to know where at this point? We have so many different digital worlds that depend on truth and transparency in most corners.

 

Meet the Data Snails


Heidi: I think the last topic that we had here before we wrap up is the data snails.


Dan D.: Yeah. There was a report that was published that talked about how few businesses are actually compliant with things like GDPR and CCPA.


And for me it was interesting, but at the same time not surprising. Knowing that how difficult it would be for businesses to know what, regulations actually apply to them as well as. The inability for them to systematically comply with them. It's again, coming back to the car analogy, just make sure that your passengers are safe, but the car may not have seat belts.


It's a little bit like that. And I just found that again, both interesting but at the same time, not surprising. And, I don't know what can really ultimately happen with that other than governments shifting their focus towards ensuring that the cars have seatbelts and that the way technology is engineered and constructed has a privacy built-in by design such that it's non-negotiable. And if someone gets in the car and doesn't put the seatbelt on, well they've now accepted the risk of that. And I think that's the way that, that the world needs to shift versus telling people to make sure that their passengers are safe.


I don't know if that makes sense.


Heidi: Yeah, you guys have anything you want to say on that? The perspective from what I see on the GDPR is its individual rights and these people have rights and here's how you treat their data. And the American side it's "you businesses have obligations and you should do this with people's data for business obligations."


It doesn't really create a personal right of privacy. It's more of business regulations. And so the approaches are different, so they have competing interests and they're just, they're not working out for transported data for sure. But the surveillance is the biggest problem that they're having.


So I really think we're going to see a progression of their activist groups. Like None of Your Business that isn't going to stop. They're not going to stop. And this is their biggest issue. Executive Order 12333 and FISA 702, they're not going to stop until we can say we don't surveil our people these ways because it opens up surveillance of their citizens and they have rights. So that's where the standoff is.


David: I think it was a key distinction that you need to make though, and this goes back to Dan Demers's comments right here. We have a technological problem, legislation has a role but a technological problem requires a technological solution.


That's just being logical. So, when you're dealing solely in the realm of rules and regulations and laws you're dealing in the world of won't. We're telling you that you need to do this and whoever you're telling that to, whatever business that you're regulating or say "okay, we agree, we won't do that". "Won't" is entirely different than "can't". And what we need is "can't". And that's a technological thing and you can't abuse this data in this way. You can't surveil in this way. And you don't get to "can't" unless you have a technological fix. And as we've already seen from the data snail story, people, if all they have to do is say, well, I won't do that. Then you're kind of limited in your outcomes. When my kids were little, they told me a lot of times that I won't do that. Didn't always work out.


If we're talking about real solutions and stuff like that, we have to move from this world of "won't" to this world of "can't".


And that is a technological solution every time. Getting hurt is not the same thing as putting a seatbelt on.


Heidi: Daniel, you wanted to get in on this one?


Dan K.: I just also wanted to circle, this kind of almost all brings things back to the beginning when I was referring to the executive order, really only focusing on Big Tech, right? A lot of the snails, if you will. Yeah, they don't know better, they don't know how to do better, but also in many cases they see the various regulations that are in place are focused on what can we do to catch the big fish.


And if you're a smaller company, you don't have the resources to you necessarily invest in a privacy program and you see that everyone's focused on the FAANG companies and the other big companies you say, why should I do this? And that's not necessarily the law's intent, but in practice, we see time and time again, the focus is on what Amazon is being fined, what Google was fined.


And so, I think we really need to see a full application of the regulations in place also applying to smaller and midsize companies. And when that happens, the snails will catch up because they realize, oh no, this applies to us as well. So there is that side of it also to consider.


Heidi: Yeah, those are the businesses and schools that are getting hit with ransomware too.


Cause they're thinking they're so small, they don't really need to deal with it and they have huge cyber security concerns. And so, yeah that's likely going to be a problem that we need to solve moving forward.

 

The Data Drop is a production of the Data Collaboration Alliance, a nonprofit advancing meaningful data ownership and inclusive innovation through open research and free skills training. To learn more about our partnerships, the Information Ownership Network, or the Data Collaboration University, please visit datacollaboration.org.

61 views

Recent Posts

See All
bottom of page