top of page
  • Writer's pictureTeam

The Data Drop Panel for March 2022

Host Debbie Reynolds and special guests take a deep dive into the noteworthy, concerning, and downright fascinating stories featured in recent episodes of the Data Drop News podcast.


Pro tip: you can listen to The Data Drop Panel on your phone by subscribing to our podcast.

About The Data Drop


The Data Drop podcast is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and global Collaborative Intelligence.


Join Node Zero


Node Zero is a data-centric community where professionals, nonprofits, and researchers join forces to collaborate on datasets, dashboards, and open tools in support of important causes. Learn more.

 

Full Transcript


Debbie Reynolds: Hello my name is Debbie Reynolds. I'm a data privacy strategist and expert from Chicago, Illinois, and a member of the Node Zero Community at the Data Collaboration Alliance. So welcome to the Data Drop Panel where every month we discuss some of the leading data privacy and data news around the world and talk about things that we think the audience would be we're care about.


So we're stripped from the headlines, but going a bit deeper on these things. So, you know, in this type of world where we have this fast paced eyebrow raising moves around data and privacy, it really makes us to do these types of panels. And I'm really happy to. The special guests here. So all the stories that we featured today also will be included in our sister data drop moves podcast, which delivers a four minute data prior to news round up every week.


So, without further ado, I want to introduce our guests today for the data drop panel. We have Jeff Jockisch who's a data researcher and principal at Privacy Plan, a data privacy consultancy and data set provider from Florida. Welcome to we have Samir Ahirrao. He's the founder and CEO of Ardent Privacy in Washington, DC and David Krugerwho's the VP of Strategy co-founder co-inventor of Absio corporation the creator of a software defined, distributed key cryptography from Texas. Hello. Well, before we get started Jeff junkets my buddy has a community announcement.


Jeff Jockisch: Just wanted to make sure the community of privacy pros out there knows that we have a new privacy gateway, which is a tool for privacy pros to be able to access a lot of new data sets that the Data Collaboration Alliance has available.

So it's in beta. Now, if you want to come check it out say it will be a link at the bottom of this. So check it out. It's a have a lot of great datasets that are available right at your fingertips. So please check it out and give us.


Debbie Reynolds: Thank you, Jeff. Really appreciate that update. Highly recommended people jump in over on these dead data sets. They're tremendously helpful, and it's really the only the best free resource I've ever seen is actually black. Right. So, let's start with our first first person news item. So, David let's start with your story. Google. Push back against changes to Australian privacy.


David Kruger: Yeah, I just thought this was interesting, especially in light of the fact of you know, Facebook's stock plummet and the sort of when I read this article, it's almost a little bit of a panicky undertone here.


With this line I'm quoted in the article Australia consumers benefit from being able to access free digital services funded by personal advertising and what I continue to see here. Is this tight coupling of free to targeted advertising, right? You know, you get this free service and this is how we paid for, you know, shut up and love it, kind of, kind of an attitude here.


But I think that one of the things that's interesting in our space is the, is this, there is a continued move to decouple the advertising model. You know, in a specific way. So back in 2002, when Google discovered that they were collecting this personal information, then they were primarily using it for user experience improvement, and then they found out they could repurpose that data and begin to target ads for, but now you have this host of of startups and using privacy enhancing technology.


That allow you to do that advertising without collecting that personal information. Right. In the technology has advanced so much that now setting up a separate facility just to crawl the web and catalog data and to be able to produce search results is a tiny fraction of what it used to call.


Right. So it's just interesting to think about what happens to Google and what happens to this this whole model when you can, the cost of delivering the search results, which used to be very expensive as incredibly cheap. And you don't have to collect, uses data to be able to deliver targeted advertising.


So, you know, to me, it shows up as this continuing. Assumption that this model is going to remain unchanged and I just don't think it is. And again they're going to push back against the collecting of personal information in the exchange. Well, this is free. Well, if you don't have to have their personal information and delivering search results is cheap.

How does that change things in the longterm? That's an interesting question to make. I see a lot of.


Jeff Jockisch: A lot of wisdom there. What you're talking about, David, there are. So many different ways that I think you can provide advertising that you don't necessarily have to go down the path of the pure surveillance capitalism that we're currently using right now.

I actually heard a really interesting. And proposal from, and I forget the guy's name, but it was a gentleman from the eff in a podcast a couple of months back where he was proposing a model that he labeled forgetful advertising. And the concept was to essentially use most of the data points that, that Google already collects.


But once the ad is actually served to then essentially forget all of that information and move along and. That's an interesting proposition. So if you could actually have the data, present the ad and then forget all of it, so that now that you can move along where would that lead us with? Could that be a model that could actually be.


David Kruger: Yeah. I mean, if you think about it, you don't need to collect data with modern technology to deliver the advertising. You also don't need to do collect data on people to deliver search results. You do need data to distort search results, to tailor them in such a way that you can increase your ad revenue, but you don't need them to an order just to deliver the results.


So this is this further uncoupling of the free service from the needs to acquire all this private information. It's just going to continue


Sameer Ahirrao: one comment there. Right? So, we gotta be a little careful about for the early-stage companies are the companies who came are coming in existence because if you understand big tech, Google, Facebook, apple, They already got 20 years or 15 years worth of data.


So they may not need more personal data and research. Right? So even if you stop the collection of data today, right? I'm not saying we shouldn't be collecting more data, but they have more advantages compared


David Kruger: I kind of disagree with you and one waste and because the value of the data to take hold is the value of their company.


It's not in their intellectual property. It's not in their hardware. It's not in their paperless, in that the value of. Revert Google, for example, 85% of its revenues come from advertising. But the way that advertising system works, it has to be constantly refreshed with new data because they look at that old data, compare it to new data, do continuous individualized trend analysis, to know what answer to deliver.


Well, that's entirely dependent on having a cost of the influx of new data. So the value of their data that they hold. Right Thomas. I mean, it just goes, I mean, it's value drops very fast if it can't be continuously updated. So again, the value that they hold is predicated entirely on the value that they continue to bring in because the two must work together to deliver targeted advertising the way that they do it.


Sameer Ahirrao: Yeah no, I don't disagree. I'm just what I was saying was that the other smaller companies should not be disadvantageous because it might be in the favor of, again, that, to how that regulation. So nobody else has data. They have, that's the only point I want to make. I'm completely.


Debbie Reynolds: Very cool. So let's go to the next topic.

Jeff Ontario pledges to become the first province to protect workers from digital spine by bosses.


David Kruger: Yeah.


Jeff Jockisch: So let's talk a little bit about workplace surveillance. This is a big topic because with the move to remote workplace, because of COVID our bosses are spying on us a whole lot more at least for most jobs.


And that's really of changed Workplace surveillance a whole lot. It didn't use to necessarily happen for every job. Maybe some jobs, certainly lower level jobs. Bosses have been spying on us for a long time, but it's really sort of moved that sort of issue to the forefront of a lot of privacy pros minds.


And, you know, a lot of people sort of think that they have a lot of privacy rights, those rights change a lot in the employment situation. A lot of those rights sort of go out the window. When you sign an employment contract, at least in the United States, it's a little bit different in Europe.


But this law sort essentially, it's not saying that you necessarily get all those rights back, but Ontario's essentially. That your bosses are going to have to tell you if they're spying on you, which is a really interesting way to sort of go about this. It's not saying they can't spy, but they have to tell you if they're going to spike.


So, I think it's a really good move. I think it's a valuable move. And I'm very much in favor of this.


Debbie Reynolds: What do you think Samir?


Sameer Ahirrao: Oh, mean, absolutely right. It's both sides of the story, but in terms of people are worried right. More and more remote work. Coming up. But I think there has to be a strictly balanced there, what you can do, what you cannot do.


But it also brings a kind of challenges like what people do from the invoice side. So in blossom, some need kind of, information, right. People are not just checking out and things of that nature. So I think it's a strictly balanced in between the employee and employer their contact, but of course Continuous up on basically extra additional surveillance is absolutely not good.


So we gotta be careful that but at the same time, we need to understand from what perspective that the product would be, should not be impacted as well as. As I implied ourselves, right. So we need to take just what size of the client.


Debbie Reynolds: what do you think David?


David Kruger: Well, you know, the, I mean, apart from the Ontario's ruling, which I think is smart, it's a dumb ideas on your employees a few years ago.


Oh gosh. Eight or 10 years ago, there was this proliferation of companies that that allowed you to contract through them, to, you know, offshore developers primarily. Right. You could go sign your developers and assemble yourself. And a lot of the companies offered what was in a new technology that basically allows you to supply on your off shore developers.


You, you could, you know, watch some type that you'd have to have a webcam. You could see that they were in front of the gates. You meet her reports that tells you, you know, how long people were spending each hour, you know, entering code or working on code and that type of thing. And that got a lot of uptake.


And then it went away very quickly because they found out that people resent being spied on they're uncomfortable with it. Productivity went down. And sort of the lesson learned from that was that spying on your employees is a poor substitute for hire and trustworthy people. Yeah.


Debbie Reynolds: And it's also a poor substitute for managing it's like be a better manager.


David Kruger: Why do you think that this will work is a question I will ask these guys, what is it that your people from your, how you perceive them? And why do you think it's a good thing that, that that they're, you're declaring upfront that they're not trustworthy. You think that's going to be helpful?


Debbie Reynolds: Great point.


Well, Samir you're a term. So, and you asked, we recently had president Joe Biden do a state of the union address. And then also you want us to talk about DHS as privacy, chief, who aims to promote privacy enhancing techniques.


Sameer Ahirrao: So I think a first good news, there is some practicing in the federal us federal space for privacy. Right? So state of union is always sought after in the larger community. So I think the whole mention of privacy as a term starting from children that was definitely outstanding that some get it but interesting status it's actually, they put together, they said that one by one. Advertising firms.


Right. And we just talk on the, David's find out here they've all 72 million data points on an average child. By the age of 13, I have 14 and 12 year old. When I can understand, right. We use up the world and call it tech, what they have to use for that. That's dangerous. Right? So the advertisers collect data.


Know, the more internet is all about and how the responsibility use them. So I think it's a very good news that there is attention there. I think there are several states as well as are taking a good action including here in Maryland where educational privacy. Right, right. That, and I think, especially with.


It's absolutely other team about the DHS privacy. So that's, he actually talked about putting in technical design, so fundamental aspect just like we do security by design that she will talk. She will basically promote the privacy, enhancing the technical designs, a product, which is, are being procured at DHS or all the the agencies which come under that umbrella.


With DHS. So, that is also good news. So I think oral, what is happening is at federal or even at the commercial, I think that the whole incidence more and more incidents going to happen about the privacy and we're gonna know bad impacts of it as a society. And that's gonna drive these regulations and just like apple date, right?


That $300 billion. All the advertising social media company. So I think it's just a natural, not just regulatory, but not so bad stuff happening will actually improve the whole privacy value prop for all these big tech and consumer focused companies, automatic. And that's the good news. The attention by the Goa, especially in the state of union is a fantastic.


Debbie Reynolds: Al also wants to just throw away there for anyone who doesn't know a DHS is that's the department of Homeland security. The. So I'm an anti acronym person.


So, yeah, so I actually ripped through all the stuff that Biden talked about and the state of the union address. And I also noticed that statistic smear that you talked about where they, since by the age of 13 companies have up to 72 million data points on kids. One thing I will say.


The us gets a bad rap about their privacy regulation. We have some of the strongest children regulations, right? So under the age of 13, obviously we know that companies don't abide by that in the way that they should be. So in a way, I'm glad that we're strengthening those privacy protections for children.


But, you know, as you know, once you get to be known. 13 15, 16, 18, you know, your privacy protects those dropped precipitously from there. Interesting story. So let's go to the next topic. Let's see where we have. So David Google to change privacy policy on Android under UK competition oversight.


David Kruger: Yeah, it's just, again, it's just interesting to me because of of some of the language in there.


So. It's at 90% of the apps available on Google play store or app are free also, thanks to digital advertising. So this is really a continuation of that last story. However, the announcement that the app ecosystem is healthy, the industry must continue to evolve town. Digital advertising works to improve user.


Privacy. And it talks about the new setting will limit the share of personal data with third parties. That's the key there at third parties in in work without cross party identifier. So the interesting thing about this story to me is the way that they can. What is actually going on and that's, you know, Google's advertising system is a walled garden, right?


You have to work with Google in order to be able to access the benefits of it. And you gotta pay them a cut, you know, every time somebody clicks on an ad. Right. So what's interesting to me is that they talk about this in terms of privacy. But nowhere in any of the announcements that Google's made, you know, they've abandoned FLoC.


Now they're trying to develop this new privacy respecting ID nowhere. Does it say that they're going to slow down on the collection of data for their own needs? Right. So I'm always intrigued how you can get away with saying that this is about user privacy. When it's doesn't have anything to do with user privacy it has everything to do with further restricting people you know, forcing them to use Google's walled garden advertising ecosystem.


So I'm always a little bit intrigued how you can cast that as privacy, when your data collection isn't slowing down at all.


Debbie Reynolds: I'll jump in on this one. Yeah, so basically there are proposals around the world or around third-party data sharing. And so, what these regulators are trying to do is stop that sort of exchange, but. What happens and that they live Google will benefit from so they can sort of stop these third parties or whatever, but they are our first party data holder.


So most people use Google. You know, it's the most widely used search ended in. A lot of people use on their phone. A lot can take their phone and where they go. A lot of people have, you know, these Not just the phone, but other devices to collect this information and all this makes Google a first party to that data.


So really they don't have to stop, you know, they have this relationship. So on their way, all the way home, I would say, what are your thoughts do?


Jeff Jockisch: Well, you know, I think that the the privacy sandbox. A hell of a lot better solution than FLoC was. So Google has definitely improved on what they were proposing before, but David's right.


That they're not really doing much to regulate themselves in terms of privacy and what they're doing with data privacy practices will it actually help consumer privacy? Yeah, I think it's going to take more personal information off the market. There'll be less data brokers out there with profiles on, on us.


Though they may still have profiles that have poor quality data that just isn't getting updated. So maybe that actually hurts us in some ways. But there'll be less data flowing to those brokers, right. And other advertising platforms. So that's good in that sense. But aggregating it all to a couple of players like apple and Google, that's got its own set of problems.


Debbie Reynolds: I agree. I agree. So, Jeff you say Bloomberg loses appeal and landmark UK privacy case. This is, should be interesting.


Jeff Jockisch: Yeah. This is where, you know, different rights sort of buddy against each other. So we have a reasonable expectation of privacy. That's sort of a baseline for privacy, at least in the United States.


And that sort of. I started back in a court decision. Called cats bus back in 1967. And that was really where reasonable expectation of privacy test came about. And essentially said that fourth amendment searches occur when the government violates a subjective expectation of privacy that society recognizes as reasonable.


And it said that the fourth amendment protects people and not places. And what this decision is really saying, and this was this was a European decision, right? It's from the UK and Britain's Supreme court dismissed an appeal that Bloomberg made saying that a person who was under criminal investigation has.


A reasonable expectation of privacy until they're charged. And so essentially this was a guy who wasn't famous and he got charged with a crime and Bloomberg reported on it and essentially said, you know, Hey, this guy's being investigated. But it turned out that he wasn't. Right. And so he sort of became.


For something he didn't do. And so the question is should people report should report us report on people that aren't famous, right. Just because they're under suspicion of something and the court said, no, you can't do that. So the problem is this You know, the rights of the press to be able to report things, you know, the public interest.


And those also have, you know, certain rights you know, in the United States, there's a real important case called Cox V Cohen, where it gets quoted a lot, right where the press has rights first amendment rights. To be able to release names. And that particular case was the rights to release names out rape victims, which is pretty egregious.


Right. But as, since you sort of established the rights of the press to be able to release, you know, names of non-famous people. And so how do we sort of, you know, you know, compare the right of privacy to the right of the press to be able to release information. And that's what this decision sort of, you know, gets to, and it's a tough thing to compare.


Debbie Reynolds: What do you think Samir?


Sameer Ahirrao: that? So it gets into a, more of a Lawley. So I'll take Jeff's on that, but again, I think balance is the key here, right? We have our own right to privacy and at the same time what can you do? So I think I'll pass on that. I'm talking too much about it. I'm on the engineering side of it.


Debbie Reynolds: So yeah. You want a job jumping in


David Kruger: David? Yeah. I mean, this is it's interesting, I think some years, right? You have to, it's a balance thing and you have to depend on being reasonable along with cats. You've got Sullivan, right. Which basically. Gives media sort of a free for all wings, say whatever we want, no matter what damage it causes you can't come after us because we're the media.

There's limits that have to be applied, but unfortunately they, you can't make a law. That's a one size fits all. People need to be reasonable in reveal that which is reasonable to reveal and hold close, that which is unreasonable to reveal. And then there in lies the rub laws aren't going to solve this situation.


They may make it some improve. But you're still going to have to rely on people doing the right thing. And that's unfortunately, always an iffy proposition.


Debbie Reynolds: Yeah. This looks like Bloomberg was trying to set a precedent here and it was not a good one that they said, right. But. Most people should know, or hopefully Blueworks should have known that in the UK, they're more strict around stuff like this reporting around crimes and stuff.


You know, I've seen major cases where they use, know, antonyms, right? Somewhat anonymous. Suspect 1, 2, 3, or whatever, just because they know that this information gets proliferate in the media. And especially if it's a bad information or it's not accurate they don't want that to sort of continue with the person.


So I think this should be a heads up for any organization that's reporting stuff release around sort of criminal things. Especially also. The UK has a rule, like say for instance, someone did a crime and they went to jail and they, you know, got out of jail. They're their conviction is considered a spit.


So at that point can't really hold it against them. So that's something that people weren't doing advertising and doing marketing and doing these stories need to understand. It's just a very, it's very different than it is in the us in terms of what they think is kind of the protection of individual.


Alright Samir, you want to talk about Iris regulator could halt Facebook, Instagram in EU us data flows in may.


Sameer Ahirrao: Yeah, that's that's interesting one, but I think there is a lot of news around you and us, how regulatory is actually wearing after. And of course I think maximum and all those decisions were kind of set down.


They see that. So interesting. So as soon as me, I mean, that's really serious. Then one of those little news where Google analytics, they do something that you can't use that in you. So I think we began to hear more about that. More and more in terms of how we use all these technologies. And if you see there was another news where another regular, French or German regulator, they say.


Thank you Facebook, if you can't provide our services in, I think Facebook said that we might have to just log out or don't offer services in that region. If laws don't get like losing up for them. And they said, thank you. We don't want those companies here. And so it's got it becoming more interesting, more pro citizens.


And of course you respect privacy as a fundamental, right. That just the example of in legal language. Oh, they're interpreting this one interesting thing that it's about Facebook and Instagram. WhatsApp is still excluded from that. So your phone numbers and data collection goes there, but they say WhatsApp has a different data regulator as a registered entity.

So they are going after I think the scoping Facebook and Instagram wants different time has that. . So I think that the whole regulatory portion between those data transfers between EU us, or even you and rest of the world, they gone on walk, work the concerns there. And I think more people I'm gonna have to change their strategies.


And if you are changing that strategy for such a large. I think that this is done for our entire world, that, Hey, you should not be doing deceptive data practices in you. And it just feels like at least for now. So I think we're going to see more of that.


Debbie Reynolds: What do you think?


Jeff Jockisch: Well, you know, there's a lot of high stakes games of chicken going on.

You know, you've heard that Facebook's talked about sort of pulling out of the EU and I think that, you know, probably the data transfers. Well, I don't really see either of these things actually happening. I think probably there's going to be some sort of agreement that's reached. Well, I think probably is going to happen is that Facebook's going to figure out a solution that allows them to keep European data in Europe.


And you know, all of this probably goes away. They're probably avoiding that solution for whatever reason. That's probably not very easy to accomplish. Maybe not even very it's probably not very efficient monetarily for them to do. But that's probably what ends up happening in my mind.


Debbie Reynolds: We saw last year that the EU us privacy shield was invalidated. And we've been waiting. We thought that supposedly happened already with kind of, you know, a renegotiation there, but that just has not happened. So I think a lot of these companies are caught in the middle in terms of that. And so it's just going to be a tough way forward.

Jeff is probably right. I've come up to some agreement. You know, the really big sticking point in Europe has been like the cloud act, the US surveillance, you know, the reach of the US federal government to us companies who. Based who have operations in foreign countries. So I think until that gets straightened out, we're still, we're going to see a lot more cases like this, where they're like, you know, get out of our country.


We can do X, Y, and Z. So this is a tough issue for sure. So I want to thank all of our special guests today. Jeff at privacy plans, the mirror, a hero at. Privacy and David Krueger at corporation. So for anyone who's interested in finding out more about the data collaboration Elias and those zeros community go over to the collaboration that org slash community.


Thank you.



37 views

Recent Posts

See All
bottom of page