The Data Drop Panel: November 2021
Updated: Jan 8
Our host Debbi Reynolds takes a deeper dive into some of the most important, concerning, and downright fascinating data privacy and data protection items covered by the Data Drop News podcast in recent weeks.
Pro tip: get The Data Drop on your phone by subscribing to our podcast.
Debbie: Hello. My name is Debbie Reynolds. I'm a global data privacy advisor, a strategist from Chicago. Very happy to be here today with all of you on the Data Drop Panel. So I am a member of the iOWN community and the Data Collaboration Alliance. So welcome to the Data Drop Panel. I am a guest host where each month we gather some of the finest minds in data privacy and data to talk about the stories that have are of interest to them, whether it's something that raised their eyebrows or having clenching their fists. So, I should note that all the stories that we featured today have also been included on the sister Data Drop News podcast, which delivers the four-minute data privacy news roundup every other week. All right, so today we have a great panel.
I have the fortune of actually having to know the two guests, two of the three guests, and we'll happy to be getting to know Daniel as well. So the guests are Daniel Knapp, who is a data privacy consultant based in Atlanta, Georgia. And the Principal of Red Clover Advisors.
We have Peter Barbosa, I know him. He's a co-founder and CEO of Opsware Data in Canada.
And then Jeff Jockisch, who really doesn't need much introduction. I call him the data wizard. A data researcher and principal at PrivacyPlan which is a data privacy consultancy, and also he's a dataset provider.
All right, thank you gents for joining me on the Data Drop Panel today.
GDPR fines top €1 billion in Q3 2021
Debbie: Let's see. I will love to start with Peter. What news came up for you this week or this month that you really want to dig into a little bit more?
Peter: Yeah, absolutely. Happy to start this off. So, the article I found that I'd like to call out this week is, I found the Brussel Times, and its GDPR fines for the third quarter of 2021 almost equal 1 billion Euros which is completely insane.
I've been watching these closely since 2018 and if you actually look at the enforcement, there's been an ongoing trend of it constantly taking upwards every month, and month, and month, of fines being introduced to companies. So for Q3 of 2021 believe it or not, the majority of those fines came from two particular companies.
The first one is I believe it was at my notes over here was Amazon for 746 million euros. If you recall that, I believe that was back in June or July and more recently we saw WhatsApp, which was a 225 million euro fine. So very huge fines. Those are probably the largest ever finds that we've seen issued.
Also important to note that those are fines issued, right? So they still have to go to court and the actual fines being paid and settled with will likely be significantly less. If you're a member of the I think it was the case of the British airway they were originally fined 200 million euros but I believe the actual find pay was 20 million. But still, a very significant jump in Q3 of this year and will be interesting to see how the rest of the year pans out for fines and penalties.
But yeah, those are really getting fines issued, not paid, but still significant. And then, you know, there's, I want to say like four dozen, very much smaller fines that were introduced throughout Q3 of 2021 as well.
Debbie: Right. Yeah. I think Amazon's fine, they're not going to appeal that one. They just sort of agreed to whatever that payment was, but I think WhatsApp, they're definitely gonna appeal. So people are gonna watch that on appeal and see if that gets reduced or not. They're very interesting. Yeah. I think especially because a lot of people have been complaining, I think people in the US we thought soon as GDPR came out, like that day. I start finding people and because it's taken so long, I think it sort of took the wind out of people's sails in some way.
But now that we're seeing these things kind of culminate, you know, there are, there is progress happening and there are things happening. So definitely keep an eye on it.
Over 60 million wearable fitness tracking records exposed
Debbie: I would love to talk with Daniel about fitness tracking records.
Dan: Yeah. So, last month it was discovered that there is a third party, not necessarily even breach, there was an investigation and it was uncovered that a third party known as Get Health, had a non-encrypted, non-password-protected database that they had compiled of users of their third party app and service on Apple Watches and Fitbits along with other wearables. And of course, if you're on Apple Watch or Fitbit, those are the two biggest fitness trackers out there.
And this is an instance where you know, the breach was entirely on the side, that the third party, they had a database, non-encrypted with usernames and first names and geolocation info along with health info that they compiled. And you have a situation where once again, you have users who are, I think just trusting the service because it's accessible through their Fitbit or through their
Apple Watch, presume it's trustworthy. And, you know, up to the level of the privacy policies that they've agreed to with Apple or Google, and it's not necessarily the case. And you know, this has me wondering, however, What technical constraints and further policy agreements can companies such as Apple and Google rollout to better protect their end-users from not necessarily malicious third parties, but third parties that are not being as diligent as they should.
You know, I think there needs to be more education at the very least. For end-users to realize, you know, even if it appears to be a trustworthy and legitimate service, you want to do a certain level of investigation on your end to make sure they're actually protecting your data. But then beyond that, are there further technical constraints that the operating system owners and the device providers, and manufacturers can provide to protect users from even potentially engaging with third parties that are a little, you know, less or in this case, a lot less up to snuff when it comes to data protection. So I think these are the sorts of conversations that we're going to continue to see every time a breach such as this happens.
Debbie: Yeah, that's actually a really good one. You know, we are seeing Apple with their App Transparency, almost go in the other direction, which is you know, let the third-parties fend for themselves and sort of creating that relationship. So maybe there's something that can be done because sometimes like you're right.
Some people think, oh, because I'm using like an Apple product or something and I go to the service, they're covered in the same way. So I think what they're trying to do is say, you know, third parties, you have to create a first-party relationship with a consumer. You have to ask them questions, ask them for consent and things like that.
But I think you're right in terms of consumer education, people may not be aware that once they leave, you know, even though they're using a device and they use other apps that may or may not have the same protection or SECure.
Peter: Yeah, I think Debbie you're right. A lot of companies when they, you know, just cause they're purchasing an Apple product or Fitbit product or that new job-owned wristband that they really like.
I think once they do purchase that they immediately assume it's a big company. You know, a lot of validation. I bought it at the Apple store, I'm protected as much as Apple is. But then again, you have all these third-party services that are constantly tracking and collecting this data. And I totally agree with what Dan says.
I'd like to see some tighter contracts in place between these companies that collect the data and who to share data with to actually protect the consumers more and see those extra technical provisions and you know, being more transparent to the customers as well, as far as what actually is being shared.
And obviously dumbing it down, but we'll see what happens in time.
Debbie: Excellent. I agree with that. I agree with that.
App Annie to pay $10M in landmark SEC action
Debbie: Well, Jeff, you want to talk about a landmark SEC action?
Jeff: Absolutely. Great to be here. Thanks for the introduction earlier. I want to talk about the intersection of what Peter was talking about with fines and what Dan is talking about with breaches.
You know, in this case, I think it's really a breach of trust because the SEC has fined App Annie $10 million for what's effectively a breach of trust, right? What they do if people don't understand is App Annie sort of ranks downloads and other statistics for applications, mobile applications.
And they sort of act as a third party in the mobile ecosystem. And what's really sort of interesting about this story is a couple of different things. One, it's a pretty huge fine in the privacy world. I mean, it's not the sort of on the size of, you know, an Amazon fine, but for a company, the size of App Annie, it's pretty huge.
Right. Second, it's a fine that's coming sort of out of left field in terms of privacy regulators, because you don't really hear about the SEC making privacy finds. Right. So it really sort of demonstrates just how fragmented the US privacy landscape is and how many different regulators that are out there.
And we've literally got dozens of different organizations that regulate privacy in the United States. Right. And third, I think it also sort of speaks to how byzantine the mobile data ecosystem is, right. That app Annie was able to get away with this for how long, right? I mean, it was like years that they've been doing this, where they've been taking all this information from companies large and small and giving away information that was supposed to be de-identified.
And it wasn't and they were just giving away all of this PII to whoever they wanted to. And that's really scary.
Debbie: I agree. I agree with that. You're right. And a lot of people don't with the SEC and the SEC is fining people, actually, we're seeing them go after people for cyber stuff as opposed to privacy, but this is more or less is almost a combination of the two. So, The regulators are also in addition to seeing things in the news about data breaches, like a bad actor or something, or comes in and does something that we talk about in the news also third-party or data sharing without consent of the consumer.
That is kind of what the SEC is looking at and also the FTC. So that's something definitely to look out for. Dan, you have a comment?
Dan: Yeah. Well, I just wanted to build on this by commenting. Yes, Jeff does definitely seems like it flows very theatrically from what I was just discussing because here's yet another instance where you have a company or an entity that is just one step away from the larger, more trusted entity or entities that the consumers are doing business with. And, you know, we have one case, I was talking about Apple and Google in this case, we're talking about, you know, in many cases, major app providers, and yet here is App Annie, just one step away doing something that's not trustworthy.
And so just another instance of we've got to remember as privacy professionals, consumers have to remember, tech companies have to remember, you know, no matter how good our policies are, no matter how strong our protections are, at least we think they are, they're only as strong as the protections and the agreements we have with the third parties we're working are as well. And you know, as someone who, when I first got involved in privacy, the very first thing I was assigned to do is work on data flows and data mapping. This is near and dear to my heart as well, because it's always been part of what I look at, but it just serves as is get another good real-life reminder.
These are the sorts of things we need to be paying attention to.
Debbie: What do you think, Peter?
Peter: Yeah. I mean, like, it kind of comes back the last story, like third-party due diligence is so critical for companies. And I looked at this article a bit more in-depth. It was 2014 to 2018 they've been doing this, right? Like that's a long time.
And App Annie, like they support a lot of major mobile applications. Like I know they're really popular with the mobile gaming space, but Pinterest, LinkedIn, like they got some big customers and to full-on - I mean, from what I understood there, they're manipulating the estimates and the analytics for some of these companies, which is just a big no-no in my books.
So again, I think they're part of your diligence is critical. I think it's something that a lot of companies need to focus on regardless of their size. So yeah.
Debbie: Yeah, it's a big ball of wax. You know, especially people who are developing apps. A lot of times they're very focused on, you know, look at these cool feature, all the stuff that we can do.
And just because he can technologically do something doesn't mean that you should do it. Legally or ethically.
Italian data authority seeks clarifications on privacy from Facebook over its new smart glasses
Debbie: Peter, you have something about the Italian data authorities seeking clarification on privacy from Facebook over these new smart glasses.
Peter: Yeah. So, I mean, you just said that headline right there founded on euronews.com also on the Data Drop as well.
But I'm sure this was pretty controversial when it came out a week ago, Facebook partnered with Ray-Ban to come out with, I call them spy glasses, but they really are such a Spyglass. Cameras embed right into the actual Ray-Ban lens. And I'm a big fan of Ray-Ban. I was pretty shocked to see this myself and I think really the only way they tell the users that they're actually recording is there's a little red light on the actual glass itself. And I think there Facebook, someone asked Facebook and they made a comment about it. They asked Facebook's, you know, so what if someone just covers up the red light with a, with a piece of tape or something, and I think they responded well, that's against our terms of service, don't worry about that.
I only wish it was that easy for people not to worry about, but it is extremely creepy. And. Yeah. I mean, it's definitely not privacy by default or privacy by design. I mean, it's far from it, but I think this is Facebook, unfortunately. Yeah, not the biggest fan of this at all.
Debbie: Yeah. People, you know, this is definitely going to happen anyway. Right. So we knew that, you know, especially with people who are using like the Oculus glasses or whatever, you can't really walk down the street with that. So this was inevitable. This is going to happen. And, you know, I think a lot of these companies they're seeing how far they can push it.
And because especially in the US, there just aren't a lot of guard rails, you know, for stuff. They can kind of just throw things out there and see what happens. But I think the Italian data authority, their seeking clarification, but I think they're not even the only regulator in Europe that's asking for clarification on this.
So what do you think, Dan?
Dan: You know, this just strikes me as really on a whole, another level beyond your typical smart home device or your typical internet of things device, because more often than not, regardless of what the privacy concerns might be about IoT devices, smart home devices. They're usually in the primary user's home or adjacent to their home in some way, shape, or form.
And that individual is at least on some level signing off themselves on what level of privacy they are or are not agreeing to. This is just different because you know, it affects not mainly the privacy of the person wearing the glasses, but presumably whomever they encounter walking down the street, dining with a restaurant, whatever wherever it's they may be.
And I think Peter, beyond the example you gave, I actually saw an example that Facebook was asked, well, what if you know, what's to stop someone from using this say in a public restroom, and again, they said, don't worry about it. That's against our terms of service. Yeah. Okay. But, you know, there's no way to actually then physically stop someone or you know, technologically stop someone from using their smart glasses in a public restroom.
So, there are, I think huge concerns with this product that go well above and beyond what we're used to even addressing. I mean, I really had kind of had to laugh cause, you know, the word I'll use to describe Facebook's responses is chutzpah. I want to use a different term, but I'm not going to for our family-friendly podcast.
But my thought was, they have a lot of, you know, what, and I don't mean that in a positive way in this case. So yeah, this really blew me away that they've even introduced this product and it will be interesting to see they might get away with it here in the states, just because we all know how, outside of California, the US tends to operate, but I don't think Europe's gonna have much patience for this.
Peter: Yeah, a hundred percent agree. And Debbie, to your point, like this was inevitable. But Dan, I think you're right. Like this is going to be a no-fly in Europe and I'm hoping it's gonna eventually be a no-fly in the US as well. We're already seeing with the Nest doorbells, right.
With the cameras on it, there are some privacy issues that happened in Europe. I think we're going to do the exact same thing for this. I don't think it was going to be around for hopefully very long. But again, we'll also have to see what, how the U S in how individuals in the US respond to this. Cause it's extremely creepy to me anyway.
Debbie: I love Peter. He's so optimistic. Yeah. This thing is going all the way to the end, all the way to the finish line. This is kind of their entree into the middle. So the metaverse is supposed to be the internet around you all the time, always on. And actually, I did a video a couple of days ago about our video surveillance with audio capability.
So this falls into that, where there are a lot of different laws and regulations. Well, you can do a video and what you can do, audio, where you are, where you know the jurisdiction. So I think it's almost like saying, okay, you buy these glasses, you do this thing. And if you get in trouble, it kind of sucks to be you in a way because once they make the sale, they don't really, it doesn't matter to them how else do you use that, in terms of legality.
Employers hold too much power over information.
Debbie: All right. So, Daniel, we have you next for "employers hold too much power over information".
Dan: Yeah. I thought this was just the perfect article to discuss with fellow members of the Information Ownership Network. The world economic forum has raised concerns that we are living in a time where companies increasingly track and rate their employers, their employees, based on big data that they compile. And that, you know, this isn't even so much, I mean, the article that's related to this touches on things like, you know, when you're a new employee and companies will begin to track certain levels of, of data for any employee. But this goes beyond that there's actually in some countries, health data tracking. And other cases it's purely performance-based, but and employees can be tracked and braided and either maintained or even some cases dismissed based on data that is compiled by their employer. And that's concerning in of itself but this is specifically focusing on the fact that this is data about you as an employee that determines whether or not you're considered a good employee and you have no way to actually transport said data with you from employer to employer.
So it's really employee-specific DSAR we're talking about here for all intents and purposes. And, you know, I think we're used to talking about you know, portability requests from service providers were used to you know, talking about data portability requests from medical providers. But the concept of getting, you know, your data in a portable format from an employer, because this is data that is about you as an employee.
That's something I haven't seen a whole lot of focus on. And you know, basically. What's happening is that the World Economic Forum recognizes that, especially over the last couple of years since COVID the volume of personal data that employers keep on their employees has just increased substantially to the point where this is actually a specific area that we need to start focusing on is DSAR-able. And we haven't before. So again, I just found that especially interesting. It hasn't been something I focused on so much you know, working for a smaller company. It might not affect me directly, but I see how that could be a concern. I think that it's something we could expect to see again, starting with our friends in Europe.
But then moving beyond that it could be very much an area to highlight in the next wave of privacy regulations.
Debbie: Right. And also there isn't as much transparency around that, depending on where you live. So in the US employer, employees have very little power or stuff like that. And you know, I think there are articles that came out a couple of weeks ago. I think it was about Uber in general, this is why ride-sharing. And what do you do when the algorithm is your boss? You know, so that's definitely a concern.
Dan: And I can't recall which of the major delivery services, but some employer employees had actually docked, for example, looking in their side-view mirror while driving because that's, you know, determine there have been a slightly less efficient driver.
So that's something absolutely concerning, especially, you know, from those of us who are on the road next to those people and probably would prefer they be looking in their side-view mirror at times.
Peter: Yeah. I mean, I think, Debbie, correct me if I'm wrong, CPRA, I think employee data is in the scope of CPRA which is coming in January 2022. Is it in scope? Sorry, mainly a question for you.
Debbie: A little bit, not as much as you think.
Peter: Yeah. Definitely would like to see, you know, the US pick up on that and then kind of, you know. In Europe privacy, it's like, it's a human right, right?
That's how it's looked at. And in the US I think it's more like it's a consumer right now, I'm in Canada. I feel like we're kind of in a happy medium here. Where we think it's a human right as well, to some degree, but I think it depends on what context you're speaking of.
But definitely, I think like Daniel right, I would like to see, you know, data requests or, you know, crash requests be in scope for some of this new legislation we're seeing emerge across the US states. I think it will be very critical because it's also about the employees as well. And I think you know, even have Apple, I think recently there's a new store that came out where Apple employees are required to link their personal Apple IDs for their work devices.
Who knows what's happening there as well? So I definitely think there's a big room for improvement here. And as people start going back to work in the offices, I think the amount of data being collected will just increase. It's increased working from home, for example, right. We're all kind of connecting over camera right now, but when you go to the office, I think there's a need for COVID, its temperature checks.
That's all going to tie back to your name and your employee ID. And they're just going to be expanding the scope of data they're collecting. So I think we just need to do that much more dash protect that data and give the rights back to the individuals
UAE announces new federal data law
Debbie: Jeff, you want to talk about what's happening in the United Arab Emirates?
Jeff: Sure. Yeah. Thanks, Debbie. The story that I've got here is that the UAE has announced a new federal data law. And the big story here is not so much that they've announced a new federal law, right? It would be the first of its kind for the UAE. And currently, their data regulations sort of differ based upon the different free trade zones that they have.
But What I really want to talk about here is that you know, Asia is really sort of picking up its game in terms of privacy laws, right. UAE actually right now, I think has like four different privacy regulators. At least that I know about, they're probably even more even just for a small state. But if you look across to Asia, right, China just passed this huge new, you know, PIPL legislation. South Korea is getting an update to their legislation.
Singapore just updated their law. India, which is sort of close to, you know, there is getting ready to update their legislation. And we have to really just sort of look across the globe. Everything is really changing with privacy. It's not just Asia, but I guess I just wanted to point out to people that.
The world is really moving forward with privacy legislation. And we just really needed to be paying attention, not just our own country, you know, here in the United States, but worldwide, everybody is upping their game in privacy and we need to be paying a lot of attention. Dan, what do you think?
Dan: I was going to say, that said, could it, should it, will it be the wake-up call that the US federal government finally needs. That nations that are not known to respect their resident's privacy at all, such as China, such as the UAE, are moving ahead. Far ahead of the US in terms of privacy regulation. I would think if anything could be a wake-up call, it should be that because all of a sudden it also becomes a political issue.
Should it maybe not, but, you know, whatever party gets ahead of it and says, look, our adversaries, and not that the UAE and China adversaries, but in the case of China, in some ways, they're thought of it as one, you know, they're actually moving ahead of us in this important area that you know, it's something somewhat of a focus to people everywhere.
Maybe it's going to finally kind of get that kick in the pants that we've needed, you know, one can only hope.
Peter: Yeah, I totally agree. I mean, this is now becoming a global topic and a global issue. Right. And you know, India and like, you know, parts of Asia, you now see privacy regulations.
And I heard a, I think it was last year here, a gardener came up with a report and they said by. By 2023, 65% of the world's population will be covered by, you know, modern privacy laws in light of GDPR. I think more and more as I see time evolve and I'm seeing more jurisdictions, whether they're big or small, like, you know, the United Emirates or I think very, you know, not a major state to Jeff's point.
But I think it's a very conservative forecast by Gardner. Life being 65%. Like when you have, you know, China and India you have US states popping up laws everywhere in California being the biggest state already has a law in effect. I think you're going to see, hopefully, see it being above 65% by 2023.
We can only hope, but it's exciting to see these laws come in place all modeling and following, you know, similar steps, the GDPR as well providing back privacy rights to consumers. And I think the legislation poses in the United Emirates that's announcing at Emirates, I think that also is a very deep in the scope of being for individuals, not just consumers. So very excited to see that as well.
Jeff: I think we need to do an analysis of that, of what actual percentage of the population is now covered by modern data privacy laws. So I think we could probably update those numbers.
Debbie: I agree with that. I don't know. I thought when the US privacy shield was invalidated. I thought that would be kind of the US wake-up call that we need to sort of do something and really I'm stunned and nothing significant has happened as a result of that. So, you know, I don't want to throw cold water on anyone's ideas or You know, Wiz for that, but I'm hoping that will be the case.
But I think in the US, we're very much motivated by commerce. So, if it is an impediment, a significant pure impediment to commerce, I think that's where we will see more movement.
What do you think Peter?
Peter: I'm also starting to think as well that, you know, the more we see state laws introduced in the US the harder it will be to pass a federal law because I'm concerned of the federal, just say "Hey, look, if this is a state issue that states are taking care of it, this is not on us."
So as much as I'm someone in favor of the state-level legislation, that they're actually giving something about this and giving a damn, I don't know, like, I think it might make things harder in the long run.
Dan: Well, I believe Google just went ahead and, you know, put in a very you know, forward request for a federal privacy law.
So when you say commerce, yeah, that might be what it takes. Right? When we have the largest companies in the US start appealing to both the executive and legislature we want federal privacy laws, that might be what it takes to get Congress to listen. We shall see.
Debbie: I'm hoping that something will happen. And I do see, you know, obviously, the states are definitely taking charge here. I dunno, I, in some ways I'm dismayed because I feel like what's happening right now in the US is that some people are saying, well, We have laws on the books for some of the stuff. So let's try to use that and maybe try to stretch it out somehow.
But what we're seeing in these other countries is that they're saying, you know, we need agencies to deal exactly with this particular issue. We need a strategy on a federal or national basis to deal with this. And we're not seeing that same type of movement here in the US in terms of kind of strategy and then sort of making it kind of a federal umbrella of laws.
Dan: And to that point, I know we have in California, I heard McTaggart very clearly say he's concerned about a federal law that might supersede California's law and actually be more watered down than what they have now. So there is the other side of it that, you know, the handful of states that already have regulations in place may or may not themselves be on board for a federal law, depending on what is in it.
So there are multiple layers to this, and I think that the drama is. Only just getting started.
Debbie: I agree with that completely. I think the drama is only getting started. So buckle your seat belts. It's going to be a bumpy ride, I believe so.
Debbie: Well, thank you. Once again to our guests, Daniel Knapp of Red Clover, Jeff Jockisch, of PrivacyPlan, and Peter Barbosa of Opsware data. We'd also like to invite you to check out the iOWN community, the Data Collaboration Alliance, and the Data drop podcasts. Thank you!
The Data Drop is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and democratized IT.
Data privacy, data protection, and compliance professionals should check out iOWN, the Information Ownership Network. The Alliance also provides students and mid-career professionals with free learning in data-centric skills via the Data Collaboration University. To learn more about the alliance and these initiatives visit datacollaboration.org.