top of page
  • Writer's pictureTeam

The Data Drop Panel: June 2021

This month, our host and self-confessed ‘data protection contrarian’ Carey Lening takes a deeper dive into some of the most important, concerning, and downright fascinating data privacy and data protection items covered by the Data Drop News podcast in recent weeks.


Carey: Hey, so my name is Carey and I'm the data protection contrarian and I just wanted to welcome you all to the first inaugural episode of the Data Drop Panel hosted by the Data Collaboration Alliance. So each month I'll be sitting down with privacy pros, data mavens, transformational technologists, and anyone I can find off the street to talk a little bit about deep dives into the pressing issues around personal data, including data protection, transparency, and perhaps most importantly ownership and control.


So you might be asking yourself "Okay, what's the Data Collaboration Alliance and why should I listen to yet another privacy podcast?" Well, the Data collaboration Alliance is a Toronto-based nonprofit and they're dedicated to building a future where data is fully controlled by its rightful owners and they're guided by #AccessNotCopies,


The alliance does research into data ownership and control into best practices, and they also offer some really cool free training on the data collaboration approach. We also produced the Data Drop Podcast, which offers listeners a weekly news roundup, and also this monthly panel.


We hope to make it worth your while by offering insightful analysis, good verbal repartee, and a good discussion of pressing topics and privacy. Think of this as sort of a non-partisan version of William Buckley's The Firing Line. And here I go, dating myself there. So with that, let's go ahead and introduce today's guests.


First up, we have Dan DeMers. He's based out of Toronto, Canada. He is the President of the Data Collaboration Alliance and the CEO and Co-founder of Cinchy, the Dataware platform.


Next up, we have the lovely Sarah Clarke who's based out of, and I know I'm going to say it wrong, Teesside Northeast England. Sarah Clarke is a data protection and cybersecurity governance specialist working for her own firm, Infospectives Limited.


Next up is Jeff Jockisch from Port St. Lucie, Florida in the US. Jeff is the CEO of PrivacyPlan where he does independent data privacy research and creates privacy-centric data sets.


So that's the panel and we have three topics for today. The first step is we're going to talk about vaccine passports, and then we're going to go talk a little bit about Apple's new app tracking tool. And finally, if we have time, we're going to talk about the latest and one of many Facebook data breaches.


The privacy challenges of Vaccine Passports

Carey: So the US, EU, UK, and numerous other countries are all rushing ahead now that we have vaccines and they want to get to a point where they have a functional system for actually tracking who has and hasn't been vaccinated. Proponents of these so-called vaccine passports argue that certifications will help reopen economies safely for those who are fully vaccinated, but many folks in the privacy sector, especially have concerns about scope creep and privacy abuses. A few days ago, the UK announced that they would be rolling out a digital certificate vaccine passport via their NHS app, which has already used widely to arrange doctor appointments and includes access to patient medical records.


Meanwhile, the EU has announced plans for a slightly more subdued version called the Digital Green Certificate, which attract vaccine status and whether someone tests negative for COVID-19 and in the US of course, there's IBM's efforts working with various states including New York to develop a digital health pass built on the Blockchain.


So the European Data Protection Supervisor states that a vaccine passport must not allow for and must not lead to the creation of any sort of centralized database of personal data at an EU level. But beyond that, there hasn't been a lot of guidance and there certainly isn't a lot of legislation around this issue.


So I have a couple of questions for you panelists. First things first, and come on, let's be honest here, how many of you are going to sign up for a digital vaccine passport?


Sarah: Well, I'm going to start kick off and say unless I need it - unless I need it to travel and depending on where it creeps into in terms of controlling access - I will attempt to avoid doing so because the push for the COVID app, to push it into the Google and Apple basis so it was decentralized, at the point that happened with the COVID app in the UK, I was incredibly supportive of it. The people who are involved in it were people who I knew were good people. They were sharing impact assessments. And it, it wasn't this evangelical zeal of let's just get it done. And suddenly there was some constructive information. I haven't got that with the intent of the data being gathered centrally through this specific process.


And it's not that they haven't already got our NHS records. I don't, I have absolutely no issue for a direct purpose that has got a public health interest. What I don't have is clarity about limiting it to that.


Carey: And the NHS app, in particular, I've read up into it. It's a little concerning how much overlap that they plan to include.


And, you know, there, there's some kind of weird things that I've heard about biometric data and other things, and it's like, oh, that one, that one strikes me as a little dicey. But what about, what about something like the, the more EU kind of passport where it's a digital green certificate where it's literally just yes or no, I've been vaccinated or if I haven't been vaccinated, I am negative for COVID-19.


Sarah: If it's limited to those purposes if the data are minimized for purpose. It's all about the purpose and it's all about making sure it stays that way and it's secure when it's happening that way and we are very clear about all of those things


And there are, it's more than just assurances, it's contractual, and it's the third parties have also covered off in those, in those same ways. Then I would have far less issue within that case.


Dan: Yeah, and for me, it's a very similar perspective of my, my own only concern really is, does the set of precedent that the world rushes into that facilitates the eventual expansion of scope to the point where you know, what about other vaccines?


What about you know, kids proving that they've been vaccinated within the school, and does it stop there? Does it go beyond that and cover your entire medical history and and that's where we just need to be really, really careful, but for myself as a, as a one-time use for a very specific issue and potentially even setting a bit of a precedent for such issues, which don't happen all that often, I think is the only way that's going to actually work.


Carey: Yeah, no, I agree. What about you, Jeff? What do you think.


Jeff: Well, I don't love the idea of a digital version of a passport for COVID. I do think there are reasons to do it, and then there are certainly a lot of reasons not to. Scope creep is, is a definite concern. I think sort of on the other end of the spectrum, people that are, that are banning COVID passports I've got a problem with some of that, right.


And I think if you're, if you're banning passports because you're, you're trying to look strong for your political base, you're probably just a demagogue trying to support the same people that won't wear masks now, or perhaps get vaccines for any kind of illness. So I've got a real problem with that. But in terms of supporting vaccine passports, I think there are certainly some reasons to do it.


And I would, I would be definitely in support of a vaccine passport if it was administered by a nonprofit agency that I trust. And if it had to be a centralized database, I'd be much happier if it was in that kind of, kind of an organization. And frankly, I'd be much happier if it was a paper passport, you know, rather than a digital passport.


Carey: The old school way of doing things, that's what they did before and during the 1918 pandemic


Jeff: I mean, it's working with yellow cards, you know, the yellow cards. So why can't we try that if we're going to have to do this? And the other thing that I think we need to really be careful of is if we're going to put vaccine passports in place, I think we need to be very sure that there are rules against discrimination. Or, you know, people who need to have access to essential services.


Carey: I completely agree. In fact, I think that the discriminatory aspects are very interesting and they really aren't being discussed as much as. I have a very firm position on vaccines, but, you know, I don't think it's necessarily right in the same end to say you know, there's concerns out of Israel, for instance, where they've, you know, cut things off, they cut services and access to things off to people who can't or won't get vaccinated.


Jeff: Yeah. You know, and I'm not trying to say that, that the people that won't get vaccinated should you know


That I want to support that behavior. But I think there are probably reasons that some people can't get vaccinated. Like they've got a valid reason, a life-threatening illness, maybe religious persecution or something. I'm not going to try to decide who has a valid reason or not, but I think there are valid reasons.


Carey: So here's one as a follow-up Jeff, because, because you know, I can't get away with asking something about the blockchain and not pinging you about it. So what do you think about the IBM digital health pass and putting this information on the blockchain?


I am deeply skeptical because I find everything on the block... Okay, not everything. A lot of things on the blockchain are sort of like the old adage of like, what happens when you add digital to problem X? Well, then you have digital problem X, right? So in my opinion, sometimes adding the blockchain to it means you just have a new problem on the blockchain. Prove me wrong.


Jeff: I like it in theory but you're right. In practice, I don't know whether it would work, so I would have to look into it more and I don't really know that all the details are out there yet that would make me a believer.


Carey: Should we put this on Cinchy Dan?


Dan: I'm not sure. What I would say though is the eventual standard of Zero Copy Integration would be very appropriate for this, whether it was on Cinchy or not is a separate thing. But, yeah, my view is that the only way to ever get control over data, and be it your vaccine records or anything is that the owner of that data has control over that data.


And that data is treated as if it had value, like other assets, like intellectual property people and, and money where you can't create copies of it. Cause it's the copying that creates the risk. So definitely this would be in scope for Zero Copy integration.


Carey: So my last question, Sarah, do you think it's possible to have a kind of a user-centric or user-focused vaccine passport initiative since you're skeptical of the current crop of things going on? You know, I guess the question is how do we apply vaccine passports or kind of vaccine and record-keeping fairly and ethically?


Sarah: Well, I'm not an app designer. But there are incredibly creative people who are designing software in this space. And I don't see why we can't follow a decentralized record if it's just the fact of vaccination or not.


Why does it need to be linked to your unique identity? Certainly, people could be given a unique code associated with confirmation of their vaccination status that is only put back together with anything, resembling a medical record in the protective backend systems that don't have to be shared with third parties.


Is this a valid pairing of an alphanumeric string and a positive statement of vaccination? And if I, in my position with a little bit of technical knowledge - enough to be dangerous can say that - and I've got people like Jeff nodding along suggesting that it's probably feasible and talking to people like Terence Eden, who were working, I haven't spoken personally, but from what I understood from his kind sharing around the NHS COVID app or the COVID app, why not? Why shouldn't we? And they specifically rejected the COVID app as a candidate for the vaccine passports, because it didn't allow them to get centralized storage of data.


Carey: I am genuinely deeply worried about some of the UK stuff, but that's a different conversation. I don't want to get into another Brexit debate.


Apple's App Tracking Transparency Tool (ATT)

Carey: So we'll move on to topic number two, which is Apple's new app tracking tool. So, a couple of weeks ago, Apple released its new App Tracking Transparency privacy protection framework, as part of its latest iOS 14.5 release.


I don't have an iPhone so, you know, I follow this, but I'm not, I don't have any dogs in this fight, I guess. But the aim of ATT is to give users direct control over how downloaded apps track users across websites and third-party other, other third-party applications. If users opt out of tracking, developers will be stopped from accessing the users' identifier for advertising, which is a unique apple identifier.


And that, that allows for that sort of a tracking kind of mechanism to occur. ATT allows users to opt-out device-wide, which is kind of interesting or on a per-app basis. But it also requires developers to refrain from sharing information with data brokers, which I did not know until I started looking into it.


So there's a question, there's no question that ATT gives users a tremendous amount of transparency and control, and it's kind of, it follows along with their other initiatives like the privacy food labeling that they're doing and that what Google is doing there. But of course, you can't have any good thing without lots of criticism.


So many people, including, you know, media outlets, the internet, and various internet companies and advertising folks are complaining. Most notably Facebook released a huge anti-ATT campaign complaining that ATT would destroy small businesses who rely on Facebook advertising to promote their offerings.


And, you know, generally create the downfall of man. There's also been a couple of potential lawsuits that have been roaming around and a few antitrust complaints that have been filed so people are genuinely concerned. Still, privacy pros have lauded ATT as a good next step and there are definitely some signs that users are actually getting a lot out of this.


And finally, maybe able to get off the treadmill of always-on-tracking. So in fact, there are some reports that are even suggesting that opt-in rates for this tracking are as low as 4 - 13%. So that is a huge impact on third-party advertising potentially.


So the question for you guys first is Apple controls nearly 20% of the worldwide mobile market according to IDC. Do you think that this will force a similar transparency effort or hastening of Google's own kind of privacy sandbox initiatives? Or do you think it will encourage, you know, app developers to change their tune and how they're developing apps and the kinds of things that they're tracking? Let's start with you, Dan.


Dan: First of all, I think the trend towards this is actually inevitable and highly predictable. I think the business model over the past number of decades where I'm going to a trap and monetize your data, its days are numbered. It's just the question of when so it is just the inevitable future.


I think what this is actually going to do just drive awareness to the general population that this is possible, also giving better transparency into the fact that how it worked even before this. You'll find consumer demand will shoot through the roof where people will expect this, not only of other mobile devices but devices in general, to the point where I would anticipate this eventually mature as a standard such that the connected oven that you buy in the future has mechanisms that have the enablement of you to have control over this.


Carey: Well, the pop-up window for what your connected oven is going to be sharing with the world would probably horrify me, to be honest. But it would be definitely a good step no question!


Dan: Well if you think about it, It's going to know when you're home. It's going to know what you eat. It's going to know what it needs to know to tell you when dinner's done.


That's going to go through the internet and that information could be correlated with other information. Then if you use the oven, you know, you could blame the person who buys it and turns it on for basically giving it access to this information, because if it's putting food into the oven, therefore now the oven knows about us.


But is it really their fault? You can't inspect every device that you interact with. You need to be able to trust the machines that you interact with. So whether it's your mobile phone or your oven or your coffee maker, you need to be able to trust that. And right now you can't.


So I think the enforcement of this as a consumer demanded standard will become the future. So this I think is just the beginning.


Carey: So it sounds to me like you're saying that this, this is actually going to encourage good behavior.


I'm forever skeptical because it's always one of those situations where you know, you come up with a new set of rules and someone comes up with a new way to break those rules and the new way, you know, it's like fingerprinting or any of the other kind of weird on device identifiers. So Sarah, do you think this is actually going to lead to a change in behavior or do you think that this might just end up in lots and lots of litigation and appearances before various competition authorities in Europe or other regulatory bodies by Facebook and Apple and all those other guys.


Sarah: So I like the awareness-raising. The challenge I did raise though is I don't see that this 97% failure to opt-in is any more representative of the choice as previously people being opted in invisibly and fading to opt-out because it's all happening in that first fear of missing out.


I want access to this thing. I want to have this conversation moment. Where you interact with the new app, you've downloaded it, you want to use it. You've seen a notification, you want to get online. So I think it's a good thing because, as I have on my banner on Twitter, "privacy and security by default and design."


And this is the "by default" stuff. Also, since I've been so rigorous opting out of everything on the internet, making my life an absolute misery going through 75 pages of E-bay third-parties and what I have to agree on there. I've got such a vanilla webpage these days when it's a nice third party, I am actually opting back into first party cookies because I want to see what they're advertising and what their latest deals are.


So there is a positive side of this, but I do have to question that until we actually have situation irrelevant just-in-time notifications of what you're potentially sharing at a particular point in time, how much awareness they can really be?


And until we've got some kind of collective bargaining available to people around use of their digital life identities, I also don't see a way forward. I don't see that we can lean on individuals to actually cause the step change necessary. And the third point I'd like to make is. How many more data points do we think these companies actually need to be able to put together a reliable profile for us to use for whatever purpose they choose?


It's what they've already got. That kind of bothers me in many ways. The data, subject rights, if they're ever enforced as hyped for is kind of, and that point I made about potential future collective action on this front is, is where that's, where that comes


Carey: That makes sense. So, Jeff, what are your predictions on the impact of these efforts?


You know, and it's not just Apple. ATT gets a lot of buzz and it gets a lot of news. But these initiatives like Dan has mentioned are coming. You know, these are things that people are already starting to think about on the horizon and already developing tools around. I mentioned Google's privacy sandbox.


I know Mozilla is doing a few things in this area. So this is not new. But what do you think these impacts? What will these efforts actually do in terms of companies that make their living through advertising and tracking, and then selling their data to data brokers? Cause I know you're kind of big on focusing on data brokers.


Jeff: So I think data brokers are eventually in for a rude awakening. And also a lot of the ad tech middlemen right. I think that Facebook's probably wrong that, that that small businesses and medium-sized businesses and large businesses are going to see much of an impact in this.


They've got enough data to be able to target and target pretty well. I don't think they're going to really see any drop in conversion rates. There's plenty of research that points to this. A lot of the research that points, the other direction is from those ad tech middlemen that say that you need the kind of behavioral targeting that they provide.


Carey: Or it's from Facebook.


Jeff: Yeah. I mean, there's certainly trajectory contradictory evidence, but I think that it's not going to have a whole lot of impact as my assessment. Right. But I think that the bigger picture here is that the tech platforms have a whole lot of control over this situation whether that's the Apple ecosystem or the Google ecosystem or other ecosystems that influenced this and to some extent, they have more control than governments have over the direction of privacy right now. And while there are probably some concerns in terms of antitrust and in terms of how much power they wield, they are having a positive impact on consumer privacy right now. And I think that that is a net good for us, at least as a first step.


And then we've got to figure out how we reel them in eventually. But I think that the silver lining here is that their interests are not aligned. If you look at Apple versus Google for instance. Apple doesn't make its money on ad tech and Google does. Amazon doesn't really make most of its money on ad tech, they're product sales, tech. Even though they do make money on ad tech. But if you look at the big tech giants, tech, Microsoft doesn't really make it their money on ad tech. So you can really play some of those players off against each other in these coming battles over privacy.


Facebook's massive data leak

Carey: So over 533 million Facebook users' data was leaked in April 2021 and posted for free on an online hacking forum. Yay. The leak did include phone numbers, Facebook IDs, full names, gender, relationship status, locations, birth dates, bios, and potentially even email addresses.


So, and this came from Facebook and, and that's the problem. It's kind of important to me because I was caught up in this breach. Despite the fact that I no longer had a Facebook account when my account details were hacked or when it was part of this breach I'm a little pissed because I deleted my account. I made sure it was gone, but Facebook didn't delete it and more importantly, they had my phone number not because I wanted to supply it, but because I wanted to turn two-factor authentication on and they pinky swore that they were only using that for security. Guess not.


So that was a problem and more compounding to this is the fact that Facebook's approach has been to downplay the breach, which is number 11 by my accounts and I actually did the count. And that's not including the breaches of WhatsApp and Instagram, which are also owned by Facebook. In this case, they're classifying the breach as just a regular old scrape of old public data.


They refuse to notify data subjects directly and even have been kind of jerky to the various supervisory authorities in the US and in Ireland where I'm based. So questions for the audience. First off, this is a quick one where any of you guys also affected?


Jeff: Yup I was


Sarah: I wasn't. I think I did it in my account in time or had managed to swift adding my own phone number.


Dan: And I haven't actually checked. So I need to check. I deleted mine a long time ago and I was a fool to assume that it was deleted.


Carey: Well, I had to have it cause I, for a period of time, I've worked at Facebook. And you have to have a Facebook account when you work at Facebook. I had this crazy idea in my head. I was going to change the world. Anyway, that didn't work. So, Facebook isn't alone, unfortunately, in this new approach of casting these kinds of breaches these leaks of data as scrapes and not a formal data breach.


And I think that they're doing that in part for legal reasons because there's no law that says that you have to necessarily report a data scrape. But while I think it's partly regulatory and legal it also seems to be a way that they're using to shift liability and limit the severity and impact of the breach itself. And they're kind of trying to condition us I think to make these things seem normal. Like "yeah, it just sort of happens."


I don't know. Am I wrong? Am I overly paranoid? What do you think, Sarah?


Sarah: I'm inclined to believe what you believe, Carey, actually. I mean I come at this very much from a different position than a lot of the sort of developed types that I mix with and that kind of thing which tends to be stateside who tend to have an attitude oftentimes what are you bleeding about? It's public data. So it's fair game. And that isn't the point. I mean I've always come at it from the direction that I provide data to people for a particular purpose to enable a particular thing as part of my interaction with them.


The argument is always that yes I'm the product. I knew that. I accounted for what I was getting from them and the exchange. I didn't account for them making it easy for third parties to take it and do what they liked with it. And it isn't as if they were helpless in this. If they could close the gap in 2019 when there was a gap to close.


And the rules within the GDPR are that you need to have controls that are commensurate with the risk and that are usually expressed as being controlled in line with your own policies because your policies should be an embodiment of the controls necessary to mitigate the prevailing risks.


They knew this was a possibility because they've had numerous similar things happen before and they've had numerous people already reporting this similar particular breach from this context thing. So there is a lack of duty of care or a lack of care about their duty. One or the other.


Carey: What do you think Dan?


Dan: So sadly I think your point earlier around how this is maybe conditioning people to assume that this is just normal is it's actually true because it actually isn't on the book meaning this is not going to stop happening. It's going to continue to happen. And as more and more of our inner monologues are digitized and persisted and copied there's going to be more and more availability and access to that that we don't want.


It's an unstoppable phenomenon. The only way is to basically change how we think about data and the whole access, not copies thing. That's the only way even that doesn't eliminate it but it dramatically minimizes because in this scenario like you entrusted one particular organization with a copy.


And it's that copy that was breached and that's the problem. That's the only way that this will ever stop.


Carey: Right. And like I said I did a timeline as part of a bit of research with Digital Rights Ireland because they filed a recent mass action. They don't call them class actions over here. They call them mass actions. But they filed a recent mass action against, or they're planning on filing a recent mass action against Facebook for this very thing. And we've just looked into the data and it's horrifying the number of copies, the number of cases where this data was shared with third parties who had just insufficient security controls or no security controls.


And they left their databases open on elastic search and you know or like an Amazon bucket somewhere. You're like, what the hell? Like, what is going on? You know, why are there no controls? Why is there no enforcement? Why are we in this dance again? I think the multiple copies and the fact that it's way too easy to just share information in that kind of old-school method is really the thing.


Dan: And even if organizations want it to minimize that, there are technological constraints on why they do that. So even if people knew how many times your phone number or your email address or your net worth or something is actually copied not only across organizations but even within an organization like I used to work in financial services and big banks and you know 10,000 systems, how many of those don't need to know anything about their employees or customers that most of them do. So it's not that all data's copied everywhere but every piece of data is copied many places within an organization and across organizations and every one of those copies is just waiting for someone to access it.


There's no way to assure protection of you know thousands, hundreds of thousands, millions of copies of my name or my phone number or my address. It's impossible.


Carey: What do you think Jeff?


Jeff: I mean I think Dan's right. You know, it's a problem with copies of the data. Right after the Facebook breach, I started getting more phone spam, more SMS spam, more email spam. And I know it's Facebook but how the heck do you prove that? You can't. And to Dan's point I give an example in my data broker presentation where you know, students have signed up for like a sweepstake to win $10,000 and you know they submit some information about what schools they want to go to, And in that one example, their data gets copied to five different organizations, including three data brokers just from one interaction.


I mean I'm sure there's some disclosure that they click yes on. But it doesn't state that it's going to be copied five times rather than just once. But that happens every day with every piece of information we share with somebody because there's no control over the copies.


Dan: You know how they talk about all this growth in the volume of data? Guess what that is.


Carey: It's just extra copies of the data all over the place. The 8,000 versions of the same Excel spreadsheet. Right. Well, yeah, if you can get rid of the Excel spreadsheet my friend you're doing God's work.


Anyway, so my last question on this, on this topic before we wrap up is, of course this led to a renewed interest in regulatory response. The Irish DPC actually responded very quickly this time because it turns out like three of the people in the Data Protection Commission in Ireland had their accounts breached. Oops. Same thing with European Data Protection Board. Good times. But you know that's great and it's great that they're starting investigations and launching things but really in practice, this has happened before as well.


There have been multiple investigations brought by the ICO, brought by the US, FTC brought by various countries all over the world and nothing really has happened. There was that FTC action that actually led to something meaningful. But my question to you guys is do you think it'll change this time?


Do you think anything positive will come out of this to actually smack Facebook into changing its policies? Or is this just Groundhog Day and we're going to repeat ourselves again?


Jeff: No I don't think anything's going to change in a major way though. I do think that some state legislators are starting to wake up to the fact that their data breach legislation is completely inadequate, right? I mean because not one of the 50 states in the United States can really go after Facebook for this breach because it doesn't meet the data requirements like financial data or government ID or anything. Like, is it public data? Is it classified as PII? Almost all of those definitions are not met based upon how those statutes are written and they're old and they're outdated


Carey: I agree. Sarah what do you think.


Sarah: Well the FTC 5 billion was caveated with indemnifying them against any breaches that happen before. I think it was the 12th of June 2019. So they're being a bit vague about which breach these were from. I suspected may have had something to do with that. I think the changes will be by proxy. As if somebody stops them acquiring the next best alternative to keep acquiring data or impacts their intermediate layer of brokers because that is at the moment is a self-sustaining machine.


There's enough data seeping around to serve as that intermediary role that's just really a self-fulfilling prophecy in terms of revenue generation. And if that ceases to become a useful way to operate then maybe we'll see a change, who knows. But I can't see where it's coming from right this second.


Carey: Dan as the chairman of this established and lovely organization I'm going to leave you with the last word.


Dan: So sadly until the technology changes that make it possible, I don't see this changing anything. In fact, as I said earlier, it's going to get increasingly frequent and society will have to decide whether they accept that or reject that.

 

I wanted to wrap up and just talk really quickly about the initiative that we're working on here. And I of course wanted to thank all of you for coming and being on our inaugural podcast. But a lot of us in this room here is working on a project called the Information Ownership Network or iOWN and it's a little bit of a sandbox a little bit of a toolkit part professional networking organization that will hopefully be a little bit nicer than LinkedIn.


And more importantly, it uses the DCA Dataware platform which was contributed by our partners at Cinchy. So yay. It's a really cool tool and the data where platform gives us a lot of user ownership and control over what we create. And since we're still in the starting phases of the ion initiative we're always looking for fresh ideas on what to build, what a world of zero copies means.


And you know what full data control actually might look like and how to continue to advance the privacy tech landscape for the better. So to all of our listeners we'd love to hear from any privacy compliance, technology, security pros, anybody who's kind of passionate about this subject, we would love to hear any good you know thoughts about how we could use a tool like the dataware platform to do this.


And if you have any thoughts please feel free to check us out and set the datacollaboration.org/iown. Thank you guys so much again.

 

This video is also available as a podcast. Subscribe at datacollaboration.org/datadrop. The data drop is a production of the data collaboration Alliance, a non-profit dedicated to advancing meaningful data ownership and democratizing IT


Data privacy data protection and compliance professionals should check out iOWN, the Information Ownership Network


The Alliance also provides students and mid-career professionals with free learning in data-centric skills via our Learning Zone. To learn more about the Alliance and these initiatives visit datacollaboration.org.



129 views
bottom of page