The Data Drop Panel for April 2022
Host Heidi Saas and special guests take a deep dive into the noteworthy, concerning, and downright fascinating stories featured in recent episodes of the Data Drop News podcast.
Pro tip: you can listen to The Data Drop Panel on your phone by subscribing to our podcast.
About The Data Drop
The Data Drop podcast is a production of the Data Collaboration Alliance, a nonprofit dedicated to advancing meaningful data ownership and global Collaborative Intelligence.
Join The Data Collaboration Community
The Data Collaboration Community is a data-centric community where professionals, nonprofits, and researchers join forces to collaborate on datasets, dashboards, and open tools in support of important causes. Learn more.
Heidi Saas: Hi I'm Heidi Sass, and I live in the Washington DC area. I'm a member of the community, the Data Collaboration Alliance. Welcome to the Data Drop Panel. Each month, we gather some leading data on privacy professionals to hear about the new stories that stood out for them over the past month or so. And in the fast paced world of data privacy, it's always interesting to hear what's the reason that eyebrows and curling the fist of the practice. I should note that all of the stories that we'll feature today have been included in our podcast, which delivers a five minute privacy news Roundup every other week.
This month on the data drop panel, we have three guests, Cat Coode, data, privacy consultant, and data privacy officer at Binary Tattoo. We have Jeff Jockisch who is CEO of Privacy Plan, and we have Chris McClellan Director of Operations and the Data Collaboration Alliance.
And we're going to get started on talking about the first article. With Cat, Google analytics to stop logging IP addresses in sunset old versions in privacy standards. Overhaul. What are your thoughts on that at Cat?
Cat Coode: Oh, so I was here a few months ago talking about Google analytics. And now we're back talking about Google analytics. There were some data protection agencies in Europe that had come down and said, we do not want. Google analytics in Europe anymore because they are taking private information and shipping it over to the U S and for people that are unfamiliar, there's always been a little bit of a contention with the IP address.
And is it personal information or is it not, is it personal information if it's combined with other information? So, because Google does not want to lose their foothold in the world, they are removing the IP of. From Google analytics, they are sunsetting the tool that we all know as universal analytics, which I didn't know that's what it was called, but apparently this, and they are moving to GA four, which is Google analytics for, which has all sorts of privacy and security embedded in it.
So that companies can continue to use the analytics information and not breach the regular. It's
Chris McLellan: Google Pearl, Google X, right. It had a pretty peaceful existence for 20 years or so it's something it's a political football. I guess, so this, this came out of something, you know, the, the privacy shield review and the Schrems lawsuits, I think it was in Austrian, France, where the countries that raised the alarm bells first and declared Google analytics is counter to GDPR, therefore, potentially illegal.
Cat Coode: Yeah. And even right down to the location, I mean, when we, we all talk about data and privacy by design, we always talk about office getting data and not collecting specific, specific data that you don't need. So they are finally using their, what they're calling data-driven attribution modeling to determine general locations and not actually specific locations, which again, privacy by design is what they should have been doing probably from the beginning.
Jeff Jockisch: The important thing here is to understand that that, that What this means is that marketers can't track your location as easily. Right. And they also can't connect you across devices as easily. And that's really what they're most concerned about. And. You know, location data is probably one of the most sensitive pieces of information about you.
And while IP address is not directly your location, it's very much in indirectly your location. It can be inferred very, very easily in many, many cases.
Chris McLellan: Yeah. I read the sort of the summary of the court's decision, which is. It's just too easy to re identify someone with that information and all the other stuff in Google analytics.
And therefore, when that, when that data reappears in the United States, if you're a, US-based a website publisher, then potentially because Google falls under a certain act that data could be subpoenaed or requested. Right. And and that's the big fear there. I was asked to provide some commentary on what us organizations ought to do about this or US-based website publishers.
And I sort of broke it into two halves. One was like, look into how you're managing your data pipelines today. Look into privacy, enabling technologies that might be able to get you to minimize and anonymize, and just generally be a better curator of data. But. But being from the data collaboration Alliance, I was also of course, had to say that thinking long-term, this is just the thin edge of a wedge, and you need to start really thinking about how you're building and buying new technology because silos and copies are the enemies of control.
And a, this problem only gets worse and worse and worse on an exponential pace this over time. And I think you're going to see these countervailing forces of increasingly strict. Data protection regulations and data proliferation being an increasing issue on this panel and elsewhere over time.
Heidi Saas: Google was at the IAPP conference. They were there and they built a cage and. Like a cage. It did. I posted some pictures of myself and some friends like trying to get out of the Google cage and the big sign everywhere that said Google most to make you safer. So they're definitely trying to do the apple thing was like, you know, privacy is cover for anti competitive behavior.
Yeah, apple was there. They were pretty, Tim cook was pretty angry too. But it, it just shows that they're listening, but they're not quite listening to the whole argument. I think I listened to max Schrems recently and he said even Google fonts has identifiers in it. So even the fonts that are used by Google, so IP trackers.
That's great. What about Google fonts? Well, you keep going and taking away and nipping away pieces of Google. And Google has got to make some changes to stay in the markets because they keep being left there unlawful. So w well, what's the next thing that they're going to do? This looks like some of the introductions of the changes that they're going to make.
Are they for the better? I don't know. I think we still have the issue with the Pfizer court and addressability. So I think that's going to be the bigger issue instead of some, you know, them not tracking the IP address. So, the next topic is Jeff's and just going to talk about scraping data from LinkedIn profiles and it's legal, the appellate court ruled.
So this week, Jeff, what do you think?
Jeff Jockisch: Yeah, actually this this case went to the Supreme court and was pushed back down. To the ninth circuit after the Supreme court had made a ruling in van Buren, which was a, which was a court case that the Supreme court made a ruling about the the computer fraud and abuse act.
And. This is really sort of a complicated issue. The computer fraud and abuse act essentially a law that was passed quite a while ago to stop people from hacking into computers. And it was really initially it was a, a criminal law and it was later sort of amended to become more of something that you could also Sue for a civil damages.
But It essentially says that whoever intentionally accesses a computer without authorization or exceeds authorized access and obtains information from a protected computer if the contact involved if it, if it involves the interstate or foreign communication that can be punished. And so obviously there's a lot of legal, legal, legal, Is it intentional access?
Is it authorized or unauthorized? Is it a protected computer? And all of those things make it very complicated as to what's actually going on. So now, instead of just talking about whether somebody is breaking into a computer, now it, the computer fraud and abuse act is being used to stop people from.
Scraping websites, right. Which is really what this, this new case is about. And haiku labs scraped a bunch of information from LinkedIn and wanted to reuse it themselves. But the problem in this particular case was they scraped a bunch of personal data, your data and my data, right. About our, you know, about us and about our jobs and our job history and things like that.
So the real question is, and what does that mean? In this particular case, the court essentially said, yeah, it's okay for them to scrape all that data because Chris and Heidi and Cat, and I made all that data, public and LinkedIn didn't really protect it other than telling people. No, no, no, you shouldn't access it.
So. Everything's good. A hike. You can do this. That's the ruling. The problem of course, is that this really says, now it's okay to sort of violate data privacy issues, right. And just scrape whatever data, whatever data you want. And that's going to cause some big issues about the gates.
Heidi Saas: They said, because he didn't put the gates up.
Everybody must want it to be public. Right. So making it public, does it mean you also want it to be scraped and then reused and resold for other purposes? You know, that's not the same thing as a LinkedIn user. That's not what I meant when I set my settings. And so to their credit, they have said they're going to do what they can to do more technologically to try to prevent this kind of thing, but the data's already out there.
So yeah, I, I was, I was surprised that it came down that way as well. What about you guys?
Chris McLellan: I have a personal stake in this cause I put a LinkedIn post up this with a couple of thoughts and opinions, which I I'm happy to say kicked off early. Interesting multithreaded conversation on LinkedIn. So it all got very meta really quick, but and so there was a lot of interesting opinions in there that ranged from, you know, Isn't this putting sort of the cat amongst the pigeons, why LinkedIn, you know, they have a vested interest in not having data shared because they want all of it.
And that was an interesting perspective. Not probably not incorrect, but but my perspective was that, you know, let's give credit where it's due kind of like with apple, you could always argue that it's good for business. And if privacy is good for business, isn't that a good thing? Like, I, I think so, but but in the case of LinkedIn, it was like you know, and, and, and their parent company, Microsoft, of course, it's like, they're on the right side of history here.
I mean, they're trying to do the right thing, whatever the reason in terms of giving their members or their users more, more control. But, but, but the real issue, I guess, from a data owner, Perspective, which is what I always try to bring to the table is that shouldn't users have control over access to their data, even within a third party app like LinkedIn.
And if that were the case then the consents and access controls would be managed by the end user. And if that were the case, wouldn't courts find it much more difficult to make a commercial decision on what really is a decision about privacy and human rights. And so I guess some fine day, which is exactly the mission of the data collaboration Alliance, when users have control of their information, I think courts will make, find it more difficult to make decisions like this based on commercial, not personal reasons.
Cat Coode: I think this is another one of those, just because you can doesn't mean you should because the data is there and it's available. And I've never seen a terms of service that says, Hey, if you're a third-party using LinkedIn, you know what? You can see the data, but you're not allowed to scrape it and go reuse it.
Like we don't, we don't have that as LinkedIn members, but this is arguably to me, this is the same as if I'm on a beach, in a bathing suit and someone takes a picture. It's a public beach, but does that give them the right to take that picture and then resell it or redistribute it? Like, again, it's the context we're not sharing in that case.
But I don't know how you fix it. I can't see a technological way to block people from taking data. That's there for sharing.
Heidi Saas: Alright, Chris, you're going to talk about a major study that finds consumers, becoming data capitalists, willing to trade personal info. Who did that study?
Chris McLellan: The global data and marketing Alliance.
So I'm not really sure who they are. And I, I brought this story to the table because I thought, you know, we could have some fun with both the, what, what it represents like our and, and who's behind the study for that very reason. I, I think we, I, haven't done a close look into the global data marketing Alliance, our, but I think Jeff has, and you can bring that in a second, but the thrust of it is that this study was published recently.
Saying that four years after government regulation and media coverage about data, self sovereignty that they've, they've had a material effect on the way people view their data. I E they're much more likely to contribute it to third parties and causes. And I think part of the context here was the pandemic and revealing your vaccine status and all that sort of thing.
And your passport. And I think it's a, it's an interesting study to be put out. It, for example, it says that in 2018, only 26% of respondents to this study characterize themselves as deem themselves as data unconcerned and not. And that's now risen to 31% and 2020 to 2022, the, the, the thesis being that we're all thanks to the pandemic and other things in the news that we're all a bit more willing to share our data.
But I think Jeff brought something up in the green room to this meeting. I think I'd hand over to him to to bring up sure. Chris well I think that the headline for me is.
Jeff Jockisch: It's probably that, you know, that data broker says that people more willing to share, share data. It's not necessarily that, that, that this market research is wrong.
I think probably within the context of what they're, they're exploring. They're probably. That people may be more willing to share, but the context that I would use is this the, the data pure, sorry, the data pragmatists. And I forget that the. The framework that they're using there, but essentially the same framework that Alan Weston used I guess like 30 or 40 years ago when he originally sort of explored this idea and Alan Weston was a data privacy researcher really I guess the first one.
And he sort of devised the, the notice and consent framework that we still sort of use today. And he was really a big influencer in the world of privacy. But there were some things wrong with the original research that he did that sort of came to light later in life. And. There were some things wrong with his studies.
I think that the behavioral economists have sort of figured out, I don't know whether that, that, that those things that were wrong with his studies are wrong with these new ones, but I think it probably needs to be looked at and I just have some concerns with, with where and how we sort of handled that. bAnd how we sort of represent those, those three groups.
Heidi Saas: Awesome. Cat. Did you have anything to add? Are you looking forward to this BC commissioner ruling here? Either way? It says a BC commissioner is mulling over privacy code for children. Well, gotta be good if it's for the children.
It should be good.
Cat Coode: We don't have anything for the children. The UK just released a privacy code for children, and that was sort of the impetus. So the U S has CAPA, which is the children's online privacy protection act. Canada doesn't actually have an equivalent. We have a bunch of hazy language and all of our regulations that sort of say, if kids are under 14, Be getting consent from a parent, but it's not specified.
We don't have specifics around how data can be used and abused. So CAPA for anyone who's unfamiliar says that children under 13 data cannot be taken, cannot be used. And then between 13 and 17, it can be used, but it has to be aggregated in another. And then over 18, you can use the data. So what's interesting about so BC is a province, which is British Columbia has its own privacy protection act.
A lot of the provinces do again, doesn't cover children. So it's kind of looking to go into that, but one of the things that they're looking at as well is nudging and. People are unfamiliar with nudging. It is one of the privacy harms where software is built, where the yes is big and easy to see, and the know is somewhere small in the corner.
And it drives you to say yes even, even things in, in the YouTube videos that continue to play, like anything that, that forces you to continue to use something. This privacy commissioner is now saying nudging is a specific and, and it's a harm to all, but it's a specific harm to. And so they're looking in this privacy code for children to include nudging as a real risk and harm.
And so companies wouldn't be able to deploy it or use it as much as they're
Heidi Saas: using it. I think it's fantastic. Behavioral economics has been used against us for so long. It's about time that we figure out how it works and put it in the regulations. Dark patterns are everywhere.
Jeff Jockisch: I think, I think that's great.
Chris McLellan: Thanks. I was just going to just raise the point that I'm glad Cat chose this story because I'm not sure how others feel, but I, I don't feel that my privacy online has been lost to the ages entirely. I mean, I've still got life to live and things I'm going to do in the future that I would like to protect and, and have control over.
But. I think it's right to focus a lot of a lot of the attention in the privacy sector and priv tech and beyond on children who arguably have their lives ahead of them and, or do have their lives ahead of them and all their data to protect. And I don't think we can do enough to focus on people who are entering the digital world.
Birth, possibly before birth, if you think about the data collected in the placenta before they're born. And, and, and so I just, I'm always pleased at CA chose this story and and to highlight that issue I don't know how others feel about that. A lot of people are like, it's still, like, for me, my privacy's gone.
So it's focused on the kid. I don't believe that, but I believe that. Equal emphasis needs to be given to children and they don't always have the same voice that adults do or business people do.
Jeff Jockisch: I'm very much in agreement with that, Chris. Plus I also think that it's a good way to, to introduce privacy legislation.
If you can get it in for, for kids. First it's a way to sort of introduce that type of legislation to the world. And then hopefully you can sort of expand some of that stuff to the broader population. I don't know how well that's going to work, but I think it's a, it's a good start. I mean it's hard to get dark patterns stuff into the broader, broader privacy legislation landscape, like getting it in for kids first.
I think it's an easier way.
Heidi Saas: Oh, yeah. Oh yeah. And we're not making it up, but it's yeah. There's science behind it. It's yeah. It's behavioral economics. Right? So I think this is a pretty good segue into the next story that Jeff has here, which is a new campaign in the UK to rename cookies as data collectors so that you can highlight kids' privacy online.
Jeff Jockisch: Yeah, exactly. This is really sort of the same kind of a story. And as Kat said before, the UK has been really great with with the new children's code. They're probably strongest right now on protecting kids. For data privacy they've done some excellent work there, and this is another step in, in that kind of action.
I'm really pretty impressed with it. They're also doing some really good stuff with AI and, and children's could which is pretty amazing, but in terms of this Idea for renaming cookies as data collectors, I've been a sort of a big proponent of this on a broader scale. And so I really love the idea of going after it in the children's venue first, because I think it's, like I said, a first step, right.
And it could actually. Happened here because the idea of calling these trackers cookies is is a misnomer. And I think it's why they're so widely accepted, right? Will you, will you take these cookies? And all of this cookie consent stuff, it sounds like it's a benign practice, but it's not it's, it's where you allow tracking to happen.
And it's just a crazy thing that, that we're, we're allowing people to do. It's almost you know, akin to the whole idea of having privacy policies or privacy notices for websites when those privacy notices. Aren't really about privacy. It's about telling you all the different things they're going to do to track you.
They're rarely ever telling you about how much privacy you have, right. Is it's the exact opposite and cookies. Aren't about, you know, giving you anything good. They're about all the things that they're going to do to surveil you. So why aren't we giving them nice names instead of what calling them, what they are.
Cat Coode: Yeah, well, I've done. I've done a lot of work with them, with kids around. I used to call it digital safety. Now I call it online reputation because kids, it's not about safety. It's about what they're putting online and what they're sharing. And I think there's a broad misunderstanding around kids and people say, well, they have all these public Instagram accounts in public Tik TOK accounts.
So they don't care about privacy. They do care about privacy. What they have is a lack of. Staining of who has access to their information. So exactly what Jess said, it's the tracking. They don't like. When when a young teenage girl sticks up an Instagram account with a bunch of bikini shots, it's not because she wants everyone to see the shots it's cause she has like 10 people in mind who she's hoping will actually look at her account.
Often tell parents that if their teenage daughters are doing that to have a middle-aged friends, just be like, Hey, saw your photo, great bikini. And that gets that photo down really quickly. So, Like kids again, they, they, aren't thinking of the reach. It's not that they, they, they care that other people can see their stuff.
So I love this. I love the fact that they're changing the name. Kids don't want to be tracked. They just want to be able to share what they want to share with the group of people that they want to.
Heidi Saas: I hate data brokers. I call them data brokers. I tell my children about data brokers. My kids understand everything about it.
We were playing clue and that's how I taught them to play clue is by explaining relational data to them. And after we had the discussion of relational data, they whipped me pretty good. Blue. So I kind of had that coming, but it's good to talk to your kids about these things to find these kinds of examples.
So I'm not the only one professor Solano just published his paper on this week. And he also talks about the example of clue in relational data. So it's important to talk about kids. What's happening online because you're not doing it with a machine. There are other people on the other side of machine and what are those people doing?
So it's, it's all about raising the awareness and yeah. Call it like it is data collectors, data brokers. I got a couple of names that aren't fit for podcasts, but Chris, you got something to say about it.
Chris McLellan: Yeah, so those are all excellent points and it was a nice continuation from the previous story around children.
I lived in the UK for a long time and I'm surprised that they weren't called, you know, biscuits instead of cookies. Anyway, that's their name for cookies. But it's. Just another signpost in the, in what I hope is the beginning of the end for the cookie entirely. First of all, like this story reveals like in this, let's call them what they are, which is data collectors and surveillance.
And then secondly, let's, you know, I think similar to Google analytics and our previous story about up it's like the Cookie's been having a pretty quiet existence for a few decades now. And you know, not too much, not many months go by without. You know, a review of these little pieces of software, it goes by and then sort of fundamental rethink of what they mean to privacy and what they mean to society wider and particularly children.
So I think it's a great move. I hope others follow suit and I'm interested to see the, the future of the cookie as. Data collection and data aggregation and just the entire landscape seems to be an upheaval right now for, for, in a good way. And the cookie could be just another victim on site Google analytics before we.
Heidi Saas: So we've got one more story to talk about here because it seems pretty obvious here, but your personal data is exposed to hackers and it's an alarming report revealing that your mobile apps are not protecting your info. So I, I don't know who really thought that was happening, but. Yeah. What do you guys have to say about that?
Oh, I think the thrust of the story was that there's no malware or anything that you need to side load download or engage with for hackers to get at your phone data. They can simply do it through the browser. And I, I suppose this is you know, one, another, one of those stories that is nothing sacred.
Well, not really. And so. It's it just continues to speak about the need for humanity society to move towards a, a, an end game where there's less data it's used with intent. The owners of information are the rightful owner information is how have control of it. And there's far less third party data collection than there is today, which is a hallmark of web two dot.
Oh. Because, you know, as it turns out, You can't even access the web at all without being surveilled or hacked or life hacked in some way. The other thing like to our other story, I read about just recently about the screen scraping associated with LinkedIn is that. Most screen scraping takes place on the, on the webpage logs.
And so it goes completely undetected. So if I scrape LinkedIn, they can detect that and bring it into, into the realm of a court case. But in fact, most screen scraping takes place on Google's page logs. Which nobody looks at except people trying to scrape data. So this is a essentially a cash of, of webpages.
And so, I, I thought I'd just bring that up at this point in the nothing, nothing sacred, sort of a file in the browser that you know, it's interesting and sad in a way as well that It's so easy. I guess you want
Heidi Saas: to talk about side loading. Anybody want to talk about sideloading Jeff? You had something to say
Jeff Jockisch: No not really on sideloading.
Heidi Saas: I was like, maybe we should take it talking to people. We want to talk to people and tell them what side loading is for people that don't know what side of the.
Chris McLellan: Sure. It's, it's the ability to download an app to your phone outside of the walled garden of an app store, like a, the, the apple app store or Google play, or so you know, There's that's a whole different subject and kettle of fish as they say.
But but I guess the point is most people would associate that activity with being a security risk, to the information you have on your mobile computing device versus just simply opening a browser. But as it turns out, just simply opening a browser from your phone is enough to expose a lot of information to the well motivated hacker, put it that way.
And it's disturbing and and. I don't know enough about this specific story to know what the answer to that is. Browsers are the portal to the web or the internet. I don't see any replacement to the browser imminent, possibly some of the newer browsers offer more protection in this story is more related to Chrome.
I'm not sure.
Jeff Jockisch: I didn't want to say one thing. As we're moving away from this sort of third-party data collection as, as apple and Google, try to sort of cut off the third-party data collection. And we move more to a first party data collection model where brands for instance, are collecting the data more directly.
There's already a lot of move for these brands to start doing things like browser fingerprinting and other kinds of ways to, to collect that data, deep data about you more directly through exactly what Chris is talking about. Right. Directly grabbing the data through the mobile operating system or your desktop operating system directly.
And trying to figure out exactly who you are because Google and apple won't tell, tell them that information. And those are sort of nefarious ways to collect that information. And, and there's evidence that that's happening on something like 60 or 80% of websites are. Yeah, when you visit them.
Cat Coode: So I'm finding this funny from my perspective, cause it's come full circle.
So I worked at Blackberry, which was rim. I was leading the architecture team for handheld and we literally designed it so that the operating system would not allow downloadable apps to take your data. That was the whole premise of how we did. Which is also why we lost market share because I phone came out and you could download all this cool stuff and it just worked.
It just connected with your contacts. It just connected with your calendar. And it just worked, which was, this is always like my origin story. So people had, how did you get into privacy? And I'm like, because I realized people didn't understand privacy when they jumped in. To iPhone. But now we've come back now they're locking it down again.
Now they're saying no, every time you download an app, you have to actually consent to have the data fall out of our native apps. But it's, it's just interesting to me that, that people didn't realize that, but didn't realize that having the data available meant having the data available, think it was the correlation wasn't there.
Heidi Saas: That's crazy. Yeah. People are lazy, man. And yeah, it's it right. People are lazy. I just want it to work. Right. Yeah, so well, Tim cook had some interesting things to say about side loading, but first he decided to, you know, tell us all the positive things about what apple is doing for privacy. And then he went off on the sideloading rant and it was heated.
Like he took us to church for a minute. It was wild and people in the room loved it for some reason, you know, yell at me some more or whatever. And the whole time I'm sitting there thinking like, does anybody else know that this is just rampant? For anti-competitive behavior, because there are technological ways that he can still provide the same safety and security and privacy controls with sideloading, you know, like it's not an all or nothing kind of thing here, although what you were in the process of like building tools and making tools better.
So how about do things that we want? We just want them to work and we don't want to be tracked and surveilled and harassed and hacked. So yeah, I thought that was, that was an interesting way to go about it. No, I don't trust my phone a bit. It's listening to me right now. It's going to tell all kinds of people, what we talked about here today.
Whether I want it to or not, I'm pretty sure. Right? So that's just where I am on the level of trust with, with all of the technologies. I don't trust any of it. I'm zero trust.
Chris McLellan: Well, if I could put my DCA hat back on I would say that, you know, the fundament, the root cause of all, this is a lack of data control from end users and developers alike part in due in part to copies and everything else and copy based data integration.
The the, the real solution to get to the root causes. We have to build apps different. And, and, and that's a big part of what we're trying to do with the data. Collaboration lines is make people more aware that there are new ways and it's not only decentralized apps and blockchain. There are other technologies that are making this possible.
To rewire how we build apps to decouple data from the code like that. That's a really fundamental principle of the framework we're advancing in Canada. And that soon to become a national standard called zero copy integration and you know, just a little, a little shout out there for one of, one of our causes, but we're always, we're going to be having this conversation.
20 years from now, unless we, you know, call it web three, call it web whatever, or the future of apps. But the, the, the real issue underlying all of this is the root cause of how we've combined apps and data. And every app creates a data silo and those silos need to be connected. And they're connected by copies and copies, equal chaos.
When all we want is. And another, I always sort of, equate privacy to control and I use them fairly interchangeably. What I want is control over who, what, when, where wife sees my data and I want that as an organization, I want it as an individual today's app architecture will never really fully support that.
So something to think about.
Heidi Saas: That's right. We haven't had any last points to make before we wrap it up. Awesome. Well, this has been great. It's been lovely to see everybody and thanks for joining the panel.
And that's a wrap everyone. Thanks again to our guests. Cat Coode of Binary Tattoo, Jeff Jockisch of PrivacyPlan and Chris McClellan of the Data Collaboration Alliance. I'd also like to invite listeners to check out the free community at the data collaboration Alliance. We're a vibrant group of data savvy professionals who collaborate on open data sets and build free tools for important causes. It's all done within a zero copy environment that offers our members, unprecedented control of their information.
Our new community experience is launching soon. So sign up and become a founding member today. Visit data collaboration.org.