Naïm Alexandre Antaki
Associé
Co-chef Groupe national Intelligence Artificielle | Chef du groupe de droit des sociétés et droit commercial - Montréal
Balados
19
Naїm: Thank you everyone for joining our podcast today. My name is Naїm Antaki. I'm partner at Gowling WLG in Montreal, in business and tech law, and privacy is something that I always have to deal with, with my colleagues, as part of our global tech team. Today we are so blessed and lucky to have two wonderful people to speak with us about this important topic, Luigi Bruno and Sherry Truong. Luigi, would you like to introduce yourself, just to give the audience a little bit of an idea of the very interesting path that you've had in privacy and tech generally?
Luigi: Yes, absolutely. So first of all, thanks for the invite, Naїm:. It's a pleasure to be here with you today. As Naїm: said my name is Luigi Bruno. I'm originally Italian but a little bit of everywhere around the world. I'm a Senior Engineer and privacy technology leader in the Group Privacy Operations team at IKEA in Sweden and I'm also doing a Doctorate in AI and law at McGill University in Montreal. I have a bit of an unusual background because I studied both law in computer science. When I'm not very busy doing other things I also try to publish research papers. So I'm recently publishing a research paper on the post-quantum encryption and how it affects privacy law and how regulators should actually try and mitigate its effects. In my previous life before joining IKEA and before doing a PhD, I've advised several larger multi-national organizations on the protection, privacy and a lot of information security, as a freelancer and as a consultant while working for Deloitte Switzerland in its cyber risk team.
Naїm: Thank you so much, Luigi. Sherry, it would be wonderful to hear also your very interesting path so far.
Sherry: Hi. Thank you for having me. I'm very excited to be part of this amazing podcast with everyone here. So a little bit about me. I graduated from UC Hastings in California and I went in-house doing a lot of privacy work. I started off at a digital media conglomerate doing their pre-GPR work and then moved onto an open source platform that was acquired Microsoft. So I was handling the M&A privacy integration for them and then recently, previous to my current job, I joined Twilio which is a telecommunications platform doing a lot of their sort of global risk and remediation and privacy work. Just recently joined Asana, which is sort of like this project management platform, and looking forward to seeing how this company will continue to grow and privacy is a very interesting industry to be in nowadays with all the changes. So I find that at any place that I go the challenges are always new and so we'll see what the future holds.
Naїm: Wonderful, and Sherry, even though this is a technology podcast it would be great to hear where are you speaking to us from today?
Sherry: I'm in San Francisco, California. We just returned to office which is very exciting. First week in and I love being on this podcast with everybody. It's such a great international community and privacy is very, very much rooted in international law so it's always great to get interesting perspectives from everyone.
Naїm: Thank you so much, Sherry, and Luigi, where are you based today?
Luigi: At my home base at the moment. It's a bit less famous then the Bay Area. I'm in Malmo, Sweden. It's exactly on the other side of a tiny strait called the Oresund Strait from Copenhagen. So that's where we are. It's a very international place. I think in Malmo over 200 different languages are spoken and it's home to several companies. Obviously IKEA but nearby you have Volvo, you have several larger and very novelty companies. So privacy is certainly a crucial aspect of the business life around here.
Naїm: Wonderful. I'm actually in Montreal which also is a wonderful platform for different languages and cultures. So I think I will start, if it's okay with you, because this is such an international panel before delving in maybe some legal aspects of privacy. Let's start with cultural approaches to privacy based on your path and I'll start with Sherry if it's okay with you. How do you feel people look at privacy, generally, and do you feel like this is something that has changed since you started in the field a number of years ago?
Sherry: I think that's a great question. I think privacy is becoming more of an issue for more companies, especially in the US, as more State laws start being passed. There's talk of a US Federal law. We don't know when/if that's going to happen but it is at the forefront now, at least for a lot of people, certain things about regulations coming through. CCPA being one of them, the California laws that are coming to play, CPRA coming out as well. So I think companies especially based in Silicon Valley are very, very concerned about privacy and how it's going to affect them. I think we're starting to see a shift in the industry in hiring. There's a lot of privacy professionals out there waiting to be swooped up and I think that we're starting to really see companies take privacy seriously. Especially as fines are coming down in Europe I think it's really starting to, not put people in a panic, but they're taking it a lot more seriously then they did before. I think that for smaller companies and startups it's a bit more difficult because they lack a lot of the resources that a lot of, obviously, that big tech has but I think it's very important to kind of see that being privacy forward doesn't necessarily mean that you have to have a team of 15 privacy lawyers behind you to implement some formal privacy program. A lot of it is thinking about how you build privacy into your product by privacy by design. How you want to implement privacy values or ethos into your company culture. Thinking forward as to how technology is going to change and how your product is going to be able to keep up with those changes. I think all of that is leading to a lot of discussion about privacy being woven into the culture of every single company, that at least I've talked to, and part of that is going to be an interesting exercise in seeing will that translate into resources, into more robust privacy teams. You'll start to see privacy being marketed as part of the company brand. I've seen large tech companies with huge billboards about privacy and being privacy forward and gaining customer trust. So I think that that's probably going to be the lay of the land as more regulations are being passed and people are starting to see customers ask, "What are you doing with my data? Where is this going?" and that's really going to be a forcing function I think for privacy moving forward.
Naїm: Thank you so much, Sherry. Luigi, since you were in Montreal, you're from Italy, you're in Sweden right now, do you see some maybe differences in approaches? Again, just how people approach privacy depending on where they are, from a cross-cultural standpoint?
Luigi: Yes, so I think there is a bit of a symmetry among European countries in the first place, and also like between Europe and North America. When it comes to the European landscape, certainly I think European Countries and European companies, are a bit more mature when it comes to operationalizing privacy because Europe was obviously the first major jurisdiction to push forward for the GDPR. It was a big game changer when it comes to great ... landscape, in terms of privacy. When I worked in Switzerland I noticed that operationalizing the privacy for Swiss companies was a very natural thing to do because Switzerland is based on trust and privacy for your customers. We can go back 1,000 years with the first banks being set up there and that's the culture. You don't need to disclose what we do. You need to keep whatever your customers are doing very secret and very private. So I think that the transition towards privacy it's been very natural for Swiss companies and the same I'm noticing, here in Scandinavia, where I think the quid pluris, to use the Latin expression, is that Scandinavian companies are based on values and these values are actually the pillars of whatever companies do. So if I think about the case of IKEA, for instance, it's pretty public knowledge IKEA is based on values that have been set several years ago when living conditions in Sweden were different and now that the company's so big they can actually have a big impact over the world. So whatever they're doing it's based on these values, and I think having these values that tend to protect methodical workers but also like customers and the environment and society, it's been a very good basis for privacy. Not only to be upheld from a compliance perspective, the so called paper compliance, but also to be internalized, operationalized and then seamlessly imbed that within the organization. Then moving across the pond, I think in North America because the US has been so dominant in the North American business landscape for so long and there's so many different jurisdictions within the US, I have to build on what Sherry was saying. That only now with so many State pre-regulation we start to see a bit of a change in the scenario and I think Canada is following, right? Besides the larger international oriented companies in Canada, that obviously have to deal with California, have to deal with the major States like New York or with Europe, Asia, that inevitably needs to be compliant, right? Because if you want to cater your customers in Europe you need to be compliant with the GDPR. I think that we're going in the direction where the landscape is becoming more mature, also in Canada, and this is being reflected also in the new regulation that's coming along. If we look at Bill 64 in Quebec we see that clearly there's a phenomenal legal transplantation that comes from the GDPR.
Naїm: Thank you so much, Luigi, and if you don't mind I'd like to pick up on the very interesting, I guess balance, that you talked about. Which is the importance of values and I think, Sherry, you were talking about this also and the distinction between values versus strict compliance. In my experience I truly feel like compliance is only the very first step, and may not necessarily be enough, as is typically not enough for a company. Right? Yes, definitely you want to be compliant but at the same time there's a real emphasis being put on trust with your employees, trust with your suppliers, trust with your customers. This leads I think to trying to do more, and as you said Sherry early on, being privacy forward in a world where laws have to be drafted in a general way, if I can put it this way, being technology neutral. Turning those principles into action can sometimes be difficult. I think this is where I'd like to turn to, it it's okay, because I think that it's one thing to talk about the laws and some type of harmonization. Luigi, you were talking about Bill C-64 in Quebec, which is still in draft stage and obviously our schedule, Federal privacy legislation which is also in draft stage right now in terms of being overhauled, and there are some what I will call adaptations. Because if you're a company my understanding is that at the end of the day you need to be able to operationalize all of this and you can't do it differently State by State, Province by Province, country by country, at the end of the day you need to have some type of common denominator. I would be grateful to hear your experiences in maybe how to turn this variety of legal privacy legislations in different jurisdictions into how do we launch this project, or how do we grow this project across, from a very pragmatic standpoint. What's the best way to go about it. Thinking about, forgive me the jargon, but thinking about your stakeholders. Your board of directors. Your tech team. The other legal team, your insurance and your sales team. Maybe what have been some of the things that you have learned, or how have you learned to navigate with these different stakeholders? Sherry, I'll start with you.
Sherry: I think depending on the size of the company one of the first things that I like to do is sort of have conversations with internal stakeholders, particularly for smaller companies, about where do you want to land on the spectrum of privacy and I say that because personally I think it's important for a company to kind of have some sort of privacy mission statement in mind. It doesn't have to necessarily be written down. It's great if it's formalized but very rarely will you see a smaller company say, "This is what we're doing. We're sticking to it." because it doesn't really enable business. Like you said, operationalizing privacy and when privacy laws are constantly changing, what you operationalize on one day might not be compliant the next day which is the great thing about privacy. You always have to constantly be iterating, making things better and so baking privacy into the culture, into the product, into how you do business, is really important because that's going to inform how you face these challenges as these changes continue to grow. It's only going to become more complicated as technology becomes more complicated. You're moving into, as Luigi said like AIML, the questions are just going to become murkier. Your solutions are going to become more ad hoc and how do you prepare for that? I think one of the things that I find in companies that struggle is that they don't really take a stance on where they fall on that privacy spectrum. So if you're not willing to take a stance, wherever it is that you fall, right? I'm not here to make a judgment on how closely you align with I don't really care about privacy to I care about so much about privacy, that's the only thing that matters to me. Those are very extreme positions to take but I think that there is something to be said about being able to pinpoint where you are because that is what helps drive your decision making. We get presented with really difficult decisions that balance being maybe a little more conservative on privacy and business needs. Sometimes in order to service our customers that's a very difficult balance to keep. Those are the considerations that we always take in-house, and the values play into that, because if you don't know where you stand you're sitting there going, "I don't know if I should go and make a choice that's better for the business or better for privacy." You're making inconsistent choices and that usually equates to inconsistent policies, inconsistent frameworks, internal frameworks and it confuses internal stakeholders because why are you giving on this and not that? Why are we not taking this stance? Then there's 5 or 6 exceptions to those. It provides a sense of clarity for everyone moving forward in the right direction because, if you say we're going to be a leader in privacy in our industry, that means you're going to have to potentially be a little bit more conservative on certain things that fall on the privacy side which are going to affect your bottom line. If you're willing to do that and you're willing to say this is the plan that we're going with, we're staking our reputation on it, this is our customer trust, we're making privacy part of our brand, XY and Z, and it's consistent not only are internal stakeholders clear but you're probably more likely to get resources to support that if it's a company choice that you've socialized out. Then to your customers as well. They see a consistent stance on how you're treating their data, what your policies, your external facing policies, are like. Regulators get to see that they're really trying to be more privacy forward, privacy focused, gaining customer trust, they're understanding the balance between themselves and the customer and that might buy you some goodwill. So I think that values really play into it because it helps your decision making and it really just gives people a sense of wow, this company really fits in this section of the privacy spectrum. If I say, not to throw certain names out there, if I say Facebook, Google, Apple, people sort of have ideas of where they fall in the spectrum. It's like that for a reason. They've made consistent choices when it comes to privacy, when it comes to security, when it comes to data usage and retention and sharing. So I think that's a very real thing for a company to think about and I hope that people will take the time to really build that out, and encourage their stakeholders to really take a moment and pause and decide kind of where they want to be, because that can really, really affect what the company looks like 10 years down the road.
Naїm: Thank you, Sherry. There's lots to think about there in terms of not just thinking about privacy as a value but how it interacts with and relates to the other values of the company. Luigi, your thoughts and any thing else you'd like to add on this topic, which is the operationalization of privacy, making it real, and navigating through the different stakeholders. I'm sure you'll have a lot to say on that.
Luigi: Yes, I think Sherry touched up on two points that are really key to all of this. The first one is that operationalizing privacy, it's not a one time effort. It's actually ongoing dynamic effort. I like to think of it in terms of having processes that allow a company to actually say, "Okay, now we're building this product. We think we have privacy by design but if tomorrow the regulation changes we're going to be able to quickly, promptly, without harming our business, without harming our customers, to actually go into our privacy by design and change whatever needs to be changed to be able to keep our privacy compliant up to the new regulatory standard." I think, especially for large companies building these processes, especially for existing products I can only think how difficult it must be, for example, for companies to build very complex software like operating systems. It must be really complex but for startups I find that this journey must start very early on. Sometimes I have chats with startups and they tell me, "Yeah, we're testing this AI algorithm but we don't know if we can market it." Then why are you even test it if you don't think you can market it, right? If you think it's not even going to pass the very simple, you don't even have a legal basis to process the data, why are you even doing that? Because knowing that first it's going to give you money, because you're never going to be compliant, so you're probably going to be bleeding in terms of compliance costs, and even potentially fines, and then I think it is wasting your time. I think also innovation needs to be very privacy oriented. If you're thinking of developing your product you need to do so in mind, knowing that your product needs to be built with privacy in mind and with privacy in its bones, and then at some point down the line you probably need to be able to very quickly, if you're a startup, very quickly and in a very agile way change your product so that it can stay very privacy oriented. The larger the company I think the more issue at stake becomes having the right resources that can combine privacy knowledge with technology knowledge. Also I think the right processes that can efficiently get into the nitty gritty of the products. It can be technology product. It can be any sort of product which can have potentially privacy impact and then swiftly say, "Okay. Now we need to change this. How do we do it? Do we have a process?" ... this person that's this, we do this and keeps being compliant and it doesn't hurt their bottom line.
Naїm: Thank you, Luigi, and you're right. There are some choices that you have to make whether you are a startup or you are a large company from a privacy standpoint. If I can put it maybe in a different light, when people try to disrupt an industry they need to have insights. Insights means information. Here we, obviously in privacy, what the law and I guess compliance efforts tend to focus on is personal information. So information about individuals who need the protection from the law. But when you're thinking about these insights and disruption and all of this, sometimes there's also other types of information which is as important, or even more important sometimes to your own company, which is business information. In that frame of mind, when you think about compliance it really only deals with one item, which is personal information. Definitely important. Laws are there in order to protect people because they would not be able to protect themselves otherwise. But, again compliance, if you are in compliance with privacy laws it doesn't mean that you've done everything that you needed to do, internally as a company, to protect yourself and to protect what may be the most important to you. I guess, just to share a little bit from my own experience working with startups and working with large technology companies and dealing sometimes with public procurement and others, I really feel that people are asking these questions much, much earlier. I've had the chance to teach as a guest lecturer on artificial intelligence and the law at McGill for the past few years, and the business people in the room and the tech people in the room, all of their questions relate to privacy and they go straight to what you were talking about, Luigi. Which is can I use this data set or not? They're very much I guess in tune with the fact that they have to deal with this. I would say, I don't know maybe Sherry how you've seen this, but what I've seen also is that previously people tended to rely on very general statements, perhaps in contractual terms, to say, "I'm not doing this. You're the service provider, you're the supplier so take appropriate means in order to protect this." and that was enough but, both from a public sector standpoint and a private sector standpoint, I see much more due diligence on the privacy side to make sure that the whole supply chain is solid from that standpoint. I'm not sure how you've seen it on your end, Sherry.
Sherry: I think a big chunk of privacy risk for any company, especially for most SaaS companies is not everything is built in-house. It's impossible. You're using vendors and building privacy and risk assessments into your procurement life cycle is very, very important because you have data centers housing your data, you're using sub-processors, you're using other processors and now, especially with the new SCCs coming out, and even before then, a lot of the obligations that you have, in terms of doing due diligence and maintaining your compliance, flows down to the processors that you use because they touch the customer's data. I think that you're starting to see things being a lot more formalized. There's a lot more transfer impact assessments being done. A lot more privacy assessments being done because I think in a post-Schrems world you kind of have to be very, very aware of your total exposure in terms of enterprise liability. That includes vendors. So you're going to be see longer questionnaires. You're going to see requests for supplementary measures. You're going to see I think a lot more scrutiny for security exhibits and security policies and I think it's a way for everyone to understand that you can't just put liability off. People are starting to be more particular about the vendors that they take on. I will tell you right now, in part of the procurement life cycle, if a sales person or an internal stakeholder says, "I want to use this vendor." and the first thing I do is I go and I click on the privacy policy and it hasn't been updated in 3 years, I immediately am suspicious. I'm like I don't know if we should be using this vendor because if you're telling me that in the past 3 years, in the crazy world of privacy, nothing in your business has changed enough for you to have to update your privacy policy, I'm not going to believe you, first of all, and it's really going to raise a lot of questions on my end about whether or not this is an appropriate vendor. Even if they have all the right contractual language because for me it's not just about you signed it so now I can just ignore it, it's if the reality of the fact is you very obviously haven't operationalized a lot of the privacy compliance that it's so apparent that it's on your website within a 30 second glance, that poses a really big risk to us because we're entrusting our customer's data to this vendor and if something happens to them. So I think part of that is really being careful about doing that due diligence. It's very important. You can't just rely on contracts anymore. Luckily I find that a lot of security teams and a lot of privacy teams are working cross-functionally to really kind of tighten up those gaps in vendor assessments because they're realizing that it's becoming a big issue. What you're also starting to see is a lot of companies going for more certifications because I think it just automatically provides a level of security, reputational security and compliance security to the vendor's customers that, "Look, we take this seriously. We understand that this is only going to become more stringent." Very rarely are security regulations and privacy regulations going to become more lax. Right? It's just going to get tougher and tougher, and so much like companies that take a privacy forward approach with their customers, vendors that are ... more privacy forward and security forward approach tend to be the ones that sort of get that goodwill from their customers. I'm telling you, come renewal time, a lot of times I'm like is it really worth it to keep this vendor on and we have to really have that discussion with our internal business teams. We've had to drop some vendors because they either just weren't being compliant, or they weren't being forthcoming about their practices, where it was very clear that a lot of things were lacking and we couldn't risk that for ourselves.
Naїm: I hear you on that because I think indeed people have moved away from the one paragraph to say that's enough, but they use contracts in a very different way now to deal with different representations and warranties or different obligations, in order to make it very clear who is responsible for what and what level of privacy or sometimes security needs to be. Oftentimes contracts are there also to maybe help to take care of some, I want to be very careful in how I'm going to phrase this, I don't want to say holes in the law but maybe some choices that have been made by certain jurisdictions, not to go as far on some in terms of regulatory approach. So things as simple as data breaches, for example, Luigi and Sherry will all remember a few years ago people would think about the US and say, "Oh my God. There's 50 different ways of dealing with this." and Canada was, "Well, actually the law is about to change but hasn't changed yet." Thank God it has changed now but it's things like this where I think that operationally, in your contracts, you need to make sure that if there is a data breach or if there is some type of other security aspect or privacy aspect that you need to think about, you need to make sure that whether your vendor is based in the US, or your vendor's based in Canada, or in Europe, or in Asia that people know how to deal with this. There's a harmonization from a contractual standpoint on this.
Sherry: Yeah, and just to jump in here on the contract standpoint, another point that I think is really important is how fast and how much more complicated privacy is from a contractual standpoint, from pre-GDPR to now. I think everyone was scrambling to even figure out what a DPA looked like around GDPR time. When do we sign it? What does it look like? What are these mono clauses? What are they going to look like? How do they apply? I don't know. Every company had to try to figure out how they want to do it for themselves. Are we a controller? Are we a processor? Are we controlling certain other data and a processor for only this type of data? Those types of questions were really difficult to figure out 5 years ago and not a lot of people wanted to sign DPAs and you see, looking back at what those DPAs looked like versus what they look like now, they can look wildly different. You're talking, like at one point companies were signing like, "I'm not going to sign anything with you unless you give me uncapped liability for data breaches." Now you're seeing a more nuance conversation about those types of clauses, about limitations of liability and responsibility and reps and all that stuff, because people are starting to understand more what privacy liability looks like. What privacy risks look like and what you as company can actually agree to. I think that that's a very important note to make because it's only going to become, like I said, more complicated. Now the DPAs are more specific, more exhibits. The new standard contractual clauses that were adopted by the European Commissions are coming out and now people have to figure out what modules they want to use, how it's going to apply to the company, what you can say yes to, what you can't say no to, and I think that is going to be a driving force as well because now companies understand, "I'm actually really locked in to these very specific clauses." because there is more clarity of what is expected of a company to be able to show, in terms of compliance, and to be able to provide, as a service, that aligns with the DPA that they're signing or any other agreement. It sounds very much like common sense but 5 years ago it was very, very unclear about what that would look like. The more guidance that we get, the more specific your contractual clauses are going to get as to what falls in line under certain allegations that are written into these contracts.
Naїm: Very well put, Sherry. If you don't mind I'm going to challenge you on something that you said a little bit earlier where you were saying that it would be quite rare for maybe jurisdictions to tone down privacy or security obligations. I think you're absolutely right and there has to be some type of standard. That having been said, what at least I see from my end, is a lot of conversation between corporations or businesses, academics and regulators on what should come next and how to adapt some of these, sometimes what may seem to be very cumbersome obligations on companies, so that it works. So if you think about Canada, I'm sure everyone knows this but, while we have the internet and we're very tech minded some of the smaller businesses have had almost, no one needed the pandemic, don't get me wrong, but with the pandemic were really thrusted into a whole new world of I have to go on that digitalization path right now. I have to right away think about all of these issues that I've never thought about. I think a bit like the approach that was taken on data breaches, Federally, where the idea was to take maybe some very good and important principles from Europe, and obviously some from the US and some from Canada, and blend them in the Canadian fashion. But I think that with privacy, what I'm currently seeing both at the Quebec level and at the Federal level, is a little bit of a same approach meaning we do want to have a level or standard that is closer to Europe but maybe not all the way there because I think that goes back to culture. I think Sherry, Luigi, you were saying that in Europe, I think Luigi you mentioned that, in Europe people have had to deal with privacy for many years. So they're used to all of this. Now if you thrust upon the Canadian companies all of this it may be too much for them and they may just throw their hands up and say, "I give up. This is just too much for me to deal with so I'm just not going to deal with it at all." I think that at some times it's important to bring companies along in stages, focusing maybe on what is most important or most crucial, and not necessarily worrying about all the things that GDPR deals with, because you may not be in Europe, but focusing well for European operations or to the extent that GDPR actually applies to you knowing that you need to take one step up. And, understanding that even if this may be cumbersome, once you've gone through that you may be closer to compliance in other jurisdictions, eventually. Luigi, so we were talking about innovation and privacy, and you were talking about this very interesting paper that you're working on right now, or maybe that you have worked on and that's going to be published. I'm going to take this in a broader way. At which point do you feel that technology enhances privacy and at which point do you feel that it actually hurts it? Based on some of the edge technology or I would say latest developments in technology. We're talking about post-quantum, but there's AI, there's a whole lot of different topics there. Just to get your general thoughts on this topic.
Luigi: I think it's a very good question. We can have totally separate podcast just to talk about this because it's really broad and requires a lot of ideas to be thrown around. I think, certainly, technology can enhance privacy. I think about, for instance, how it can automate processes. For instance let's say you have a repository of how you do privacy by design. If you've done assessments for privacy by design and then you can have an automated tool, for instance, that has intelligence on regulatory development and automatically flags which product you developed needs to be reviewed in light of this development. You would have to have probably a dedicated person, if you're a global company, to have a dedicated person to just be on top of things. This obviously is efficient economically but it's also efficient when it comes to managing privacy and operationalizing privacy. On the other hand, when does technology harm privacy? I think technology can always harm privacy. It really depends on what do we do with technology. As I mentioned in the paper, we can have a quantum computing, which is fantastic if we actually get there to the point where it becomes mainstream, and widely adopted, and cost effective and doesn't require a team of scientists just to turn on. We can get to the point where we foster spaces for exploration. We probably have drug development, much better drug development, more efficient, cheaper, etcetera, but at the same time you can also have malicious actors that decide to hone the technology for bad intent. Inevitably that leads to privacy harms and this I find it to be a bit controversial in the legal world. There's still people that think that privacy and security are two different things. They are but inevitably they are links and I think that to have very good privacy one needs to keep security in mind and at the same time, right now, one cannot do security without doing privacy. It's like wanting to set up the security operation center without taking into consideration what kind of data can be logged and what kind of data cannot be logged. Inevitably you need to think about it. I'm very positive as to what changes and what enhancements technology will bring to privacy, but I also think that this will inevitably also impact the way professionals approach privacy, because there will be a need to develop further skills and there will be a need for additional, let's call them professionality's, that are able to be privacy technologists.
Naїm: Thank you so much, Luigi, and you're right. We don't always have that full team that can take a look at this but those are clearly important issues. I guess, if you don't mind, if I add a little bit to this. If you think about Blockchain, for example, some people were talking at some point about zero knowledge types of solutions. If you think about artificial intelligence, do you always need big data? Do you always need all of this information or not or can you deal with smaller sets? Or can you deal with synthetic data sets? It's quite interesting to see how people continue to approach this. The simple, again sometimes the seemingly simple approach to this is to say, "Well, make sure you never gather any personal information." But I don't know to what extent that is, you know, it's nice to have but I don't think you can every really get there. Especially considering that with technology right now maybe a certain piece of information here, combined with another piece of information that you may have in a completely different part of your company there, together may help to identify an individual even if by themselves, each by itself may not. I think that the idea of thinking it's not just a person who collects the information that needs to think about this as you're storing it, processing it, all the way to disposal and thinking about your supply chain, I think this is where it's a bit of a hybrid approach. Maybe you need the information at the beginning but then how fast and how well can you get rid of the personal information, to a certain extent, while still being able to gather insights? I'm going to throw out another word, just to get your thoughts or reactions on it to the extent that you've seen it in the course of your research or in the course of work, both Sherry and Luigi. What are your thoughts on the concept of Federated Learning? Again, to the extent you've heard about it.
Luigi: I think of Federated Learning, as I was just saying 2 seconds ago, has a lot of pros and cons. I think there are two ways to look at it. One is from a purely technical perspective which is clearly distributing the workload across several devices enables you to probably do it more efficiently with less centralized resources and so forth. On the other hand, at the same time, allows you to dislocate also the processing of data in a way that can somehow further protect the way data is processed and have a ... guarantee for privacy. There are out there projects seem to be interesting. For example, the Federated Learning of cohorts by Google which they say should be replacing ... I had a look into that and for what I've seen on their deep talk pages seems an interesting project. There are also people that are clearly skeptical of this. With every technology there's people that are, by principle, skeptical of everything and there's also people that probably know much more than I know about Federated Learning and then they have a base for being skeptical. But without getting into the details of how Federated Learning works and so forth I think this is just my two cents on it.
Naїm: Thank you so much, Luigi. I'm very grateful to both of you on this wonderful conversation. It's so difficult, sometimes, you know we've all been stuck in COVID but this feels to me as if we were just sitting at a table, notwithstanding the fact that Sherry's in the Bay Area and Luigi in Europe and I'm in Montreal, it's wonderful to be able to share these thoughts and pick your brains because it's so rare to have the change to do that. To pick your brains on some of these important issues. Are there any other thoughts or questions that you think we should discuss today? If not, I think that we're going to keep in mind, Luigi, your thoughts about the fact that there may be another podcast that we need to think about on the importance on whether technology helps privacy or sometimes whether it can hurt it. There are so many different sub-aspects of privacy that are important. One of the things that I'm grateful for too is understanding that both of you, whether it's IKEA or Asana, you're really thinking about privacy, in terms of the values, that are associated with your company. It is part of value proposition and I think it's clear for your two companies, but also for other companies, that privacy is not just something to comply with, something that you have to deal with. But it's something that can actually help you go above and beyond what's strictly required and bring value. Bring value and bring trust. In a world like ours, which is more and more digital even though I still want to walk into an IKEA store although I'm very grateful for everything that IKEA does digitally, and Sherry, with Asana I think it just enhances conversations that you can have, ideally with teams internally, and just giving more of a way to just deal with things and using so many different tools. So I see it as an adjunct or an enhancement of what you can do in person in a certain way. I think that in this digital world we need to hopefully continue to help some of the companies that may not have had this experience over the past few years and are being still thrust into a digital world and having to think about privacy. Also for people in the whole supply chain to think if my customer is really thinking about privacy I should make sure that in my proposal to them it is a key part of my proposal. From a sales standpoint, if privacy becomes one of your sales' strong points, then it's great. I'm keeping in mind, Sherry, what you were saying. If you're looking at a vendor, and the vendor hasn't changed their privacy policy on their website in the past 3 years then there's a question mark there, because maybe the laws haven't changed and even if the laws didn't change their business hasn't changed at all. It could be a little bit surprising. So, again, thank you so much for this very interesting discussion on privacy and I hope that we have the chance to continue to our conversation another time.
Sherry: Thanks for having me. I think this is a great conversation. Only more conversations to be had in the future as privacy laws get more complicated and things change as the wind blows, as we like to say. I really want and I hope that companies get excited about the fact that they really do have an opportunity to tap into an area that is really at the forefront of law and technology. Oftentimes the law is so behind technology. By the time it catches up on one foot the technology is greatly out paced and I think privacy is one of the areas where it's really changing in the course of how technology is changing as well and how that looks. I really want companies, small, medium, large, to feel empowered that there is no one size fits all solution. I can't take what I did at a startup, or a digital media company, to a late stage startup that's an open source platform that's getting acquired by big tech, to another platform that does something completely different with their data. Every company has a different approach. Like you said with values, every company has different values. Every company has different risk appetite. So you really get a chance to make something that's very unique and very specific to you. If that doesn't sound exciting I don't know what else to tell you. It's why I'm in privacy. Even within a team I've been so fortunate to work with great privacy lawyers and other security teams and governance teams and all that, and to really be able to see here's how we all look at things and where we can align and where we don't align and all that. It's always a fun exercise and to be able to do that, build, re-build, make better, improve and really influence how privacy is being done in tech is something that I find really rewarding and really exciting and I hope that more companies will invest in that.
Naїm: Thank you so much, Sherry. Luigi?
Luigi: I really want to echo what Sherry said. So thank you so much for having me. It's been a pleasure to reconnect with you, Naїm:, and with Sherry. So I hope we get the chance to do this again at some point and hopefully right in front of a nice warm cup of coffee. I completely stand by what Sherry said and it's exciting. We're part of a field that's growing and it's meaningful because at the end of the day privacy, it's a human right. So we're part of an effort to actually make sure that its human right is respected and people can keep their data private. I've been so fortunate. Wherever I worked in the field of privacy, be it at Deloitte or at the companies that I gave advice when I was freelancing or now at IKEA, I've always been surrounded by incredible people that are truly passionate about what they do and have really, really been the cause of actually having a meaningful, positive impact in the field of privacy, be them lawyers or engineers or architects, it's been an incredible journey and ultimately it also lead me to meet as many people that have been really, really great even in my academic life. Sometimes I just reach to people, share my papers that I get great comments, great insights, and then you forge relationships. I'm really keen in further exploring the intricacies of the relationship between the technology and privacy and see where that leads us in our journey to make the privacy world a better place but also make overall the digital world a better place.
Naїm: Thank you so much, Luigi, and Sherry again. To the audience, I hope that you've enjoyed this podcast and that you will continue to follow us for other great podcasts that Gowling has on its website. If you wish feel free to reach out to us and we'll let you know when the next great conversation on technology and privacy takes place. Hopefully in live with you, or if not, in the comfort of your own home. Thank you again.
We are now living in a new world of technology and data, which are essential to both a business' operations and its success. However, as innovations move forward the challenges that these assets present will only increase.
In this podcast, Naïm Alexandre Antaki, Gowling WLG partner and member of the firm's Blockchain & Smart Contracts and FinTech Groups, co-head of the firm's Artificial Intelligence Group and co-head of the Montréal's Tech group, Sherry Truong, Privacy Counsel at Asana and Luigi Bruno, Senior Engineer (Privacy Technology Leader) in the Group Privacy Operations at IKEA and Doctoral Candidate at the Faculty of Law at McGill University discuss privacy law - specifically cyber security and data protection.
CECI NE CONSTITUE PAS UN AVIS JURIDIQUE. L'information qui est présentée dans le site Web sous quelque forme que ce soit est fournie à titre informatif uniquement. Elle ne constitue pas un avis juridique et ne devrait pas être interprétée comme tel. Aucun utilisateur ne devrait prendre ou négliger de prendre des décisions en se fiant uniquement à ces renseignements, ni ignorer les conseils juridiques d'un professionnel ou tarder à consulter un professionnel sur la base de ce qu'il a lu dans ce site Web. Les professionnels de Gowling WLG seront heureux de discuter avec l'utilisateur des différentes options possibles concernant certaines questions juridiques précises.