Brent J. Arnold
Partner
On-demand webinar
82
ROSE DEMELO JOHNSON: Take two. Good morning, everyone. It's my pleasure to welcome you, everyone joining us here today and online, in line and in person, to learn more about defending your business against cyber crime and cyber warfare, and more importantly, what to do if you fall victim to such an attack. This is a joint oatmeal presentation between Gowlings and ScotiaMcLeod.
So my name is Rose DeMelo Johnson, and I'm a partner here at Gowling WLG, and I'm one of the co-moderators of today's discussion. A little bit about me and why I wanted to be here today. I help my clients in developing and implementing estate plans to achieve their personal and business objectives. In doing so, my clients provide me with lots of information about themselves, their family, their business, and it's all very valuable to them.
So I wanted to be able to educate my clients about what to do to prevent an attack of such information. And if they fall victim to such an attack, what to do to minimize the damage to their finances, their reputation, their business reputation, and let's be real, their mental health because disclosure of information can be quite devastating to businesses and individuals. My co-moderator is Jonathan Rigby, and I'll let Jonathan tell you a little bit more about himself and Scotia.
JONATHAN RIGBY: Perfect. Thanks, Rose. Yeah. We're thrilled to be here with you today and especially to be in such a beautiful space. And for those online, you're missing out. It's a gorgeous spot here. As Rose said, I'm Jonathan Rigby. I'm with ScotiaMcLeod, the umbrella brand of Scotia Wealth Management, Waterloo Office.
And I've been 28, 29 years doing this, and I head up a team of 10 of us. There's a bunch of us here today. And what our team does is we anticipate and plan for our clients' life stages. And what that means is it's different things to different people depending upon what stage of life that they're at. For us, planning is a massive part of what it is that we do. That's effectively the foundation of everything.
And from that, it leads into conversations around estate planning, succession, insurance-related things. Obviously portfolio management is a big, big piece of what it is that we do, but it is all encompassing. And just same thing with Rose, we get to know our clients deeply through real deep conversations with them, understanding them, what's important to them, what's important to their families, and what stressors are.
And these stressors can be all different things at different people's lives again. And one of those stressors is the topic that we're going to be speaking about today because when somebody has, whether it's a breach from a business owner's perspective or from a personal situation of somebody getting scammed, it's a tough situation to be going through. And that's why we're fortunate to have partners like Gowlings to be able to rely on some of their expertise when it comes to these things.
So protecting our clients is an important thing, and providing education to them is something that we do regularly. And so yeah. We're thrilled to be here. Thank you.
ROSE DEMELO JOHNSON: So before we get started, just a few housekeeping items. For those of you here in person, there are washrooms just outside of the board room. Please continue to help yourself to coffee and oatmeal. The session is designed to be more of a discussion format. So please feel free to raise your hand during the presentation or for those of you online to submit a question in the chat function, and we will be monitoring that.
And I think if you want to wait till the end of the presentation to ask a question, again, feel free. There'll be some free time at the end to answer questions. So now on to today's presentation. We are pleased to have with us today Brent Arnold, Jasmine Samra, and Alycia Riley. So I'm going to have the honor of introducing Brent.
So Brent is a partner practicing as a commercial litigator out of our Toronto office. Brent has a specialization in cyber security and leads the firm's commercial litigation technology subgroup. When Brent is not appearing in all levels of court, dealing with contractual shareholder software and e-commerce disputes to name but a few, or when he's not co-authoring texts and publications on cybersecurity and data protection, or when he's not chairing various committees focused on cybersecurity and data privacy, he's probably trying to grab a nap. But he is assisting clients with cyber risk analysis and cyber breach coaching. We are very fortunate to have someone with Brent's expertise within the firm and happy to have him share his knowledge with you today.
Alycia. Alycia is a lawyer practicing in privacy and employment law, specializing in the technology industry. She manages she manages diverse employment matters at all stages of the litigation, the print is too small, sorry, processes and maintains a robust solicitor practice. She's recognized as a Certified Information Privacy Professional by the International Association of Privacy Professionals. Alycia advises clients on a broad range of privacy and cybersecurity issues, including privacy compliance , data protection, and privacy program management.
So Jonathan doesn't feel left out.
JONATHAN RIGBY: No. I'm fortunate I get to introduce Jasmine. Jasmine Samra is a counsel in Gowlings Toronto Office. She's a Certified Information Privacy Professional by the International Association of Privacy Professionals. I've never said that before. Jasmine advises companies on privacy compliance and data protection issues and helps organizations develop privacy compliance programs, privacy and social media policies. And she has extensive experience in requests under Canada's Access to Information Act and Provincial Freedom of Information legislation assist clients in protecting third party business info under those laws. And she helps clients respond to data breaches and other privacy-related incidents. And I guess my first question to Jasmine is why do I still keep getting all those duct cleaning calls?
ROSE DEMELO JOHNSON: So over to you guys.
JASMINE SAMRA: Well, thank you for the lovely introduction. We're excited to be here. And so we're first going to just start off. Brent is going to be talking about the most exciting stuff. We're going to lay the land on the background of where we are in privacy in Canada right now, just to give you guys an understanding.
So we're going to first start off with what we have currently, and we're going to be discussing some of the upcoming privacy reform that's occurring in Canada. So Canada is a federal state with powers delegated to federal government and to the province. Federal government laws govern the collection of personal information by businesses unless they're superseded by equivalent provincial laws.
So currently, we have three provinces that have their own privacy legislation that's equivalent. And that's British Columbia, Alberta, and Quebec. Quebec has recently gone through major privacy reform where the bulk of the requirements, new requirements came into force last year. And under federally, we've got the Personal Information Protection and Electronic Document Act, which we refer to as PIPEDA. That applies federally as well as to any provinces that don't have substantially similar legislation. So I'm going to turn it over to Alycia to kind of give us an overview of what's happening currently from a federal perspective.
ALYCIA RILEY: Thank you, Jasmine. So Jasmine introduced PIPEDA, which as she mentioned, is the overarching privacy legislation in Canada. More recently, in November 2020, the federal government introduced Bill C-11, which was supposed to be a complete overhaul of our existing privacy framework. Unfortunately, Bill C-11 never made it past the preliminary stages because there was a federal election. And so it died on the table.
But more recently in June 2022, there was a resurrection of Bill C-11 with the introduction of Bill C-27 in the House of Commons. And so this is part of a global overall push to strengthen privacy regulations internationally. Of course, the European GDPR is considered the gold standard. And as Jasmine mentioned, we now have Law 25 in Quebec as well, which is a lot more onerous and much more closely aligned with that legislation.
JASMINE SAMRA: Great. So we've got the Digital Charter Implementation Act, excuse me, it's a mouthful, which creates three new pieces of legislation. The first one is the Consumer Privacy Protection Act, which would replace part one of PIPEDA. So this would apply to private sector organizations in Canada that collect, use, and disclose personal information in the course of commercial activity and if they transfer that information across provincial borders or national borders. And it would also apply to federally regulated entities like banks and undertakings and businesses.
So the second act would enact the Personal Information and Data Protection Tribunal, which establishes an administrative tribunal to hear appeals of certain decisions made by the OPC, which is our federal regulator, and impose administrative monetary penalties if there's a contravention of the new proposed legislation, CPPA. I'm just going to turn it over to Alycia to talk about the Artificial Intelligence and Data Act.
ALYCIA RILEY: All right. So with respect to AIDA, I mean, that's not why you're here today. We're talking about protecting your businesses from a cybersecurity attack. But certainly, AI is what's on everybody's mind these days. I don't think you could go a day without hearing about it in the news. And certainly, with the degree of transformation that we're seeing in the economy, it's definitely worth a quick mention.
So this particular legislation was not introduced in C-11. So it's brand new under Bill C-27. It would be the first of its kind in Canada to regulate development and deployment of AI systems. And in terms of how that looks, it applies to those organizations carrying out what's called a regulated activity. And so that means the, and I'll quote from this so I don't misread it. But the processing or making available for use of any data relating to human activities for the purpose of designing, developing, or using an AI system. Or B, designing or developing or making available for use an AI system or managing its operations.
So it's a really, really broad definition, and there's a lot of language in there that even leaves us scratching our head as to like, OK, let's break that down. What does it mean to manage operations of an AI system? What does it mean to any data related to human activity? So there's a lot of things that need to be further explored in this legislation.
And so once ADA is enacted or if it's enacted because we'll talk about the status of this a little bit later on, it's going to impose regulatory requirements to both AI systems generally. But more importantly, it's going to focus on what we call high impact systems. And so if you look at the original version of the bill online, it's a very broad definition again, and there's a lot of questions about it because it just talks about anything that falls within regulation. In other words, go look somewhere else once this is enacted, and we'll tell you what it is.
And so more recently, during the course of committee and debate, there's been a little bit of a broadening of the scope, so we have a better sense of what that means. And so for high impact system, that could be using it for employment, recruitment purposes. That can include biometric information, anything related to the moderation of content that's found online.
So they're starting to build it out a bit more, which is helpful. But overall, the critical reception to AIDA has been, is it fair to say not great? Yeah. It has not been great. So where we're at with it right now is the committee is done taking witness statements. It's going to be moving into a clause by clause consideration of the bill next month.
And there's been a lot of advocates and witnesses testifying, saying, is this the right way to deal with it? Are we being too hasty? And on the other hand, this is changing exponentially and rapidly day by day. We need to have something in place. So it's been this tough balancing act between those two sides and trying to find something that is broad enough to recognize the principles that we need to protect while being specific enough to actually give businesses guidance that they need.
So all of this is to say TBC, and you're going to see a lot more about this. But right now, everything is still pretty up in the air. And I'm going to pass it back to you. Yeah.
JASMINE SAMRA: So I'm just going to talk about, I'm going to start talking about some of the major changes that the CPPA introduces. And so the big ticket item is fines. So we're finally going to have fines in Canada. Similar to GDPR actually, the fines are higher than GDPR. We're going to be seeing administrative monetary penalties of up to the higher of 3% of gross global revenue or $10 million and increased fines for more serious contraventions of law up to a fine of the higher of 5% of gross global revenue or $25 million.
And so these would be things like an organization knowingly failing to report a breach of a security safeguard, knowingly failing to keep records on breach of security safeguards, as well as unlawful attempts to re-identify de-identified information. So those are some of the things that will incur these major fines. There are also new auditing and order-making powers for the OPC, as well as a brand new right of private action against an organization for damages for loss due to a contravention of the act.
Next slide, please. Thank you. So other than the fines, we're going to now dive into some of the other changes. So PIPEDA is based on the 10 fair information principles, which are still encoded in the CPPA. And a lot of the changes we're seeing are not exactly completely brand new. They're either case law or guidance we've had from the OPC. So the first major change is minors' personal information now constitutes sensitive personal information. While currently, it is implied that children's information is sensitive, now we have it codified. And there's potential rigorous requirements now with respect to children's information.
We've got some new exceptions to consent. So we only have one way to collect, use, or disclose personal information, and that's with consent, either expressed or implied. But now we have a couple of exceptions. So we've got an exception for certain business activities, such as an activity related to a product or service, an activity related to information systems or networks, or an activity necessary for the safety of a product or service.
There's also a new authority for collecting or using personal information without consent for legitimate interests and if that outweighs any potential adverse effect on an individual. And there's an assessment that has to be completed to fulfill certain conditions. We also have a new definition for de-identified personal information, as well as anonymization.
So this is very exciting for the folks in privacy because anonymization is outside of the scope of the new act, and it's also a method for companies to use to dispose of their personal information. Unfortunately, there is a bit of debate on the definition of anonymization. So currently, the definition is the modification to information is irreversible and permanent. The modification prevents direct or indirect re-identification of the individual, and that's such a high standard with technology now developing as fast as it is. It is possible without 100% surety that information could be re-identified. So I'm going to turn it over to Alycia to talk about some of the other changes.
ALYCIA RILEY: So some of the other changes that are being proposed under Bill C-27 includes more information about minors' personal information and how that is automatically now considered sensitive personal information, which means that there's going to be a higher standard imposed with respect to protecting that information and how you can use it and retention periods.
So generally speaking, even without Bill C-27 part of the privacy framework-- oh, sorry. We kind of put forward one slide. Part of the privacy framework as an existing standard is we need to consider retention periods for information generally. So businesses cannot be holding on to personal information that's no longer relevant, doesn't serve a purpose within the organization, and happens to be 15 years old.
And so with these new inputs, in addition to the existing fair information principles, it's really going to prompt a closer examination of what your existing privacy program is and how is it that you're going to modify it not only for compliance but to minimize risk of liability if you do have a cyber breach.
With respect to security safeguards, again, this is part of a business's obligation under the Fair Information principles to safeguard the data that it keeps about others. It's not sufficient to just contract that out to somebody else. The accountability principle says that ultimately, the person collecting the data or the business collecting the data is the one who's responsible for it.
So one of the standard considerations for security safeguards is with respect to multifactor authentication. So that's going to be a standard under Bill C-27 going forward, so that businesses are taking reasonable measures to authenticate an individual's identity. Oops. Sorry, Brent. Not yet.
OK. So new codes of practice and certification programs. This ties into what we've been seeing a lot in the news in terms of voluntary codes of conduct. Ultimately, it's better to have these established codes of practice because that way, there's a consistent standard that everybody is being held to, as opposed to an opt-in method. And then there's going to be some new individual rights under the CPPA as well. So there's the right of disposal, the right to be informed of automated decision making, and the right to mobility.
OK. So as I mentioned earlier, the status of Bill C-27, it is currently still in committee. So it's been there for a while. The committee has conducted I want to say 70 days, 80 days' worth of testimony. It has been a very long road for the committee. But they've finished up with their witnesses, and they invited final recommendations and consult, which was due on March the 1st. So that's all been done. And now in April, they're going to move into a clause by clause analysis. After that's done, that's probably when we're going to get the latest draft of the bill, and we'll see just how different it looks from there. It's probably going to be a complete overhaul.
And if you want to learn more about the development of Bill C-27, we have a ton of resources available for you online. We've published lots of articles about it. We've just got amendments that have been proposed by the minister. So for anything else that you want to read up on, feel free to visit our website, and of course, we're going to continue to post about it. So stay tuned. So now, I'm going to pass it over to my colleague Brent Arnold to talk about cybersecurity.
BRENT ARNOLD: Thank you very much. Hello. Yes. 3 centimeters is the magic distance from the microphone any questions so far because that's a lot to bite off in one go. I see one at the back. No?
AUDIENCE: No, no, no. It was just, you're good on camera.
BRENT ARNOLD: All right. I've never been told that before. So as an aside. If it seems weird to you that we tacked on a bill about artificial intelligence to a bill about privacy, you're not the only ones who think that. And in fact, as this has gone through readings and committee, one of the constant criticisms has been why are we rolling out a five-page law on the most important and transformative technology in human history and stapling it to the back of a bill that's about privacy that's ready to actually go? And the answer-- well, there hasn't been an answer. So part of why this has gone so long in committee is they've been dealing with two entirely different regimes all at the same time that are unconnected. So it should be fascinating to see how that works out.
So I'm here to talk about the scary part of this. As C-27 has been working its way through the legislature, another bill has been working through as well. It's called C-26. And C-26 is a cybersecurity law as opposed to a privacy law. And I paused there to emphasize that because we've never had anything like this in Canada at the federal level.
The difference between cyber security and privacy essentially is that privacy is about safeguarding individuals' rights. Your right to privacy obviously. Cybersecurity is about protecting organizations and by extension, protecting individuals. So it's about imposing obligations on organizations to make sure they're taking the measures they need to take to protect basically themselves and everybody who's dependent upon them.
In privacy laws, even C-27 is no exception. You see a brief mention of obligations to keep information secure, but it's really as simple as that. It's usually a couple of sentences. It's a duty, but it's not fleshed out in any detail. C-26 plugs some of those holes. And I should say you'll get these slides. As we've been going through some of these things, like I know the prints are a bit small. But it's meant to be read on your computer. So you'll have that, and obviously, if you have any questions, you can follow up with this.
So speaking of bills that do more than one thing at once apparently unconnected. C-26 does a couple of things. First, it amends portions of the Federal Telecommunications Act. Second, it enacts the Critical Cyber Systems Protection Act to provide a framework for protection of critical cyber systems that are vital to national security or public safety and that are delivered as part of, and this is the language of the bill, operated as part of a work undertaking or business within the legislative authority of parliament.
Why does this last bit matter? Why is that in the definition? Any guesses? It's to tack it on to what the federal government's constitutional authority is to do anything about this in the first place. So it's cybersecurity bill, but the federal government is limited in its scope to what it can do under, and the rest of it falls to the provinces. So this is the federal government's attempt to deal with what it's allowed to deal with legally.
Vital means critical infrastructure. The bill uses the term vital. It's critical infrastructure, which is the way you hear it discussed everywhere else on the planet. Can you advance to the next slide, please? Thanks. So let's talk about the first piece of this, and this probably applies to the least number of people in the room or at home, except by extension because it's intended to protect you.
This is intended to empower the government to compel action by specifically telecommunications service providers, including forcing them to terminate service agreements or forcing them to stop using certain products and services that are in their vertical. Functionally, what this is really about is the fact that we have a debate in this country for a very long time and frankly too long as to whether or not a certain company out of China was going to be allowed to participate in our building Canada's 5G network.
We came to that decision a lot later than most of our global partners, and nobody benefits from that decision taking too long because if you let it take too long, you end up in a situation where stuff's already installed. Equipment has already been purchased. So this act makes it specifically-- well, it gives the government the specific power to say, nope, you're not going to work with them, or nope, you got to rip all that stuff out if they think that there's a security issue. And if that seems unconnected to the next part, it is, except that this is all globally under the theme of security. Next slide, please.
Under the CCSPA, as I'll be referring to it by acronym, vital organizations have to first create and file with appropriate regulators, and we'll see in a minute as an aside. There isn't one regulator dealing with this. Unlike the privacy bill that we just talked about, there isn't a regulator responsible for this. What the bill does is it creates a legal regime of responsibilities for organizations across a set of sectors, which we'll go to in a second. And then it puts the responsibility for administering those responsibilities. Still with me? This is a long sentence. On the given regulator for that sector. This will make sense in a second I promise.
So the organizations have to create and file an appropriate regulatory cybersecurity program. They have to disclose material changes if there are any to the regulator. And they have to take reasonable steps to mitigate cyber risks. They have to keep records of the steps that they take, and they have to report cyber breaches.
That's a different obligation than the report to report under our privacy laws. There's a legal threshold that you saw in the smaller but very colorful slide a few slides back dealing with when do you have to report a privacy breach under the privacy legislation. This is a different threshold. We don't have to have significant risk of material harm or any of those or real risk of substantial harm or any of those things. It's just was there a breach.
So actually, back up for one second. So bottom line is this. This is a show your homework kind of law, right? It's essentially saying you got to have something in place, and we're going to assess whether or not it's good enough. Now this is going to hit organizations differently. The really big organizations, and we'll see in a minute when we get to the sectors on this. We're talking about the biggest institutions in this country. They're already doing this. Not all of them are doing it as well as we would like to see, but they're already doing this.
So they are essentially going to be asked to explain what they've already done. And then they're going to be put in the position of having the government tell them whether they think it's good enough. We'll come back to that in a second. I can tell you that though, in these industries, in these verticals, there are lots of mid-sized and small players, right? These are complex machines with lots of different cogs in them. Those companies, many of them are not ready for this. And I can tell you that because I'm one of the people that mops up when they get hit by a cyber attack, and we discover that they've done almost no planning. Next slide, please.
So as I said before, the power to administer this law and these obligations is going to be passed along these different regulators. This is, by the way, this is a good idea because these regulators know their business. If you're in one of these industries, you probably have your own beefs with your particular regulator. But at least they understand how your business works.
I would say that's better than the situation as we have with the artificial intelligence bill that you just heard about, where you have essentially responsibility for artificial intelligence in the abstract. Administered by one regulator. In this case, you have administrative responsibility for cyber in the industry that the regulator understands well.
This is closer to what we see in terms of the way that artificial intelligence is regulated in the US, where it's a sector by sector approach, putting the power in the hands of the regulators that know the industry and know the people they're regulating. Next please.
So the bill has teeth. But you can see these teeth are sharper if you're a smaller organization, right? They can impose fines on individuals. So again, there's two things. One, the dollar amounts are big. And two, they're not just going after companies. This is not going to be a cost of doing business tariff on businesses that decide not to do this. They're going to come for the officers and directors of those organizations if they're not doing their jobs.
And you see here, 1 million per individual, 15 million per organizations. They can also issue compliance orders and conduct audits. They'll also imposes responsibility on those organizations to be named later. We don't know who they are yet, for ensuring that their vendors and supply chains operate securely.
And this is something we see in privacy law as well. There is an expectation in privacy law that you're not doing your business alone. You depend on a supply chain, vendors and so forth. And you are intended-- you are supposed to with privacy law impose contractual obligations on those companies you do business with to make sure they are protecting the data when you share data with them.
Very similar principle here. You need to make sure if you're getting, if you're getting components, if you're getting equipment that's going into your equipment, or if you are let's say running a nuclear power plant. Whatever the companies that are providing you with the materials you're using, the software, all the rest of it they have to be secure as well. And it's on you to make sure that they are. Next slide, please.
So how do we feel about this bill? The civil rights groups don't like it and for good reason. And I haven't gotten into all the details on this. We're giving you the short version of this because there's only so much you can do in an hour. But this bill, how do I put this? It creates a sort of a shadowy system of accountability where you're reporting to the government the government is taking steps and so forth, and the government is also allowed to share information that it gets from your organization about its cybersecurity preparedness. That's information companies don't like to share for good reason, right?
Essentially it's like saying, well, here's all the steps we've taken. Here's a copy of the key to the front door, and on, and on, and on. You don't want that information to be shared with anybody if you're that organization, and you probably don't want to share with the government because guess who gets hacked a lot? The government, even recently.
But they're also, the government is allowed to pass this along to law enforcement. They're allowed to share it with their other partners, other countries and their intelligence networks. Well, in the interest of global security, our government does business with governments that you may not want to give information about your cybersecurity posture to, especially, and I know this is the most extreme example, you're running a nuclear power facility. There are certain countries that we work with, perhaps reluctantly, sharing intelligence and so forth, who you may want not want to know details about how you keep your nuclear power plant secure.
So they don't like that there's this whole process, the civil rights groups bringing it back, they don't like that there's this whole process where we can't really see-- the businesses and individuals can't challenge how their information is being shared, what's being done with it, and so forth. The companies don't like that they have to share it in the first place, especially the big companies that are already doing a good job of this. And they would tell you, I suspect, know more about it than the federal government does because they've been doing it a while.
And it's been attacked by business groups as too punitive, imposing unnecessary compliance costs on those already mature organizations who again have been working towards this for 20 years, let's say, probably more in most cases. Next slide, please.
So How's this bill coming along? This is a really important piece of legislation as a national security goal, right? This is the thing that's going to protect you from having your internet shut off by cyber attacks. It's going to protect your nuclear power plant from being shut down. It's going to stop the banks from being frozen and unable to process your money. These are all the things that we need to have working every day where society collapses. Seems important, right? Seems like a bill you'd want to pass quickly, right?
So how are we doing? First reading, 2022. Second reading, almost a full year later. We've had a grand total of two committee meetings. When's third reading coming? When are we getting to clause by clause and so forth? No, no. So that is the urgency with which this important piece of legislation is being treated. And apologies if it sounds like have strong opinions about them. Next slide, please. Any questions on any of that so far? Yeah.
AUDIENCE: Why the difference? Why is, I guess, C-26 being treated differently than C-27?
BRENT ARNOLD: So the folks at home, the question is I think why is it taking so long compared to C-27? Yeah. Yeah, yeah, yeah. I'm giving the short version. Political priorities. And the good news here, and Canadians love this, is no matter who you vote for, no matter what party you support, there's someone for you to be mad at because the government is responsible for getting this on the agenda and deciding how quickly it moves through parliament, and it has not been making it a priority. Other things have been ripped from the headlines and made priorities instead.
When it gets to committee, you get to blame the opposition because they've been filibustering to complain about things like why did Trudeau invoke the Emergencies Act when people started showing up and putting hot tubs in the middle of Spark Street? And also, what's the deal with everyone stealing cars in Canada? Why aren't we doing anything about that? So these committee meetings were supposed to be dealing with this bill have been hijacked to deal with other completely unrelated priorities and basically grandstanding. As I said, no matter who you vote for, somebody is wrong. Is this you? Yes.
JASMINE SAMRA: Thanks, Brent. So now, I thought it'd be an interesting time just to give you guys an overview of breach privacy, breach requirements before Brent jumps into the cybersecurity stuff. So we've got three mandatory regimes for breach reporting federally under PIPEDA Alberta, which was the first in 2010 to introduce mandatory breach reporting and recently, as of last year, Quebec.
BC has breached reporting requirements, but they're voluntary. So I'm just going to focus on the mandatory breach reporting requirements. So what does that mean? So a breach of security safeguards under PIPEDA or a confidential incident in Quebec occurs when there is a loss of, unauthorized access to, or unauthorized disclosure of personal information resulting from the breach of an organization's security safeguards.
So when that occurs, what you have to do is take a look and see if there is what we call a rash, which means if there is a real risk of significant harm, or in Quebec, a risk of serious injury. And so the test for that is whether the breach creates a real risk of significant harm based on the sensitivity of the personal information.
So the more sensitive the information is, the more likely you are going to have to report. And whether there probability that the personal information will be or may be misused. So a lot of times, if there's a breach due to a phishing attack or some other sort of cyber attack, a lot of times, folks have a malicious intent. So those types of breaches are normally reportable. So this just kind of sets out and compares the three breach regimes. Next slide.
And then on this slide, I've just put together kind of the notification requirements. So you not only have to notify the regulator. You also have to notify individuals in those jurisdictions. And we've just laid out kind of the nuanced differences between the different regimes. And in under PIPEDA and in Quebec, there's also a mandatory record-keeping.
So even if you have a breach and you determine there is no real risk of harm, you have to keep a record of that information. The regulator can ask for that at any time. And under PIPEDA, under federally, you need to keep that information for two years, and under Quebec, you need to keep that for five years. And now I'm going to turn it over to Brent to tell us some cyber stories and talk about incidents involved.
BRENT ARNOLD: We'll talk about the next slide. So I appreciate those who are a bit small. When you're looking at them on the screen, all of those-- oh, question at the back.
AUDIENCE: Yeah. One quick question from online, and you're going to have to forgive me. I hope I make sense. It's a question on Bill C-26, and it said, that in the sectors of transport, finance, nuclear, et cetera, defense was not there. And why was that or why is that?
BRENT ARNOLD: I think because they don't want-- it's a good question. Let me back up for a second and say-- I should have said we don't know who the organizations are yet that it applies to. We know what sectors they're in, but there's a schedule that's to be populated later. So later on, organizations are going to find out, surprise, you have these obligations. No idea when that will happen after the bill is passed.
I suspect the answer as to why defense isn't in there is that it's something that they consider too top secret to go through the same regime as is going through all these other things. I mean, there's accountability there through different structures. But that's just a guess. And because we're only at two committee meetings in on this, it's not a subject we've been explored, although we may.
I should say, on the slides that you've seen, the colorful ones with lots of text on them, all of that stuff is available on the Gowlings website. Just go to the privacy landing page. And if you sign up for our privacy newsletter, you'll get this content delivered right to you.
Back to cybersecurity. So how bad is it? It's bad. You see here, this is a Statistics Canada study talking about the impact of cyber crime on Canadian business, and this is a couple of years old now. Under one fifth of Canadian businesses were impacted by security incidents in 2021. Canadian businesses reported, that's important, spending over a $10 billion on cybersecurity just in 2021. Impacted businesses spend more to prevent and detect cyber security incidents than they had in the past, and Canadian businesses are implementing formal policies for cybersecurity.
Now this is self-reporting, right? This is just like a lot of other kinds of crime. It's vastly underreported. And I can tell you that because I have clients that tell me they don't want to report it. And for the most part, there isn't an obligation. You have to report your obligations under the privacy law. But if you have a breach that doesn't manage to sort of hit that test of real risk of significant harm, there's no real obligation to report it to law enforcement or to government. We encourage reporting incidentally, but you can't force someone to do it.
So that's just the ones we know about. Next slide, please. We're jumping around a little. I apologize for that. So what do you do to prevent this? And then perversely, we're going to talk about what happens when an attack happens anyway. But what do you do to prevent it? One, you want to verify your technical and physical information security measures meet legal obligations. Physical security gets overlooked in all this.
I had an interesting case of a company that it was a lender. And so people were applying for loans. They would send in their applications electronically. The company had great policies about how information is dealt with in the office, right? No USB sticks, no sending stuff out by email, no accessing stuff remotely. It's pretty locked down.
Guess what kind of technology can get around those technical safeguards. This. Notebook and a pen. And that's what happened. Somebody answered an ad and applied for a job with the intention of stealing information working in that position. And that's what they did. They sat down at their computer and wrote out information from job applications into a notebook that they took home every day.
The person got caught because she was dating somebody who was also a criminal, who was engaged in credit card fraud. And when the police raided their apartment, they found the notebook. So again, and I've dealt with situations. I sometimes get brought in to help with companies where let's say, a small company, tech company is working on an angel round of investing or something, and the company that's spending the money wants to know how safe is my investment.
So we'll go through all the questions about what are your technical measures. And then I'll say, so where are the servers? And they're like, they're in that closet that doesn't have a door or doesn't have a lock on it, that anyone in the company can just walk into. So the physical security is very important too. And I'm emphasizing that today just because we don't talk enough about it.
You want to get breach assistance lined up in advance. I would say, and of course, the lawyer would say this, but there's other people that would tell you the same. You want to have legal counsel in place, preferably legal counsel that has done this before, because they're going to be the ones you want sort of running the crisis response if the breach happens, in part because, anyone want to guess?
The magic pixie dust of privilege. That's right. You want the lawyers to retain the other vendors in the space. You want them reporting to the lawyers, so that you can frame all this as being done in the service of providing legal advice, which it is. And that gives you at least a chance largely untested in Canadian law of protecting privilege over your communications with your forensics vendors, your crisis communications professionals, and others in this sort of matrix of first responders that work together.
You need something called an incident response plan and the related policies in place. Anyone know what an incident response plan is? I know you do. Incident response plan is your playbook in case a breach happens. So literally, you open it to the first page, and it tells you who do I call first? What do I do first? We had to revise a lot of these during COVID because a lot of these plans start with get everybody together in a room. That doesn't work well if everyone's a remote deployed because they're all a biohazard.
So you need to update these plans very regularly, not least because people come and go from an organization. And so one of the things that identifies is who needs to be around the table? Who gets called first? And if that person is not there anymore, you're starting off with chaos right away.
So the incident response plan is crucial. And the policies sort of lock down how is data dealt with on a day to day basis in a way that's secure, in a way that's mindful of cybersecurity? And if I take a step back, I might for time-- do I have time to blather on about this or not? Excellent.
So cybersecurity is different from IT. Cybersecurity is different from IT. And I emphasize that because when I get brought in to deal with the breach, often they say, yeah, our guys are dealing with it. They're trained differently. It's a different way of thinking. And cybersecurity isn't just about tools and tech. It's about process, and it's about planning.
So if you're looking at things from a cybersecurity perspective, you're not just thinking, do I have a firewall in place? Do I have anti-spam? Do I have virus protection? And on, and on, and on. You're also thinking about, let's assume a breach happens. What are they going to be able to access? If they get in this way? How do I prevent them from getting this other information? How do I segregate all that information, so that if a threat actor gets in, the amount of damage they can cause is as minimal as possible?
One of the ways you do that you segregate your data. You apply what's called the principle of least privilege, so that people that don't have any business having access to sensitive information don't. Because the threat actors are going to get in however they can. And if that means somehow getting their hands on the credentials, the sign of the login and the password of somebody who's relatively low level, if that person has access to everything, they've got access to everything.
So you see, it's a mindset. It's a way of thinking about organizing data. That's not IT's job. And the policies are what set all that out in detail. You need to conduct regular data breach exercises. We call these tabletop exercises. And that's when you get people together, and you go through the incident response plan. It's like a really depressing board game. But this is where you find out whether your plan still works, whether there are aspects of your business that are new and aren't covered by the plan, whether you need to replace some of the people that are in there, whether you've got everyone around the table, all the departments, that need to be there or not.
Cyber insurance is important if you can get it. That's getting harder now because it was relatively new in the Canadian market. Insurance companies weren't exactly sure how to price it. So it was available for a while, reasonably priced. And then among other things, we had an absolute outbreak, a bloom of cyber attacks during the period of COVID that has not slowed down.
So the insurance companies have been hemorrhaging money over cyber coverage, which means now, it's more expensive. It's harder to qualify for it, and there's more exceptions and exclusions than there used to be. In the US, actually one insurance company actually sued one of its insurers. Why? Because when they applied for the insurance, the question on the application, I'm simplifying this a bit but not by much. Was there multifactor authentication? Yes, they have multifactor authentication. They didn't have it on everything though, and they didn't have it on the system that got compromised. That was the way the threat actor got in to the system. So the insurance company said, well, you misrepresented this, and we want our money back, and we're terminating the policy. And that's what happened. We haven't seen that happen in Canada yet, but I suspect it may be a matter of time.
Again, and we talked about this before. You want to pass those privacy obligations and data protection obligations downstream to the companies you do business with via clauses in your contracts. And this is something that a law firm that has experience in this area can help you with. Next slide, please.
So just so you have a sense of this, I won't ask who's had the pleasure of going through a data breach before, but this is broadly how this works. Now this looks like it's a sequence of things. But in reality, there's a lot of overlap. A lot of this stuff has to happen simultaneously, and you start at the beginning.
Let's talk about what's in there. One, you have to stop the bleeding. You have to identify the nature of the breach and contain it. You need to contact your insurer and make sure that they know because if you don't, and you've probably had this experience in other circumstances with insurance. If you don't notify your insurer, you might find yourself off coverage.
And the thing to know about cyber insurance is it tells you who you're allowed to use to help you. I've had the situation of being brought in to deal with the breach where we were several days into it. The IT department had brought in a cyber forensic vendor to help them get back on their feet. They called us to do the legal work on it. And I said, have you looked at your insurance policy? And like no, no. We should do that. And they read the insurance policy. And it said different law firm. Different forensics vendor. And I don't know whether they were off coverage, but they technically could have been found to be off coverage at that point.
So make sure that they're involved at the beginning. You want to get your data forensics firm in, and again, you've identified them in advance. So this is easy. You've already got the contracts in place. You don't have to spend time drafting those or revising them while the breach is happening.
And you may want to bring in crisis communications, particularly if you're in a situation where the fact that you've been hit by a breach is obvious to the public, right? Your website is down. You can't respond to email. Your phone lines aren't working, that sort of a thing.
You need to investigate you have to figure out, how did this breach happen? What was the cause? Because you can't stop it and seal it if you don't know it. And incidentally, the privacy regulators, if you have to report, are going to ask you these questions. How did this happen? How do you know it's not going to happen again?
You need to preserve the evidence because there could be litigation downstream. I recently helped defend a class action in this space. It affected six million Canadians. And the evidence is important on this because you want to be able to show we had reasonable steps in place. The good news for you, if you have the pleasure of going to court over a data breach, is the fact that you are hit by a data breach doesn't mean that you are liable for that necessarily. Doesn't mean you did anything wrong.
And this happened. It's a perfectly plausible outcome that a judge looking at what you had in place would say, that's reasonable. Particularly if you were hit by a very sophisticated threat actor. Let's say the Russian state or the North Korean army, which happens. Nobody expects your mid-sized business to have cyber security so good it can deflect an army, an actual army of people in a bunker in North Korea who spend all day doing sophisticated hacks. So it has to be reasonable.
So that's where your evidence is important. You need to figure out who's affected because you may have notification obligations. You may also want to do what you can to protect them. You have to figure out what's our potential exposure? Notification and message management. There's two pieces to this. One, you might have to report to privacy commissioners and notify affected individuals.
Again, if you've hit that threshold where you're the privacy law requires you to report it, and you might decide to tell people anyway because you want to protect them, and it may just be good customer relations or client relations to do it. And remediation. You need to look after the people who are affected because the court looking at this, if it does go to court, is going to treat this the way they treat product liability. Did you act responsibly, quickly, and did you take care of the people who were harmed by this? So it's the same as when you do a product recall.
What does that look like? It means, among other things, you're providing free credit monitoring services through Equifax, TransUnion for the people who are affected so that they can protect themselves. And you need to plug your holes in your cybersecurity, but you can kind of see, you don't leave some of these things to the till they come up in a stage down the road. You're thinking about this all the way through, and your legal counsel should be thinking about all of this, all the way through right from the beginning. Next slide, please.
So the last bit. Lessons learned in continuous improvement. And again, this is something that courts and regulators expect you to see. It's not enough to say, huh, it happened. What are you going to do? You need to be able to say, we fixed this, and here's what we've done to make sure that it doesn't happen again in this exact way, and hopefully nothing else happens in some other way.
You need to document the changes and improvements you've made, update your incident response plan and your policies if they didn't cover this particular scenario. And as I said before, you need to be able to show what you did to fix this, what you put in place, how you've looked after the people who were affected by it. That's what you want to do to put yourself in the best possible position in case you have to answer for what's happened here and what you've done about it. Next slide. That's it. Any questions on that part or anything else we've covered?
AUDIENCE: I have a question. Could you tell us a bit about how you might navigate things when it's like not necessarily a data breach for personal information but your client's information?
BRENT ARNOLD: So the question is how do you deal with the breach where it's not a scenario where personal information of your customers or clients are affected or your employees but it's just corporate information. And the answer is you deal with it largely the same way. The only piece that you back out of this is that you don't end up doing notice and reporting notification to affected individuals and reporting to the privacy commissioners because those legal obligations aren't triggered. But all the rest of this, you still have to do.
And depending on what the data is, you might still have to worry about a lawsuit because for instance, let's say you're a vendor in an industrial sort of vertical, and you're providing parts, and you've got intellectual property, and specifications, and trade secrets belonging to the company that you're providing stuff to. All that stuff may end up disclosed.
And there's a couple of ways that can happen. One, you might be the victim of an attack where that's the goal. And real quick, if the attack is from Russia, it's probably just thugs. Criminals trying to either just harass for the sport of it because they don't like that you supported Ukraine, or they're trying to steal money. That's the usual scenario. That's what we see with the ransomware attacks.
If it's North Korea they are stealing money to fund their missile program. So we dealt with the breach a few years back of several hundred million dollars in crypto stolen. And it turned out it was, again, the North Korean army with the goal to funding the missile program. If it's China, it's a state-directed attack to steal intellectual property that is then handed over to Chinese industry and reverse engineered to give them a competitive advantage. And we have seen that happen before. That's what happened to Nortel we now know in the early aughts.
So you can end up in the same place regardless of how this happens because it could be a targeted attack on your intellectual property, or it could be a ransomware attack where they steal your data and what they're-- again, their goal is money. But their leverage against you is that they're going to dump it on the dark web if you don't pay. So even though they're not giving that to your competitors, they've made it a publicly available for anybody.
So the way that we deal with this, pretty much the same. And sometimes, the difference is the sophistication of the attackers because we don't have time to talk about the whole ransomware ecosystem, but there's a scale of sophistication of attackers from schmucks in basements who buy stuff off the dark web or participate in ransomware as a service.
And they're just sort of low skilled attackers using off the shelf stuff they get on the dark web. And at the other far end, you've got sophisticated state actors participating in the attacks. So your attack could be your threat. The threat could be anywhere on that spectrum. It's a long-winded answer to what seems like a simple question. At the back.
AUDIENCE: Yes. Question from online. Under C-27 item 2, what is considered a legitimate interest for a business activity? An example would be helpful.
BRENT ARNOLD: I bet it would. Yeah. This is a broad exclusion.
JASMINE SAMRA: The answer is we don't know yet. So the whole purpose is to be able to balance. So you'll have to do an impact assessment. So basically, you have to look at what you want to do, what the adverse impact on individuals might be, and what mitigating strategies you have in place. And then based on that kind of balancing test, you would determine whether it would be in a legitimate interest to proceed with such activity. So a lot of secondary things like marketing may fall into that bucket. We have yet to see if that is going to be the case.
BRENT ARNOLD: So our considered opinion or answer to the question is depends.
AUDIENCE: I got a question about breach. Just wondering what you're seeing in the market, we've never had-- our company has never had a breach. We prepare for one as best we can be. But we have had vendors and customers that have had breaches in the last 24 months.
We'll try to build into our contracts. And dealing with privacy for the last 10 years and security, what I'm starting to see, though what we're trying to do is one of the issues we ran into last year is we had a client that wasn't forthcoming to us about their breach, and they had access to our system. And we've revamped our contracts now to essentially serve voting rights into that if they're on our SaaS solution, that they have essentially disclosure rights, that if they have situations happen, they kind of tell us.
We try to put it as much as possible. I haven't seen that before just because it hasn't been a concern. The concern has always been the data and just notification of the data, and it's always been vendor-facing. But we've been reexamining all our partnership agreements and vendor agreements to make sure that this sort of situation is there. And like our big thing is reputational like we don't want to be caught in your shitstorm, so to speak, and have our name brought into it, that oh, this platform is part of the breach, even though we weren't affected by it type thing. So just wondering if you've seen that in the industry type thing, and what you would recommend for companies that are associated with breached companies.
BRENT ARNOLD: So the short answer is yes, I have seen that. I'm dealing with it on a file right now, and it's a mess. The question for those at home, have I seen, and how do you deal with the scenario of a company in your supply chain? A vendor, let's say, who has access to your platform because you're to some degree tethered together and integrated so that you can do the business that you do together.
They get hit. They are tightlipped about it. They're not forthcoming. And you have to worry because you're still connected to them in this way, and you don't know whether or not that threat can-- you didn't say this, but this is what I'm worried about. That threat can transfer to you because the threat actor may have access to you by virtue of the connection that you have to this company through the platform.
So yes. I have seen that. And we are now seeing, and in fact, we're doing this work. Revising contracts to deal with that scenario, so that yes. They have to report it to you. And it's a different threshold contractually. That's something that comes up a lot in data breaches.
One of the first things I ask the clients once we get things locked down is what do your contracts say you're supposed to do? Because some of those contracts, the newer ones, will have provisions that say you have to notify immediately about any breach that affects our stuff or potentially affects our stuff.
That's a lower threshold for having to report than the privacy law provides for, right? There's no harm threshold built into that. It just says if there's a risk, you have to tell us, and you have to tell us immediately. Often this doesn't come up until several weeks into the breach. So if the client hasn't sort of mapped out those obligations in advance, and frankly, they don't have time to worry about that when they're dealing with the immediate issues to do with the attack. Often that part gets mopped up later.
So yeah. You as the partner in this relationship, you want to have clauses in your contract require them to tell you. You want a low threshold, so that they have to tell you, even if it's a seemingly innocuous breach, not every time somebody tries to tag them because that's happening all the time. But any time that they get access to your data potentially, you want audit rights, so that you can ask questions about, and you'll kick the tires to say what are you actually doing because often, we see companies say they give assurances about what they're going to have in place, and they don't actually do it.
Too many companies treat these things as a tick the box exercise in compliance the way they treat other kinds of legislation, for instance. So yeah. You want to build those rights into your contracts. You're absolutely right that it's not something you're going to see in an older contract. But the new reality is you need that.
And as we saw here, either from a privacy perspective or with respect to cybersecurity laws, you may have an obligation to do that, to sort of pass that on. You want indemnification. You want other ways to guarantee you'll be made whole if they're the ones responsible for you taking a hit on all this. So the short answer is yes. I'm seeing it. We should be seeing more of it, and it's a real problem. Yeah.
AUDIENCE: Just to clarify, like I get the vendor side. But we're looking at it customer-wise too with our customers. Like we get access to our platform. You have a breach, and that breach infiltrates our platform because of your breach. Like so it's customer-facing. They haven't gone so far as to ask for an indemnity yet, but I am putting like notification up the chain from the customer back to the vendor type thing.
BRENT ARNOLD: So the clarification is what about where the business relationship here isn't vendor supplier, the vendor purchaser, it's your company and its customers. And what if the customer, let's assume it's a business customer, that the attack happens on the customer side. That's interesting. No, I haven't seen much along those lines.
It's very hard to do if your customers are individuals, not least because they're probably not even reading their contracts. But yeah. If it's a well-documented relationship where you've negotiated a contract with a sophisticated commercial entity that's your customer, I don't see why you wouldn't build that in. And in fact, I think you I think it's in their interest and yours that it is built in there. Any other questions? At the back?
AUDIENCE: Yep. Another one from online. Bill C-27 a to b. I hope you know that because I certainly don't know exactly which one that is. But the question is if a business is using AI for resumes, is the business now required to inform the individual that they are using automated decision-making up front?
ALYCIA RILEY: OK. So it's not passed yet. So there's not a standing requirement under the bill. But what's going to be required is there's going to have to be an assessment to see how they're using that information. More recently though actually, under the Employment Standards Act in Ontario, there's currently legislation that will require employers to disclose if they're using AI in recruiting systems. So we're seeing some development on the provincial side, and that's actually more likely to come into effect sooner than anything we're going to see on the federal side.
And just one thing I wanted to mention about cybersecurity because we've talked a lot about how do you protect your systems internally. And Brent, you talked on this earlier, but training your employees. This is where I'm putting my employer hat on. A lot of cybersecurity attacks, they happen because of phishing, right? The employee just thinks, oh, yeah. I've worked with that client before, and they click on this link. And now you've opened the door to your systems. That happens a lot.
So to the extent that you can, have these procedures in place, do the planning, but then make sure that everybody knows what the plan is and not just those who are going to be dealing with the crisis incident if and when it happens.
AUDIENCE: And on that note actually, we have a question. And the question is can you describe the minimum level of breach at which point you need to report, and is every instance of phishing something you have to report?
BRENT ARNOLD: So before I tackle that, let me just say thank you for reminding me to address the training issue. And don't just make this a once a year thing where somebody does a 10-minute sort of online exercise that they can then repeat over and over until they finally get it right or a cheat on it.
This needs to be built into the culture, and it needs to be very regular because most organizations, have; ole turn over quickly, right? Like people come and go. And you really need to instill a culture of being careful about this stuff. Again, this is another area where companies get themselves into trouble when they treat it as a tick the box thing. So yeah, absolutely. And
I would go further than my colleague to say it doesn't just happen a lot. Most of the breaches I deal with happen because some person has been compromised in some way. Somebody clicks on something they shouldn't. Somebody is careless with their login credentials, and they get picked up. Somebody uses the same dumb password for everything. So when somebody manages to get your Netflix password, they now also have your work computer password. And on, and on, and on, and on.
In terms of what's the minimum threshold that you'd have to report, if you mean under privacy law, we've been talking about federal privacy law. There's also provincial privacy laws that deal with other areas on this. The bottom line is if it doesn't reach that threshold where we think that that information has been accessed in a way that gives rise to that real risk of significant harm, you don't have to report it.
But that's not as simple as it seems, and this is where getting some legal advice is a good idea. Some commissioners take the position that if your data is accessed by the threat actor in that it is encrypted but it isn't exfiltrated. So just to be clear, what the cyber attacks we see these days, the ransomware attacks, what happens is a two-stroke attack, right? They get into your system. They copy a bunch of stuff, and they download it. And then they encrypt your files. So they have two threats against you. One we're going to dump your stuff on the dark web or sell it to your competitors and/or we're never going to help you recover your files. So we have your stuff, but you don't have your stuff.
Some privacy regulators take the position that even if they don't get your data and they don't know what your data is, all they've managed to do is make it unusable to you, they've still accessed it, and there's still a real risk of significant harm to the individuals. I'm not sure I agree with that but it doesn't matter what I think. That's what they think.
So when have we crossed that legal threshold isn't always a straightforward question, and this is where knowing the decisions that have been coming out by the privacy commissioners is important, and it's important to keep an eye on that. So the answer is depends. Another at the back.
AUDIENCE: Yeah. There's one more. Are phone numbers and addresses for customers classified as personal information the same as bank information?
JASMINE SAMRA: And so if that's business contact information. So under PIPEDA, there's a carve-out for business contact information if it's used in the context of that individual's job. So that would be out of scope.
BRENT ARNOLD: Yeah, financial, and there's degrees of sensitivity. And the problem with this is that it's not just this kind of information, this piece of information meets the threshold. There's some that do. Health information, banking information. Those for sure get you over that threshold of real risk of significant harm.
But for other things, it's a question of this piece of information by itself isn't useful, but this piece with these two other pieces means I can identify that it's that guy whose information I have. Now we've crossed the threshold. So it's a holistic exercise, and that makes it really messy to deal with when you're dealing with a data breach and trying to figure out who's been affected. And it doesn't help that the thresholds are different in different jurisdictions. So it might matter whether the person I was pointing at is in Europe, or in Alberta, or in Maine because they're all different laws and many different thresholds.
JASMINE SAMRA: Yeah. And I just want to add like we still have recording requirements under PIPEDA. So for instance, if an employee loses a laptop but you are able to cut off access, there was some financial information, but the access was cut off. You have control of it now. That's not reportable, but you still have to keep a record of it. So all these breaches have to be recorded. The privacy commissioner can come in and say, I want to see all your records, and they'll go through to see if you're assessing these situations correctly.
AUDIENCE: So a related question then on that. Can you explain to us the importance of like a privacy officer within an organization?
JASMINE SAMRA: So in Canada you are required to have a privacy officer or someone designated as a privacy officer. In Quebec, the CEO is automatically designated as the president if no one's assigned. So you need someone who will oversee the privacy compliance of the organization as well as be able to answer questions from the public. Individuals have the right to access and correct their personal information. So you need to have a system to do that. And usually, the privacy officer is in charge and responsible of that.
BRENT ARNOLD: Have we utterly worn you down and depressed you, or are there any other questions? These are good questions by the way.
AUDIENCE: We work in the defense industry. So what we're seeing now is a Canadian Cybersecurity Center is putting out a program. So they want to have start implementing that by December 2024. So there's a whole bunch of requirements going to be pushed out to contractors that regulate information that flows through cybersecurity or information systems that you might have. So in your opinion, do you think that's going to inform what happens with the other critical sectors when they're going through C-26?
BRENT ARNOLD: So I believe the question is it's a very, first of all, helpful comment because it reminds me of something we didn't cover. We have someone here who's in the defense sector and reminds me that there's guidance coming down the pike from the Canadian Center for Cybersecurity, which if you don't know about it, learn about it. The Canadian Center for Cybersecurity is the federal agency that's responsible for this stuff. And they put out incredibly good guidance, and training, and certification programs for free. So they're very valuable for mid-sized and smaller businesses in particular.
But the question was particular guidance is going to be coming down in obligations for the defense sector, and is that going to inform what's done with the other sectors under Bill C-26? I think to the extent that it's applicable and it's sort of generalized, I think you should expect that. And there's a more general comment I would say. If there's guidance out there that's publicly available and easy to find, you should probably assume that that's going to sort of outline what the standard is to which you're going to be held in the event of a breach because what a lawsuit like that is going to look like is probably for the moment at least not going to be breach of a statute. It's going to be negligence.
And as the lawyers in the room and at home will know, negligence is informed by, is there a duty of care? Is there a standard of care? What informs the standard of care in part will be what's industry standard and I would say what the government is telling you you should be doing and you should know about. So if it's there at this very-- and they advertise now. So I think we all have to assume that you're going to be assumed to know about that stuff. So that's a more general comment.
So yeah. Spend some time with getting to know the Canadian Center for Cybersecurity. And if you're an organization that sort of deals with their cybersecurity themselves, consider joining the CCTX, which is the Canadian Cyber Threat Exchange. This is an organization that's a membership organization where companies and actually government as well work with law enforcement sharing threat information 24/7, so that you benefit from the collective knowledge of what these other companies are seeing out in the field and what fixes they're applying. So those are two other things. And I think over time, we're going to see that membership and organizations like that may well become table stakes for showing a court that you've behaved responsibly.
I don't know if there are-- any other questions? And if there aren't, let me just say. This doesn't have to be the end of the conversation. You're going to get these slides, and at the end of those slides are our bios with our contact information. We love talking about this stuff. This is the fun part of our job. Speaking for myself anyway. So feel free to reach out at any time. All right. Thank you.
JONATHAN RIGBY: Thank you, Brent. On behalf of Rose and our speakers today, we really want to thank you all for being here. And for those that are online, appreciate it. Slides will be sent out. Questions can be flipped through and answers by the specialists. And really appreciate it all. And thanks again and hope you have a great day.
In the digital realm, Canadian businesses face emerging threats from cyber criminals and hostile foreign governments. As the cyber security landscape evolves, new challenges arise that may affect the security and privacy of your organization.
This on-demand webinar discusses the latest cyber threats and regulatory developments affecting businesses like yours. Gain valuable insights and learn practical strategies to protect your business in the cyber world.
NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.