Blog
October 3, 2024

5 Questions with Legal Expert, Sarah Jarman

Sarah Jarman has over 10 years of extensive legal experience. We caught up with her to ask her five questions about the current compliance landscape, and the important legal issues facing organisations right now.

Download
Download

Sarah Jarman has over 10 years of extensive legal experience across multiple industries. Currently the VP of Legal Compliance and Group DPO at Emplifi, Sarah has held positions at Boeing and Rolls Royce. She has a keen interest in how generative AI can impact privacy rights and her knowledge of the global data protection landscape is second to none.

We caught up with her to ask her five questions about the current compliance landscape, and the important legal issues facing organisations right now.

1) What are the biggest challenges organisations face right now in achieving and maintaining data privacy and compliance?

Clarity and cost are probably most common. It’s not possible to read one act or regulation and achieve compliance; it’s an ecosystem and it involves changing the way business has traditionally been done with the respective mentalities accordingly. It also involves knowing what data is where (in often historical circumstances), and how can you know what you don’t know?

The cost to achieve and then maintain compliance can vary from hiring suitably qualified and experienced personnel who are specialists in their field, through to the technical costs of implementing the tools and the time it takes to implement what may never be 100% compliance. Planning early and monitoring proactively can significantly aid this, however, it’s also important to feedback practical challenges to the regulators.

2) How is Generative AI impacting data privacy right now? How are compliance laws evolving to keep up?

It’s impacting both positively and negatively; we see the negative in the sense that people are really excited and pour all kinds of data in without realising the impact it may have, or bad actors increasing their use of Gen AI, for example, for vishing.  

However, on the positive side, we have also seen some promising advancements in areas such as medicine or reduction of online abuse. The great thing about privacy is the data protection frameworks that have come in for the last few years are offering a crutch for AI management while the regulations catch up because there are so many similarities in tools and skills required. 

The laws are evolving and there are many greater minds than mine working on them with incredible philosophical debates behind them! However, they are very much in their plural because the areas to be covered are so wide; from the debate of how liability should be treated in respect of fault vs strict, or how a new piece of legislation will interact with pre-existing, partially overlapping things, like discrimination or copyright. 

There are (expectedly) differing approaches geographically too. For example, the EU is taking a risk-based approach to AI whereas China is taking a rights based approach. It’s good that there is debate and indeed that debate should be continuous. Although, I cannot say how excited I get when there is a consensus approach that avoids significant differences between jurisdictions, as the practical implications for organisations just trying to go about their business often means they can do just that!

The recent release in the US with the second attempt at such an approach was the CPRA (which was unveiled on a Sunday so maybe they share my excitement) or Bill C-27 in Canada. These are hopefully symptoms of the realisation that harmony is needed to a certain extent so that the data subject can be protected and the business can operate within the legal or even social remit - the alternative being a bullet fire approach that hinders business and leaves gaps for poorly informed subjects - rather inconvenient.  

3) We talk a lot about building a Human Firewall - a security conscious workforce that can help minimise risks to data security. Would you consider compliance to be everyone’s responsibility?

Yes, but to varying degrees. Everyone is the first line of defence because according to the 2023 FBI IC3 report (which reflects a very typical landscape in other jurisdictions as well), phishing/social engineering is still the major cause of security breaches.  

However, while that therefore means some governance and training is required to achieve that first line, it doesn’t mean everyone needs to be a covert cybersecurity hero by night with a matching cape. It does mean they need to know what to look out for and how to deal with it, including who to contact. The hero is always welcome though.

4) What are the critical considerations for third-party data sharing and vendor management to ensure compliance with data privacy laws? How can organisations effectively assess the privacy practices of their third-party partners?

The exact considerations and therefore assessments can differ dramatically depending on the amount and type of data, coupled with the challenge of how you truly know an organisation is doing what they say they will do. 

Therefore, there will be an element of trust at some point so it’s important to start with some general due diligence first. 

A transfer impact assessment and a data protection agreement are very useful tools to effectively assess (regardless of the legislative or regulatory need) because you look at where the data is going, how it’s stored, who has access, for what purpose, how long it needs to be kept, the type of data (which informs what rights may be attached to it) etc and match it to the organisation you are screening. 

To make this ‘effective’ in practice rather than declaratory, you can reserve the right of audit, however, looking for ISO certifications or similar third-party independent verifications can save both parties time and effort. If that is the choice though, a clause that includes the obligation to maintain such certification is ideal with notification of any non-conformances. It then needs to be monitored regularly according to what is appropriate in that situation.

Ultimately, any risk identified that cannot be mitigated through the above should be made transparently into the centralised risk register so that the business can make an informed decision and factor it into business continuity plans, incident response plans, liability to third parties or any other area that could be affected if the risks crystalised. 

In order to do the above, governance/legal/privacy, procurement and IT/security functions need to be joined up and suitably resourced so that the organisation has a clear view before the data sharing takes place.

5) With the constant evolution of technology and data practices, how do you stay up-to-date with emerging trends and changes in data privacy laws?

I will quite happily sit down to read about emerging trends, even on a Sunday, or when driving, I’ll tune into what is coming down the line! I do that through a plethora of sources and realise I probably need a word with myself.

However, there aren’t really surprises when something becomes law because although different jurisdictions have different processes, you can see what is coming in the form of bills, white papers, jurisprudence, etc and often the rationales behind them, which I find very useful for understanding the aforementioned ecosystem as a whole.

One of the most effective I find though, is also to know key players in the industry and their aims; for example, in the UK, Lord Holmes is doing some amazing work in AI as well as Elizabeth Denham who focuses in children’s privacy. It definitely requires a passion to stay up to date!

Sarah Jarman has over 10 years of extensive legal experience across multiple industries. Currently the VP of Legal Compliance and Group DPO at Emplifi, Sarah has held positions at Boeing and Rolls Royce. She has a keen interest in how generative AI can impact privacy rights and her knowledge of the global data protection landscape is second to none.

We caught up with her to ask her five questions about the current compliance landscape, and the important legal issues facing organisations right now.

1) What are the biggest challenges organisations face right now in achieving and maintaining data privacy and compliance?

Clarity and cost are probably most common. It’s not possible to read one act or regulation and achieve compliance; it’s an ecosystem and it involves changing the way business has traditionally been done with the respective mentalities accordingly. It also involves knowing what data is where (in often historical circumstances), and how can you know what you don’t know?

The cost to achieve and then maintain compliance can vary from hiring suitably qualified and experienced personnel who are specialists in their field, through to the technical costs of implementing the tools and the time it takes to implement what may never be 100% compliance. Planning early and monitoring proactively can significantly aid this, however, it’s also important to feedback practical challenges to the regulators.

2) How is Generative AI impacting data privacy right now? How are compliance laws evolving to keep up?

It’s impacting both positively and negatively; we see the negative in the sense that people are really excited and pour all kinds of data in without realising the impact it may have, or bad actors increasing their use of Gen AI, for example, for vishing.  

However, on the positive side, we have also seen some promising advancements in areas such as medicine or reduction of online abuse. The great thing about privacy is the data protection frameworks that have come in for the last few years are offering a crutch for AI management while the regulations catch up because there are so many similarities in tools and skills required. 

The laws are evolving and there are many greater minds than mine working on them with incredible philosophical debates behind them! However, they are very much in their plural because the areas to be covered are so wide; from the debate of how liability should be treated in respect of fault vs strict, or how a new piece of legislation will interact with pre-existing, partially overlapping things, like discrimination or copyright. 

There are (expectedly) differing approaches geographically too. For example, the EU is taking a risk-based approach to AI whereas China is taking a rights based approach. It’s good that there is debate and indeed that debate should be continuous. Although, I cannot say how excited I get when there is a consensus approach that avoids significant differences between jurisdictions, as the practical implications for organisations just trying to go about their business often means they can do just that!

The recent release in the US with the second attempt at such an approach was the CPRA (which was unveiled on a Sunday so maybe they share my excitement) or Bill C-27 in Canada. These are hopefully symptoms of the realisation that harmony is needed to a certain extent so that the data subject can be protected and the business can operate within the legal or even social remit - the alternative being a bullet fire approach that hinders business and leaves gaps for poorly informed subjects - rather inconvenient.  

3) We talk a lot about building a Human Firewall - a security conscious workforce that can help minimise risks to data security. Would you consider compliance to be everyone’s responsibility?

Yes, but to varying degrees. Everyone is the first line of defence because according to the 2023 FBI IC3 report (which reflects a very typical landscape in other jurisdictions as well), phishing/social engineering is still the major cause of security breaches.  

However, while that therefore means some governance and training is required to achieve that first line, it doesn’t mean everyone needs to be a covert cybersecurity hero by night with a matching cape. It does mean they need to know what to look out for and how to deal with it, including who to contact. The hero is always welcome though.

4) What are the critical considerations for third-party data sharing and vendor management to ensure compliance with data privacy laws? How can organisations effectively assess the privacy practices of their third-party partners?

The exact considerations and therefore assessments can differ dramatically depending on the amount and type of data, coupled with the challenge of how you truly know an organisation is doing what they say they will do. 

Therefore, there will be an element of trust at some point so it’s important to start with some general due diligence first. 

A transfer impact assessment and a data protection agreement are very useful tools to effectively assess (regardless of the legislative or regulatory need) because you look at where the data is going, how it’s stored, who has access, for what purpose, how long it needs to be kept, the type of data (which informs what rights may be attached to it) etc and match it to the organisation you are screening. 

To make this ‘effective’ in practice rather than declaratory, you can reserve the right of audit, however, looking for ISO certifications or similar third-party independent verifications can save both parties time and effort. If that is the choice though, a clause that includes the obligation to maintain such certification is ideal with notification of any non-conformances. It then needs to be monitored regularly according to what is appropriate in that situation.

Ultimately, any risk identified that cannot be mitigated through the above should be made transparently into the centralised risk register so that the business can make an informed decision and factor it into business continuity plans, incident response plans, liability to third parties or any other area that could be affected if the risks crystalised. 

In order to do the above, governance/legal/privacy, procurement and IT/security functions need to be joined up and suitably resourced so that the organisation has a clear view before the data sharing takes place.

5) With the constant evolution of technology and data practices, how do you stay up-to-date with emerging trends and changes in data privacy laws?

I will quite happily sit down to read about emerging trends, even on a Sunday, or when driving, I’ll tune into what is coming down the line! I do that through a plethora of sources and realise I probably need a word with myself.

However, there aren’t really surprises when something becomes law because although different jurisdictions have different processes, you can see what is coming in the form of bills, white papers, jurisprudence, etc and often the rationales behind them, which I find very useful for understanding the aforementioned ecosystem as a whole.

One of the most effective I find though, is also to know key players in the industry and their aims; for example, in the UK, Lord Holmes is doing some amazing work in AI as well as Elizabeth Denham who focuses in children’s privacy. It definitely requires a passion to stay up to date!

Sarah Jarman has over 10 years of extensive legal experience across multiple industries. Currently the VP of Legal Compliance and Group DPO at Emplifi, Sarah has held positions at Boeing and Rolls Royce. She has a keen interest in how generative AI can impact privacy rights and her knowledge of the global data protection landscape is second to none.

We caught up with her to ask her five questions about the current compliance landscape, and the important legal issues facing organisations right now.

1) What are the biggest challenges organisations face right now in achieving and maintaining data privacy and compliance?

Clarity and cost are probably most common. It’s not possible to read one act or regulation and achieve compliance; it’s an ecosystem and it involves changing the way business has traditionally been done with the respective mentalities accordingly. It also involves knowing what data is where (in often historical circumstances), and how can you know what you don’t know?

The cost to achieve and then maintain compliance can vary from hiring suitably qualified and experienced personnel who are specialists in their field, through to the technical costs of implementing the tools and the time it takes to implement what may never be 100% compliance. Planning early and monitoring proactively can significantly aid this, however, it’s also important to feedback practical challenges to the regulators.

2) How is Generative AI impacting data privacy right now? How are compliance laws evolving to keep up?

It’s impacting both positively and negatively; we see the negative in the sense that people are really excited and pour all kinds of data in without realising the impact it may have, or bad actors increasing their use of Gen AI, for example, for vishing.  

However, on the positive side, we have also seen some promising advancements in areas such as medicine or reduction of online abuse. The great thing about privacy is the data protection frameworks that have come in for the last few years are offering a crutch for AI management while the regulations catch up because there are so many similarities in tools and skills required. 

The laws are evolving and there are many greater minds than mine working on them with incredible philosophical debates behind them! However, they are very much in their plural because the areas to be covered are so wide; from the debate of how liability should be treated in respect of fault vs strict, or how a new piece of legislation will interact with pre-existing, partially overlapping things, like discrimination or copyright. 

There are (expectedly) differing approaches geographically too. For example, the EU is taking a risk-based approach to AI whereas China is taking a rights based approach. It’s good that there is debate and indeed that debate should be continuous. Although, I cannot say how excited I get when there is a consensus approach that avoids significant differences between jurisdictions, as the practical implications for organisations just trying to go about their business often means they can do just that!

The recent release in the US with the second attempt at such an approach was the CPRA (which was unveiled on a Sunday so maybe they share my excitement) or Bill C-27 in Canada. These are hopefully symptoms of the realisation that harmony is needed to a certain extent so that the data subject can be protected and the business can operate within the legal or even social remit - the alternative being a bullet fire approach that hinders business and leaves gaps for poorly informed subjects - rather inconvenient.  

3) We talk a lot about building a Human Firewall - a security conscious workforce that can help minimise risks to data security. Would you consider compliance to be everyone’s responsibility?

Yes, but to varying degrees. Everyone is the first line of defence because according to the 2023 FBI IC3 report (which reflects a very typical landscape in other jurisdictions as well), phishing/social engineering is still the major cause of security breaches.  

However, while that therefore means some governance and training is required to achieve that first line, it doesn’t mean everyone needs to be a covert cybersecurity hero by night with a matching cape. It does mean they need to know what to look out for and how to deal with it, including who to contact. The hero is always welcome though.

4) What are the critical considerations for third-party data sharing and vendor management to ensure compliance with data privacy laws? How can organisations effectively assess the privacy practices of their third-party partners?

The exact considerations and therefore assessments can differ dramatically depending on the amount and type of data, coupled with the challenge of how you truly know an organisation is doing what they say they will do. 

Therefore, there will be an element of trust at some point so it’s important to start with some general due diligence first. 

A transfer impact assessment and a data protection agreement are very useful tools to effectively assess (regardless of the legislative or regulatory need) because you look at where the data is going, how it’s stored, who has access, for what purpose, how long it needs to be kept, the type of data (which informs what rights may be attached to it) etc and match it to the organisation you are screening. 

To make this ‘effective’ in practice rather than declaratory, you can reserve the right of audit, however, looking for ISO certifications or similar third-party independent verifications can save both parties time and effort. If that is the choice though, a clause that includes the obligation to maintain such certification is ideal with notification of any non-conformances. It then needs to be monitored regularly according to what is appropriate in that situation.

Ultimately, any risk identified that cannot be mitigated through the above should be made transparently into the centralised risk register so that the business can make an informed decision and factor it into business continuity plans, incident response plans, liability to third parties or any other area that could be affected if the risks crystalised. 

In order to do the above, governance/legal/privacy, procurement and IT/security functions need to be joined up and suitably resourced so that the organisation has a clear view before the data sharing takes place.

5) With the constant evolution of technology and data practices, how do you stay up-to-date with emerging trends and changes in data privacy laws?

I will quite happily sit down to read about emerging trends, even on a Sunday, or when driving, I’ll tune into what is coming down the line! I do that through a plethora of sources and realise I probably need a word with myself.

However, there aren’t really surprises when something becomes law because although different jurisdictions have different processes, you can see what is coming in the form of bills, white papers, jurisprudence, etc and often the rationales behind them, which I find very useful for understanding the aforementioned ecosystem as a whole.

One of the most effective I find though, is also to know key players in the industry and their aims; for example, in the UK, Lord Holmes is doing some amazing work in AI as well as Elizabeth Denham who focuses in children’s privacy. It definitely requires a passion to stay up to date!