top of page
  • Sabrina Palme

Looking ahead: Macro events that will impact the data privacy field in 2021


Illustration of a person working at a desk, holding a pencil over a document with graphs and charts, alongside a calculator, a cup of coffee, and a pencil.

Written by Sabrina Palme, CEO at Palqee Technologies


 

Adapting the PESTEL framework, a tool commonly used in business strategy, we consider a range of macro events and how they may influence the world of data privacy in the foreseeable future.


 

2020 has been eventful, at least. With a year that has brought a lot of change to many for us, the privacy profession and field has not been spared by it. Quite the opposite. As whole countries went into working-from-home mode and governments fired new responses to COVID-19 in form of restrictions and health and safety measurements on a daily basis, companies and data privacy professionals faced a range of challenges. From preserving privacy rights when working from home while keeping productivity up to implementing new technologies and policies rapidly to continue business operations — privacy professionals had to act fast.


But it has not all been COVID-19. Other events such as the Black Lives Matter (BLM) movement, Schrems II and Brexit have stirred up the world of data privacy just as much and it seems there won’t be a moment to take a breath any time soon.


The last year has shown us that data privacy touches every aspect and every layer of society. Working in the field of data privacy management, means one has to be extremely versatile in order to be able to manage fast changing environments.


However, merely reacting to events usually is not the most efficient, cost effective and sustainable route. A business that is prepared for different eventualities, will be able to navigate the ship more safely through stormy times and this is also true for data privacy. While it is not possible to prepare for all eventualities — who can tell when and how a pandemic will break out — there are hints and signs we can observe and analyse to help us prepare and be ready to fight when it comes down to it.


Looking over to our neighbours from the business strategy department, they are no stranger to this approach. They know that a thorough analysis and monitoring of environmental factors that may have a profound impact on a business, can help in formulating a companies’ strategy.


Privacy professionals are in a similar position. We have now reached a stage where it is not anymore merely about reaching compliance with data privacy regulations. Companies are realizing the potential that lies within personal data management and how much operations can be influenced by it due to macro-environmental factors.


Commonly used in business strategy, a PESTEL analysis can help to analyse and monitor these macro-environmental factors. The letters stand for Political, Economic, Social, Technological, Environmental and Legal. Some organizations have adapted the tool to fit better to their needs, adding other dimensions such as Ethical.


Below we have taken the PESTEL framework and applied it to the field of data privacy to get an overview on how different factors may influence and impact data privacy, with the goal to help us be prepared. We also switched Environmental factors with Ethical factors. While other environmental events such as a pandemic could happen, they’re hardly predictable. An agenda on how to deal with disastrous events, should be part of a companies’ general emergency plan.


The assessment below is not exhaustive. Rather it highlights some of the main expected developments in the medium term (+6 months) that are likely to have an impact on the work of privacy professionals across industries. If you have an insight and input you think is missing, please share them with us to make the analysis more thorough.


A lot of the factors fit into more than one segment. We placed them where we see they have the greatest weight. The factors within each category aren’t sorted in any certain way. Further, this analysis focuses mainly on factors that can impact the field of data privacy in the EU and the United Kingdom.


PESTEL Analysis for the data privacy field


POLITICAL FACTORS


What? European Commission unveiled new rules for data-governance, including ‘data-altruism’ clause where individuals and companies can give consent to share the data they generate for the common good to help build open yet sovereign single market for data with the aim to unlock potential of big data for emerging technology. You can read the Q&A on the data governance regulation here.


Why we care: This is an exciting topic as it shows an attempt to foster innovation of emerging technology while respecting the rights to data privacy. Companies that wish to participate should start thinking about how they’ll engage with their customers so they can make an informed decision on sharing their (anonymous) data. Further they should assess whether or not the company already has the capabilities and tools in place that allows them to manage this sort of data sharing. Something that most companies probably need time to prepare for.


What? Biometric tracking of COVID-vaccination could leave privacy measures by the wayside and data could be used by governments for mass surveillance or by private companies for targeted advertising.


Why we care: While EU citizens are protected by this scenario to a certain extend thanks to the GDPR, other governments may act differently, including (possibly) the United Kingdom who is known for its mass surveillance practices on its citizens in the name of safety. Looking outside of the EU, we see countries that have implemented mandatory tracking tools such as China and even though the pandemic seems to be under control there ever since March, the application is still mandatory and in use. Philosophers and thought leaders such as Yuval Noah Harari, have warned about continued mass surveillance in a post-pandemic world early on (read article in times magazine here) but the true implications on what this means for data privacy are yet to be seen.


SOCIAL FACTORS


What? Awareness about data privacy continues to rise, likely leading to more demanding customers on how companies engage with them in regards to the management of their data.


Why we care: As a result of raised consumer awareness around data privacy, businesses will have to combine purpose with profits and build trust-based relationships in order to stay competitive. It is also expected that companies who cater to the data privacy concerns of their customers see a higher customer retention rate.


TECHNOLOGICAL FACTORS


What? Challenges surrounding biased algorithms are unlikely to be resolved in 2021.


Why we care: This one falls into a broader issue and that is to ensure data subjects don’t fall victim to systematic discrimination through automated processing activities. While biased algorithms continue to be a big issue specifically for AI and machine learning innovation, discrimination through automated processing activities is happening at large already, especially in the financial and insurance sector. As we know, the GDPR gives automated processing activities special attention and VIP treatment which tightly links data privacy to the issue of biased algorithms. With the Black Lives Matter movement having brought a lot of attention to the issue, companies are well advised to revisit all of their automated processing activities, analyse if and how these could treat certain people unfairly and make sure to fix it. Businesses that fail to do so, are likely to lose their appeal in the market.


What? Emerging technology such as Artificial Intelligence and Blockchain innovation will continue to mature with special focus and support by the EU on innovation in Industrial Internet of Things (IIoT), Future Mobility and Smart Health.


Why we care: Industries that have been historically more traditional are expected to be disrupted quite heavily. Keeping an eye open as privacy professional on the different technologies out there and observing their stand on data privacy, can help when the board needs to make informed decisions in adopting new technologies to stay with (or ahead) of the curve.


LEGAL FACTORS


What? The CJEU Schrems II and Privacy International ruling, give indication that legal disputes among privacy advocates and companies will continue to have an impact on the data privacy sector as a whole.


Why we care: This year has taught governments and businesses they should not have the illusion that once a decision under the GDPR has been made that it will stay like that forever. A very active community of data privacy ambassadors continues to push for their rights under the GDPR. And rightly so! However, that means for companies that the easiest route isn’t always working out in the long run. The UK thought it was almost granted that they’d achieve adequacy status post-Brexit but this is hanging in the clouds now after the last ruling by the CJEU on government surveillance. The Schrems II ruling, invalidated the Privacy Shield framework which businesses used for transatlantic data transfers to and from the US. This comes to show that for data privacy, going the extra mile initially when e.g. signing that first contract with a vendor, creating a first ROPA and implementing privacy-by-design principles, is likely going to pay out later.


What? Following the Schrems II ruling, the European Commission proposed new Standard Contractual Clauses (SCCs) for international data transfers including specific provisions dealing with government access requests. Once the European Commission adopts the final version, there will be a transitional period of year where companies can still rely on the existing SCCs for the performance of contracts.


Why we care: This gives companies that have used the existing SCCs in any of their contracts time to revisit and prioritise them one by one. It is also important to double check that no agreement has left out the SCCs even though they should be implemented. The transition period is a certain known companies can prepare for.


What? Regulators continue to struggle with clarity on how to regulate emerging technology. In the EU, varying rules and legislations are being implemented on country level, meaning emerging technology companies need to observe legislation across jurisdictions if they want to operate in the EU as a whole.


Why we care: This point is somewhat connected to the technological factor of AI innovation. Companies that operate in several EU countries and are thinking about adapting or implementing emerging technology into their business, need to consider how aside from the GDPR, different legislations apply and how they may differ from country to country. Further some emerging tech innovations may not be accessible in the EU until proper legislation has been implemented.


ECONOMICAL FACTORS


What? Small and Medium sized businesses have been on the losing end for GDPR compliance, as they’re particularly affected by the costs of compliance with the GDPR.


Why we care: This is both, an opportunity for privacy professionals to cater their services to the needs from SME’s and a risk for SME’s as data protection authorities turn their eyes more and more towards them. SME’s that are non-compliant won’t be treated differently by the authorities and it’s expected that we’ll see more fines handed out to them in 2021.


What? Germany announced the drafting of law that will permit driverless vehicles in regular operation, making the country the first in the world to have driverless cars on the streets. The aim for the law to take effect is in 2022.


Why we care: The draft may give an insight and inspiration for how Smart Cities in general could be regulated from a data privacy and security perspective. Companies working or connected to this field will be able to get a first glimpse on the standards they’ll need to fulfil in order to sell their products and services in the EU.


What? The EU plans to have fully deployed 5G networks after 2021.


Why we care: Implementation of 5G networks will bring good and bad for data privacy. 5G enables use of fully anonymised authentication techniques to secure end-to-end communication, reducing greatly risk to hacking. On the other hand, it will also enable large scale data collection from hyper-accurate location information to health data and direct connection of billions of IoT enabled devices through the Massive Machine to Machine mode. Companies should start to consider how they’re adapting their processes and policies to this new network.


ETHICAL FACTORS


What? The role of data privacy and security professionals will continue to evolve into data ethics


Why we care: Assessing new vendors, processes and solutions is getting more complex.

Agreeing on what personal data is processed with what security measurements in place, isn’t enough anymore. Other aspects such as accessibility, diversity, neutrality and equality need to be assessed as well. As data ethics gain momentum as a source for sound business decisions, fulfilling “legal compliance” won’t be enough. Companies will have to go beyond the regulatory requirement, showing motions of integrity and ethical behaviour and promoting the right company culture. Having an ethics specialist or even board is still an exotic exception for most companies and skilled people are rare, too. The “next best thing” they can turn to are privacy and security professionals, adding yet another skill to the professions’ qualification requirements. Privacy and security professionals that can offer not just services on data privacy but also ethics, are likely to have a competitive edge over others.


Manage your compliance program workflows Prioritise tasks, assign responsibilities and track progress.


Comments


bottom of page