The coming years will be filled with more data protection regulations, such as through NIS2 and the AI Act. Therefore, laying a solid foundation is crucial for how well your company can keep up. This article will give you an overview of the new regulations. We also provide our take on why risk assessments will form a solid foundation for future compliance.
Since 2018, many have loved to hate GDPR. However, we have learned enough to make peace with it. There is now sufficient legal practice and guidance to incorporate GDPR’s top 10 requirements into the compliance structure at a reasonable level. And the very structure of working with GDPR offers several advantages that you can apply when the many new provisions hit everything from entire sectors to specific types of companies in the coming years.
Therefore, you should consider GDPR the first part of a massive uplift in digital regulation in the EU, where businesses must, in the coming years, mature to protect both individuals and society.
NIS2 to strengthen supply security and robustness across the EU
The Directive on Network and Information Security, NIS2, comes into force on 17 October 2024 and requires companies in 15 sectors to consider the societal risk of IT security breaches within their operations. All public administration is also included.
The regulations aim to ensure a high level of supply security and stability in critical societal areas such as energy, food, and medicine throughout the EU.
Just over 1,000 companies in Denmark are expected to be directly affected by the regulations. In addition, suppliers to the affected organisations will also be included. Suppliers must be able to participate in the initial risk assessment and the subsequent contingency planning in the event of security incidents. If you, as a supplier, cannot do this, you will fall behind in the competition. Therefore, suppliers in particular can benefit from understanding the NIS2 language and mindset at an early stage if they want to offer services either directly or as subcontractors.
The AI Act focuses on AI and the associated risks
The AI Act is a new set of regulations aimed at reducing the risks associated with the use of AI. The AI Act distinguishes between prohibited uses of AI and high-risk systems.
For high-risk systems, a risk assessment and a comprehensive system for managing risks must be in place. In the high-risk zone, you will find assessments of, among other things, creditworthiness, social status, and health. These issues go very close to individuals' personal and social conditions, which is why compliance with the regulations must be demonstrable.
So why is GDPR a good starting point for future compliance requirements?
GDPR states that the processing of personal data requires the implementation of appropriate security measures.
For companies that have worked with data processing agreements and third-country transfers, it is a familiar issue to determine when there is sufficient documentation to demonstrate that appropriate security has been implemented.
The solution is to prepare a risk assessment based on all the facts collected when a company concludes a data processing agreement.
So, if your company has worked with risk assessments through GDPR, there will be much structurally for you to lean on when preparing risk assessments in relation to the many new regulations that are coming.
If you have not yet translated the work with data processing agreements and security descriptions into actual risk assessments, now is a good time to start.
Your simple framework for a good risk assessment
There are various methods for conducting risk assessments. However, we have good experience with the classic 'consequence times probability,' which in broad terms includes the following:
- Which personal data are processed?
- Which security measures are applied?
- Which threats are there for us generally and specifically? (the general cyber threat has been very high for some time)
- What are the consequences for the individual?
- What is the likelihood of the consequences occurring?
If your company thoroughly collects facts, you can, with the right process, create a simple and clear framework for conducting risk assessments – also, for example, for society (NIS2) and AI high-risk systems (AI Act).
The great advantage arises if you use risk assessments commercially by, for example, sharing the results in a non-confidential version with partners and customers. You can also use the risk assessments when reporting to the board or management and as part of the decision-making basis for the company's activities. These are two very good reasons for getting started with risk assessments.
Get help translating regulations into practical solutions
Has your company not translated the work with data processing agreements and security descriptions into actual risk assessments? Then contact us for a non-binding talk about how you can get risk assessments that are both accessible and commercially usable.