Tag Archives: Italian Data Protection Authority

Italy’s New AI Law: A Boost for Healthcare Research?


Italy has recently enacted its own “Artificial Intelligence Act”, set to take effect on October 10, 2025.

You might be wondering: Did we really need another layer of AI regulation? That was our initial reaction, too. But a closer look reveals that the Italian AI Law introduces several interesting provisions, especially in the healthcare sector, that could facilitate research for both public and private entities. Here are some highlights:

1. Healthcare Data Processing as Based on Public Interest

The law explicitly recognizes that the processing of health-related personal data by:

  • Public or private non-profit entities,
  • Research hospitals (IRCCS),
  • Private entities collaborating with the above for healthcare research,

is of “substantial public interest.” This significantly expands the scope of Article 9(2)(g) of the GDPR, offering a clearer legal basis for processing sensitive data in research contexts.

2. Secondary Use of Data

The law introduces a simplified regime for the secondary use of personal data without direct identifiers. In particular:

  • No new consent required, as long as data subjects are informed (even via a website).
  • Automatic authorization unless blocked by the Data Protection Authority within 30 days of notification.

This provision applies only to the entities mentioned above so it is limited in scope, but in any case significantly strengthens the framework for nonprofit research projects.

3. Freedom to Anonymize, Pseudonymize and Synthesize

Under Article 8(4) of the AI Law, processing data for anonymization, pseudonymization, or synthesization is always permitted, provided the data subject is informed. This is a major step forward in enabling privacy-preserving AI research.

4. Guidelines and Governance

The law delegates the creation of technical guidelines to:

  • AGENAS – for anonymization and synthetic data generation.
  • Ministry of Health – for processing health data in research, including AI applications.

It also establishes a national AI platform at AGENAS, which will act as the data controller for personal data collected and generated within the platform.


Final Thoughts

While the GDPR aimed to support research, its implementation often created legal uncertainty and operational hurdles. Italy’s AI Law appears to address some of these gaps, offering a more pragmatic and enabling framework for healthcare research.

Happy GDPR-compliant Xmas and a prosperous new year!

Winter recess is about to start. While we’ll all be resting, GDPR will not!

While we will all be recharging our batteries to tackle the challenges for the upcoming 2025, GDPR will not go on vacation, and will thus never be out-of-office!

Check out the following tips that the Italian Data Protection Authority has recently issued in order to avoid threats to your privacy rights during the upcoming vacations:

  • Are you receiving plenty of virtual greetings and commercial offers? Be careful about them, even if apparently sent by a friend or parent: they may contain viruses, obscure links or may hide tentative of phishing. Not all presents may be welcome.
  • Have you taken good family pictures that you wish to share on your social network? Don’t forget to ask consent of all depicted individuals. Is your grandpa going to provide his consent as well?
  • Have you filmed your children’s Christmas pageant? Keep it for yourself! You’d need consent from all depicted individuals for publishing (including from their parents in case of minors).
  • Are you wishing to download any specific Christmas-related app on your smartphone? Choose them carefully, check their issuer and the reviews. You may inadvertently be downloading the Grinch’s one!
  • Are you going away for a trip? Don’t share too much information and pictures on your social media about your time off, your house and your vehicles, as it may attract thieves. Only Santa Claus shall be allowed to break in without your consent!
  • Are you connecting with your hotel’s or restaurant’s Wi-Fi? Ask the staff about its security: they may not be protected enough.
  • Have you bought any “smart” presents for your little nephews? Check whether they may collect any personal data from their users. In the affirmative, make sure that they will not harm them in any way possible.

Our own additional tips: rest, enjoy good food, spend time with your loved ones, and get ready for 2025! We wish you happy holidays and a healthy and successful new year.

Gitti and Partners Life Sciences Team

Processing Health Data: the Most Recent Amendment to Italian Privacy Code

The Italian “Privacy Code” (Legislative Decree No. 196/2003), which governs data protection in Italy together with the European GDPR, has recently been amended.

Law No. 56/2024, further implementing the National Recovery and Resilience Plan, intervened on section 110 of the Privacy Code, which deals with the processing of health-related data for the purposes of medical, biomedical or epidemiological scientific research.

Section 110 provides that consent of the data subject for the processing of health-related data for the purpose of medical, biomedical or epidemiological scientific research is not required when:

  • the research is carried out on the basis of legal provisions or European Union law, when processing is necessary for scientific research or statistical purposes, provided that an impact assessment is carried out pursuant to sections 35 and 36 of the GDPR; or
  • informing the data subject is impossible or involves a disproportionate effort, or would render impossible or seriously jeopardise the attainment of the purposes of the research.

In such cases – before the latest amendment – the data controller had to:

1) take appropriate measures to protect the rights, freedoms and interests of the data subject;

2) obtain a favorable opinion of the competent ethics committee; and

3) consult the Italian Data Protection Authority prior to processing.

The obligation to consult the Italian Data Protection Authority has now been repealed. Thus, there is no need to apply for the Authority’s clearance prior to processing health-related data (in those cases where consent of the data subject is not required under section 110 of the Privacy Code). 

This amendment may have a significant impact especially on retrospective studies for which informing data subjects is particularly burdensome. The data controller will, in fact, be able to proceed without the Authority’s permission. Nonetheless, the data controller will still have to comply with specific guarantees and ethical rules issued by the Authority – as specified by the amended section 110.

On the one hand, the amended section 110 seems to favor accountability and to soften the procedural requirements in processing health data for research purposes, making the overall procedure quicker. When it comes to “secondary use” of health data, the accountability approach should be considered strong enough to protect data and favorably welcomed, as it moves in the same direction of the European Health Data Space – which intends to provide a reliable and efficient system for the re-use of health data in areas as research and innovation.

On the other hand, though, the Italian Data Protection Authority has already issued some interim guarantees, specifying that data controllers – when processing health data related to deceased or non-contact subjects – must carry out and publish an impact assessment, pursuant to section 35 of the GDPR, notifying it to the Authority. It remains to be seen how the amendment will be handled by the Authority in practice: the effects of the simplification provided by the new version of section 110 may be diminished if the guarantees set forth by the Authority generate equally articulate procedures.      

Processing of personal and health data through apps and online platforms aimed at connecting HCPs and patients: the new digest of the Italian DPA

On March 2024, the Italian Data Protection Authority (“Italian DPA”) has issued a new digest (“Digest”) relating to the processing of personal data, whether or not concerning health data pursuant to section 9 of the GDPR, carried out through the utilization of platforms, accessible through apps or web pages (“Platforms”), that aim to facilitate connection between healthcare professionals (“HCPs”) and patients.

The use of such Platforms poses high risks to the protection and security of patients’ personal data, and in particular health-related data, given that the latter are subject to an enhanced protection regime set forth by section 9 of the GDPR. 

The Digest seeks to summarize the applicable data protection rules that may be followed, and defines the roles of the parties, as well as the legal bases, applicable to (i) the processing of personal data of the users by Platform’s owners; (ii) the processing of HCP’s personal data by Platform’s owners; and (iii) the processing of health data of the patients by the Platform’s owner and by the HCPs.

Additional guidance is provided as to:

  • The necessity for the Platform’s owner to carry out (and periodically update) a data protection impact assessment (DPIA) pursuant to section 35 GDPR, since the use of Platforms determine a “high risk” processing of personal data, as such kind of treatment automatically meets the criteria issued by the European Data Protection Board for the identification of the list of data processing that may be deemed subject to the duty to perform a DPIA;
  • Which information notices should be provided, by who and to whom, as well as the contents that such information notices should have in each case, according to sections 13 and 14 GDPR;
  • The specific rules applicable to cross-border data transfers and data transfer to third countries.

Lastly, the Digest includes a list of the most common measures that are taken by the data controllers to ensure an appropriate level of technical and organizational measures to meet the GDPR requirements, such as encryption, verification of the qualification of the HCPs that seek to enroll within the Platform; strengthened authentication systems, monitoring systems aimed at preventing unauthorized access or loss of data.

The Digest should be very welcomed by the Platform’s owners, as it now gives a reliable and complete legal frame that may be followed in order to set up a Platform in a way which is compliant with the GDPR principles.

AI and Healthcare: Recommendations by the Italian Data Protection Authority

The use of Artificial Intelligence in healthcare continues to grow and it is poised to reach 188 billion by 2030. It also raises many concerns.

The Italian data protection authority (Garante) has recently issued recommendations based on 10 points, which can be found here.

The Garante particularly insists on:

  1. Human in the loop: a human being must be involved in the control, validation or change of the automatic decision;
  2. No algorithmic discrimination: trustworthy AI systems should reduce mistakes and avoid discrimination due to inaccurate processing of health data;
  3. Data quality: health data must be correct and updated. Representation of interested subjects must correctly reflect the population.
  4. Transparency: the interested subject must be able to know the decisional processes based on automated processes and must receive information on the logic adopted so as to be able to understand it (easier said than done!). The Garante also requires that at least an excerpt of the Data Protection Impact Assessment is published.

Other recommendations are not surprising for anyone familiar with the GDPR:

  • Profiling and decisions based on automated processes must be expressly allowed by Member State’s laws.
  • The principles of privacy by design and privacy by default obviously play a big role in healthcare AI systems.
  • Roles of controller and processor must be correctly allocated: in particular, the public administration must ensure that external entities processing data are appointed as data processors.
  • A Data Protection Impact Assessment must be carried out and any risks must be evaluated.
  • Integrity, security and confidentiality of data must be ensured.

Striving for genuine transparency in connection with very complex and rapidly evolving algorythms is not going to be an easy task for the data controller. Similarly, understanding how AI works in a healthcare setting is not going to be simple for patients.

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!

Data Protection Day 2021: What You May Have Missed (while busy celebrating Data Protection Day)

This year’s celebrations for Data Protection Day may have been a bit toned down. But you still may have been so busy celebrating that you may have missed a couple of news from the (data privacy) world.

First, the EDPB’s Guidelines 01/2021 on Examples regarding Data Breach Notification are out and open for comments until March 2nd.  The document can be used as a very practical guide for whoever is involved in data processing activities. It is aimed at helping data controllers in deciding how to handle data breaches and what factors to consider during risk assessment. The Guidelines reflect the experiences of the European supervisory authorities since the GDPR became applicable and they are full of cases and examples which make them, admittedly, a practice-oriented, case-based guide for controllers and processors. So, are you curious to know what to do in case of a ransomware attack with backup but without exfiltration in a hospital? Or perhaps in case of a credential stuffing attack on a banking website? Or you’re “just” trying to figure out what to do in case of mistakes in post and mail?  Then, check out the guidelines!

Meanwhile, in Italy, the Italian Data Protection Authority gave its favourable opinion to the proposed reform of the Italian Registro Pubblico delle Opposizioni, a service designed for the protection of data-subjects, whose telephone number is publicly available but who wish not to receive unsolicited direct marketing calls from an operator. Nevertheless, the Italian Data Protection Authority specified that such service, essentially based on a list of express dissents, only applies to marketing activities carried out by human operators and cannot be extended to automated calls. The Italian Data Protection Authority, by doing so, confirms that marketing activities carried out through automated systems must be subject to stricter measures and always require express consent, given their highly invasive nature. So: Humans 1, Automated Calling Machines: 0.

Italy’s First Multi-Million GDPR Sanctions

Before last week, the Italian Data Protection Authority (“DPA”) only applied one (modest) GDPR sanction, which placed Italy at the bottom of the lists of EU Countries per number and value of GDPR sanctions applied.

In addition to the great differences in numbers and figures – for example, of soon-to-leave UK (sanctions’ amounts in Euro: Italy 30k vs. UK 315mln+) or Spain (number of sanctions: Italy 1 vs. Spain 43) – it is interesting noting that, until last Friday, the most active European DPAs (UK, France, Germany, Spain) tended to target big players in the private sector (i.e. British Airways, Marriot International, Google), as opposed to Italy’s attention to websites affiliated to a political party and run through the platform named Rousseau.

Last Friday, however, a significant change in such scenario occurred. The Italian DPA issued a press release announcing two GDPR sanctions applied to Eni Gas e Luce, a fully-owned subsidiary of Italy’s State-controlled multinational oil and gas company, Eni S.p.A., for Euro 8.5 and 3 million.

The first sanction of Euro 8.5 million has been imposed for unlawful processing in connection with telemarketing and tele-selling activities. The inspections and inquiries had been carried out by the authorities as a response to several alerts and complaints that followed GDPR D-Day.

Violations included: advertising calls made without consent or despite data subjects’ refusal, absence of technical and organisational measures to take into account the instructions provided by data subjects, excessive data retention periods, obtainment of personal data of possible future customers from third parties which did not obtain consent.

The second sanction of Euro 3 million relates to unsolicited contracts for the supply of electricity and gas. Many individuals complained that they have learned about their new contracts only upon receipt of the termination letter from the previous supplier or of the first electricity bill from Eni Gas e Luce. Complaints included alleged incorrect data and false signatures.

About 7200 consumers have been affected. The Italian DPA also underlined the role of third-party contractors, acting on behalf of Eni Gas e Luce, in perpetrating the violations.

Both decisions are quite significant as, for the very first time, the Italian DPA provides its indications and illustrates its approach in dealing with data processing and violations by large-sized companies operating in the private sector, within the GDPR regulatory framework.