Tag Archives: GDPR

Italy’s New AI Law: A Boost for Healthcare Research?


Italy has recently enacted its own “Artificial Intelligence Act”, set to take effect on October 10, 2025.

You might be wondering: Did we really need another layer of AI regulation? That was our initial reaction, too. But a closer look reveals that the Italian AI Law introduces several interesting provisions, especially in the healthcare sector, that could facilitate research for both public and private entities. Here are some highlights:

1. Healthcare Data Processing as Based on Public Interest

The law explicitly recognizes that the processing of health-related personal data by:

  • Public or private non-profit entities,
  • Research hospitals (IRCCS),
  • Private entities collaborating with the above for healthcare research,

is of “substantial public interest.” This significantly expands the scope of Article 9(2)(g) of the GDPR, offering a clearer legal basis for processing sensitive data in research contexts.

2. Secondary Use of Data

The law introduces a simplified regime for the secondary use of personal data without direct identifiers. In particular:

  • No new consent required, as long as data subjects are informed (even via a website).
  • Automatic authorization unless blocked by the Data Protection Authority within 30 days of notification.

This provision applies only to the entities mentioned above so it is limited in scope, but in any case significantly strengthens the framework for nonprofit research projects.

3. Freedom to Anonymize, Pseudonymize and Synthesize

Under Article 8(4) of the AI Law, processing data for anonymization, pseudonymization, or synthesization is always permitted, provided the data subject is informed. This is a major step forward in enabling privacy-preserving AI research.

4. Guidelines and Governance

The law delegates the creation of technical guidelines to:

  • AGENAS – for anonymization and synthetic data generation.
  • Ministry of Health – for processing health data in research, including AI applications.

It also establishes a national AI platform at AGENAS, which will act as the data controller for personal data collected and generated within the platform.


Final Thoughts

While the GDPR aimed to support research, its implementation often created legal uncertainty and operational hurdles. Italy’s AI Law appears to address some of these gaps, offering a more pragmatic and enabling framework for healthcare research.

Happy GDPR-compliant Xmas and a prosperous new year!

Winter recess is about to start. While we’ll all be resting, GDPR will not!

While we will all be recharging our batteries to tackle the challenges for the upcoming 2025, GDPR will not go on vacation, and will thus never be out-of-office!

Check out the following tips that the Italian Data Protection Authority has recently issued in order to avoid threats to your privacy rights during the upcoming vacations:

  • Are you receiving plenty of virtual greetings and commercial offers? Be careful about them, even if apparently sent by a friend or parent: they may contain viruses, obscure links or may hide tentative of phishing. Not all presents may be welcome.
  • Have you taken good family pictures that you wish to share on your social network? Don’t forget to ask consent of all depicted individuals. Is your grandpa going to provide his consent as well?
  • Have you filmed your children’s Christmas pageant? Keep it for yourself! You’d need consent from all depicted individuals for publishing (including from their parents in case of minors).
  • Are you wishing to download any specific Christmas-related app on your smartphone? Choose them carefully, check their issuer and the reviews. You may inadvertently be downloading the Grinch’s one!
  • Are you going away for a trip? Don’t share too much information and pictures on your social media about your time off, your house and your vehicles, as it may attract thieves. Only Santa Claus shall be allowed to break in without your consent!
  • Are you connecting with your hotel’s or restaurant’s Wi-Fi? Ask the staff about its security: they may not be protected enough.
  • Have you bought any “smart” presents for your little nephews? Check whether they may collect any personal data from their users. In the affirmative, make sure that they will not harm them in any way possible.

Our own additional tips: rest, enjoy good food, spend time with your loved ones, and get ready for 2025! We wish you happy holidays and a healthy and successful new year.

Gitti and Partners Life Sciences Team

New Guidelines on Web Scraping

Pursuant to Article 57(1)(b) of the GDPR, on May 20, 2024 the Italian Data Protection Authority (“Italian DPA”) adopted guidelines [LINK] on web scraping, with the aim of providing guidance to operators of websites and online platforms, acting in Italy as data controllers of personal data made available online to the public.

Web scraping is defined by the Italian DPA as the massive collection of personal data from the web for the purpose of training generative artificial intelligence models. Specifically, whenever such phenomenon involves the collection of traceable information – linked to an identified or identifiable natural person – a data protection issue arises with reference to the identification of an appropriate legal basis for the processing of such data.

According to the guidelines, the assessment of the lawfulness of web scraping must be carried out on a case-by-case basis. Personal data are made available on the web as a result of a primary level processing by operators of online platforms as data controllers. Only then, third parties – often web robots or “bots” – may gather such data for different purposes while scraping the web. This is the reason why the Italian DPA addresses its guidelines to operators of online platforms: they are, in fact, the only ones able i) to more easily evaluate how data are used after being scraped from their platforms and ii) to implement measures on their platforms that may prevent or mitigate web scraping activity for purposes of training algorithms.

Possible precautions or enforcement actions identified by the Italian DPA are the following:

  • Creation of restricted areas, which can only be accessed after registration. In this way, certain personal data would be removed from public availability;
  • Inclusion of ad hoc clauses in the terms of service of the online platform expressly prohibiting the use of web scraping techniques;
  • Monitoring network traffic to detect any abnormal flow of data and adopting limits as countermeasures;
  • Direct intervention on bots (e.g. insertion on websites of CAPTCHA checks or monitoring log files to block undesirable users).  

Such measures should be adopted by the data controller after an independent assessment – in compliance with the accountability principle, which increasingly appears to govern new data protection legislation and strategies. At any rate, the Italian DPA acknowledges that, albeit useful, none of these measures can be expected to entirely prevent web scraping from happening.  

Processing Health Data: the Most Recent Amendment to Italian Privacy Code

The Italian “Privacy Code” (Legislative Decree No. 196/2003), which governs data protection in Italy together with the European GDPR, has recently been amended.

Law No. 56/2024, further implementing the National Recovery and Resilience Plan, intervened on section 110 of the Privacy Code, which deals with the processing of health-related data for the purposes of medical, biomedical or epidemiological scientific research.

Section 110 provides that consent of the data subject for the processing of health-related data for the purpose of medical, biomedical or epidemiological scientific research is not required when:

  • the research is carried out on the basis of legal provisions or European Union law, when processing is necessary for scientific research or statistical purposes, provided that an impact assessment is carried out pursuant to sections 35 and 36 of the GDPR; or
  • informing the data subject is impossible or involves a disproportionate effort, or would render impossible or seriously jeopardise the attainment of the purposes of the research.

In such cases – before the latest amendment – the data controller had to:

1) take appropriate measures to protect the rights, freedoms and interests of the data subject;

2) obtain a favorable opinion of the competent ethics committee; and

3) consult the Italian Data Protection Authority prior to processing.

The obligation to consult the Italian Data Protection Authority has now been repealed. Thus, there is no need to apply for the Authority’s clearance prior to processing health-related data (in those cases where consent of the data subject is not required under section 110 of the Privacy Code). 

This amendment may have a significant impact especially on retrospective studies for which informing data subjects is particularly burdensome. The data controller will, in fact, be able to proceed without the Authority’s permission. Nonetheless, the data controller will still have to comply with specific guarantees and ethical rules issued by the Authority – as specified by the amended section 110.

On the one hand, the amended section 110 seems to favor accountability and to soften the procedural requirements in processing health data for research purposes, making the overall procedure quicker. When it comes to “secondary use” of health data, the accountability approach should be considered strong enough to protect data and favorably welcomed, as it moves in the same direction of the European Health Data Space – which intends to provide a reliable and efficient system for the re-use of health data in areas as research and innovation.

On the other hand, though, the Italian Data Protection Authority has already issued some interim guarantees, specifying that data controllers – when processing health data related to deceased or non-contact subjects – must carry out and publish an impact assessment, pursuant to section 35 of the GDPR, notifying it to the Authority. It remains to be seen how the amendment will be handled by the Authority in practice: the effects of the simplification provided by the new version of section 110 may be diminished if the guarantees set forth by the Authority generate equally articulate procedures.      

Processing of personal and health data through apps and online platforms aimed at connecting HCPs and patients: the new digest of the Italian DPA

On March 2024, the Italian Data Protection Authority (“Italian DPA”) has issued a new digest (“Digest”) relating to the processing of personal data, whether or not concerning health data pursuant to section 9 of the GDPR, carried out through the utilization of platforms, accessible through apps or web pages (“Platforms”), that aim to facilitate connection between healthcare professionals (“HCPs”) and patients.

The use of such Platforms poses high risks to the protection and security of patients’ personal data, and in particular health-related data, given that the latter are subject to an enhanced protection regime set forth by section 9 of the GDPR. 

The Digest seeks to summarize the applicable data protection rules that may be followed, and defines the roles of the parties, as well as the legal bases, applicable to (i) the processing of personal data of the users by Platform’s owners; (ii) the processing of HCP’s personal data by Platform’s owners; and (iii) the processing of health data of the patients by the Platform’s owner and by the HCPs.

Additional guidance is provided as to:

  • The necessity for the Platform’s owner to carry out (and periodically update) a data protection impact assessment (DPIA) pursuant to section 35 GDPR, since the use of Platforms determine a “high risk” processing of personal data, as such kind of treatment automatically meets the criteria issued by the European Data Protection Board for the identification of the list of data processing that may be deemed subject to the duty to perform a DPIA;
  • Which information notices should be provided, by who and to whom, as well as the contents that such information notices should have in each case, according to sections 13 and 14 GDPR;
  • The specific rules applicable to cross-border data transfers and data transfer to third countries.

Lastly, the Digest includes a list of the most common measures that are taken by the data controllers to ensure an appropriate level of technical and organizational measures to meet the GDPR requirements, such as encryption, verification of the qualification of the HCPs that seek to enroll within the Platform; strengthened authentication systems, monitoring systems aimed at preventing unauthorized access or loss of data.

The Digest should be very welcomed by the Platform’s owners, as it now gives a reliable and complete legal frame that may be followed in order to set up a Platform in a way which is compliant with the GDPR principles.

A New European Digital Identity

On March 26, 2024 the Council adopted a new framework for a European digital identity (eID).

Background. In June 2021, the Commission proposed a framework for a eID that would be available to all EU citizens, residents, and businesses, via a European digital identity wallet (EDIWs). The new framework amends the 2014 regulation on electronic identification and trust services for electronic transactions in the internal market (eIDAS regulation n. 910/2014), which laid the foundations for safely accessing public services and carrying out transactions online and across borders in the EU. According to the Commission, the revision of the regulation is needed since only 14% of key public service providers across all Member States allow cross-border authentication with an e-Identity system.

Entry into Force.  The revised regulation will be published in the EU’s Official Journal and will enter into force 20 days after its publication. The regulation will be fully implemented by 2026.

Digital Wallets.  Member States will have to offer citizens and businesses digital wallets that will be able to link their national digital identities with proof of other personal attributes (e.g., driving license, bank account). Citizens will be able to prove their identity simply using their mobile phones.

EU-wide Recognition.  The new EDIWs will enable all citizens to access online services with their national digital identification, which will be recognised throughout the EU. Uses of EDIWs include: opening a bank account, checking in in a hotel, filing tax returns, storing a medical prescription, signing legal documents.

The Right to Digital Identity.  The fundamental purpose of the regulation is to establish the right to a digital identity for Union citizens and to enhance their privacy.

Main features of EDIWs.  According to the new regulation:

• the use of EIDWs shall be voluntary and shall be provided directly, under mandate or recognition by a Member State;

• EDIWs shall enable the user to (1) securely request, store, delete, share person identification data and to authenticate to relying parties; (2) generate pseudonyms and store them encrypted; (3) access a log of all transactions and report to the national authority any unlawful or suspicious request for data; (4) sign or seal by means of qualified electronic signatures; (5) exercise the rights to data portability.

Privacy.  Privacy will be safeguarded through different technologies, such as cryptographic methods allowing to validate whether a given statement based on the person’s identification data is true without revealing any data on which that statement is based. Moreover, EDIWswillhave a dashboard embedded into the design to allow users to request the immediate erasure of any personal data pursuant to Article 17 of the Regulation (EU) 2016/679.

GDPR Cross-Border Complaints: a New Regulation Proposal Attempts to Harmonize the Procedural Rules Among the Member States

On July 4, 2023, the European Commission has issued a proposal for a new EU regulation laying down additional procedural rules aimed at ensuring a better and uniform enforcement of the GDPR among the Member States, especially with regard to the handling of cross-border complaints (“Proposal”).

The Proposal has been inspired by the findings of the reports issued by the European Commission and the European Data Protection Board concerning the status of the application of the GDPR among the Member States. Such reports stressed the need to make the handling of cross-border complaints more efficient and levelled across the EU, since the proceedings followed by local data protection authorities (“LDPA”) have been found to be differently designed and may thus lead to different application of the GDPR provisions.

The main features of the Proposal may be summarized as follows:

  • Submission and handling of cross-border complaints: The Proposal aims at removing the existing differences among the procedural rules applied by different LDPAs, namely with regard to how complaints on cross-boarder issues should be filed and which contents they should have. In such respect, a template for the filing of cross-border complaints – including a standard pre-determined set of information to be provided – has been drafted. The Proposal further specifies procedural rules for the rejection of complaints in cross-border cases and clarifies the roles and rights of the lead LDPA and of any other concerned LDPAs. A system of amicable settlement of complaints is also encouraged.
  • Procedural rights of parties under investigation: The Proposal further aims at harmonizing and strengthening the rights of defence in the course of cross-border investigations and proceedings. Specifically, the Proposal recognizes an extended right of the parties to be heard at key stages of the proceedings and imposes the creation of an administrative file and the parties’ rights of access to it.
  • Tools for cooperation between LDPAs: New tools have been designed to ease the building of consensus between the involved LDPAs on the main features of cross-border proceedings since their preliminary phase, in order to limit the recourse to the (time consuming) dispute resolution mechanism provided by section 65 GDPR only in few exceptional cases. LDPAs that are called to handle a cross-border complaint are required to provide other involved LDPAs with a summary of key issues”, wherethe main findings of facts and legal grounds underlying each complaint are set out. Concerned LDPAs will be able to provide their views on such summary and to raise “relevant and reasoned objections”, in which case a specific fast-track procedure is designed in order to ensure that disagreements among LDPAs are settled at the beginning of the process.
  • Acceleration of cross-border proceedings: Lastly, the Proposal, by imposing strict deadlines, aims to prevent undue delays within the proceedings.

At the moment it is still unclear whether the Proposal will be officially adopted and become a binding regulation. Certainly, it has been welcomed by the European Data Protection Board and by the European Data Protection Supervisor and may be a good opportunity to level the difference among Member States and make the proceedings more efficient.

GDPR Turns 5, and Trans-Atlantic Data Flow Remains a Headache

Happy birthday to the GDPR, who has turned 5 years old on May 25, 2023! Is the European Union (and, given the Brussels effect, perhaps the entire world) a better place than pre-GDPR? This is a difficult question. Surely there has been a lot more focus on data protection by companies. And one of the reasons why companies have attempted to comply (100% compliance appears to be an unachievable goal!) is the possibility of being sanctioned with “administrative fines up to 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher“. Clearly, while the very same GDPR language applies throughout the EU, data protection legislation is not yet harmonised, not even 5 years after its entry into force. About 30 articles of the GDPR allow Member States to depart from it. Interpretations of the Regulation also vary, so in many areas uniformity has given way to diversity (which, in this case, is not ideal).     Additionally, enforcement of the GDPR is entirely decentralized and data protection authorities have different views, differing resources and different strategies. A list of GDPR sanctions is regularly updated and since the Meta decision of May 22, 2023, the Irish Data Protection Commission has emerged as the champion. While it was previously criticized for “being too cozy to Big Tech”, it has issued the highest ever sanction, along with strong measures ordering Meta to stop further transfers of personal data from the EU to the US and to bring its processing operations of data already transferred to the US into compliance with the GDPR. The problem, once again, stems from the trans-Atlantic data flow from the EU to the US, and from the concerns that such EU personal data is subject to surveillance in the US, without any redress system for EU citizens. (Incidentally, thousands of companies, like Meta, may have the same problem).  The US and EU have yet to reach an agreement that would allow a safe flow of data (although there are hopes that progress will be achieved by July). Further, there is no guarantee that the European Court of Justice will not strike down any such new arrangement, like it did in the past (twice). Meanwhile, the post-GDPR world appears to strongly push towards data localization (or “sovereign cloud”), making data flows out of the EU to non-“adequate” countries very scary. 

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”