Tag Archives: data protection

Italy’s New AI Law: A Boost for Healthcare Research?


Italy has recently enacted its own “Artificial Intelligence Act”, set to take effect on October 10, 2025.

You might be wondering: Did we really need another layer of AI regulation? That was our initial reaction, too. But a closer look reveals that the Italian AI Law introduces several interesting provisions, especially in the healthcare sector, that could facilitate research for both public and private entities. Here are some highlights:

1. Healthcare Data Processing as Based on Public Interest

The law explicitly recognizes that the processing of health-related personal data by:

  • Public or private non-profit entities,
  • Research hospitals (IRCCS),
  • Private entities collaborating with the above for healthcare research,

is of “substantial public interest.” This significantly expands the scope of Article 9(2)(g) of the GDPR, offering a clearer legal basis for processing sensitive data in research contexts.

2. Secondary Use of Data

The law introduces a simplified regime for the secondary use of personal data without direct identifiers. In particular:

  • No new consent required, as long as data subjects are informed (even via a website).
  • Automatic authorization unless blocked by the Data Protection Authority within 30 days of notification.

This provision applies only to the entities mentioned above so it is limited in scope, but in any case significantly strengthens the framework for nonprofit research projects.

3. Freedom to Anonymize, Pseudonymize and Synthesize

Under Article 8(4) of the AI Law, processing data for anonymization, pseudonymization, or synthesization is always permitted, provided the data subject is informed. This is a major step forward in enabling privacy-preserving AI research.

4. Guidelines and Governance

The law delegates the creation of technical guidelines to:

  • AGENAS – for anonymization and synthetic data generation.
  • Ministry of Health – for processing health data in research, including AI applications.

It also establishes a national AI platform at AGENAS, which will act as the data controller for personal data collected and generated within the platform.


Final Thoughts

While the GDPR aimed to support research, its implementation often created legal uncertainty and operational hurdles. Italy’s AI Law appears to address some of these gaps, offering a more pragmatic and enabling framework for healthcare research.

Happy GDPR-compliant Xmas and a prosperous new year!

Winter recess is about to start. While we’ll all be resting, GDPR will not!

While we will all be recharging our batteries to tackle the challenges for the upcoming 2025, GDPR will not go on vacation, and will thus never be out-of-office!

Check out the following tips that the Italian Data Protection Authority has recently issued in order to avoid threats to your privacy rights during the upcoming vacations:

  • Are you receiving plenty of virtual greetings and commercial offers? Be careful about them, even if apparently sent by a friend or parent: they may contain viruses, obscure links or may hide tentative of phishing. Not all presents may be welcome.
  • Have you taken good family pictures that you wish to share on your social network? Don’t forget to ask consent of all depicted individuals. Is your grandpa going to provide his consent as well?
  • Have you filmed your children’s Christmas pageant? Keep it for yourself! You’d need consent from all depicted individuals for publishing (including from their parents in case of minors).
  • Are you wishing to download any specific Christmas-related app on your smartphone? Choose them carefully, check their issuer and the reviews. You may inadvertently be downloading the Grinch’s one!
  • Are you going away for a trip? Don’t share too much information and pictures on your social media about your time off, your house and your vehicles, as it may attract thieves. Only Santa Claus shall be allowed to break in without your consent!
  • Are you connecting with your hotel’s or restaurant’s Wi-Fi? Ask the staff about its security: they may not be protected enough.
  • Have you bought any “smart” presents for your little nephews? Check whether they may collect any personal data from their users. In the affirmative, make sure that they will not harm them in any way possible.

Our own additional tips: rest, enjoy good food, spend time with your loved ones, and get ready for 2025! We wish you happy holidays and a healthy and successful new year.

Gitti and Partners Life Sciences Team

Your Face at the Airport: the EDPB Weighs in on Face Boarding

As you wander around an airport waiting to travel for the summer, you may notice that your image is captured by various devices. This process, known as facial recognition or “face boarding”, has recently been the subject matter of an opinion by the EDPB https://www.edpb.europa.eu/edpb_it, which issued an opinion (no. 11/2024, https://www.edpb.europa.eu/our-work-tools/our-documents/opinion-board-art-64/opinion-112024-use-facial-recognition-streamline_en, pursuant to article 64 of the GDPR) – on the processing of data obtained in airports using facial recognition to streamline airport passenger’s flow.

The EDPB assessed the compatibility of such data processing with:

  • article 5(1)(e) and (f) of the GDPR on storage limitation and integrity and confidentiality;
  • article 25 of the GDPR on privacy by default and privacy by design;
  • article 32 of the GDPR on security of processing.

The opinion takes into account four different scenarios:

  • Scenario 1: Storage of an enrolled biometric template – which is a set of biometric features stored in a database for future authentication purposes – only in the hands of the passenger.

Enrolment consists in recording – by each passenger who has consented to such processing – the biometric template and ID necessary for the processing, on the passenger’s device. Neither the passengers’ ID, nor their biometric data are retained by the airport operator after the enrolment process.

The passenger is authenticated when going through specific checkpoints at the airport (equipped with QR scanners and cameras), through the use of a QR code produced by the passenger’s device, where the biometric template is stored.

The EDPB opinion concludes that such processing could be considered in principle compatible with article 5(1)(f), 25 and 32 of the GDPR (nonetheless, appropriate safeguards must be implemented, including an impact assessment).

  • Scenario 2: centralized storage of an enrolled biometric template in an encrypted form, stored in a database within the airport premises and with a key solely in the passenger’s hands.

The enrolment is controlled by the airport operator and consists in generating ID and biometric data that is encrypted with a key/ secret. The database is stored within the airport premises, under the control of the airport operator. Individual-specific encryption keys/ secrets are stored only on the individual’s device

Passengers are authenticated when going through specific checkpoints, equipped with a control pod, a QR scanner and a camera. The passenger’s data are sent to the database to request the encrypted template, which is then checked locally on the pod and/or user’s device.

The opinion concludes that such processing could be considered in principle compatible with article 5(1)(e)(f), 25 and 32 of the GDPR subject to appropriate safeguards. In fact, the intrusiveness from such processing through a centralized system can be counterbalanced by the involvement of the passengers, who hold control of the key to their encrypted data.

  • Scenario 3: centralized storage of an enrolled biometric template in a database within the airport, under the control of the airport operator and Scenario 4: centralized storage of an enrolled biometric template in a cloud, under the control of the airline company or its cloud service provider.

The enrolment is done either in a remote mode or at airport terminals.

At the airport passengers go through dedicated control pods equipped with a camera. Biometric data is sent to the centralized database or to the cloud server – where the matching of the data is processed. The biometric matching is only performed when the passengers present themselves at pre-defined control points at the airport, but the data processing itself is done in the cloud or in centralized databases.

The EDPB considers that the use of biometric data for identification purposes in large central databases, as in Scenarios 3 and 4, interfere with the fundamental rights of data subjects and could possibly entail serious consequences. As such, Scenarios 3 and 4 are not compatible with article 25 of the GDPR because they imply the search of passengers within a central database, by processing each biometric sample captured. Also, taking into account the state of the art, the measures envisaged in such Scenarios would not ensure an appropriate level of security under article 5(1)(f) of the GDPR.

In conclusion, the EDPB regards with suspicion the processing (through matching-and-authenticating process) of biometric templates of the passengers when it happens in centralized storage tools (databases or clouds). The EDPB regards that this increases risks for the security of data, requires the processing of much more data and does not leave passengers in control of the data.

Processing Health Data: the Most Recent Amendment to Italian Privacy Code

The Italian “Privacy Code” (Legislative Decree No. 196/2003), which governs data protection in Italy together with the European GDPR, has recently been amended.

Law No. 56/2024, further implementing the National Recovery and Resilience Plan, intervened on section 110 of the Privacy Code, which deals with the processing of health-related data for the purposes of medical, biomedical or epidemiological scientific research.

Section 110 provides that consent of the data subject for the processing of health-related data for the purpose of medical, biomedical or epidemiological scientific research is not required when:

  • the research is carried out on the basis of legal provisions or European Union law, when processing is necessary for scientific research or statistical purposes, provided that an impact assessment is carried out pursuant to sections 35 and 36 of the GDPR; or
  • informing the data subject is impossible or involves a disproportionate effort, or would render impossible or seriously jeopardise the attainment of the purposes of the research.

In such cases – before the latest amendment – the data controller had to:

1) take appropriate measures to protect the rights, freedoms and interests of the data subject;

2) obtain a favorable opinion of the competent ethics committee; and

3) consult the Italian Data Protection Authority prior to processing.

The obligation to consult the Italian Data Protection Authority has now been repealed. Thus, there is no need to apply for the Authority’s clearance prior to processing health-related data (in those cases where consent of the data subject is not required under section 110 of the Privacy Code). 

This amendment may have a significant impact especially on retrospective studies for which informing data subjects is particularly burdensome. The data controller will, in fact, be able to proceed without the Authority’s permission. Nonetheless, the data controller will still have to comply with specific guarantees and ethical rules issued by the Authority – as specified by the amended section 110.

On the one hand, the amended section 110 seems to favor accountability and to soften the procedural requirements in processing health data for research purposes, making the overall procedure quicker. When it comes to “secondary use” of health data, the accountability approach should be considered strong enough to protect data and favorably welcomed, as it moves in the same direction of the European Health Data Space – which intends to provide a reliable and efficient system for the re-use of health data in areas as research and innovation.

On the other hand, though, the Italian Data Protection Authority has already issued some interim guarantees, specifying that data controllers – when processing health data related to deceased or non-contact subjects – must carry out and publish an impact assessment, pursuant to section 35 of the GDPR, notifying it to the Authority. It remains to be seen how the amendment will be handled by the Authority in practice: the effects of the simplification provided by the new version of section 110 may be diminished if the guarantees set forth by the Authority generate equally articulate procedures.      

Processing of personal and health data through apps and online platforms aimed at connecting HCPs and patients: the new digest of the Italian DPA

On March 2024, the Italian Data Protection Authority (“Italian DPA”) has issued a new digest (“Digest”) relating to the processing of personal data, whether or not concerning health data pursuant to section 9 of the GDPR, carried out through the utilization of platforms, accessible through apps or web pages (“Platforms”), that aim to facilitate connection between healthcare professionals (“HCPs”) and patients.

The use of such Platforms poses high risks to the protection and security of patients’ personal data, and in particular health-related data, given that the latter are subject to an enhanced protection regime set forth by section 9 of the GDPR. 

The Digest seeks to summarize the applicable data protection rules that may be followed, and defines the roles of the parties, as well as the legal bases, applicable to (i) the processing of personal data of the users by Platform’s owners; (ii) the processing of HCP’s personal data by Platform’s owners; and (iii) the processing of health data of the patients by the Platform’s owner and by the HCPs.

Additional guidance is provided as to:

  • The necessity for the Platform’s owner to carry out (and periodically update) a data protection impact assessment (DPIA) pursuant to section 35 GDPR, since the use of Platforms determine a “high risk” processing of personal data, as such kind of treatment automatically meets the criteria issued by the European Data Protection Board for the identification of the list of data processing that may be deemed subject to the duty to perform a DPIA;
  • Which information notices should be provided, by who and to whom, as well as the contents that such information notices should have in each case, according to sections 13 and 14 GDPR;
  • The specific rules applicable to cross-border data transfers and data transfer to third countries.

Lastly, the Digest includes a list of the most common measures that are taken by the data controllers to ensure an appropriate level of technical and organizational measures to meet the GDPR requirements, such as encryption, verification of the qualification of the HCPs that seek to enroll within the Platform; strengthened authentication systems, monitoring systems aimed at preventing unauthorized access or loss of data.

The Digest should be very welcomed by the Platform’s owners, as it now gives a reliable and complete legal frame that may be followed in order to set up a Platform in a way which is compliant with the GDPR principles.

GDPR Cross-Border Complaints: a New Regulation Proposal Attempts to Harmonize the Procedural Rules Among the Member States

On July 4, 2023, the European Commission has issued a proposal for a new EU regulation laying down additional procedural rules aimed at ensuring a better and uniform enforcement of the GDPR among the Member States, especially with regard to the handling of cross-border complaints (“Proposal”).

The Proposal has been inspired by the findings of the reports issued by the European Commission and the European Data Protection Board concerning the status of the application of the GDPR among the Member States. Such reports stressed the need to make the handling of cross-border complaints more efficient and levelled across the EU, since the proceedings followed by local data protection authorities (“LDPA”) have been found to be differently designed and may thus lead to different application of the GDPR provisions.

The main features of the Proposal may be summarized as follows:

  • Submission and handling of cross-border complaints: The Proposal aims at removing the existing differences among the procedural rules applied by different LDPAs, namely with regard to how complaints on cross-boarder issues should be filed and which contents they should have. In such respect, a template for the filing of cross-border complaints – including a standard pre-determined set of information to be provided – has been drafted. The Proposal further specifies procedural rules for the rejection of complaints in cross-border cases and clarifies the roles and rights of the lead LDPA and of any other concerned LDPAs. A system of amicable settlement of complaints is also encouraged.
  • Procedural rights of parties under investigation: The Proposal further aims at harmonizing and strengthening the rights of defence in the course of cross-border investigations and proceedings. Specifically, the Proposal recognizes an extended right of the parties to be heard at key stages of the proceedings and imposes the creation of an administrative file and the parties’ rights of access to it.
  • Tools for cooperation between LDPAs: New tools have been designed to ease the building of consensus between the involved LDPAs on the main features of cross-border proceedings since their preliminary phase, in order to limit the recourse to the (time consuming) dispute resolution mechanism provided by section 65 GDPR only in few exceptional cases. LDPAs that are called to handle a cross-border complaint are required to provide other involved LDPAs with a summary of key issues”, wherethe main findings of facts and legal grounds underlying each complaint are set out. Concerned LDPAs will be able to provide their views on such summary and to raise “relevant and reasoned objections”, in which case a specific fast-track procedure is designed in order to ensure that disagreements among LDPAs are settled at the beginning of the process.
  • Acceleration of cross-border proceedings: Lastly, the Proposal, by imposing strict deadlines, aims to prevent undue delays within the proceedings.

At the moment it is still unclear whether the Proposal will be officially adopted and become a binding regulation. Certainly, it has been welcomed by the European Data Protection Board and by the European Data Protection Supervisor and may be a good opportunity to level the difference among Member States and make the proceedings more efficient.

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!