Tag Archives: privacy

Happy GDPR-compliant Xmas and a prosperous new year!

Winter recess is about to start. While we’ll all be resting, GDPR will not!

While we will all be recharging our batteries to tackle the challenges for the upcoming 2025, GDPR will not go on vacation, and will thus never be out-of-office!

Check out the following tips that the Italian Data Protection Authority has recently issued in order to avoid threats to your privacy rights during the upcoming vacations:

  • Are you receiving plenty of virtual greetings and commercial offers? Be careful about them, even if apparently sent by a friend or parent: they may contain viruses, obscure links or may hide tentative of phishing. Not all presents may be welcome.
  • Have you taken good family pictures that you wish to share on your social network? Don’t forget to ask consent of all depicted individuals. Is your grandpa going to provide his consent as well?
  • Have you filmed your children’s Christmas pageant? Keep it for yourself! You’d need consent from all depicted individuals for publishing (including from their parents in case of minors).
  • Are you wishing to download any specific Christmas-related app on your smartphone? Choose them carefully, check their issuer and the reviews. You may inadvertently be downloading the Grinch’s one!
  • Are you going away for a trip? Don’t share too much information and pictures on your social media about your time off, your house and your vehicles, as it may attract thieves. Only Santa Claus shall be allowed to break in without your consent!
  • Are you connecting with your hotel’s or restaurant’s Wi-Fi? Ask the staff about its security: they may not be protected enough.
  • Have you bought any “smart” presents for your little nephews? Check whether they may collect any personal data from their users. In the affirmative, make sure that they will not harm them in any way possible.

Our own additional tips: rest, enjoy good food, spend time with your loved ones, and get ready for 2025! We wish you happy holidays and a healthy and successful new year.

Gitti and Partners Life Sciences Team

Can Corporate E-mail Accounts Be Used in Case of Litigation?

With an order of July 17, 2024, the Italian Data Protection Authority (“DPA”) has fined Selectra
S.p.A. Euro 80,000 for unlawful processing of personal data. The case originates from an
agent’s claim that Selectra (i) had maintained his email account active after the termination of
his collaboration with the company; (ii) had used a specific software (MailStore) to back up the
contents of his email account for three years; (iii) had used his data in a judicial proceeding, in
which he was accused, along with other individuals, of business secrets misappropriation and
further unlawful conduct.


The DPA reaffirmed various key principles, applicable to employees and self-employed
personnel:


– The DPA has offered some important guidelines concerning the balance between the right
to defense and the right to privacy. According to the DPA, it is admittable to access
personal data to protect one’s right in court, only if the process is already
pending before the court or there are realistic possibilities to start the claim
.


Corporate email accounts cannot be used as archives. It is a company’s duty to
introduce suitable document management systems capable of archiving documents and
employees/collaborators’ email accounts cannot be used for such purposes.


Personnel must be provided with an information notice which clarifies what is
processed,
on which basis and how. Selectra, instead, had backed up corporate email
accounts, with the possibility of retaining their contents for 3 years after termination of the
employment/collaboration contract, without offering any kind of information to its
employees and collaborators.


The DPA concludes that the right to privacy cannot be sacrificed in pursuit of abstract and
indeterminate protection purposes. Incidentally, the DPA emphasized again that it is
forbidden to use tools that carry out monitoring of employees’ activity in breach of
Article 4, L. 300/1970 (Italian Statute of Workers’ Rights), which admits the use of systems
for remote employee monitoring only for production, organizational, labour and safety needs
and after an agreement with trade unions. (Instead, Selectra, using the software MailStore,
was able to trace meticulously, and even after a long time, the activities carried out by
employees in breach of the Italian Statute of Workers’ Rights).

Your Face at the Airport: the EDPB Weighs in on Face Boarding

As you wander around an airport waiting to travel for the summer, you may notice that your image is captured by various devices. This process, known as facial recognition or “face boarding”, has recently been the subject matter of an opinion by the EDPB https://www.edpb.europa.eu/edpb_it, which issued an opinion (no. 11/2024, https://www.edpb.europa.eu/our-work-tools/our-documents/opinion-board-art-64/opinion-112024-use-facial-recognition-streamline_en, pursuant to article 64 of the GDPR) – on the processing of data obtained in airports using facial recognition to streamline airport passenger’s flow.

The EDPB assessed the compatibility of such data processing with:

  • article 5(1)(e) and (f) of the GDPR on storage limitation and integrity and confidentiality;
  • article 25 of the GDPR on privacy by default and privacy by design;
  • article 32 of the GDPR on security of processing.

The opinion takes into account four different scenarios:

  • Scenario 1: Storage of an enrolled biometric template – which is a set of biometric features stored in a database for future authentication purposes – only in the hands of the passenger.

Enrolment consists in recording – by each passenger who has consented to such processing – the biometric template and ID necessary for the processing, on the passenger’s device. Neither the passengers’ ID, nor their biometric data are retained by the airport operator after the enrolment process.

The passenger is authenticated when going through specific checkpoints at the airport (equipped with QR scanners and cameras), through the use of a QR code produced by the passenger’s device, where the biometric template is stored.

The EDPB opinion concludes that such processing could be considered in principle compatible with article 5(1)(f), 25 and 32 of the GDPR (nonetheless, appropriate safeguards must be implemented, including an impact assessment).

  • Scenario 2: centralized storage of an enrolled biometric template in an encrypted form, stored in a database within the airport premises and with a key solely in the passenger’s hands.

The enrolment is controlled by the airport operator and consists in generating ID and biometric data that is encrypted with a key/ secret. The database is stored within the airport premises, under the control of the airport operator. Individual-specific encryption keys/ secrets are stored only on the individual’s device

Passengers are authenticated when going through specific checkpoints, equipped with a control pod, a QR scanner and a camera. The passenger’s data are sent to the database to request the encrypted template, which is then checked locally on the pod and/or user’s device.

The opinion concludes that such processing could be considered in principle compatible with article 5(1)(e)(f), 25 and 32 of the GDPR subject to appropriate safeguards. In fact, the intrusiveness from such processing through a centralized system can be counterbalanced by the involvement of the passengers, who hold control of the key to their encrypted data.

  • Scenario 3: centralized storage of an enrolled biometric template in a database within the airport, under the control of the airport operator and Scenario 4: centralized storage of an enrolled biometric template in a cloud, under the control of the airline company or its cloud service provider.

The enrolment is done either in a remote mode or at airport terminals.

At the airport passengers go through dedicated control pods equipped with a camera. Biometric data is sent to the centralized database or to the cloud server – where the matching of the data is processed. The biometric matching is only performed when the passengers present themselves at pre-defined control points at the airport, but the data processing itself is done in the cloud or in centralized databases.

The EDPB considers that the use of biometric data for identification purposes in large central databases, as in Scenarios 3 and 4, interfere with the fundamental rights of data subjects and could possibly entail serious consequences. As such, Scenarios 3 and 4 are not compatible with article 25 of the GDPR because they imply the search of passengers within a central database, by processing each biometric sample captured. Also, taking into account the state of the art, the measures envisaged in such Scenarios would not ensure an appropriate level of security under article 5(1)(f) of the GDPR.

In conclusion, the EDPB regards with suspicion the processing (through matching-and-authenticating process) of biometric templates of the passengers when it happens in centralized storage tools (databases or clouds). The EDPB regards that this increases risks for the security of data, requires the processing of much more data and does not leave passengers in control of the data.

New Guidelines on Web Scraping

Pursuant to Article 57(1)(b) of the GDPR, on May 20, 2024 the Italian Data Protection Authority (“Italian DPA”) adopted guidelines [LINK] on web scraping, with the aim of providing guidance to operators of websites and online platforms, acting in Italy as data controllers of personal data made available online to the public.

Web scraping is defined by the Italian DPA as the massive collection of personal data from the web for the purpose of training generative artificial intelligence models. Specifically, whenever such phenomenon involves the collection of traceable information – linked to an identified or identifiable natural person – a data protection issue arises with reference to the identification of an appropriate legal basis for the processing of such data.

According to the guidelines, the assessment of the lawfulness of web scraping must be carried out on a case-by-case basis. Personal data are made available on the web as a result of a primary level processing by operators of online platforms as data controllers. Only then, third parties – often web robots or “bots” – may gather such data for different purposes while scraping the web. This is the reason why the Italian DPA addresses its guidelines to operators of online platforms: they are, in fact, the only ones able i) to more easily evaluate how data are used after being scraped from their platforms and ii) to implement measures on their platforms that may prevent or mitigate web scraping activity for purposes of training algorithms.

Possible precautions or enforcement actions identified by the Italian DPA are the following:

  • Creation of restricted areas, which can only be accessed after registration. In this way, certain personal data would be removed from public availability;
  • Inclusion of ad hoc clauses in the terms of service of the online platform expressly prohibiting the use of web scraping techniques;
  • Monitoring network traffic to detect any abnormal flow of data and adopting limits as countermeasures;
  • Direct intervention on bots (e.g. insertion on websites of CAPTCHA checks or monitoring log files to block undesirable users).  

Such measures should be adopted by the data controller after an independent assessment – in compliance with the accountability principle, which increasingly appears to govern new data protection legislation and strategies. At any rate, the Italian DPA acknowledges that, albeit useful, none of these measures can be expected to entirely prevent web scraping from happening.  

Processing Health Data: the Most Recent Amendment to Italian Privacy Code

The Italian “Privacy Code” (Legislative Decree No. 196/2003), which governs data protection in Italy together with the European GDPR, has recently been amended.

Law No. 56/2024, further implementing the National Recovery and Resilience Plan, intervened on section 110 of the Privacy Code, which deals with the processing of health-related data for the purposes of medical, biomedical or epidemiological scientific research.

Section 110 provides that consent of the data subject for the processing of health-related data for the purpose of medical, biomedical or epidemiological scientific research is not required when:

  • the research is carried out on the basis of legal provisions or European Union law, when processing is necessary for scientific research or statistical purposes, provided that an impact assessment is carried out pursuant to sections 35 and 36 of the GDPR; or
  • informing the data subject is impossible or involves a disproportionate effort, or would render impossible or seriously jeopardise the attainment of the purposes of the research.

In such cases – before the latest amendment – the data controller had to:

1) take appropriate measures to protect the rights, freedoms and interests of the data subject;

2) obtain a favorable opinion of the competent ethics committee; and

3) consult the Italian Data Protection Authority prior to processing.

The obligation to consult the Italian Data Protection Authority has now been repealed. Thus, there is no need to apply for the Authority’s clearance prior to processing health-related data (in those cases where consent of the data subject is not required under section 110 of the Privacy Code). 

This amendment may have a significant impact especially on retrospective studies for which informing data subjects is particularly burdensome. The data controller will, in fact, be able to proceed without the Authority’s permission. Nonetheless, the data controller will still have to comply with specific guarantees and ethical rules issued by the Authority – as specified by the amended section 110.

On the one hand, the amended section 110 seems to favor accountability and to soften the procedural requirements in processing health data for research purposes, making the overall procedure quicker. When it comes to “secondary use” of health data, the accountability approach should be considered strong enough to protect data and favorably welcomed, as it moves in the same direction of the European Health Data Space – which intends to provide a reliable and efficient system for the re-use of health data in areas as research and innovation.

On the other hand, though, the Italian Data Protection Authority has already issued some interim guarantees, specifying that data controllers – when processing health data related to deceased or non-contact subjects – must carry out and publish an impact assessment, pursuant to section 35 of the GDPR, notifying it to the Authority. It remains to be seen how the amendment will be handled by the Authority in practice: the effects of the simplification provided by the new version of section 110 may be diminished if the guarantees set forth by the Authority generate equally articulate procedures.      

Processing of personal and health data through apps and online platforms aimed at connecting HCPs and patients: the new digest of the Italian DPA

On March 2024, the Italian Data Protection Authority (“Italian DPA”) has issued a new digest (“Digest”) relating to the processing of personal data, whether or not concerning health data pursuant to section 9 of the GDPR, carried out through the utilization of platforms, accessible through apps or web pages (“Platforms”), that aim to facilitate connection between healthcare professionals (“HCPs”) and patients.

The use of such Platforms poses high risks to the protection and security of patients’ personal data, and in particular health-related data, given that the latter are subject to an enhanced protection regime set forth by section 9 of the GDPR. 

The Digest seeks to summarize the applicable data protection rules that may be followed, and defines the roles of the parties, as well as the legal bases, applicable to (i) the processing of personal data of the users by Platform’s owners; (ii) the processing of HCP’s personal data by Platform’s owners; and (iii) the processing of health data of the patients by the Platform’s owner and by the HCPs.

Additional guidance is provided as to:

  • The necessity for the Platform’s owner to carry out (and periodically update) a data protection impact assessment (DPIA) pursuant to section 35 GDPR, since the use of Platforms determine a “high risk” processing of personal data, as such kind of treatment automatically meets the criteria issued by the European Data Protection Board for the identification of the list of data processing that may be deemed subject to the duty to perform a DPIA;
  • Which information notices should be provided, by who and to whom, as well as the contents that such information notices should have in each case, according to sections 13 and 14 GDPR;
  • The specific rules applicable to cross-border data transfers and data transfer to third countries.

Lastly, the Digest includes a list of the most common measures that are taken by the data controllers to ensure an appropriate level of technical and organizational measures to meet the GDPR requirements, such as encryption, verification of the qualification of the HCPs that seek to enroll within the Platform; strengthened authentication systems, monitoring systems aimed at preventing unauthorized access or loss of data.

The Digest should be very welcomed by the Platform’s owners, as it now gives a reliable and complete legal frame that may be followed in order to set up a Platform in a way which is compliant with the GDPR principles.

A New European Digital Identity

On March 26, 2024 the Council adopted a new framework for a European digital identity (eID).

Background. In June 2021, the Commission proposed a framework for a eID that would be available to all EU citizens, residents, and businesses, via a European digital identity wallet (EDIWs). The new framework amends the 2014 regulation on electronic identification and trust services for electronic transactions in the internal market (eIDAS regulation n. 910/2014), which laid the foundations for safely accessing public services and carrying out transactions online and across borders in the EU. According to the Commission, the revision of the regulation is needed since only 14% of key public service providers across all Member States allow cross-border authentication with an e-Identity system.

Entry into Force.  The revised regulation will be published in the EU’s Official Journal and will enter into force 20 days after its publication. The regulation will be fully implemented by 2026.

Digital Wallets.  Member States will have to offer citizens and businesses digital wallets that will be able to link their national digital identities with proof of other personal attributes (e.g., driving license, bank account). Citizens will be able to prove their identity simply using their mobile phones.

EU-wide Recognition.  The new EDIWs will enable all citizens to access online services with their national digital identification, which will be recognised throughout the EU. Uses of EDIWs include: opening a bank account, checking in in a hotel, filing tax returns, storing a medical prescription, signing legal documents.

The Right to Digital Identity.  The fundamental purpose of the regulation is to establish the right to a digital identity for Union citizens and to enhance their privacy.

Main features of EDIWs.  According to the new regulation:

• the use of EIDWs shall be voluntary and shall be provided directly, under mandate or recognition by a Member State;

• EDIWs shall enable the user to (1) securely request, store, delete, share person identification data and to authenticate to relying parties; (2) generate pseudonyms and store them encrypted; (3) access a log of all transactions and report to the national authority any unlawful or suspicious request for data; (4) sign or seal by means of qualified electronic signatures; (5) exercise the rights to data portability.

Privacy.  Privacy will be safeguarded through different technologies, such as cryptographic methods allowing to validate whether a given statement based on the person’s identification data is true without revealing any data on which that statement is based. Moreover, EDIWswillhave a dashboard embedded into the design to allow users to request the immediate erasure of any personal data pursuant to Article 17 of the Regulation (EU) 2016/679.

GDPR Cross-Border Complaints: a New Regulation Proposal Attempts to Harmonize the Procedural Rules Among the Member States

On July 4, 2023, the European Commission has issued a proposal for a new EU regulation laying down additional procedural rules aimed at ensuring a better and uniform enforcement of the GDPR among the Member States, especially with regard to the handling of cross-border complaints (“Proposal”).

The Proposal has been inspired by the findings of the reports issued by the European Commission and the European Data Protection Board concerning the status of the application of the GDPR among the Member States. Such reports stressed the need to make the handling of cross-border complaints more efficient and levelled across the EU, since the proceedings followed by local data protection authorities (“LDPA”) have been found to be differently designed and may thus lead to different application of the GDPR provisions.

The main features of the Proposal may be summarized as follows:

  • Submission and handling of cross-border complaints: The Proposal aims at removing the existing differences among the procedural rules applied by different LDPAs, namely with regard to how complaints on cross-boarder issues should be filed and which contents they should have. In such respect, a template for the filing of cross-border complaints – including a standard pre-determined set of information to be provided – has been drafted. The Proposal further specifies procedural rules for the rejection of complaints in cross-border cases and clarifies the roles and rights of the lead LDPA and of any other concerned LDPAs. A system of amicable settlement of complaints is also encouraged.
  • Procedural rights of parties under investigation: The Proposal further aims at harmonizing and strengthening the rights of defence in the course of cross-border investigations and proceedings. Specifically, the Proposal recognizes an extended right of the parties to be heard at key stages of the proceedings and imposes the creation of an administrative file and the parties’ rights of access to it.
  • Tools for cooperation between LDPAs: New tools have been designed to ease the building of consensus between the involved LDPAs on the main features of cross-border proceedings since their preliminary phase, in order to limit the recourse to the (time consuming) dispute resolution mechanism provided by section 65 GDPR only in few exceptional cases. LDPAs that are called to handle a cross-border complaint are required to provide other involved LDPAs with a summary of key issues”, wherethe main findings of facts and legal grounds underlying each complaint are set out. Concerned LDPAs will be able to provide their views on such summary and to raise “relevant and reasoned objections”, in which case a specific fast-track procedure is designed in order to ensure that disagreements among LDPAs are settled at the beginning of the process.
  • Acceleration of cross-border proceedings: Lastly, the Proposal, by imposing strict deadlines, aims to prevent undue delays within the proceedings.

At the moment it is still unclear whether the Proposal will be officially adopted and become a binding regulation. Certainly, it has been welcomed by the European Data Protection Board and by the European Data Protection Supervisor and may be a good opportunity to level the difference among Member States and make the proceedings more efficient.

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!