Tag Archives: data protection

Processing of personal and health data through apps and online platforms aimed at connecting HCPs and patients: the new digest of the Italian DPA

On March 2024, the Italian Data Protection Authority (“Italian DPA”) has issued a new digest (“Digest”) relating to the processing of personal data, whether or not concerning health data pursuant to section 9 of the GDPR, carried out through the utilization of platforms, accessible through apps or web pages (“Platforms”), that aim to facilitate connection between healthcare professionals (“HCPs”) and patients.

The use of such Platforms poses high risks to the protection and security of patients’ personal data, and in particular health-related data, given that the latter are subject to an enhanced protection regime set forth by section 9 of the GDPR. 

The Digest seeks to summarize the applicable data protection rules that may be followed, and defines the roles of the parties, as well as the legal bases, applicable to (i) the processing of personal data of the users by Platform’s owners; (ii) the processing of HCP’s personal data by Platform’s owners; and (iii) the processing of health data of the patients by the Platform’s owner and by the HCPs.

Additional guidance is provided as to:

  • The necessity for the Platform’s owner to carry out (and periodically update) a data protection impact assessment (DPIA) pursuant to section 35 GDPR, since the use of Platforms determine a “high risk” processing of personal data, as such kind of treatment automatically meets the criteria issued by the European Data Protection Board for the identification of the list of data processing that may be deemed subject to the duty to perform a DPIA;
  • Which information notices should be provided, by who and to whom, as well as the contents that such information notices should have in each case, according to sections 13 and 14 GDPR;
  • The specific rules applicable to cross-border data transfers and data transfer to third countries.

Lastly, the Digest includes a list of the most common measures that are taken by the data controllers to ensure an appropriate level of technical and organizational measures to meet the GDPR requirements, such as encryption, verification of the qualification of the HCPs that seek to enroll within the Platform; strengthened authentication systems, monitoring systems aimed at preventing unauthorized access or loss of data.

The Digest should be very welcomed by the Platform’s owners, as it now gives a reliable and complete legal frame that may be followed in order to set up a Platform in a way which is compliant with the GDPR principles.

GDPR Cross-Border Complaints: a New Regulation Proposal Attempts to Harmonize the Procedural Rules Among the Member States

On July 4, 2023, the European Commission has issued a proposal for a new EU regulation laying down additional procedural rules aimed at ensuring a better and uniform enforcement of the GDPR among the Member States, especially with regard to the handling of cross-border complaints (“Proposal”).

The Proposal has been inspired by the findings of the reports issued by the European Commission and the European Data Protection Board concerning the status of the application of the GDPR among the Member States. Such reports stressed the need to make the handling of cross-border complaints more efficient and levelled across the EU, since the proceedings followed by local data protection authorities (“LDPA”) have been found to be differently designed and may thus lead to different application of the GDPR provisions.

The main features of the Proposal may be summarized as follows:

  • Submission and handling of cross-border complaints: The Proposal aims at removing the existing differences among the procedural rules applied by different LDPAs, namely with regard to how complaints on cross-boarder issues should be filed and which contents they should have. In such respect, a template for the filing of cross-border complaints – including a standard pre-determined set of information to be provided – has been drafted. The Proposal further specifies procedural rules for the rejection of complaints in cross-border cases and clarifies the roles and rights of the lead LDPA and of any other concerned LDPAs. A system of amicable settlement of complaints is also encouraged.
  • Procedural rights of parties under investigation: The Proposal further aims at harmonizing and strengthening the rights of defence in the course of cross-border investigations and proceedings. Specifically, the Proposal recognizes an extended right of the parties to be heard at key stages of the proceedings and imposes the creation of an administrative file and the parties’ rights of access to it.
  • Tools for cooperation between LDPAs: New tools have been designed to ease the building of consensus between the involved LDPAs on the main features of cross-border proceedings since their preliminary phase, in order to limit the recourse to the (time consuming) dispute resolution mechanism provided by section 65 GDPR only in few exceptional cases. LDPAs that are called to handle a cross-border complaint are required to provide other involved LDPAs with a summary of key issues”, wherethe main findings of facts and legal grounds underlying each complaint are set out. Concerned LDPAs will be able to provide their views on such summary and to raise “relevant and reasoned objections”, in which case a specific fast-track procedure is designed in order to ensure that disagreements among LDPAs are settled at the beginning of the process.
  • Acceleration of cross-border proceedings: Lastly, the Proposal, by imposing strict deadlines, aims to prevent undue delays within the proceedings.

At the moment it is still unclear whether the Proposal will be officially adopted and become a binding regulation. Certainly, it has been welcomed by the European Data Protection Board and by the European Data Protection Supervisor and may be a good opportunity to level the difference among Member States and make the proceedings more efficient.

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!

Google Analytics under Scrutiny by Italian Data Protection Authority

The second issue of our summer series focuses on the recent decision by the Italian Data Protection Authority, which affects all users of the Google Analytics services in Italy, as well as other similar services that entail the transfer of users’ personal data to the United States.

Read our slides to understand what actions are available to you.

Check Your Website’s Compliance with New Rules on Cookies

The Italian Data Protection Authority’s new guidelines for the processing of cookies are in force. Does your website comply? Find out if the answer is yes (or if you need adjustments) through the Q&A below.

On January 9, 2022, the new guidelines for processing of cookies and other online tracking instruments issued by the Italian DPA have officially entered into force. Take this test to check if you are already compliant.

Q: What kind of cookies are you currently using on your website?

A: The Italian DPA has divided the cookies currently in use in 3 categories:

  • Technical cookies: these cookies are the ones strictly necessary to a service provider for the dispensing of a service requested by users.
  • Profiling cookies: these cookies are the ones used to create clusters of users, by associating them with specific actions or behavioral patterns. Such cookies are mainly aimed at modulating the delivery of services provided to the user in an increasingly personalized way, as well as to carry out targeted advertising activity.
  • Analytic cookies: these cookies are the ones which are aimed at evaluating the effectiveness of the services offered or to measure user “traffic” on the website, by memorizing users’ online activities within the website. These cookies are mainly provided by third party suppliers.

Q: What should I do in case I use TECHNICAL COOKIES?

A: Technical cookies are not subject to any prior consent by the users. This means that you just need to provide the users with a specific cookie policy information, having the details set forth by article 13 of the GDPR. Such policy may also be contained on a specific section of your general privacy policy information.

Q: What should I do in case I use PROFILING COOKIES?

A: Profiling cookies may be used only upon prior consent by the users. You may obtain users’ consents by implementing a cookie banner that will pop up on your website as soon as users log your online page.

Q: What should I do in case I use ANALYTIC COOKIES?

A: Analytic cookies can be processed without any consent by users only if they do not allow any identification (direct identification – i.e. “singling out” – of the person concerned should not be achieved), and if they are used for the production of aggregate data only. Otherwise, they need to be expressly authorized.

Usually, analytical cookies are provided by third parties. In such case, you must provide, within your cookie policy notice, an updated list of all the third party cookies that are implemented within your website.

Q: How do I collect consent by users, when mandatory?

A: You may set up a cookie bannerthat will pop up on your website when users log your online page.

Q: How to draft a cookie banner?

A: First and foremost, cookie banners must be user-friendly and immediately visible. The dimensions of the banner must be neither too small nor too big, if compared with the kind of device used. Their wording must also be simple and easy to understand. In addition, cookie banners must contain a link to the cookie policy notice. No profiling cookies can be implemented before consent by the user. Only technical cookies may be pre-implemented.

Q: Do I have to grant users the possibility to modify their choices?

A: Yes, a specific section on the website must always be included to the end of consenting users to modify their first decisions.

Q: Can I obtain consent by users in other ways?

A: Consent by the user must be free and unambiguous, but there is no mandatory way to obtain consent by the users: you may implement your own system, in accordance with accountability principles set forth by the GDPR so long as consent is unambiguous and through a positive act of the user (“opt in”). No form of implicit consent is acceptable.

Q: Can I propose the banner again in case the user has declined consent?

A: The excessive and redundant use of banners requesting consent is not allowed – except for certain specific exceptions – since this may bring the user to give consent for the sole purpose of interrupting the pop-up of the banner.

Q: What about “cookie walls” and “scroll down”?

A: Don’t use them! A “cookie wall” is a mechanism by virtue of which the denial of the consent by users prevents them from accessing the website entirely. A “scroll down” system assumes the implied consent of the user when browsing of the website without expressing any choice with regard to cookies consent is continued. Neither cookie walls nor scroll down systems are compliant, since they are not aimed at obtaining an express consent by the user.

All clear? If not, reach out to us!

Facial Recognition Technology: Are We Close to a Turning Point?

When people think about facial recognition technology (“FRT”), they immediately imagine the use of their faces to unlock their smartphones. But this technology is far more complicated, useful and potentially dangerous.

First, it is important to understand the difference among “facial detection”, “facial characterization”, “facial identification” and “facial verification”. Such terms have been defined by the non-profit organization Future of Privacy Forum (https://fpf.org/wp-content/uploads/2019/03/Final-Privacy-Principles-Edits-1.pdf) as follows:

  • Facial detection simply distinguishes the presence of a human face and/or facial characteristics without creating or deriving a facial template.
  • In facial characterization the system uses an automated or semi-automated process to discern a data subject’s general demographic information or emotional state, without creating a unique identifier tracked over time.
  • Facial Identification is also known as “one-to-many” matching because it searches a database for a reference matching a submitted facial template and returns a corresponding identity.
  • The last one, facial verification, is called “one-to-one” verification because it confirms an individual’s claimed identity by comparing the template generated from a submitted facial image with a specific known template generated from a previously enrolled facial image.

There are many possible uses of facial recognition. In the private sector FRT may be used to keep track of employees’ time and attendance, identify shoppers’ patterns inside stores, implement smart homes, etc. In the public sector, FRT may be used to monitor protests, identify suspects in security footage, check claimed identities at borders, etc.

This relatively new technology brings, besides a wide range of possible implementations, significant concerns regarding privacy, accuracy, race and gender disparities, data storage and security, misuse. For instance, depending on the quality of images compared, people may be falsely identified. In addition to that, in its current state, FRT is less accurate when identifying women compared to men, young people compared to older people, people of color compared to white people. Privacy is certainly another concern: without strong policies it is unclear how long these images might be stored, who might gain access to them or what they can be used for; not to mention that this technology makes far easier for government entities to surveil citizens and potentially intrude into their lives (see “Early Thought & Recommendations Regarding Face Recognition Technology”, First report of the AXON AI and policing technology Ethics Board https://www.policingproject.org/axon-fr).

Once the possible implementations and the related risks are understood, the worldwide lack of regulation becomes even more surprising.

Within the European Union, the General Data Protection Regulation obviously applies to FRT. Furthermore, “Guidelines on Facial Recognition” have been released on January 28, 2021 by the Consultative Committee of the Council of Europe with regard to automatic processing of personal data (https://rm.coe.int/guidelines-on-facial-recognition/1680a134f3). This latter document includes:

  • Guidelines for legislators and decision-makers;
  • Guidelines for developers, manufacturers and service providers;
  • Guidelines for entities using FRT;
  • Rights of data subject.

When it comes to Italy, particular attention has been drawn by several decisions of the Italian Data Protection Authority on the topic. Recognizing the innovative potential of FRT as well as its riskiness for individual rights, the Authority adopted a more permissive approach regarding the private sector’s use of FRT, while issuing stricter decisions with regard to the use of FRT by public authorities. For instance, the Authority allowed the use of FRT by police forces for purposes of identifying individuals among archived images, but prohibited real-time surveillance using the same technology (see https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9040256 and https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9575877). On the other hand, the Authority allowed one airport to implement FRT for purposes of improving efficiency in the management of the flow of passengers, so long as images of individuals were not stored (see https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/8789277).

The European Commission, in light of the complexity of the situation and the necessity of a strong and harmonised legislative action, presented on April 21, 2021 its “Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence” (https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021PC0206). This Proposal was already the subject, on June 18, 2021, of a EDPB and EDPSs’ joint-opinion (https://edpb.europa.eu/our-work-tools/our-documents/edpbedps-joint-opinion/edpb-edps-joint-opinion-52021-proposal_en), in which they called for a general ban on the use of FRT for:

  • Automated recognition of human features in publicly accessible spaces;
  • Categorization of individuals into clusters according to ethnicity, gender, etc., based on biometric features;
  • Inference of individuals’ emotions.

What the European Commission is doing is an example of a more globally widespread legislators’ attitude towards artificial intelligence in general and FRT in particular. These technologies are more and more in our lives and are constantly evolving. Consequently, there is an increasing request, both from public and private subjects, for clear rules to govern this new technology and ensure that individual rights are safeguarded. Hopefully in the next months/years the situation will become clearer.

Flavio Monfrini / Michele Galluccio

Web Cookies’ Processing: New Guidelines by the Italian DPA

On June 10, 2021 the Italian DPA has officially issued new guidelines for the processing of cookies and other online tracking instruments. Such newly-issued guidelines are aimed at compliance with principles set forth by the GDPR, as well as by the recently issued contributions of the European Data Protection Board. The new guidelines complement and update the previous ones issued in 2014.

New provisions mainly regard how consent is acquired and information to be provided to interested subject. In fact:

  • consent by the user must be given in accordance with principles of freedom and unambiguousness. Accordingly, the use of methods that do not comply with such principles, such as the “scrolling-down” and the “cookie-wall”, are unlawful and void;
  • the “cookie banner” must comply with the “privacy by design” and “privacy by default” principles, as resulting from article 25 of the GDPR. Consequently, simplified manners for the obtainment of the consent are allowed only to the extent that they comply with some pre-determined requirements;
  • “analytic cookies” can be processed without any consent by users only if they do not allow any identification (direct identification of the person concerned should not be achieved), and if they are used for the production of aggregate data only. Otherwise, they need to be expressly authorized;
  • information to be provided to the users must be specific and comply with articles 12 and 13 of the GDPR.

Data controllers now have a 6-months term (expiring on December 2021) for the adoption of the measures necessary to comply with such giudelines.

The full text of the measure can be found at the following link: https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9677876.

New Data Transfer Standard Contractual Clauses Approved by the EU Commission

On June 4, 2021 the EU Commission approved new standard contractual clauses (“SCC“), which are regarded to provide appropriate safeguards within the meaning of Article 46(1) and (2) (c) of the GDPR.

The new SCC are updated with GDPR, the opinions expressed during the course of the consultation phase (including those of the European Data Protection Board and the European Data Protection Supervisor), as well as take into account the recent Schrems II judgement of the Court of Justice.

There are two different sets of SCC: (i) for data transfers from controllers or processors in the EU/EEA (or otherwise subject to the GDPR) and (ii) to controllers or processors established outside the EU/EEA (and not subject to the GDPR).

The new SCC promisemore flexibility for complex processing chains, through a ‘modular approach’ and by offering the possibility for more than two parties to join and use the clauses“.

If you or your company are using the old SCC, you have a transition period of 18 months.