Tag Archives: data protection

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!

Google Analytics under Scrutiny by Italian Data Protection Authority

The second issue of our summer series focuses on the recent decision by the Italian Data Protection Authority, which affects all users of the Google Analytics services in Italy, as well as other similar services that entail the transfer of users’ personal data to the United States.

Read our slides to understand what actions are available to you.

Check Your Website’s Compliance with New Rules on Cookies

The Italian Data Protection Authority’s new guidelines for the processing of cookies are in force. Does your website comply? Find out if the answer is yes (or if you need adjustments) through the Q&A below.

On January 9, 2022, the new guidelines for processing of cookies and other online tracking instruments issued by the Italian DPA have officially entered into force. Take this test to check if you are already compliant.

Q: What kind of cookies are you currently using on your website?

A: The Italian DPA has divided the cookies currently in use in 3 categories:

  • Technical cookies: these cookies are the ones strictly necessary to a service provider for the dispensing of a service requested by users.
  • Profiling cookies: these cookies are the ones used to create clusters of users, by associating them with specific actions or behavioral patterns. Such cookies are mainly aimed at modulating the delivery of services provided to the user in an increasingly personalized way, as well as to carry out targeted advertising activity.
  • Analytic cookies: these cookies are the ones which are aimed at evaluating the effectiveness of the services offered or to measure user “traffic” on the website, by memorizing users’ online activities within the website. These cookies are mainly provided by third party suppliers.

Q: What should I do in case I use TECHNICAL COOKIES?

A: Technical cookies are not subject to any prior consent by the users. This means that you just need to provide the users with a specific cookie policy information, having the details set forth by article 13 of the GDPR. Such policy may also be contained on a specific section of your general privacy policy information.

Q: What should I do in case I use PROFILING COOKIES?

A: Profiling cookies may be used only upon prior consent by the users. You may obtain users’ consents by implementing a cookie banner that will pop up on your website as soon as users log your online page.

Q: What should I do in case I use ANALYTIC COOKIES?

A: Analytic cookies can be processed without any consent by users only if they do not allow any identification (direct identification – i.e. “singling out” – of the person concerned should not be achieved), and if they are used for the production of aggregate data only. Otherwise, they need to be expressly authorized.

Usually, analytical cookies are provided by third parties. In such case, you must provide, within your cookie policy notice, an updated list of all the third party cookies that are implemented within your website.

Q: How do I collect consent by users, when mandatory?

A: You may set up a cookie bannerthat will pop up on your website when users log your online page.

Q: How to draft a cookie banner?

A: First and foremost, cookie banners must be user-friendly and immediately visible. The dimensions of the banner must be neither too small nor too big, if compared with the kind of device used. Their wording must also be simple and easy to understand. In addition, cookie banners must contain a link to the cookie policy notice. No profiling cookies can be implemented before consent by the user. Only technical cookies may be pre-implemented.

Q: Do I have to grant users the possibility to modify their choices?

A: Yes, a specific section on the website must always be included to the end of consenting users to modify their first decisions.

Q: Can I obtain consent by users in other ways?

A: Consent by the user must be free and unambiguous, but there is no mandatory way to obtain consent by the users: you may implement your own system, in accordance with accountability principles set forth by the GDPR so long as consent is unambiguous and through a positive act of the user (“opt in”). No form of implicit consent is acceptable.

Q: Can I propose the banner again in case the user has declined consent?

A: The excessive and redundant use of banners requesting consent is not allowed – except for certain specific exceptions – since this may bring the user to give consent for the sole purpose of interrupting the pop-up of the banner.

Q: What about “cookie walls” and “scroll down”?

A: Don’t use them! A “cookie wall” is a mechanism by virtue of which the denial of the consent by users prevents them from accessing the website entirely. A “scroll down” system assumes the implied consent of the user when browsing of the website without expressing any choice with regard to cookies consent is continued. Neither cookie walls nor scroll down systems are compliant, since they are not aimed at obtaining an express consent by the user.

All clear? If not, reach out to us!

Facial Recognition Technology: Are We Close to a Turning Point?

When people think about facial recognition technology (“FRT”), they immediately imagine the use of their faces to unlock their smartphones. But this technology is far more complicated, useful and potentially dangerous.

First, it is important to understand the difference among “facial detection”, “facial characterization”, “facial identification” and “facial verification”. Such terms have been defined by the non-profit organization Future of Privacy Forum (https://fpf.org/wp-content/uploads/2019/03/Final-Privacy-Principles-Edits-1.pdf) as follows:

  • Facial detection simply distinguishes the presence of a human face and/or facial characteristics without creating or deriving a facial template.
  • In facial characterization the system uses an automated or semi-automated process to discern a data subject’s general demographic information or emotional state, without creating a unique identifier tracked over time.
  • Facial Identification is also known as “one-to-many” matching because it searches a database for a reference matching a submitted facial template and returns a corresponding identity.
  • The last one, facial verification, is called “one-to-one” verification because it confirms an individual’s claimed identity by comparing the template generated from a submitted facial image with a specific known template generated from a previously enrolled facial image.

There are many possible uses of facial recognition. In the private sector FRT may be used to keep track of employees’ time and attendance, identify shoppers’ patterns inside stores, implement smart homes, etc. In the public sector, FRT may be used to monitor protests, identify suspects in security footage, check claimed identities at borders, etc.

This relatively new technology brings, besides a wide range of possible implementations, significant concerns regarding privacy, accuracy, race and gender disparities, data storage and security, misuse. For instance, depending on the quality of images compared, people may be falsely identified. In addition to that, in its current state, FRT is less accurate when identifying women compared to men, young people compared to older people, people of color compared to white people. Privacy is certainly another concern: without strong policies it is unclear how long these images might be stored, who might gain access to them or what they can be used for; not to mention that this technology makes far easier for government entities to surveil citizens and potentially intrude into their lives (see “Early Thought & Recommendations Regarding Face Recognition Technology”, First report of the AXON AI and policing technology Ethics Board https://www.policingproject.org/axon-fr).

Once the possible implementations and the related risks are understood, the worldwide lack of regulation becomes even more surprising.

Within the European Union, the General Data Protection Regulation obviously applies to FRT. Furthermore, “Guidelines on Facial Recognition” have been released on January 28, 2021 by the Consultative Committee of the Council of Europe with regard to automatic processing of personal data (https://rm.coe.int/guidelines-on-facial-recognition/1680a134f3). This latter document includes:

  • Guidelines for legislators and decision-makers;
  • Guidelines for developers, manufacturers and service providers;
  • Guidelines for entities using FRT;
  • Rights of data subject.

When it comes to Italy, particular attention has been drawn by several decisions of the Italian Data Protection Authority on the topic. Recognizing the innovative potential of FRT as well as its riskiness for individual rights, the Authority adopted a more permissive approach regarding the private sector’s use of FRT, while issuing stricter decisions with regard to the use of FRT by public authorities. For instance, the Authority allowed the use of FRT by police forces for purposes of identifying individuals among archived images, but prohibited real-time surveillance using the same technology (see https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9040256 and https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9575877). On the other hand, the Authority allowed one airport to implement FRT for purposes of improving efficiency in the management of the flow of passengers, so long as images of individuals were not stored (see https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/8789277).

The European Commission, in light of the complexity of the situation and the necessity of a strong and harmonised legislative action, presented on April 21, 2021 its “Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence” (https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021PC0206). This Proposal was already the subject, on June 18, 2021, of a EDPB and EDPSs’ joint-opinion (https://edpb.europa.eu/our-work-tools/our-documents/edpbedps-joint-opinion/edpb-edps-joint-opinion-52021-proposal_en), in which they called for a general ban on the use of FRT for:

  • Automated recognition of human features in publicly accessible spaces;
  • Categorization of individuals into clusters according to ethnicity, gender, etc., based on biometric features;
  • Inference of individuals’ emotions.

What the European Commission is doing is an example of a more globally widespread legislators’ attitude towards artificial intelligence in general and FRT in particular. These technologies are more and more in our lives and are constantly evolving. Consequently, there is an increasing request, both from public and private subjects, for clear rules to govern this new technology and ensure that individual rights are safeguarded. Hopefully in the next months/years the situation will become clearer.

Flavio Monfrini / Michele Galluccio

Web Cookies’ Processing: New Guidelines by the Italian DPA

On June 10, 2021 the Italian DPA has officially issued new guidelines for the processing of cookies and other online tracking instruments. Such newly-issued guidelines are aimed at compliance with principles set forth by the GDPR, as well as by the recently issued contributions of the European Data Protection Board. The new guidelines complement and update the previous ones issued in 2014.

New provisions mainly regard how consent is acquired and information to be provided to interested subject. In fact:

  • consent by the user must be given in accordance with principles of freedom and unambiguousness. Accordingly, the use of methods that do not comply with such principles, such as the “scrolling-down” and the “cookie-wall”, are unlawful and void;
  • the “cookie banner” must comply with the “privacy by design” and “privacy by default” principles, as resulting from article 25 of the GDPR. Consequently, simplified manners for the obtainment of the consent are allowed only to the extent that they comply with some pre-determined requirements;
  • “analytic cookies” can be processed without any consent by users only if they do not allow any identification (direct identification of the person concerned should not be achieved), and if they are used for the production of aggregate data only. Otherwise, they need to be expressly authorized;
  • information to be provided to the users must be specific and comply with articles 12 and 13 of the GDPR.

Data controllers now have a 6-months term (expiring on December 2021) for the adoption of the measures necessary to comply with such giudelines.

The full text of the measure can be found at the following link: https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9677876.

New Data Transfer Standard Contractual Clauses Approved by the EU Commission

On June 4, 2021 the EU Commission approved new standard contractual clauses (“SCC“), which are regarded to provide appropriate safeguards within the meaning of Article 46(1) and (2) (c) of the GDPR.

The new SCC are updated with GDPR, the opinions expressed during the course of the consultation phase (including those of the European Data Protection Board and the European Data Protection Supervisor), as well as take into account the recent Schrems II judgement of the Court of Justice.

There are two different sets of SCC: (i) for data transfers from controllers or processors in the EU/EEA (or otherwise subject to the GDPR) and (ii) to controllers or processors established outside the EU/EEA (and not subject to the GDPR).

The new SCC promisemore flexibility for complex processing chains, through a ‘modular approach’ and by offering the possibility for more than two parties to join and use the clauses“.

If you or your company are using the old SCC, you have a transition period of 18 months.

Personal Data of Deceased People: Clear Indications by the Italian Data Protection Authority

Access to personal data concerning deceased people may represent an issue and a necessity, especially for their heirs. How is such kind of access to personal data currently regulated under the Italian Law (Legislative Decree n. 196/2003), as amended after GDPR?

The Italian Data Protection Authority, in its efforts to combine data protection legislation and clarity, recently issued an outline of article 2-terdecies of the Legislative Decree n. 196/2003.

  • Who is entitled to such right to access? Whoever (i) has a vested interest; (ii) acts in the interest of the deceased person (who is the “interested party” pursuant to data protection laws); (iii) acts as mandatary; or (iv) acts for worthwhile reasons of family protection.
  • To whom should the request to access data be addressed? The request should be addressed to the relevant Data Controller (i.e., the natural or legal person, public authority, agency or other body, either private or public, which determines the purposes and means of the processing of personal data), also through the Data Processor (i.e., the natural or legal person, public authority, agency or other body which processes personal data on behalf of the Data Controller), where appointed.
  • Which information may be requested? (i) Access to personal data of the deceased person; (ii) the purpose of processing data; (iii) which data have been communicated and the related addressees; (iv) the retention period; (v) the origin of such data and (vi) whether data are subject to an automatic decisional processing (Sections 15-22 of GDPR).
  • Do you have to pay to access data? No, it is free (unless the request is manifestly unfounded or excessive).
  • Are there any exceptions or limits? Yes, it is not possible to access data in the event it is forbidden (i) by the law or (ii) by the interested party, who released an express and unequivocal declaration addressed to the Data Controller. However, even in the latter hypothesis, third parties exercising their patrimonial rights originating from the death of the interested party cannot be prejudiced in their rights.
  • Do you have to motivate your request? No.
  • How long does it take to get a feedback on your request? Maximum one monthsince your request, except in some particular cases, as provided by GDPR.
  • What can you do if your request is refused or in lack of any feedback? You may address the Italian Data Protection Authority or the relevant court.

Access to data concerning deceased people seems to be quite easy in theory. However, balancing patrimonial rights of heirs and assessing “express and unequivocal” declarations of the deceased may prove to be more complex in practice.

Data Protection Day 2021: What You May Have Missed (while busy celebrating Data Protection Day)

This year’s celebrations for Data Protection Day may have been a bit toned down. But you still may have been so busy celebrating that you may have missed a couple of news from the (data privacy) world.

First, the EDPB’s Guidelines 01/2021 on Examples regarding Data Breach Notification are out and open for comments until March 2nd.  The document can be used as a very practical guide for whoever is involved in data processing activities. It is aimed at helping data controllers in deciding how to handle data breaches and what factors to consider during risk assessment. The Guidelines reflect the experiences of the European supervisory authorities since the GDPR became applicable and they are full of cases and examples which make them, admittedly, a practice-oriented, case-based guide for controllers and processors. So, are you curious to know what to do in case of a ransomware attack with backup but without exfiltration in a hospital? Or perhaps in case of a credential stuffing attack on a banking website? Or you’re “just” trying to figure out what to do in case of mistakes in post and mail?  Then, check out the guidelines!

Meanwhile, in Italy, the Italian Data Protection Authority gave its favourable opinion to the proposed reform of the Italian Registro Pubblico delle Opposizioni, a service designed for the protection of data-subjects, whose telephone number is publicly available but who wish not to receive unsolicited direct marketing calls from an operator. Nevertheless, the Italian Data Protection Authority specified that such service, essentially based on a list of express dissents, only applies to marketing activities carried out by human operators and cannot be extended to automated calls. The Italian Data Protection Authority, by doing so, confirms that marketing activities carried out through automated systems must be subject to stricter measures and always require express consent, given their highly invasive nature. So: Humans 1, Automated Calling Machines: 0.

COVID-19 Infects Smart Working and Data Protection Rules

The unfortunate spread of COVID-19 throughout Italy led to some interesting legislative measures.

Smart Working

Thanks to a Decree of the Prime Minister adopted on March 1, 2020, the employers could employ their workers by remote working, even without the individual agreements in writing mandated by Law no. 81/2017. 

  • Remote or “smart” working is not mandatory. It is up to the employer, given its responsibility for the organization of the working activity, to decide whether or not to adopt remote working both for employees who work in areas at risk and for employees who live in such areas but work outside.
  • Secondly, for the next six months the principle of consent, on which remote working is based, will be waived: the employer will be able to arrange such method of working “even in the absence of individual agreements”. In case of refusal by the employee, disciplinary sanctions may be applied. On the contrary, the employee may not use smart working without a specific indication by the employer.
  • With regards to formal requirements, no precise written provision is needed. An e-mail or a verbal arrangement may be sufficient.

During this time, smart working will be considered as a measure of health and safety at work and the employers should provide for the relevant IT instruments to allow the employee to arrange remote working.

Moreover, last February, before the outbreak of COVID-19 crisis, Regione Lombardia already launched a campaign to make public funds available for employers that never implemented plans of smart working. The employers can send the application starting from April 2, 2020, until December 15, 2021, up to availability of the subsidies. We could assist the employers to define the relevant plan.

Data Protection

Ordinance no. 630, adopted on February 3, 2020, as an emergency measure to contrast corona virus has been approved by the Italian Data Protection Authority. Surprisingly, it in fact lowers the protection of individuals in light of the public interest.

More specifically, the Italian Data Protection Authority pointed out that, pursuant to Section 9 of GDPR, certain personal data may be legitimately processed for reasons of public interest in public healthcare – particularly in case of serious cross-border threats against healthcare – while ensuring appropriate measures to protect the rights of the concerned individuals, with a specific focus on professional secrecy.

In light of the above and considering the ongoing COVID-19 crisis, the measures taken allow personal mobile communication data and geolocation to be analysed in order to trace connections and contacts amongst individuals. However, such decision does not set forth specific countermeasures in order to protect the rights of the concerned individuals.