Tag Archives: Italian Data Protection Authority

GARANTE VS. CHATGPT: LATEST DEVELOPMENTS

1. An Order to Stop ChatGPT

On March 30, 2023 the Italian Data Protection Authority (“Garante”) issued an order by which it temporarily banned the ChatGPT platform (“ChatGPT”) operated by OpenAI LLC (“OpenAI”). The Garante in fact regards ChatGPT as infringing Articles 5, 6, 8, 13 and 25 of the GDPR. In particular:

  • No Information.  OpenAI does not provide any information to users, whose data is collected by OpenAI and processed via ChatGPT;
  • No Legal Basis.  There is no appropriate legal basis in relation to the collection of personal data and their processing for the purpose of training the algorithms underlying the operation of ChatGPT;
  • No Check of User’s Age.  OpenAI does not foresee any verification of users’ age in relation to the ChatGPT service, nor any filters prohibiting the use for users aged under 13.

Given that, the Garante has immediately banned the use of ChatGPT, and OpenAI has blocked the access to ChatGPT to the Italian people.

2. Measures Offered by OpenAI

On April 11, 2023, in light of the willingness expressed by OpenAI to put in place measures to protect the rights and the freedom of the users of ChatGPT, the Garante issued a new order, which opened the possibly to re-assess ChatGPT if OpenAI adopts the following measures:

  1. to draft and publish an information notice to data subjects, which should be linked so that it can be read before the registration;
  2. to make available, at least to data subjects who are connected from Italy, a tool to exercise their right to (i) object, (ii) obtain a rectification, insofar as such data have been obtained from third parties, or (iii) the erasure of their personal data;
  3. to change the legal basis of the processing of users’ personal data for the purpose of algorithmic training, by removing any reference to contract and instead relying on consent or legitimate interest;
  4. to include a request to all users connecting from Italy to go through an “age gate” and to submit a plan for the deployment of age verification tools; and
  5. to promote a non-marketing-oriented information campaign by May 15, 2023 on all the main Italian mass media, the content of which shall be agreed upon with the Italian Authority.

OpenAI has until April 30, 2023 to comply (until May 31, 2023 to prepare a plan for age verification tools). The objections by the Garante have been echoed by other European Union data protection authorities. The European Data Protection Board will be attempting to solve the dispute within two months and launched a dedicated task force on ChatGPT “to exchange information on possible enforcement actions conducted by data protection authorities”

Italian Transparency Act: the Opinion of the Italian Data Protection Authority

The Italian Data Protection Authority has issued its opinion on the data protection implications relating to the new information duties set forth on employers by legislative decree 104/2022.

On August 13, 2022, legislative decree 104/2022 (“Transparency Act”) has entered into force. It provides for a new set of mandatory information that the employer must communicate to its employees at the time of their onboarding. On January 24, 2023, the Italian Data Protection Authority (“Garante”) issued its opinion about compliance of such new information duties with the provisions of the relevant data protection legislation.

In particular, the focus of the Garante was centered on the mandatory communication that, according to section 4, paragraph 8 of the Transparency Act, the employer must give to the employees if any “decision or monitoring automated system is used for the sake of providing information which is relevant for the hiring, management or termination of the employment relationship, for the assignment of tasks and duties, or for the surveillance, evaluation and fulfillment of contractual duties by the employee”. The Garante has stated that:

  • GDPR Sanctions Apply in case of Breach.  The implementation of any decision or monitoring automated system must be made in compliance and within the limits set forth by the applicable labor law provisions, and in particular law 300/1970. Such labor law provisions, which allow the implementation of automated systems only if certain conditions occur, must be deemed as providing “more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context” (as per section 88, paragraph 2, of the GDPR), and thus non-compliance with them may lead to administrative fines pursuant to section 83 of the GDPR.
  • Data Processing Impact Analysis (“DPIA”).  The employer, who is subject to the duty of accountability, must assess beforehand if the relevant processing is likely to result “in a high risk to the rights and freedoms of natural persons responsibility”, and thus requires a preliminary data processing impact analysis under section 35 of the GDPR. In such regard, the Garante has clarified that data subjects (i.e., employees) should be deemed as “vulnerable”, and that the processing of their data with automated systems is very likely to meet the conditions that make the DPIA mandatory according to the guidelines on the DPIA issued by the WP 29 on April 4, 2017.
  • Compliance with the “privacy by default” and “privacy by design” principles.  Employers must implement appropriate technical and organizational measures and integrate the necessary safeguards into the processing so that to protect the rights of data subjects (privacy by design). Moreover, the controller shall ensure that, by default, only personal data which are necessary for the specific purpose of the processing are processed (privacy by default), and should then refrain from collecting personal data that are not strictly related to the specific purpose of the relevant processing.
  • Update of the register of processing activities (“ROPA”).  The employer must indicate the processing of data through automated systems within his/her ROPA.

Need any further assistance on the matter? Don’ hesitate to reach us out!

Data Protection Day 2021: What You May Have Missed (while busy celebrating Data Protection Day)

This year’s celebrations for Data Protection Day may have been a bit toned down. But you still may have been so busy celebrating that you may have missed a couple of news from the (data privacy) world.

First, the EDPB’s Guidelines 01/2021 on Examples regarding Data Breach Notification are out and open for comments until March 2nd.  The document can be used as a very practical guide for whoever is involved in data processing activities. It is aimed at helping data controllers in deciding how to handle data breaches and what factors to consider during risk assessment. The Guidelines reflect the experiences of the European supervisory authorities since the GDPR became applicable and they are full of cases and examples which make them, admittedly, a practice-oriented, case-based guide for controllers and processors. So, are you curious to know what to do in case of a ransomware attack with backup but without exfiltration in a hospital? Or perhaps in case of a credential stuffing attack on a banking website? Or you’re “just” trying to figure out what to do in case of mistakes in post and mail?  Then, check out the guidelines!

Meanwhile, in Italy, the Italian Data Protection Authority gave its favourable opinion to the proposed reform of the Italian Registro Pubblico delle Opposizioni, a service designed for the protection of data-subjects, whose telephone number is publicly available but who wish not to receive unsolicited direct marketing calls from an operator. Nevertheless, the Italian Data Protection Authority specified that such service, essentially based on a list of express dissents, only applies to marketing activities carried out by human operators and cannot be extended to automated calls. The Italian Data Protection Authority, by doing so, confirms that marketing activities carried out through automated systems must be subject to stricter measures and always require express consent, given their highly invasive nature. So: Humans 1, Automated Calling Machines: 0.

Italy’s First Multi-Million GDPR Sanctions

Before last week, the Italian Data Protection Authority (“DPA”) only applied one (modest) GDPR sanction, which placed Italy at the bottom of the lists of EU Countries per number and value of GDPR sanctions applied.

In addition to the great differences in numbers and figures – for example, of soon-to-leave UK (sanctions’ amounts in Euro: Italy 30k vs. UK 315mln+) or Spain (number of sanctions: Italy 1 vs. Spain 43) – it is interesting noting that, until last Friday, the most active European DPAs (UK, France, Germany, Spain) tended to target big players in the private sector (i.e. British Airways, Marriot International, Google), as opposed to Italy’s attention to websites affiliated to a political party and run through the platform named Rousseau.

Last Friday, however, a significant change in such scenario occurred. The Italian DPA issued a press release announcing two GDPR sanctions applied to Eni Gas e Luce, a fully-owned subsidiary of Italy’s State-controlled multinational oil and gas company, Eni S.p.A., for Euro 8.5 and 3 million.

The first sanction of Euro 8.5 million has been imposed for unlawful processing in connection with telemarketing and tele-selling activities. The inspections and inquiries had been carried out by the authorities as a response to several alerts and complaints that followed GDPR D-Day.

Violations included: advertising calls made without consent or despite data subjects’ refusal, absence of technical and organisational measures to take into account the instructions provided by data subjects, excessive data retention periods, obtainment of personal data of possible future customers from third parties which did not obtain consent.

The second sanction of Euro 3 million relates to unsolicited contracts for the supply of electricity and gas. Many individuals complained that they have learned about their new contracts only upon receipt of the termination letter from the previous supplier or of the first electricity bill from Eni Gas e Luce. Complaints included alleged incorrect data and false signatures.

About 7200 consumers have been affected. The Italian DPA also underlined the role of third-party contractors, acting on behalf of Eni Gas e Luce, in perpetrating the violations.

Both decisions are quite significant as, for the very first time, the Italian DPA provides its indications and illustrates its approach in dealing with data processing and violations by large-sized companies operating in the private sector, within the GDPR regulatory framework.

Don’t Forget to Close E-mail Accounts of Employees who Leave. And Happy Holidays!

The Italian Data Protection Authority has recently reiterated what to do when an employee leaves the company, i.e.:

  • Close down email accounts attributable to the former employee;
  • Adopt automatic response systems indicating alternative addresses to those who contact the mailbox; and
  • Introduce technical measures to prevent the display of incoming messages to unauthorized subjects.

The automatic forwarding of emails to colleagues of the former employee amounts to a breach of principles of data protection, which impose on the employer the protection of confidentiality even of the former worker.

In the case decided by the Authority the e-mail account had remained active for over a year and a half after the end of the employment relationship and before its elimination, which took place only after a formal complaint filed by the worker.

Our life sciences team at Gitti and Partners wishes you a relaxing Christmas break and a 2020 full of happy innovation, useful technology and interesting legal developments!

May 25, 2018: Did You Survive the GDPR D-Day?

Last May 25 the GDPR came into force. It was hard not to notice given the inundation of emails that everyone received, as well as the clear signs of burnout in the eyes of GDPR experts.

Here are my personal top 3 takeaways from that experience:

  • The flood of data protection emails received on May 25 showed me how my data had been disseminated all over the place and archived for a really long time. I had some recollection of only a few of those who wrote me to share their most recent privacy policy (and remind me how they deeply, deeply care about privacy!), since many may have bought, inherited or just collected my data a long time ago. It reminded me that those data subjects’ rights are an empowering tool, which I intend to use more frequently in the future.

 

  • The Law (capital “L”) showed its full might and power on May 25, something which surprised even those, like me, who work with legal requirements all day every day. Look at what companies do when you threaten a 4% fine on their worldwide turnover! (Incidentally, this reminded me why politics is important and why people who are indifferent to politics are wrong: this stuff does make a difference in our lives).

 

  • The Italian authorities (mostly the government and parliament) lost yet another opportunity to be helpful to citizens. We had been waiting for a national data protection law for months, but no such law was enacted before May 25. Until that happens, Italians are supposed to assess, for each and every provision of the Data Protection Code, whether or not it conflicts with the GDPR. How practical.

2017 New Year’s Privacy Resolution: Road to Compliance with the New European Privacy Framework

Year 2017 already brought to us some exciting change. The beginning of the year is also the perfect time for appraisals of the past and resolutions for the near future. Whether we see it as a welcome enhancement of personal data rights or simply as another burdensome European set of requirements, 2016 delivered the new European General Data Protection Regulation (Regulation EU 2016/679, “GDPR”). Already 233 days passed since GDPR entered into force and 498 days are left until the new Regulation will start to apply on May 25, 2018. Roughly, one third of the time given to comply with the new regulatory framework has already gone by. Then, perhaps, the beginning of 2017 can be a good chance to ask ourselves what has already been done in the first 233 days and what still needs to be done in the future 498 days in order not to miss May 2018’s deadline.

The GDPR imposes a much more burdensome level of compliance requirements to companies acting as data controllers and data processors.

Some of them require the assessment and preparation of organizational and implementing measures that need to be put in place well in advance of May 2018.

  • Data controllers and data processors must appoint a data protection officer (“DPO”). The controller and the processor shall ensure that the DPO is involved, properly and in a timely manner, in all issues which relate to the protection of personal data. The controller and processor shall support the DPO in performing his/her tasks by providing resources necessary to carry out those tasks and access to personal data and processing operations, and to maintain his/her expert knowledge. The controller and processor shall also ensure that the DPO does not receive any instructions regarding the exercise of those tasks. Furthermore, the DPO shall not be dismissed or penalized by the controller or the processor for performing his tasks and shall directly report to the highest management level of the controller or the processor.
  • Data protection by design and by default will have to be implemented. The data controller: (i) both at the time of the determination of the means for processing and at the time of the processing itself, must “implement appropriate technical and organizational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimization, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of [the GDPR] and protect the rights of data subjects” and (ii) “to implement appropriate technical and organizational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed”.
  • A data protection impact assessment must be carried out. Such impact assessment must contain: a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller; an assessment of the necessity and proportionality of the processing operations in relation to the purposes; an assessment of the risks to the rights and freedoms of data subjects; the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with GDPR.
  • Data controllers must guarantee the effectiveness of the data subject’s right to be forgotten and right to portability. This requires an assessment of the adequacy of the technical and organizational instruments currently available and, possibly, their improvement. More specifically, data controllers must be able to fulfill: (i) in relation to the right to be forgotten, their obligation to “take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data”; (ii) as regards to the right to portability, their obligation to allow the data subjects to effectively exercise their right to “receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller”.
  • Data controllers shall notify personal data breaches to the relevant supervisory authority without undue delay and, where feasible, not later than 72 hours after having become aware of it. This imposes on controllers the preparation of appropriate notification forms, as well as organizational measures to guarantee adequate resources to complete such task.
  • The mandatory content of the written contract between the data controller and the data processor requires a revision of all such contracts. They shall include, inter alia, the obligations of the processor to: process the personal data only on documented instructions from the controller, including with regard to transfers of personal data to a third country or an international organization; ensure that persons authorized to process the personal data have committed themselves to confidentiality or are under an appropriate statutory obligation of confidentiality; delete or return all the personal data to the controller after the end of the provision of services relating to processing, including copies; make available to the controller all information necessary to demonstrate compliance with the obligations under GDPR; allow for and contribute to audits, including inspections, conducted by the controller or another auditor mandated by the controller.
  • Information notice forms currently in use will need to be revised. In fact, information to be provided to data subjects must include, inter alia: the contact details of the DPO; the legal basis for the processing; the fact that the controller intends to transfer personal data to a third country or international organization and the existence or absence of an adequacy decision by the Commission; the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period; the existence of the right to data portability; the existence of the right to withdraw consent at any time for processing based on consent; the existence of the right to lodge a complaint with a supervisory authority; the existence of automated decision-making, including profiling.
  • Data controllers and data processors must keep record of processing activities under their responsibility. Records to be kept by data controllers shall contain all of the following information: the name and contact details of the controller and, where applicable, the joint controller, the controller’s representative and the DPO; the purposes of the processing; a description of the categories of data subjects and of the categories of personal data; the categories of recipients to whom the personal data have been or will be disclosed including recipients in third countries or international organizations; where applicable, transfers of personal data to a third country or an international organization, including the identification of that third country or international organization and the documentation of suitable safeguards; where possible, the envisaged time limits for erasure of the different categories of data; where possible, a general description of the technical and organizational security measures. Records to be kept by data processors shall include: the name and contact details of the processor or processors and of each controller on behalf of which the processor is acting, and, where applicable, of the controller’s or the processor’s representative, and the DPO; the categories of processing carried out on behalf of each controller; where applicable, transfers of personal data to a third country or an international organization, including the identification of that third country or international organization and the documentation of suitable safeguards; where possible, a general description of the technical and organizational security measures. Data controllers and data processors shall therefore dedicate and organize resources to be able to start keeping such records.

All this may appear daunting. Nevertheless, 498 days are more than enough to take all necessary steps, if we let one of our New Year’s resolutions be to timely walk the road to compliance with the GDPR.

Is Privacy Really a Fundamental Right?

Privacy of individuals is framed as a fundamental right in the European Union. In fact, the new European Union Regulation no. 2016/679 reiterates this in the very first of its “whereas”.

Yet, it is clear to everyone that such “fundamental” nature is regularly questioned by various factors, and particularly:

  • Technological progress, coupled with people’s growing addiction to smartphones, allowing the collection of an amazing number of personally identifiable information and leading to big banks of intrusive data; and
  • Security threats that prompt governments to closely monitor citizens’ behavior.

Once upon a time courts were called to decide on how to balance conflicting rights. These days, the act of balancing privacy and other issues has become much more common and it is in the hands of a variety of subjects, such as data processors, who must carry out a data protection impact assessment according to Section 35 of the EU Regulation no. 2016/679, and data protection authorities, who provide both general guidelines and specific advice.

A couple of recent decisions by the Italian Data Protection Authority have led me to believe that the Authority is readier than before to accept that there are justified limits to the right to privacy:

  • On July 14, 2016, the Italian Data Protection Authority has decided that a bank is allowed to analyze behavioral/biometric information regarding its customers (such as mouse movements or pressure on the touch screen) as a measure to fight identity theft and internet banking fraud. Of course, a number of limitations have been set by the Authority, in addition to consent of the customer/data subject, such as specific safety measures, purpose and time limitations, and the segregation of the customer names from the bank’s IT provider.
  • On July 28, 2016, the same Authority has granted its favorable opinion to the use of a face recognition software at the Olimpico stadium during soccer games in order to check that the data on the ticket and the face of the person actually attending the event correspond. Provided that strong security measures are used and that the processing is carried out by police forces, the processing was deemed to be necessary.

A tougher stance, instead, is adopted by the Italian Data Protection Authority in cases of processing aimed at marketing purposes, as in this decision, for example. (I note, however, that the code of conduct applying to data processing for the purposes of commercial information that will enter into force on October 1, 2016, blessed by the Italian Data Protection Authority, continues to allow the dispatching of commercial communications to individuals whose personal data is included in public listings, even without the data subject’s express consent).

Balancing rights and interests is inherent to law and justice. It remains to be seen, considering the obvious (and absolutely reasonable) limitations to which the right to privacy is subject, if it will continue to make sense to frame it as “fundamental” right.

Medical Apps and the Law, Part II – Medical Apps: Helpful or Harmful?

A BOOMING MARKET. The idea of running software on a mobile device with healthcare uses has been discussed as early as 1996[1]. However, the issue has assumed explosive proportions in recent years, thanks to the spreading of an “app mentality” among health care professionals and consumers, and its potential, given cloud computing, social networks and big data analytics, could be yet to be realized. According to a March 2014 BCC report, this growing trend will be continuing in the next years[2]. App stores offering thousands of medical app also confirm the trend, as about 97,000 mobile health apps in 62 app stores according to a Research2Guidance market report of last year. Hardware manufacturers are certainly not immune to the medical app fervor, and – for example – the new smartphone Gear 2 Neo by Samsung, launched on April 11, 2014 by Samsung in 125 countries, incorporates a heart rate sensor.

 

ACCORDING TO THE EU COMMISSION, MEDICAL APPS AND E-HEALTH HAVE GREAT POTENTIAL.  What is the view of the authorities on this phenomenon? The potential of apps makes them app enthusiasts, the reality of apps worries them. The European Commission believes in medical apps, which can be leveraged in order to eliminate barriers to smarter, safer, patient-centred health services. Further, digital health could also be a promising factor to cut Member States’ budget[3] while – in the words of the Commission – “putting patients in the driving seat[4]. The reality of the app market, however, does not necessarily boost patient empowerment. In fact, the Commission noted that there are substantial risks connected with the way apps are currently marketed: information to consumers is not clear, the trader’s contact details are not easy to find, the use of the term “free” is often misleading[5].

 

ENFORCEMENT ACTION BY THE ITALIAN DATA PROTECTION AUTHORITY. On September 10, 2014 the Italian Data Protection Authority has issued a warning regarding data protection risks inherent to medical apps (“Medical Apps: More Transparency Is Needed On Data Use”) promising future sanctions. The Authority found that insufficient information to users prior to installation, as well as the processing of excessive data. The survey conducted by the Italian Data Protection Authority involved a total of 1,200 apps and the findings thus obtained were striking: (i) barely 15% of them provided meaningful privacy notices; and (ii) in 59% of the apps reviewed the Authority found it hard to locate pre-installation privacy notices. The stance taken by the Italian Data Protection Authority echoes the Opinion 02/2013 by The “Article 29 Data Protection Working Party”, which had identified lack of transparency, lack of free informed consent; poor security measures; disregard for the principle of purpose limitation requiring processing of personal data only for specific and legitimate purposes.

 

CONSENT IN WRITTEN FORM: A REQUIREMENT PECULIAR TO ITALIAN LAW.  Italian legislation includes a couple of additional requirements, which could kill the medical app market. We note, however, that they have not been mentioned by the Italian Data Protection Authority in their September 10, 2014 warning so it is unclear whether there is any appetite for enforcing them. In addition to a specific authorization by the Data Protection Authority, typically substituted by a general authorization such as this, Section 23 of the Data Protection Code requires that consent to process sensitive data, such as health data, must be given in written form, a requirement which is not satisfied by a mere “click” on the smartphone, but would only be satisfied by the digital or qualified electronic signature in accordance with Italian legislation. This obstacle could be solved only when (and if) the proposed EU Data Protection Regulation enters into force and repeals the existing Italian Data Protection Code, as consent to process sensitive data shall have to be “freely given, specific, informed and explicit” and the controller shall bear the burden of proof of such consent, but consent in written form would no longer be required.

[1] Regulation of health apps: a practical guide”, d4Research, January 2012, citing material from the Conference of the American Medical Informatics Association Fall Symposium of 1996.

[2]This market is expected to grow to $2.4 billion in 2013 and $21.5 billion in 2018 with a compound annual growth rate (CAGR) of 54.9% over the five-year period from 2013 to 2018”.

[3]In Italy, overall savings from the introduction of ICTs in the Health Sector are estimated to be around 11.7% of National health expenditure (i.e., €12.4 billion). Savings from digital prescriptions alone are estimated to be around €2 billion”. European Commission Memo of December 7, 2012 “eHealth Action Plan 2012-2020: Frequently Asked Questions”.

[4] It should be noted that, while the Commission is a fervent proponent of eHealth (see also the recent Green Paper on mHealth), there are strong limitations to its actions given its lack of competence in healthcare delivery and financing, which is entirely up to Member States. The effectiveness of eHealth solutions in Europe require the commitment of Member States to implement organizational changes which make patient-centric eHealth solutions an integral part of their healthcare systems, a task that each Member State is pursuing with various degrees. A March 24, 2014 press release by the European Commission commenting on two European surveys on the use of eHealth (including Electronic Health Records, Health Information Exchange, Tele-health and Personal Health Records) showed that many critical issues still exist: lack of penetration, lack of interoperability, and lack of regulatory certainty, to name a few.

[5] Focus of the Italian Antitrust Authority has so far been on game apps, rather than medical apps: it, too, found that apps were misleadingly presented to users as free, while they were not.