MAKING HEADLINES

GREECE IMPOSES CRIMINAL SANCTIONS FOR GDPR OFFENCES

  • Blog
author01 October 20, 2020 1 min read

AUSTRALIA’S CONSUMERS INCREASINGLY CONCERNED ABOUT DATA MISUSE

  • Blog
author01 October 20, 2020 3 min read

AEGEAN MARINE PETROLEUM FINED €150,000

  • Blog
author01 October 19, 2020 1 min read

CHINA’S UNEXPECTED QUEST TO PROTECT DATA PRIVACY

  • Blog
author01 September 11, 2020 16 min read

CHANGES TO JAPAN’S DATA PRIVACY LAW ECHO EUROPE’S GDPR

  • Blog
author01 September 11, 2020 4 min read
<ul class="post-categories"> <li><a href="https://trustgdpa.com/blog/" rel="tag">Blog</a></li></ul>

GREECE IMPOSES CRIMINAL SANCTIONS FOR GDPR OFFENCES

author01 dpofficer October 20, 2020 1 min read

RETURN TO MAIN BLOG


Greece imposes criminal sanctions for GDPR offences of up to 20 years covering 6 key infringements:

Infringement 1: Unauthorised data processing (e.g., unauthorised access, copy, deletion, transmission, etc.)

Criminal sanction: Up to 1 year imprisonment.

Infringement 2: DPO violating their confidentiality duty in order to gain benefit or cause harm.

Criminal sanction: Up to 1 year imprisonment.

Infringement 3: Unauthorised data processing and further transmission to other unauthorised individuals.

Criminal sanction: Up to 5 years imprisonment.

Infringement 4: Unauthorised data processing of sensitive data.

Criminal sanction: Between 1-5 years imprisonment and up to 100,000 EUR monetary penalty.

Infringement 5: Unauthorised data processing in order to gain benefit or cause harm, of value greater than 120,000 EUR.

Criminal sanction: Between 5-10 years imprisonment.

Infringement 6: Unauthorised data processing that relates to the functioning of the Greek state, or national security.

Criminal sanction: Between 5-20 years imprisonment, and up to 300,000 EUR monetary penalty.

courtesy of WSGR
<ul class="post-categories"> <li><a href="https://trustgdpa.com/blog/" rel="tag">Blog</a></li></ul>

AUSTRALIA’S CONSUMERS INCREASINGLY CONCERNED ABOUT DATA MISUSE

author01 dpofficer October 20, 2020 3 min read

RETURN TO MAIN BLOG


New research from OpenText reveals that nearly half (44%) of Australian consumers would pay more to do business with an organisation that is committed to protecting their personal data, compared to 40 per cent of consumers globally.

The new data – from a survey of 1,000 Australian respondents – highlights public uncertainty and distrust around how organisations handle their personal data. Just seven per cent of the Australian public place trust in the ability of organisations to keep their personal data safe or private.

This is despite increasingly stringent standards for data privacy as new regulations emerge worldwide, including the 2018 introduction of the EU General Data Protection Regulation (GDPR). Severe GDPR infringements can result in fines of up to €20m or four per cent of a company’s total global annual turnover, whichever is higher.

Getting to grips with data privacy

The majority (80%) of Australians do not know how many organisations use, store or have access to their personal data, including their email addresses, contact numbers and bank details. Yet 70 per cent of Australian consumers are very or somewhat aware of the laws that protect their personal data, compared with 73 per cent globally.

In fact, almost a third (30%) of Australian consumers say they would proactively get in touch with an organisation to see how it is using their personal data or to check if it is storing their personal data in a compliant manner. More than one in ten (14%) have already done so at least once.

“The COVID-19 crisis has accelerated the pace of digital transformation, as companies have moved to remote work and digital customer experiences,” said Lou Blatt, Senior Vice President and CMO at OpenText. “Digital is now central to almost every business interaction – generating more data for companies to manage and secure. This shift coupled with increased consumer data privacy expectations means organisations are now under pressure to ensure that their data privacy solutions can scale appropriately for this digital-first era.”

Taking responsibility for data privacy

Almost two-thirds (60%) of Australian consumers feel they know how to keep their own data private and secure on apps, email accounts and social media platforms, from using privacy settings to turning off geolocation. However, one-fifth (20%) believe keeping their data private and secure on apps, email accounts and social media is the responsibility of the app or company in question.

Just one in ten (9%) Australian consumers believe we are already at the point when every business is meeting its legal obligations to keep customer data private, fewer than in Spain (17%), Germany (13%) and France (11%), while almost a quarter (23%) of the Australian public believe this to be a distant future or will never happen.

“Beyond potential fines, any organisation that fails to comply with data privacy laws risks losing the trust of their customers,” said Albert Nel, Vice President Asia Pacific, OpenText. “Leaders must leverage technology that not only provides visibility into how they capture and secure data, but also allows them to respond rapidly to customers’ requests on how their personal data is being processed, collected, and used. By investing in comprehensive privacy management solutions that automate and integrate an organisation’s privacy policies with data privacy and protection principles, organisations can satisfy regulatory requirements, reduce the risk of reputational harm, and maintain customer trust.”

Methodology

This research was conducted through Google Surveys from April-May 2020. Commissioned by OpenText, 12,000 consumers were anonymously surveyed globally, across the UK, France, Germany, Spain, Canada, Australia and Singapore.

The Australian research polled 1,000 respondents to offer a snapshot of consumer perspectives on data privacy during the coronavirus crisis.

courtesy of OPENTEXT
<ul class="post-categories"> <li><a href="https://trustgdpa.com/blog/" rel="tag">Blog</a></li></ul>

AEGEAN MARINE PETROLEUM FINED €150,000

author01 dpofficer October 19, 2020 1 min read

RETURN TO MAIN BLOG


The Aegean Marine Petroleum Network Inc. failed to inform data subjects that they would have their data processed and stored on the servers. Moreover, the company failed to impose the necessary technical measures and secure the processing of such large amounts of data, while also failing to impose a separation between the relevant software and the data stored on the servers. As a result, companies outside the Aegean Marine Petroleum Group had access to these servers and, implicitly, to the personal data of data subjects, which they copied from the servers.

• Country: Greece • Authority: Hellenic Data Protection Authority (HDPA) Fine: €150,000 Organization Fined: Aegean Marine Petroleum Network Inc. Article Violated: Art. 5 GDPR, Art. 6 GDPR, Art. 32 GDPR Type: Failure to comply with data processing principles

<ul class="post-categories"> <li><a href="https://trustgdpa.com/blog/" rel="tag">Blog</a></li></ul>

CHINA’S UNEXPECTED QUEST TO PROTECT DATA PRIVACY

author01 dpofficer September 11, 2020 16 min read

RETURN TO MAIN BLOG


Late in the summer of 2016, Xu Yuyu received a call that promised to change her life. Her college entrance examination scores, she was told, had won her admission to the English department of the Nanjing University of Posts and Telecommunications. Xu lived in the city of Linyi in Shandong, a coastal province in China, southeast of Beijing. She came from a poor family, singularly reliant on her father’s meagre income. But her parents had painstakingly saved for her tuition; very few of her relatives had ever been to college. A few days later, Xu received another call telling her she had also been awarded a scholarship. To collect the 2,600 yuan ($370), she needed to first deposit a 9,900 yuan “activation fee” into her university account. Having applied for financial aid only days before, she wired the money to the number the caller gave her. That night, the family rushed to the police to report that they had been defrauded. Xu’s father later said his greatest regret was asking the officer whether they might still get their money back. The answer - “Likely not” - only exacerbated Xu’s devastation. On the way home she suffered a heart attack. She died in a hospital two days later. An investigation determined that while the first call had been genuine, the second had come from scammers who’d paid a hacker for Xu’s number, admissions status, and request for financial aid. For Chinese consumers all too familiar with having their data stolen, Xu became an emblem. Her death sparked a national outcry for greater data privacy protections. Only months before, the European Union had adopted the General Data Protection Regulation (GDPR), an attempt to give European citizens control over how their personal data is used. Meanwhile, Donald Trump was about to win the American presidential election, fuelled in part by a campaign that relied extensively on voter data. That data included details on 87 million Facebook accounts, illicitly obtained by the consulting firm Cambridge Analytica. Chinese regulators and legal scholars followed these events closely. In the West, it’s widely believed that neither the Chinese government nor Chinese people care about privacy. US tech giants wield this supposed indifference to argue that onerous privacy laws would put them at a competitive disadvantage to Chinese firms. In his 2018 Senate testimony after the Cambridge Analytica scandal, Facebook’s CEO, Mark Zuckerberg, urged regulators not to clamp down too hard on technologies like face recognition. “We still need to make it so that American companies can innovate in those areas,” he said, “or else we’re going to fall behind Chinese competitors and others around the world.” In reality, this picture of Chinese attitudes to privacy is out of date. Over the last few years the Chinese government, seeking to strengthen consumers’ trust and participation in the digital economy, has begun to implement privacy protections that in many respects resemble those in America and Europe today. Even as the government has strengthened consumer privacy, however, it has ramped up state surveillance. It uses DNA samples and other biometrics, like face and fingerprint recognition, to monitor citizens throughout the country. It has tightened internet censorship and developed a “social credit” system, which punishes behaviors the authorities say weaken social stability. During the pandemic, it deployed a system of “health code” apps to dictate who could travel, based on their risk of carrying the coronavirus. And it has used a slew of invasive surveillance technologies in its harsh repression of Muslim Uighurs in the northwestern region of Xinjiang. This paradox has become a defining feature of China’s emerging data privacy regime, says Samm Sacks, a leading China scholar at Yale and New America, a think tank in Washington, DC. It raises a question: Can a system endure with strong protections for consumer privacy, but almost none against government snooping? The answer doesn’t affect only China. Its technology companies have an increasingly global footprint, and regulators around the world are watching its policy decisions. November 2000 arguably marks the birth of the modern Chinese surveillance state. That month, the Ministry of Public Security, the government agency that oversees daily law enforcement, announced a new project at a trade show in Beijing. The agency envisioned a centralised national system that would integrate both physical and digital surveillance using the latest technology. It was named Golden Shield. Eager to cash in, Western companies including American conglomerate Cisco, Finnish telecom giant Nokia, and Canada’s Nortel Networks worked with the agency on different parts of the project. They helped construct a nationwide database for storing information on all Chinese adults, and developed a sophisticated system for controlling information flow on the internet—what would eventually become the Great Firewall. Much of the equipment involved had in fact already been standardised to make surveillance easier in the US - a consequence of the Communications Assistance for Law Enforcement Act of 1994. The Chinese government has begun to implement privacy protections that resemble those in America and Europe today. Despite the standardised equipment, the Golden Shield project was hampered by data silos and turf wars within the Chinese government. Over time, the ministry’s pursuit of a singular, unified system devolved into two separate operations: a surveillance and database system, devoted to gathering and storing information, and the social-credit system, which some 40 government departments participate in. When people repeatedly do things that aren’t allowed - from jaywalking to engaging in business corruption - their social-credit score falls and they can be blocked from things like buying train and plane tickets or applying for a mortgage. In the same year the Ministry of Public Security announced Golden Shield, Hong Yanqing entered the ministry’s police university in Beijing. But after seven years of training, having received his bachelor’s and master’s degrees, Hong began to have second thoughts about becoming a policeman. He applied instead to study abroad. By the fall of 2007, he had moved to the Netherlands to begin a PhD in international human rights law, approved and subsidized by the Chinese government. Over the next four years, he familiarised himself with the Western practice of law through his PhD research and a series of internships at international organisations. He worked at the International Labor Organisation on global workplace discrimination law and the World Health Organisation on road safety in China. “It’s a very legalistic culture in the West - that really strikes me. People seem to go to court a lot,” he says. “For example, for human rights law, most of the textbooks are about the significant cases in court resolving human rights issues. Hong found this to be strangely inefficient. He saw going to court as a final resort for patching up the law’s inadequacies, not a principal tool for establishing it in the first place. Legislation crafted more comprehensively and with greater forethought, he believed, would achieve better outcomes than a system patched together through a haphazard accumulation of case law, as in the US. After graduating, he carried these ideas back to Beijing in 2012, on the eve of Xi Jinping’s ascent to the presidency. Hong worked at the UN Development Program and then as a journalist for the People’s Daily, the largest newspaper in China, which is owned by the government. Xi began to rapidly expand the scope of government censorship. Influential commentators, or “Big Vs” - named for their verified accounts on social media - had grown comfortable criticising and ridiculing the Chinese Communist Party. In the fall of 2013, the party arrested hundreds of micro-bloggers for what it described as “malicious rumour-mongering” and paraded a particularly influential one on national television to make an example of him. The moment marked the beginning of a new era of censorship. The following year, the Cyberspace Administration of China was founded. The new central agency was responsible for everything involved in internet regulation, including national security, media and speech censorship, and data protection. Hong left the People’s Daily and joined the agency’s department of international affairs. He represented it at the UN and other global bodies and worked on cybersecurity cooperation with other governments. By July 2015, the Cyberspace Administration had released a draft of its first law. The Cybersecurity Law, which entered into force in June of 2017, required that companies obtain consent from people to collect their personal information. At the same time, it tightened internet censorship by banning anonymous users—a provision enforced by regular government inspections of data from internet service providers. In the spring of 2016, Hong sought to return to academia, but the agency asked him to stay. The Cybersecurity Law had purposely left the regulation of personal data protection vague, but consumer data breaches and theft had reached unbearable levels. A 2016 study by the Internet Society of China found that 84% of those surveyed had suffered some leak of their data, including phone numbers, addresses, and bank account details. This was spurring a growing distrust of digital service providers that required access to personal information, such as ride-hailing, food-delivery, and financial apps. Xu Yuyu’s death poured oil on the flames. The government worried that such sentiments would weaken participation in the digital economy, which had become a central part of its strategy for shoring up the country’s slowing economic growth. The advent of GDPR also made the government realise that Chinese tech giants would need to meet global privacy norms in order to expand abroad. Hong was put in charge of a new task force that would write a Personal Information Protection Specification (PIPS) to help solve these challenges. The document, though nonbinding, would tell companies how regulators intended to implement the Cybersecurity Law. In the process, the government hoped, it would nudge them to adopt new norms for data protection by themselves. Hong’s task force set about translating every relevant document they could find into Chinese. They translated the privacy guidelines put out by the Organisation for Economic Cooperation and Development and by its counterpart, the Asia-Pacific Economic Cooperation; they translated GDPR and the California Consumer Privacy Act. They even translated the 2012 White House Consumer Privacy Bill of Rights, introduced by the Obama administration but never made into law. All the while, Hong met regularly with European and American data protection regulators and scholars. Bit by bit, from the documents and consultations, a general choice emerged. “People were saying, in very simplistic terms, ‘We have a European model and the US model,’” Hong recalls. The two approaches diverged substantially in philosophy and implementation. Which one to follow became the task force’s first debate. At the core of the European model is the idea that people have a fundamental right to have their data protected. GDPR places the burden of proof on data collectors, such as companies, to demonstrate why they need the data. By contrast, the US model privileges industry over consumers. Businesses define for themselves what constitutes reasonable data collection; consumers only get to choose whether to use that business. The laws on data protection are also far more piecemeal than in Europe, divvied up among sectoral regulators and specific states. At the time, without a central law or single agency in charge of data protection, China’s model more closely resembled the American one. The task force, however, found the European approach compelling. “The European rule structure, the whole system, is more clear,” Hong says.But most of the task force members were representatives from Chinese tech giants, like Baidu, Alibaba, and Huawei, and they felt that GDPR was too restrictive. So they adopted its broad strokes—including its limits on data collection and its requirements on data storage and data deletion—and then loosened some of its language. GDPR’s principle of data minimisation, for example, maintains that only necessary data should be collected in exchange for a service. PIPS allows room for other data collection relevant to the service provided. PIPS took effect in May 2018, the same month that GDPR finally took effect. But as Chinese officials watched the US upheaval over the Facebook and Cambridge Analytica scandal, they realised that a nonbinding agreement would not be enough. The Cybersecurity Law didn’t have a strong mechanism for enforcing data protection. Regulators could only fine violators up to 1,000,000 yuan ($140,000), an inconsequential amount for large companies. Soon after, the National People’s Congress, China’s top legislative body, voted to begin drafting a Personal Information Protection Law within its current five-year legislative period, which ends in 2023. It would strengthen data protection provisions, provide for tougher penalties, and potentially create a new enforcement agency. After Cambridge Analytica, says Hong, “the government agency understood, ‘Okay, if you don’t really implement or enforce those privacy rules, then you could have a major scandal, even affecting political things. The local police investigation of Xu Yuyu’s death eventually identified the scammers who had called her. It had been a gang of seven who’d cheated many other victims out of more than 560,000 yuan using illegally obtained personal information. The court ruled that Xu’s death had been a direct result of the stress of losing her family’s savings. Because of this, and his role in orchestrating tens of thousands of other calls, the ringleader, Chen Wenhui, 22, was sentenced to life in prison. The others received sentences between three and 15 years. Emboldened, Chinese media and consumers began more openly criticising privacy violations. In March 2018, internet search giant Baidu’s CEO, Robin Li, sparked social-media outrage after suggesting that Chinese consumers were willing to “exchange privacy for safety, convenience, or efficiency.” “Nonsense,” wrote a social-media user, later quoted by the People’s Daily. “It’s more accurate to say [it is] impossible to defend [our privacy] effectively. In late October 2019, social-media users once again expressed anger after photos began circulating of a school’s students wearing brainwave-monitoring headbands, supposedly to improve their focus and learning. The local educational authority eventually stepped in and told the school to stop using the headbands because they violated students’ privacy. A week later, a Chinese law professor sued a Hangzhou wildlife zoo for replacing its fingerprint-based entry system with face recognition, saying the zoo had failed to obtain his consent for storing his image. But the public’s growing sensitivity to infringements of consumer privacy has not led to many limits on state surveillance, nor even much scrutiny of it. As Maya Wang, a researcher at Human Rights Watch, points out, this is in part because most Chinese citizens don’t know the scale or scope of the government’s operations. In China, as in the US and Europe, there are broad public and national security exemptions to data privacy laws. The Cybersecurity Law, for example, allows the government to demand data from private actors to assist in criminal legal investigations. The Ministry of Public Security also accumulates massive amounts of data on individuals directly. As a result, data privacy in industry can be strengthened without significantly limiting the state’s access to information. The onset of the pandemic, however, has disturbed this uneasy balance. On February 11, Ant Financial, a financial technology giant headquartered in Hangzhou, a city southwest of Shanghai, released an app-building platform called AliPay Health Code. The same day, the Hangzhou government released an app it had built using the platform. The Hangzhou app asked people to self-report their travel and health information, and then gave them a color code of red, yellow, or green. Suddenly Hangzhou’s 10 million residents were all required to show a green code to take the subway, shop for groceries, or enter a mall. Within a week, local governments in over 100 cities had used AliPay Health Code to develop their own apps. Rival tech giant Tencent quickly followed with its own platform for building them. The apps made visible a worrying level of state surveillance and sparked a new wave of public debate. In March, Hu Yong, a journalism professor at Beijing University and an influential blogger on Weibo, argued that the government’s pandemic data collection had crossed a line. Not only had it led to instances of information being stolen, he wrote, but it had also opened the door to such data being used beyond its original purpose. “Has history ever shown that once the government has surveillance tools, it will maintain modesty and caution when using them?” he asked. “Has history ever shown that once the government has surveillance tools, it will maintain modesty and caution when using them? Indeed, in late May, leaked documents revealed plans from the Hangzhou government to make a more permanent health-code app that would score citizens on behaviors like exercising, smoking, and sleeping. After a public outcry, city officials cancelled the project. That state-run media had also published stories criticising the app likely helped. The debate quickly made its way to the central government. That month, the National People’s Congress announced it intended to fast-track the Personal Information Protection Law. The scale of the data collected during the pandemic had made strong enforcement more urgent, delegates said, and highlighted the need to clarify the scope of the government’s data collection and data deletion procedures during special emergencies. By July, the legislative body had proposed a new “strict approval” process for government authorities to undergo before collecting data from private-sector platforms. The language again remains vague, to be fleshed out later—perhaps through another nonbinding document—but this move “could mark a step toward limiting the broad scope” of existing government exemptions for national security, wrote Sacks and fellow China scholars at New America. Hong similarly believes the discrepancy between rules governing industry and government data collection won’t last, and the government will soon begin to limit its own scope. “We cannot simply address one actor while leaving the other out,” he says. “That wouldn’t be a very scientific approach. Other observers disagree. The government could easily make superficial efforts to address public backlash against visible data collection without really touching the core of the Ministry of Public Security’s national operations, says Wang, of Human Rights Watch. She adds that any laws would likely be enforced unevenly: “In Xinjiang, Turkic Muslims have no say whatsoever in how they’re treated. Still, Hong remains an optimist. In July, he started a job teaching law at Beijing University, and he now maintains a blog on cybersecurity and data issues. Monthly, he meets with a budding community of data protection officers in China, who carefully watch how data governance is evolving around the world.
TECHNOLOGY REVIEW
<ul class="post-categories"> <li><a href="https://trustgdpa.com/blog/" rel="tag">Blog</a></li></ul>

CHANGES TO JAPAN’S DATA PRIVACY LAW ECHO EUROPE’S GDPR

author01 dpofficer September 11, 2020 4 min read

RETURN TO MAIN BLOG


Japan has made changes to its 2005 Protection of Personal Information (APPI) Act, bringing the bill closer in line with the EU’s General Data Protection Regulation (GDPR). The latest tweaks, announced this month, cover data breach reporting and the use of facial recognition data gathered from devices such as security cameras. Breaches should now be reported using an official form, rather than by mail or fax, as before. When processing image data, the intended use should be stated immediately, while the methods and privacy measures used while processing said images should be made clear. These additions follow hard on the heels of more significant changes, which will mean tighter controls on the international transfer of data from 2022, helping to bring the law further in line with GDPR. “Japan has a robust data privacy law with many similarities to the GDPR,” Scott Warren, a partner in the Tokyo office of law firm Squire Patton Boggs, tells The Daily Swig. “Indeed, Japan is the only country in Asia to have exchanged joint adequacy findings with the EU, finding the laws roughly equivalent.” Warren added: “What I find interesting is the ways the laws diverge. For example, Japan does not have a breach notification obligation, nor significant penalties on entities failing to meet the standards. “Japan has recently passed an amendment to the law to rectify some of these and other items, including increasing penalties up to $946,000 – but it will take well over a year for it to be fully implemented.”

Cross-border transfers

While in its current form the APPI applies to any organization obtaining personal information from data subjects located in Japan, this hasn’t been enforceable on foreign businesses. Now, though, they will have to provide reports concerning the processing of Japanese residents’ personal information – and can be penalized if they fall short. In addition to the move towards reporting via a specific web form, there is also a new requirement for all breaches to be reported to the victim and the Personal Information Protection Commission (PPC). It’s not yet clear whether all breaches will need to be reported, but major incidents or those that violate the rights of subjects almost certainly will.

Expanding individual rights

In a GDPR-like move, data subjects will now have the right to request access to their data, and to ask for it to be corrected or deleted, where there’s the possibility that their rights or legitimate interests have been breached. This also applies to short-term data – previously, the data had to have been held for six months or more. Currently, there’s no need for a data subject to give their explicit consent when data is transferred to a third party. This, though, is set to change, and permission will become opt-in. Further, if data has already been transferred on an opt-out basis, it cannot now be transferred to a third party without permission. Any organization receiving data will have to conform to APPI standards.

Increased penalties

Organizations that violate these rules now face a potential fine of ¥100 million ($942,000), while falsifying a report to the PPC will cost ¥500,000 ($4,708). Meanwhile, any individual found responsible for a breach could face a fine of up to ¥1 million ($9,420) and a year in prison. The move brings Japan to the forefront of Asian data protection legislation, says Warren, along with Korea, which has had strong data protection laws for years. “Elsewhere, we have seen a number of countries pass new data privacy laws, which have various GDPR-type elements in them, though rarely as strenuous,” he says. “Thailand’s new law is similar in many respects, though its implementation has been delayed. Vietnam’s new law takes various elements of the GDPR, but includes data localization requirements similar to China’s Cybersecurity Law.” He adds: “I fear many countries in Asia have a way to go.”
PORTSWIGGER

GLOBAL PRIVACY TRENDS

FINES & PROSECUTIONS

BULGARIAN DPA FINES BANK €500,000 FOR DATA BREACH

  • Fines
author01 August 31, 2019 2 min read

SWEDISH SCHOOL FINED OVER €18,000 FOR GDPR BREACH

  • Fines
author01 August 21, 2019 2 min read

COMPANY FINED €150,000 BY THE HELLENIC DPA

  • Fines
author01 July 31, 2019 3 min read

PROCESSOR GETS 9 YEARS IN PRISON

  • Fines
author01 July 19, 2019 2 min read

ESTATE AGENCY FINED £80,000 FOR FAILING TO KEEP TENANTS’ DATA SAFE

  • Fines
author01 July 19, 2019 2 min read
<ul class="post-categories"> <li><a href="https://trustgdpa.com/fines/" rel="tag">Fines</a></li></ul>

BULGARIAN DPA FINES BANK €500,000 FOR DATA BREACH

author01 dpofficer August 31, 2019 2 min read

RETURN TO MAIN BLOG


The Chairman of the Commission for Personal Data Protection issued a Penal Order to DSK Bank EAD on the 08/28/2019 Pursuant to Art. 87, para. 3 of the Law on Personal Data Protection, Ventsislav Karadjov (Chairman of the Commission for Personal Data Protection), issued DSK Bank EAD with a Penal Order  for violation of Art. 32, § 1 (b) of Regulation (EU) 2016/679, with a view to the misappropriation and access by third parties of a total of 33,492 (thirty-three thousand four hundred ninety-two) bank customers involving 23,270 ( twenty-three thousand two hundred and seventy) credit records containing personal data and an unlimited number of related third parties (including their spouses, sellers, descendants and descendants and guarantors). The amount of the sanction imposed is BGN 1,000,000 (€500,000). Following a one-month audit, it was found that, in the course of its activity, DSK Bank EAD, as a data controller, did not implement appropriate technical and organizational measures and did not provide the ability to guarantee a permanent confidentiality, integrity, availability and sustainability of systems and services for processing personal data of individuals - customers of the Bank and related third parties, with their names, citizenship, personal identification number, permanent or current address being circulated and throughout the European Union, copies of identity cards and biometric data on eye growth and color contained therein; all personal data contained in tax documents, certifying the income and health status of the borrowers and third parties thereon, as well as health information (with some credit files containing declarations by the TELC for reduced working capacity), payment numbers bills, as well as registration numbers and dates of notarized acts with the signature signed.
CPDP
<ul class="post-categories"> <li><a href="https://trustgdpa.com/fines/" rel="tag">Fines</a></li></ul>

SWEDISH SCHOOL FINED OVER €18,000 FOR GDPR BREACH

author01 dpofficer August 21, 2019 2 min read

RETURN TO MAIN BLOG


For the first time, the Data Inspectorate of Sweden (Swedish Supervisory Authority) issued a penalty against a high school which has violated the rules of the General Data Protection Regulation (GDPR). A high school in Skellefteå has been testing the use of face recognition technology to record student class attendance. The trial has been going on for three weeks and affected 22 students. The Data Inspectorate has examined the use and finds that the High School Board in Skellefteå has handled sensitive personal data in violation of the General Data Protection Regulation. "The High School Board in Skellefteå has violated several of the provisions of the General Data Protection Regulation in a non-compliant manner resulting in the  issuing of a penalty fine," says Lena Lindgren Schelin, Director General of the Data Inspectorate. The fine issued was SEK 200,000 (over €18,000) which averages out to a little over €845 per student. Facial recognition technology is in its infancy, but developing fast. We therefore see a great need to create clarity about its implications, says Lena Lindgren Schelin. In its decision, the Data Inspectorate of Sweden found that facial recognition camera surveillance of the students in their everyday environment, was an intrusion on their integrity and that presence control can be done in other ways which do not violate the students rights under the GDPR.
DATAINSPEKTIONEN
<ul class="post-categories"> <li><a href="https://trustgdpa.com/fines/" rel="tag">Fines</a></li></ul>

COMPANY FINED €150,000 BY THE HELLENIC DPA

author01 dpofficer July 31, 2019 3 min read

RETURN TO MAIN BLOG


The Hellenic Data Protection Authority, in response to a complaint, conducted an ex officio investigation of the lawfulness of the processing of personal data of the employees of the company ‘PRICEWATERHOUSECOOPERS BUSINESS SOLUTIONS SA’ (PWC BS). According to the above complaint the employees were required to provide consent to the processing of their personal data. The DPA considered that PWC BS as the controller:

i. has unlawfully processed the personal data of its employees contrary to the provisions of Article 5(1)(a) indent (a) of the GDPR since it used an inappropriate legal basis.

ii. has processed the personal data of its employees in an unfair and non-transparent manner contrary to the provisions of Article 5(1)(a) indent (b) and (c) of the GDPR giving them the false impression that it was processing their data under the legal basis of consent pursuant to Article 6(1)(a) of the GDPR, while in reality it was processing their data under a different legal basis about which the employees had never been informed.

iii. although it was responsible in its capacity as the controller, it was not able to demonstrate compliance with Article 5(1) of the GDPR, and that it violated the principle of accountability set out in Article 5(2) of the GDPR by transferring the burden of proof of compliance to the data subjects.

The Hellenic DPA, after ascertaining the infringements of the GDPR, decided that in this case it should exercise the corrective powers conferred on it under Article 58(2) of the GDPR by imposing corrective measures, and that it would order the company in its capacity as the controller within three (3) months:
  • to bring the processing operations of its employees’ personal data as described in Annex I submitted by the company into compliance with the provisions of the GDPR;
  • to restore the correct application of the provisions of Article 5(1)(a) and (2) in conjunction with Article 6(1) of the GDPR in accordance with the grounds of the decision;
  • to subsequently restore the correct application of the rest of the provisions of Article 5(1)(b)-(f) of the GDPR insofar as the infringement established affects the internal organisation and compliance with the provisions of the GDPR taking all necessary measures under the accountability principle.
Moreover, as the above corrective measure is not sufficient in itself to restore compliance with the GDPR provisions infringed, the Hellenic DPA considered that, based on the circumstances identified in this case and under Article 58(2)(i), an additional effective, proportionate and dissuasive administrative fine should be imposed in accordance with Article 83 of the GDPR, which amounts to one hundred and fifty thousand Euros (EUR 150,000.00).
DPAGR
<ul class="post-categories"> <li><a href="https://trustgdpa.com/fines/" rel="tag">Fines</a></li></ul>

PROCESSOR GETS 9 YEARS IN PRISON

author01 dpofficer July 19, 2019 2 min read

RETURN TO MAIN BLOG


Harold Thomas Martin III, a former National Security Agency contractor who was accused and later pled guilty to stealing over 50TB of NSA data, was sentenced today to nine years in prison. Martin was arrested in October 2016 after the FBI raided his home and found documents that he'd been taking home for years, without authorization. Files were found on his computer, and in his car. Some of the documents were labeled top secret and contained information about NSA infrastructure and tools, but there were also documents about the Central Intelligence Agency (CIA), US Cyber Command, and the National Reconnaissance Office (NRO). NO EVIDENCE MARTIN SHARED ANY OF THE FILES Martin worked as a contractor for various US government agencies for 20 years, between 1996 and 2016. At the time of his arrest, Martin was working for Booz Allen Hamilton, the same company where Snowden had worked. He had Top Secret and Sensitive Compartmented Information (SCI) clearances, which allowed him to handle sensitive government documents and files -- but only at work, and not at home. Prosecutors described the cache of government documents discovered at Martin's home as "breathtaking" in scale. Martin's lawyers said he only took documents home to study and become better at his job and not with any intention to sell government secrets. NO SHADOW BROKERS CONNECTION WAS EVER PROVEN At the time of his arrest, the Washington Post reported that Martin was the main suspect in the government's investigation into the Shadow Brokers, a group of internet hackers who started leaking NSA files and hacking tools on the internet, in the summer of 2016. In a different report, the New York Times suggested Martin was under investigation for leaking documents to WikiLeaks. US authorities indicted Martin in February 2017, and he signed a guilty plea in March 2019. Prosecutors never proved that Martin had any connections to the Shadow Brokers or WikiLeaks. Martin signed a plea agreement admitting his guilt. As part of the guilty plea, prosecutors said they wouldn't seek more than seven years in prison. The judge wasn't bound by the guilty plea and sentenced Martin to nine years in prison, including time served, and three years of supervised release. Martin is 54.
ZDNET
<ul class="post-categories"> <li><a href="https://trustgdpa.com/fines/" rel="tag">Fines</a></li></ul>

ESTATE AGENCY FINED £80,000 FOR FAILING TO KEEP TENANTS’ DATA SAFE

author01 dpofficer July 19, 2019 2 min read

RETURN TO MAIN BLOG


The Information Commissioner’s Office (ICO) has fined a London estate agency £80,000 for leaving 18,610 customers' personal data exposed for almost two years. The security breach happened when Life at Parliament View Ltd (LPVL) transferred personal data from its server to a partner organisation and failed to switch off an ‘Anonymous Authentication’ function. This failure meant access restrictions were not implemented and allowed anyone going online to have full access to all the data stored between March 2015 and February 2017. The exposed details included personal data such as bank statements, salary details, copies of passports, dates of birth and addresses of both tenants and landlords. During its investigation, the ICO uncovered a catalogue of security errors and found that LPVL had failed to take appropriate technical and organisational measures against the unlawful processing of personal data. In addition, LPVL only alerted the ICO to the breach when it was contacted by a hacker. The ICO concluded this was a serious contravention of the 1998 data protection laws which have since been replaced by the GDPR and the Data Protection Act 2018. Steve Eckersley, Director of Investigations at the ICO said:
“Customers have the right to expect that the personal information they provide to companies will remain safe and secure. That simply wasn’t the case here. “As we uncovered the facts, we found LPVL had failed to adequately train its staff, who misconfigured and used an insecure file transfer system and then failed to monitor it. These shortcomings have left its customers exposed to the potential risk of identity fraud. “Companies must accept that they have a legal obligation to both protect and keep secure the personal data they are entrusted with. Where this does not happen, we will investigate and take action.”
ICO UK

COUNTRY PRIVACY TRENDS