California has passed a sweeping privacy law that gives consumers the right to demand that their data be deleted and to bar companies from selling their data without them losing access to services or being charged a higher price.

The bill, passed today by the state’s legislature and quickly signed by Gov. Jerry Brown, affects all companies that do business in the state and collect data. It requires those businesses to disclose information they store, what purpose it’s for, and with which third parties it’s shared.

For data breaches, consumers may be able to sue for up to $750 for each violation, while the state attorney general can sue for intentional violations of privacy at up to $7,500 each. For both consumer and state lawsuits, companies have to be given 30 days to fix the problem.

The act takes effect Jan. 1, 2020 (Credit: Illustration by Ad Age)

The legislature barreled the act through introduction to passage in a matter of days, as a stricter citizen’s initiative with a similar approach was destined for the November ballot. It let consumers sue for as much as five times as much per violation.

California often acts on technology, privacy, and environmental issues in advance of other states and the federal government, and this measure could serve as a catalyst for other states to pass similar or identical laws.

A number of tech giants strongly opposed the initiative and the legislative measure, although individual companies and groups representing them articulated few reasons. A Google executive said the act would have unintended consequences, but didn’t enumerate possibilities. A cellular operator trade group, the CTIA, said state-specific rules would confuse consumers and stifle innovation, especially if other states pile on.

Many technology companies have faced criticism over disclosures both about what data is collected and how, as well as their actions when they discover privacy flaws or data breaches.

However, the California act will affect any business that has customers in California that meet one or more of the following tests: gross at least $25 million annually; interact with information to 50,000 or more people, households, or devices; or make half its annual revenue from selling personal information.

The landmark bill has elements in common with the General Data Protection Regulation (GDPR) that the European Union imposed on its member states and some affiliates in late May. The GDPR roiled many websites and advertising networks, despite the long advance notice of its effective date, leading some media companies to block access to E.U. readers.

Unlike the GDPR, however, the California measure doesn’t require opt-in permission to collect information, nor any right to opt out short of complete deletion. Rather than a disclosure, the Consumer Privacy Act makes consumers act to request information, which then must be provided.

The ballot initiative that spurred the fast passage of this bill was the work of housing developer Alastair Mactaggart, who contributed $3 million as of June 23 for signature gatherers and other expenses. However, Amazon, Facebook, Google, Microsoft, Uber, and other tech companies planned to spend as much as $100 million opposing it if it had reached the ballot. Mactaggart said he’d withdraw the initiative if the legislature crafted a bill that had sufficiently similar protections as his.

The Consumer Protection Act gives businesses a loophole to coax consumers to share their data. But that loophole— providing consumers with financial incentives—may be costly.

California Goes Beyond GDPR With New Data Privacy Law….. Really? GDPR Watchdog think the new law is more like a “GDPR Lite”

Bill Bonney, CISA, Author of “CISO Desk Reference Guide” and Programs Directors for ISACA San Diego Chapter

Bill BonneyThis week, in my home state of California, the state legislature passed, and the governor signed, AB 375, officially known as the California Consumer Privacy Act of 2018. The legislation will take effect January 1, 2020. The good news for privacy professionals is that this bill resembles in many ways the European Union’s General Data Protection Regulation (GDPR). Much of the same data classification, business logic, and tracking of consent and preferences developed to comply with the GDPR should translate to California law.

However, there are some key differences, which I will highlight below.

A little background and a race against time
While work on AB 375 began in February 2017, its passage yesterday is a direct response to current events. The legislation lists as one of its raisons d’être the recently disclosed actions of Cambridge Analytica, and a ballot measure, the “California Consumer Privacy Act,” that was designed to push the bill along. The measure had overwhelming popular support, and June 28 was the last day that the measure could be pulled from the ballot.

With the passage of AB 375, Alastair Mactaggart, chairman of Californians for Consumer Privacy and the major force behind the ballot measure, announced that the measure would be pulled, as was previously promised if the bill passed. The bill and the ballot measure were very similar, but by passing the bill, the California Legislature preserved its right to amend the law going forward and limited consumers’ rights of redress to breaches as opposed to all violations.

Taking GDPR a few steps further
There are several key differences between AB 375 and GDPR. The major ones are the right for consumers to sell their personal information (and by explicit reference in section 1798.125 (b), the right for a business to offer incentives to consumers to allow their information to be collected and sold), and, under section 1798.115, the consumer has the right to direct a business that sells the consumer’s information to disclose: a) what they are collecting; b) what they are selling; and c) what they are transferring for other business uses.

The right to offer incentives is a huge leap forward in that is allows firms to offer something (not necessarily money) in exchange for the resale of a consumer’s personal data, but it also establishes ownership rights in a whole new way. It’s one thing to control the use of one’s data, it’s still another to allow it only with compensation. It will be very interesting to see the market (consumers and data collectors) set the price. How much is your information worth?

California rightly excludes, under section 1798.145, the obligations where none of the covered activities take place in California and do not involve individuals who are in California at the time of data collection.

What’s next
As an information security professional, I have always used California (SB 1386), Massachusetts (201 CMR 17.00), Nevada (N.R.S. § 603A.010) and Texas (Texas Medical Records Privacy Act) as my state regulatory privacy proxies. I will immediately add AB 375 to that list and predict that the consumer backlash to the events and disclosures of 2016-2018 will cause other states to pick up where California has left off.

Texas cancer center faces $4.3M fine for data breaches

Federal health officials have ordered the University of Texas MD Anderson Cancer Center to pay a $4.3 million fine for failing to secure health records stemming from data breaches.

The Houston Chronicle reports the U.S. Department of Health and Human Services announced Monday that MD Anderson’s failure to encrypt health records violated the 1996 patient privacy law known as the Health Insurance Portability and Accountability Act.

The case involves three incidents in 2012 and 2013 when the center’s devices were either stolen or lost, potentially compromising the health records of 35,000 people. The Office of Civil Rights’ investigation into the data breaches found the center didn’t fully encrypt all of its devices during that time.

MD Anderson had alleged the center wasn’t subject to encryption requirements because the health information involved was being used for research.

GDPR Watchdog – What about all those BIG DATA tech companies?

Companies such as Google, Facebook, Amazon, Microsoft, as well as trade bodies such as the Data Marketing Association and Interactive Advertising Bureau all threw large swaths of money to thwart the bill from being signed into law through the “Committee to Protect California Jobs.”

Expect them to vigorously fight for concessions that would weaken the bill leading up to 2020, says Jason Kint, CEO of Digital Content Next.

“The duopoly will fight like mad to amend this thing into their interests,” Kint says. “Facebook will take a back seat to Google because Facebook is so toxic to any privacy discussion right now. And like we’re seeing with GDPR, enforcement including antitrust scrutiny of the duopoly matters, otherwise Google determines the rules and wins the game.”

Facebook and Google are using “Dark Patterns” to fight against GDPR in EUROPE according to the Norwegian Consumer Protection Agency

Facebook, Google and Microsoft push users away from privacy-friendly options on their services in an “unethical” way, according to a report by the Norwegian Consumer Council.

It studied the privacy settings of the firms and found a series of “dark patterns”, including intrusive default settings and misleading wording.

The firms gave users “an illusion of control”, its report suggested.

Both Google and Facebook said user privacy was important to them.

The report – Deceived by Design – was based on user tests which took place in April and May, when all three firms were making changes to their privacy policies to be in compliance with the EU’s General Data Protection Regulation (GDPR).

Illusion

It found examples of

  • privacy-friendly choices being hidden away
  • take-it-or-leave it choices
  • privacy-intrusive defaults with a longer process for users who want privacy-friendly options
  • some privacy settings being obscured
  • pop-ups compelling users to make certain choices, while key information is omitted or downplayed
  • no option to postpone decisions
  • threats of loss of functionality or deletion of the user account if certain settings not chosen

For example, Facebook warns anyone who wishes to disable facial recognition that doing so means that the firm “won’t be able to use this technology if a stranger uses your photo to impersonate you”.

The report concluded that users are often given the illusion of control through their privacy settings, when they are not getting it.

“Facebook gives the user an impression of control over use of third party data to show ads, while it turns out that the control is much more limited than it initially appears,” the report said.

“And Google’s privacy dashboard promises to let the user easily delete data, but the dashboard turns out to be difficult to navigate, more resembling a maze than a tool for user control,” it added.

Microsoft received praise for giving equal weight to privacy-friendly and unfriendly options in its set-up process in Windows 10.

The consumer watchdog concluded: “The combination of privacy-intrusive defaults and the use of dark patterns nudge users of Facebook and Google, and to a lesser degree Windows 10, towards the least privacy-friendly options to a degree that we consider unethical.

“We question whether this is in accordance with the principles of data protection by default and data protection by design, and if consent given under these circumstances can be said to be explicit, informed and freely given.”

Shortly after GDPR came into force in May, Google and Facebook were accused of breaking the laws by privacy group noyb.eu, set up by activist Max Schrems.

It complained that people were not being given a free choice when it came to choosing new privacy settings.

MUST READ: Why US PRIVACY SHIELD is NOT in Compliance with GDPR

This article is sponsored by:

GDPR certified

Show your customers that you care about their privacy! European Center for GDPR Certification is the “Consumer Trust Body” of the General Data Protection Regulation. Visit  GDPRcertified.org to read about how to add “GDPR TRUST SEAL”™ to your website in order to gain more business and distance you from the not so serious competitors – It Pays Off!