Privacy law is a mess

The title says it all – what should smaller companies do to comply with privacy laws?

California has now finalized the California Consumer Privacy Act (CCPA), Cal. Civ. Code §§ 1798.100 to 1798.199 – well, at least for now (please note that this link does not have all of the law changes in it as of the posting of this article). It goes into effect 1/1/2020. Regulations under it will not be issued until December at the earliest and are likely to change over time. While it is a net gain for California consumers, it is a complex law with many incidental effects and traps for the unwary business. How does a small business deal with this mess? Before we address that, let’s discuss some background:

Why is CCPA important?

The CCPA is important because so many businesses do business with California consumers that California law is the “highest common denominator” – meaning, instead of trying to comply with disparate laws in 50 states, a business could target compliance with the most onerous law (typically California law in the pro-consumer sense), and then hope for the best that such compliance will also comply with other laws. This does not always work – for example, Illinois has a much harsher bio-metric security law than California, and New York has very detailed personal information protection laws and rules as well, particularly in the financial/banking sector. So, a slight modification of the above strategy is to target the “top 3” laws (i.e. California, Illinois and New York) and again hope for the best in other states. And finally, there is the modified “top 3” strategy of adding compliance with the General Data Protection Regulation of the EU (GDPR).

What many larger companies have done is simply targeted compliance with the GDPR worldwide, assuming it is the most onerous pro-privacy law. However, the CCPA has provisions that differ from, and add to, the GDPR, for example, the regulations on businesses that sell personal information, and that broker those sales, does not really directly exist under GDPR.

How does this affect small business?

Many small businesses may think they are not subject to these privacy laws because they do not “do business” in a particular state, or in the European Union. While those are interesting issues – either the ability of a state to constitutionally require a remote business to be liable under a privacy law when interacting solely electronically with a resident of that state – or whether the GDPR regulators can fine and enforce such fines against a small US business that has no offices, employees or other contacts in the EU, the problem is that many small businesses will contract with larger businesses that are subject to those laws and regulators and those contracts will require the small business to comply with those laws indirectly. This is particularly true in heavily regulated industries such as banking and health care, where regulators apparently force their member banks to impose liability on third party vendors.

In addition, when a small business goes to sell or merge – typically with a larger entity that may apply these rules and regulations in analyzing the level of data risk it has post transaction. If the small business has not thought about basic data privacy and data security, this can negatively impact the value of the business.

So, what should a small business do (and, what are the key provisions of CCPA)?
  1. First, do not ignore remote state laws like CCPA or the GDPR. Someone in the organization should be assigned the responsibility, and be given reasonable time, to make a true assessment of the data privacy and security risks of the company. Ideally that person would have a “C” designation (CIO, CTO, CPO etc) and be incentivized to diligently complete such tasks. The business should neither marginalize nor minimize such role.
  2. Second, do a data inventory – what data is the business collecting? Why? Does it really need it? Where is it collecting this data from? (the web, forms, manual entry, data harvesting/scraping, third party lists etc) What agreements are there with those data sources (this includes terms of service)? Is this information personal information? What type of personal information ? (i.e. is it sensitive personal information). Basically, this is a review of all inbound data flow . . .
  3. Third, determine where the collected data is being disclosed or shared? This can be the critical step – because if the information being collected is personal information, and particularly if it is sensitive information, this can have significant impacts if there is a data breach. Do adequate agreements cover that data exchange? Is the data encrypted? Should it be? Has any audit or review of the recipient been done to determine the adequacy of their data protection systems? Basically, this step involves tracing all outbound data flows, and determining the business need for the disclosure and the risk level such disclosure presents.
  4. Fourth, assess the computer systems used to capture, store, and transmit data, to determine weaknesses in security where a data breach can occur. Computer security is hard, period. It is way harder today when its not just your own computer security, but the security of every link in the data disclosure chain.
  5. Fifth, consider what tools to use to address risk. Do you throw technology at the issue, like better intrusion protection, detection systems, universal threat management devices, etc.? Do you hire experts, which may be costly? Both? Do you have proper agreements in place? Indemnity? Does the downstream recipient have insurance? Are you sure about that? Do you have insurance? Do contracts you have require insurance? Be especially careful with insurance – a good insurance broker will be up on all the various changes in insurance policies that claim to cover “cyber liability” (a generic term that is meaningless without specific context). You simply cannot determine what insurance you need if you have not performed steps 1-4 above.
  6. Sixth, engage in continuous review and management. Hackers are not static, they evolve – your systems must be maintained and modified to address new threats and issues, be updated and patched, and monitored for threats. See item 1 – it is why a dedicated C level person and/or team need to be in place to really address these issues.

The key provisions of CCPA as they relate to small businesses are below:

  1. It only applies to certain “businesses” – namely – a business that has annual gross revenues in excess of $25,000,000); or that alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices; or derives 50 percent or more of its annual revenues from selling consumers’ personal information.
  2. A consumer has a right to an accounting – essentially, a consumer can request that a business disclose to the consumer the categories and specific pieces of personal information the business has collected.
  3. A consumer has the right to request that a business delete any personal information about the consumer which the business has collected from the consumer, subject to several exceptions.
  4. A consumer has the right to an accounting of personal information that has been transferred by the business to third parties, subject to several exceptions. These rights are enhanced if the business “sells” that personal information.
  5. A consumer has the right to terminate the sale of its personal information. It is important to note that the law does not force a business to offer a service to a consumer who makes such an opt out – in other words, a business can condition its service on the right to sell the information. However: (a) a business cannot sell personal information of consumers under 16 (if they know they are under 16) without express opt in, and for consumers under 13, the parent or guardian must opt in; and (b) a business cannot discriminate against a consumer who exercises any of their rights under the law.
  6. A business must provide two methods of contacting the business to exercise these rights – one of which is a toll free number, unless the business transacts business solely online and has a direct relationship to the consumer, in which case such online business need only provide an email address to send such requests. There are affirmative disclosure requirements for websites of businesses that are subject to the law. Businesses that sell personal information have additional affirmative disclosure requirements for their websites.
  7. A consumer whose personal information has been breached now has an affirmative damage remedy. Previously there was uncertainty in the law as to whether actual damage or harm would have to be shown to recover, or just the risk of future harm. In general the cases have held that actual harm is required, but vary in what they view as “actual harm.”
  8. Significant daily penalties can be assessed for non compliance after a 30 day notice period.

The above is only a general overview. However, some of those rights, for example, the right to onward transfer accounting, the right to delete information, and the right to opt out, present not only legal compliance issues, but significant technical hurdles. For many small businesses, their systems were not designed this way, and/or, they have so many disparate systems where data is duplicated, that it might hard or near impossible to comply. If a small business runs through the above checklist and gets a handle on the who, what, where, when and why questions, it will be easier to then assess the “how hard to comply” question.

For more information or assistance in data security and privacy law compliance, please contact Mike Oliver

Misconfigured Server costs firm £80,000

Question: How do cost your company £80,000 with one relatively small computer error?

(Short) Answer: You misconfigure an FTP (file transfer protocol) server . . . and forget and leave it running.

This was the lesson Life at Parliament View Limited recently learned when the Information Commissioner’s Office (https://ico.org.uk) fined it £80,000 for violating the 7th principle of the Data Protection Act 1998 (“DPA”). See https://ico.org.uk/media/action-weve-taken/mpns/2615396/mpn-life-at-parliament-view-limited-20190717.pdf. ICO could have fined it £500,000 (the maximum under that act) – but chose to only implement 16% of the maximum fine.

What happened? Life at Parliament needed to mass transfer personal data – though not particularly sensitive data (note1) – to a data processor, and chose to use an FTP server. They intended to use a feature of this server to require a username and password, but the technicians misunderstood the server documentation from Microsoft, and ended up putting the server in Anonymous Authentication mode. In addition, “The FTP server was further misconfigured in that whilst approved data transfers were encrypted, personal data transmitted to non-approved parties was not. As such, transfers of personal data over FTP to non- approved parties had the potential to be compromised or intercepted in transit.” (Though not explained in the opinion, this was likely a fallback setting that allowed the server to transmit over a non encrypted channel if the receiving party did not have a secure channel available). The server was left in this condition for just shy of 2 years. Computer logs showed over 500,000 anonymous data requests. Eventually a hacker (well, really a person with ordinary computer skill who located the open FTP server) who had obtained the data, began extorting Life at Parliament.

While the failure of basic computer security is plain in this case, it is noteworthy that ICO also found the following violations:

  1. Post configuration of the server, LVPL failed to monitor access logs, conduct penetration testing or implement any system to alert LPVL of downloads from the FTP server, which would have facilitated early detection and containment of the breach;
  2. Failure to provide staff with adequate and timely training, policies or guidance either in relation to setting up the FTP server, or information handling and security generally.

ICO has been very active in the general data protection space and issuing fines, and this decision – while an easy one in light of the poor computer security practices – is telling because ICO found secondary violations in post implementation failures to detect and train.

The same tendency is happening in the US – the FTC and State Attorney Generals are increasing their oversight of data protection, and several states (e.g. California’s CCPA) are enacting new data protection and data oversight requirements. While the FTC has had some wins (see a recent order against a car dealer, no fine but consent order, where unencrypted data was exposed for 10 days – https://www.ftc.gov/news-events/press-releases/2019/06/auto-dealer-software-provider-settles-ftc-data-security), and at least one major set back in its efforts against LabMD (http://media.ca11.uscourts.gov/opinions/pub/files/201616270.pdf), it is likely that the government regulators will start going after companies that have engaged in less egregious data security violations, but nevertheless have lax training or monitoring set up, and probably also pursue smaller businesses who may not have the resources to have a robust security system and training.

For more information on our data security and privacy practice contact Mike Oliver.

_______________________

(note 1): The data consisted of “The types of personal data potentially compromised included names, phone numbers, e-mail addresses, postal addresses (current and previous), dates of birth, income/salary, employer details (position, company, salary, payroll number start date, employer address & contact details), accountant’s details (name, email address & phone number). It also contained images of passports, bank statements, tax details, utility bills and driving licences of both tenants and landlords.”