Target Reaches Settlement Agreement with 47 States for Data Privacy Breach

By John Weaver

Target has agreed to pay $18.5 million to settle a lawsuit involving 47 states and the District of Columbia related to a 2013 cyberattack that affected the data privacy of more than 41 million customers. The hackers gained access to Targets customer service database, capturing full names, phone numbers, email addresses, payment card numbers, credit card verification codes, and other sensitive data from those customers.

Continue reading

Artificial Intelligence Owes You an Explanation

When an A.I. does something, you should be able to ask, “Why?”

By John Weaver

As published on Slate.com (May 2017)

My family has grown very attached to our Amazon Echo, particularly for music. We can access Prime Music by asking Alexa for an artist, song, or station. Even my young kids can navigate the verbal interface to request “Can’t Fight the Feeling” from the movie Trolls or the soundtrack to the musical Hamilton.

As part of the smart speaker’s artificial intelligence, the program picks up on our tastes and preferences, so when we simply say “Alexa, play,” the device will queue up suggested tracks. In theory, what it picks should have some obvious relationship to music we chose ourselves. And songs it selects usually do. Usually.

But recently, Alexa considered our diet of kids’ music, show tunes, the Beatles, the Rat Pack, and Pink Martini, and decided to cue up … Sir Mix-a-Lot.

To read the full article on Slate.com, click here.

BYOD: Has Your Company Addressed Its Privacy and Data Security Risks?

By Cameron G. Shilling

(co-authored by: Colleen Karpinsky Cone, VP Talent & Culture, DYN)

As published in ACC Docket (October 2015)

Bring your own device, or BYOD, presents significant privacy and data security risks to companies. To reduce these risks, businesses should implement appropriate written data security and information use policies and procedures, before a disaster occurs.

BYOD appeals to both companies and their employees. Employees prefer to select the type of mobile device they want to use for business and personal purposes. Companies use BYOD to avoid some or all the costs of purchasing and supporting mobile devices for employees, and to simplify the processes when hiring employees and when employees depart.

When an employee uses a personal device to perform work and access the data systems of the company, valuable business information accumulates on the device. The presence of that data on the device is a security risk if the device is lost, stolen, or compromised, and privacy concerns can arise if the company needs to access the device to recover its data. These issues should be properly addressed in written data security and information use policies.

Several state and federal laws require companies to implement security measures to safeguard sensitive information. The Massachusetts and California data security laws and the Health Insurance Portability and Accountability Act, or HIPAA, are good examples. These laws require encryption of ‘data in motion’, such as data transported on mobile devices, laptops, and USB drives, and data transmitted electronically by email and in other ways. BYOD companies often do not encrypt data in motion on employee-owned mobile devices, and devastating data breaches have resulted from the loss, theft, and compromise of such devices.

Mobile device management, or MDM, is currently a good technology solution for encryption of business data on personal devices. MDM is not only generally commercially available and technologically viable, it also provides companies with other benefits, such as the ability to monitor an employee’s remote business activities, and to remotely erase company data from lost and stolen devices and from the devices of departing employees.

Encryption technology also is readily available for laptops and business email systems; dual authentication virtual private networks, or VPNs, provide employees with encrypted access to company systems from offsite; and secure portals and similar technologies can be implemented for the encrypted transmission of large amounts of data. In short, encrypting sensitive data on personal mobile devices and during data transmission, like email, is no longer optional under data security laws.

Privacy concerns with personal devices present equally serious issues. The company data that accumulates on such devices mixes with personal data of the employee. Because the employee owns the device, the company does not have unfettered access to its data on the device, particularly for disgruntled and departed employees, and even cooperative employees can have legitimate concerns about handing over their personal devices to corporate officials. Also, the company has little (if any) control over apps that employees download to their personal mobile devices, and malicious apps pose threats to company data on the device and can provide access through the device to company servers and other data stores.

In addition to these difficult personnel issues, recovering business data from a personal mobile device can be a legal minefield. An unauthorized interception of an electronic communication, such as an email or text sent to a personal email account or cellphone number connected to the device, can violate the federal Electronic Privacy Communications Act. Likewise, unauthorized access to stored electronic communications, such as an employee’s Facebook, LinkedIn, or other social media account, can violate the Stored Communications Act.

Beyond those two federal statutes, an employee also may assert a common law claim that the company’s intrusion into the employee’s personal device violates the employee’s legitimate expectation of privacy. In 2014, the U.S. Supreme Court recognized in Riley v. California that, as a society, we have developed a strong sense that the data on our personal mobile devices is private. The Court explained its reasoning as follows:

Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans ‘the privacies of life’ …. The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the Founders fought.

Sound information use policies and technology practices are the best solutions to avoid data privacy problems. A company should clearly notify its employees in its information use policy that the company owns its business data, and that employees cannot have any expectation of privacy with respect to their possession or use of it. A company also should notify its employees that the company has a right to access employee-owned devices to recover business data, and the company should establish parameters in its policy for doing so. And, company IT personnel need to be properly trained to avoid intentional and inadvertent violations of the federal statutes mentioned above when accessing personal devices.

BYOD is not likely to subside – if anything, its prevalence will increase. Companies that foster this practice should address the privacy and data security concerns of BYOD, by implementing appropriate written data security and information use policies and by adopting sound technology practices, like MDM and encryption.

What Does It Really Take To Be Data Security Compliant?

By Cameron G. Shilling

As published in NH Bar News (12/20/2016)

Most businesses know (or should know by now) that they must comply with state and federal data security laws and regulations. But business leaders often are unaware of what it really takes to do so. That is understandable. Data security seems complex, and technology consultants and vendors rarely try to demystify it for their customers.

Data security is just like any other legal or business risk management issue. The risk is managed through a process of collaboration between business leaders, information technology professionals, and qualified legal counsel. The process involves the following steps:

  1. Perform a risk assessment of the business’ physical, technological and administrative systems using the requirements and standards of applicable laws.
  2. Generate a report that identifies areas of non-compliance and risk, including a prioritization and chronological plan for remediation.
  3. Remediate vulnerabilities that can feasibly and financially be fixed within a reasonable amount of time.
  4. Create a written data security plan tailored to the procedures of the business.
  5. Train employees about data security compliance generally and the business’ procedures under the written data security plan.
  6. Perform periodic reassessments, including sub-assessments if new or different physical, technological or administrative systems are adopted.

Step 1 – the risk assessment – involves identifying the information a business has that is legally protected, for example, under state data security laws or under federal laws or regulations such as HIPAA, the Gramm-Leach-Bliley Act, or SEC or FCC regulations. The information is then mapped through its lifecycle (e.g., from receipt and creation, through use and transmission, to disposal and destruction), and areas of non-compliance or risk are identified using the legal requirements and standards of applicable laws and regulations.

This is a highly collaborative process between the leaders of the business, competent IT professionals (inside or outside the business, or both), and legal counsel experienced with this area of the law and qualified to understand technological and physical security matters.

Step 2 – the report – flows naturally from the areas of non-compliance and risk identified in the assessment. Priority is assigned to items that are relatively easy to remedy, do not comply with applicable law or entail significant risk, and a timeline is created for addressing the issues.

Step 3 – the remediation – is the process of identifying and implementing solutions to the vulnerabilities identified during the assessment and in the report. Remediating vulnerabilities often depends on the availability of technological or physical systems, and budgetary constraints of the business. It is common for a business to need 12-18 months to properly address all of the vulnerabilities identified in an initial data security risk assessment.

Step 4 – the written plan – is a policy created from the information gathered during the risk assessment and the remedies implemented or anticipated for the vulnerabilities. A plan created in the absence of a comprehensive risk assessment is a pure shot in the dark, and does not comply with state or federal law or accepted practice. No two data security plans are the same because no two businesses are the same, and there is no competent boilerplate form.

Step 5 – the training – is an integral component of data security compliance. Employees handle protected data on a daily basis, and thus need to be taught about data security generally as well as the business’ specific procedures as set out in the written plan. Likewise, properly trained employees know better how to avoid breaches, how to recognize an actual or potential breach, and how to properly respond in such circumstances.

Step 6 – the reassessment – is required and natural for any business committed to data security. Reassessments are used to address vulnerabilities from new or different technology, physical or administrative systems or external threats. Also, as a business that becomes data security aware, it frequently identifies previously unknown vulnerabilities and adopts remedies that enhance security beyond the measures implemented after the initial risk assessment and report.

Data security is not something that can or should be overlooked simply because a business does not understand how to become compliant. Just like any other risk management issue, security is accomplished through an established process of business leaders, IT professionals, and qualified counsel working collaboratively to implement an established process under applicable law.

Know the Law: Who is Liable for Data Breach?

By Ramey D. Sylvester

As published in the Union Leader (12/19/2016)

Q. My company handles a lot of sensitive customer information (medical, financial, biographical) and has relationships with third party service providers that have access to the information. Can my company be held liable to our customers for my service provider’s mishandling of that data?

A.  Bad news first. Not only may your company be liable to your customers, your company may have to engage in costly notification and disclosure efforts, and may be subject to governmental auditing and penalties all due to your service provider’s mishandling of your customers’ sensitive information.

In today’s computer and cloud-based business world, customer data can be accessed, and is often stored, by a company’s service provider or “vendor.” Vendors providing services such as: Software as a service (SAAS), payment processing, accounting, document destruction, and external IT all commonly have access to, and store, sensitive information of their clients’ customers. Even your office supply delivery company, cleaning service, and building maintenance company has access to your customer information and could cause a breach either knowingly or accidentally.

Depending on the privacy laws and regulatory requirements your company is subject to, you may be required to ensure that vendors are equipped to properly secure your sensitive customer data. Regardless, your company will be responsible for your vendors’ failure to maintain the confidentiality of your customer data and for choosing to work with a vendor that is not data security compliant. Should your vendor suffer a data breach, your company will be on the hook for customer notification requirements, governmental investigations, and penalties, in addition to any customer legal action.

So what can you do to minimize these risks? Establish a vendor management program to assess your vendors’ ability to handle sensitive customer data. If the vendor will be handling sensitive customer data, make sure that the vendor has a data security policy and data breach response plan. Further, require the vendor to have cyber insurance policies that will cover the costs of data breaches, and have the vendor sign a data security agreement that will require it to maintain the confidentiality of the customer data, require it to indemnify your company for unauthorized disclosures of customer data, and establish auditing rights that will enable your company to ensure that the vendor is maintaining its data security standards.

The bottom line is that since your company will be responsible for the mistakes of your vendors, you should take appropriate legal steps to protect your company and your customers.

Know the Law: Some Data Collecting Requires Disclosures

By Kevin Lin

As published in the Union Leader (12/5/2016)

Q. My website allows customers to create user accounts, saves their contact information and tracks their purchases to suggest new items they may want to buy. Are there any disclosures I need to make about my customer data collection?

A.  One regulation governing consumer data collection is CalOPPA, a California statute that seeks to improve the transparency of a company’s data privacy practices. A New Hampshire business is subject to CalOPPA if it gathers personal information online about any California resident.

This information includes first and last names, addresses, emails, telephone numbers, and other similar information. Since most online footprints are nationwide and it is often difficult to differentiate California residents from other customers, businesses should simply comply with CalOPPA to avoid unknowing violations.

CalOPPA requires that a business post its privacy policy on its website identifying exactly what consumer information is collected and with whom that information is shared. The law also requires that the privacy policy inform consumers about the process for reviewing and requesting changes to any information collected, and that it specify how consumers will be notified of changes to the policy. Additionally, the most recent amendments require the privacy policy to detail how the business will respond to web browser “do not track” signals.

Violations of CalOPPA are enforced through California’s Unfair Competition Law. A company that does not comply with CalOPPA may be subject to penalties of up to $2,500 for each violation. With respect to mobile applications, the penalty is assessed each time the application is downloaded by a California resident.

In 2012, the California Attorney General informed hundreds of noncomplying companies (including those outside of California) that they would be fined if they did not bring their mobile applications into compliance. More recently, California Attorney General Kamala D. Harris released a new tool for consumers to report noncomplying websites, mobile applications and online services.

Given the rise in enforcement and the potential risk of exposure, it is crucial that all New Hampshire companies review their privacy policies to ensure compliance with CalOPPA.

Know the Law: Who is Liable for Chip-Based Credit Card Fraud

By Cameron G. Shilling (originally published 11/23/2015)

As published in the Union Leader (9/14/2015)

Q.  More and more of my customers are paying with credit cards that have chips in them.   Do I need a chip-based credit card reader?

A.  Credit card companies – not retailers or consumers – have historically absorbed the liability for fraudulent credit card transactions.  That will change on October 1, 2015.  If your business does not use EMV equipped card readers to process credit cards that utilize the new chip technology, then your business – not the credit card company – will be liable for fraudulent transactions.

The credit card industry in the United States has been transitioning for the last several years to cards that utilize embedded chips, in addition to the older magnet strip technology.  The reason is that the vast majority of credit card fraud occurs from the “skimming” of numbers from “swiping” a card’s magnet strip through a card reader.  Target, Home Depot, and TJX are just a few examples of such recent breaches affecting hundreds of millions of consumers.

Retailers outside of the United States started many years ago transitioning to chip technology, which is called “EMV.” Outside of this country, about 70% of all credit card readers employ EMV technology, compared to the relatively negligible adoption of EMV domestically.  As a result, the approximately $10 billion of annual domestic credit card fraud accounts for nearly half of global fraudulent credit card transactions, even though only about one quarter of all credit card transactions worldwide occur in the United States.

On October 1, 2015, there will be a change to the rules that major credit card companies apply to retailers and other credit card processors.  If fraudulent transactions occur using cards with chips, and the retailers/processors did not use EMV equipped card readers, then the retailers/processors – not the credit card companies – are liable for the fraudulent transactions.  By contrast, if a retailer/processor uses an EMV reader to process a chip equipped card, the credit card company is liable.  Also, credit card companies remain liable for fraudulent transactions using credit cards equipped only with a magnet strip and not the chip technology.

Because about 40% of credit cards in the United States presently have embedded chips, domestic retailers and credit card processors face significant potential liability for fraudulent transactions.  As a result, if your business processes credit card transactions, you should promptly convert to EMV enabled credit card readers.

Employee Communications with Attorney on Company Owned Accounts Are Not Privileged

By Cameron G. Shilling (originally published 10/18/2013)

Emails, texts and other communications that an employee has with an attorney using a company account may not be privileged, according to the most recent decision on the issue from a court in Delaware. That state has now joined a growing list of others (Arizona, California, Florida, Idaho, Illinois, New Jersey, New York, Oklahoma, Pennsylvania, Texas, Washington, and West Virginia) where courts have found that an employee waived privilege by communicating with an attorney on a company email account.  However, some courts distinguish between an employee’s communications on a company account, and communications on a personal account using a company electronic device. Thus, a company should ensure  that its technology use policy covers both company accounts and devices, and that the company can permissible review communications between an employee and attorney before doing so.

 The Delaware court adopted a well-recognized four-part test to determine if an employee waived privilege by using a company account or device to communicate with an attorney:

  • Did the company have a policy informing employees that personal communications on company accounts or devices are not private?
  • Did the company monitor and review, or inform employees that it may monitor and review, such personal communications?
  • Did the company have a right and ability to access company accounts and devices?
  • Did the company notify employees, or was the employee otherwise aware, of the company’s policy?

In addition to the states listed above, courts in other states (including Connecticut, the District of Columbia, Maryland, Minnesota, and Kansas) have applied this four-part test, but found on the facts of the particular case that the company did not satisfy each element of the test, and thus that the employee did not waive privilege.

While courts are relatively settled on applying that test to company accounts, there is a split of authority concerning a company’s right to review an employee’s communications with an attorney using a personal account accessed on a company device.  For example, an employee may communicate with an attorney on a webmail account (such as Gmail or Yahoo!) using a company computer, laptop, tablet (e.g., iPad), or smartphone (e.g., iPhone, Droid or Blackberry).  The company may be able to recover such communications from the device if the webmail account was configured to create a backup file on the device, or if the webmail data can be forensically extracted from the “residual” space of the hard drive.

Courts in three states (New Jersey, Massachusetts, and Washington) found that such webmail communications remain privileged.  Two other courts (New York and Washington) disagreed.  They found that, while the differences between a company email account and a personal webmail account accessed on a company device may affect the outcome under the four-part test, the test still should be applied to determine whether the employee waived privilege.

An employee’s communications on company accounts and devices (including with attorneys) can be a treasure trove of valuable evidence.  To ensure that the company has the best possible right to review such communications, it should adopt a technology use policy that appropriately informs employees that all data created, stored, sent or received on a company account or device is the property of the company, and may be monitored and reviewed by the company at any time and for any reason, and therefore that employees cannot expect any such data to be private or confidential from the company.  The policy should be sent to all employees, and each employee should acknowledge that he or she received, reviewed, and will comply with it.  Companies should behave in accordance with the policy, and refrain from doing anything that may lead an employee to expect privacy with respect to such data.  Finally, when a company encounters employee communications with an attorney on a company account or device, it should ensure that it has the right under applicable law to review such communications before doing so.

Employers Liable Under Stored Communication Act for Accessing Employee Facebook and Gmail Accounts

By Cameron G. Shilling (originally published 9/25/2013)

Employers frequently access and review data created or stored by employees on company-owned electronic devices, such as computers, laptops, tablets (iPad), and cellphones (iPhone, Droid and Blackberry).  Well-crafted technology and social media policies specifically authorize employers to do so.  But, if not careful, employers can step over the line between permissible conduct and conduct that violates the federal Stored Communications Act (SCA).  The line between permitted and unlawful conduct is not always apparent,so employers need to be aware of the SCA and seek counsel before accessing or reviewing an employee’s electronic communications.Company-owned electronic devices are treasure troves of evidence of employee misconduct, particularly where employees use the devices to access personal email (Gmail, Yahoo!, etc.) or social media (Facebook, Google+, Twitter, Flickr, etc.).  Employers feel justifiably entitled to access and review data created and stored on such devices, particularly where employees are instructed that the company owns the devices and has the right to monitor the data, and that employees have no right to privacy.  As a general rule, the law supports employers here.

But the SCA imposes some limits on employers.  And, as few recent cases demonstrate, it is all too easy for employers to step over the line and violate the federal law.

In Deborah Ehling v. Monmouth-Ocean Hospital Service Corp., the employer terminated the employee based (in part) on posts she made on Facebook.  The court underwent a rigorous analysis to determine that the SCA protects Facebook posts, as long as the posts are limited to friends and not on the person’s public Facebook pages.  As the court explained,

“when it comes to privacy protection, the critical inquiry is whether Facebook users took steps to limit access to the information on their Facebook walls” and the “privacy protection provided by the SCA does not depend on the number of Facebook friend that a user has.”

Although the employee’s Facebook posts were protected, the employer did not violated the SCA because it received the posts through a person authorized to access them: one of the employee’s co-workers, who was her Facebook friend, gave them to the employer.  However, as this court and others have recognized, an employer violates the SCA if it obtains an employee’s private Facebook posts by other means, such as (1) using a password retrieved from the hard drive of the employee’s company-owned electronic device or from a keystroke logger installed on the device, (2) accessing the account by using the employee’s company-owned device where the password populates automatically, (3) creating a fictitious person on Facebook to friend the employee, and (4) pressuring co-workers to divulge the employee’s Facebook posts.  In those circumstances, access to the Facebook posts would not be authorized under the SCA.

In another case, Sandi Lazette v. Verizon Wireless, the employee returned her company-owned Blackberry to her employer, but did not properly disconnect her Gmail account from it before doing so.  Over the next 18 months, her supervisor read 48,000 emails sent to that account, some of which were quite personal.  The court in that case (like many other courts) found that email stored in webmail accounts (like Gmail) is protected by the SCA, at least while the email resides unread on the servers of the service provider.

The employer made several unsuccessful arguments to avoid liability.  For example, the court rejected the argument that the supervisor was accessing only the company-owned Blackberry, recognizing that he was actually using that device to access an account on the Gmail servers.  However, an employer does not violate the SCA if it recovers an employee’s personal emails that are stored on a company-owned device, such as when the data is in a backup file or recovered from the “residual” space of a hard drive.  The court also rejected the employer’s argument that the employee had impliedly consented to the employer’s review of her Gmail by not properly disconnecting the account.  While consent need not be explicit, the court recognized that,

“Negligence is … not the same as approval, much less authorization.  There is a difference between someone who fails to leave the door locked when going out and one who leaves it open knowing someone will be stopping by.”

Technology presents legitimate opportunities for employers to monitor their employees.  It also presents potential pitfalls, some of which are not apparent.  Employers should continue to harvest valuable information from company-owned electronic devices, but also need to become aware of the SCA and seek counsel before accessing or reviewing employee electronic communications.