Dark (Data) Clouds on the Horizon

Dark Cloud

There have been many definitions of “Dark Data” over the last couple of years including: unstructured, unclassified, untagged, unmanaged and unknown electronic data that is resident within an organization’s enterprise. Most of these definitions center on unstructured data residing in an enterprise. But with the advent of BYOD and employees use of personal clouds, this definition should be expanded to include any corporate owned data, no matter where it resides.

Dark data, especially dark data stored outside of the company’s infrastructure (and awareness that it even exists) is an obvious liability for eDiscovery response, regulatory compliance, and corporate IP security.

Is BYOC a good idea?

Much has been written on the dangers of “Bring Your Own Device” (BYOD) but little has been written on the dangers of “Bring Your Own Cloud” (BYOC) otherwise known as personal clouds. Employees now have access to free cloud storage from many vendors that give them access to their content no matter where they are. These same personal clouds also provide automatic syncing of desktop folders and the ability to share specific documents or even entire folders. These personal clouds offer a fantastic use model for individuals to upload their personal content for backup, sharing and remote availability. In the absence of any real guidance from employers, employees have also begun to use these personal clouds for both personal and work purposes.

The problem arises when corporate-owned data is moved up to personal clouds without the organization’s approval or awareness. Besides the obvious problem of potential theft of corporate IP, effective eDiscovery and regulatory compliance become impossible. Corporate data residing in personal clouds become “Dark Clouds” to the organization; corporate data residing in repositories outside the organizations infrastructure, management or knowledge.

Dark Clouds and eDiscovery

Organizations have been trying to figure out what to do with huge amounts of dark data within their infrastructure, particularly when anticipating or responding to litigation. Almost everything is potentially discoverable in litigation if it pertains to the case, and searching for and reviewing GBs or TBs of dark data residing in the enterprise can push the cost of eDiscovery up substantially. But imagine the GBs of corporate dark data residing in employee personal clouds that the organization has zero awareness of… Is the organization still responsible to search for it, secure it and produce it? Depending on who you ask, the answer is Yes, No, and “it depends”.

In reality, the correct answer is “it depends”. It will depend on what the organization did to try and stop employee dark clouds from existing. Was a policy prohibiting employee use of personal clouds with corporate data in place; were employees alerted to the policy; did the organization try to audit and enforce the policy; did the organization utilize technology to stop access to personal clouds from within the enterprise, and did the organization use technology to stop the movement of corporate data to personal clouds (content control)?

If the organization can show intent and actions to ensure dark clouds were not available to employees, then the expectation of dark cloud eDiscovery search may not exist. But if dark cloud due diligence was not done and/or documented, all bets are off.

Regulatory Compliance and Dark Clouds

Employee personal clouds can also end up becoming the repository of sensitive data subject to regulatory security and privacy requirements. Personally identifiable information (PII) and personal health information (PHI) under the control of an organization are subject to numerous security and privacy regulations and requirements that if not followed, can trigger costly penalties. But inadvertent exposure can occur as employees move daily work product up to their personal clouds to continue work at home or while traveling. A problem is many employees are not trained on recognizing and handling sensitive information; what is it, what constitutes sensitive information, how should it be secured, and the liabilities to the organization if sensitive information is leaked. The lack of understanding around the lack of security of personal clouds and the devices used to access them are a related problem. Take, for example, a situation where an employee accesses their personal cloud while in a coffee shop on an unsecured Wi-Fi connection. A hacker can simply gain access to your laptop via the unsecured Wi-Fi connection, access your personal cloud folder, and browse your personal cloud through your connection (a password would not be required because most users opt to auto-sign in to their cloud accounts as they connect on-line).

As with the previous eDiscovery discussion, if the organization had taken the required steps to ensure sensitive data could not be leaked (even inadvertently by the employee), they leave themselves open for regulatory fines and more.

Reducing the Risk of Dark Clouds

The only way to stop the risk associated with dark clouds is to stop corporate data from leaving the security of the enterprise in the first place. This outcome is almost impossible to guarantee without adopting draconian measures that most business cultures would rebel against but there are several measures that an organization can employ to at least reduce the risk:

  • First, create a use policy to address what is acceptable and not acceptable behavior when using organization equipment, infrastructure and data.
  • Document all policies and update them regularly.
  • Train employees on all policies – on a regular basis.
  • Regularly audit employee adherence to all policies, and document the audits.
  • Enforce all breaches of the policy.
  • Employee systematic security measures across the enterprise:
    • Don’t allow employee personal devices access to the infrastructure – BYOD
    • Stop employee access to personal clouds – in many cases this can be done systematically via cutting specific port access
    • Employ systematic enterprise access controls
    • Employ enterprise content controls – these are software applications that control access to individual content based on the actual content and the user’s security profile.

Employee dark clouds are a huge liability for organizations and will become more so as attorney’s become more educated on how employees create, use, store and share information. Now days,


Law Firms, HIPAA and the “Minimum Necessary Standard” Rule

TMI blogThe HIPAA Omnibus Rule became effective on March 26, 2013. Covered entities and Business Associates had until September 23, 2013 to become compliant with the entirety of the law including the security rule, the privacy rule and the breach notification rule. Law firms that do business with a HIPAA regulated organization and receive protected health information (PHI) are considered a Business Associate (BA) and subject to all regulations including the security, privacy and breach notification rules. These rules are very prescriptive in nature and can impose additional procedures and additional cost to a law firm.

Under the HIPAA, there is a specific rule covering the use of PHI by both covered entities and Business Associates called the “Minimum Necessary Stand” rule or 45 CFR 164.502(b), 164.514(d). The HIPAA Privacy rule and minimum necessary standard are enforced by the U.S. Department of Health and Human Services Office for Civil Rights (OCR). Under this rule, law firms must develop policies and procedures which limit PHI uses, disclosures and requests to those necessary to carry out the organization’s work including:

  • Identification of persons or classes of persons in the workforce who need access to PHI to carry out their duties;
  • For each of those, specification of the category or categories of PHI to which access is needed and any conditions appropriate to such access; and
  • Reasonable efforts to limit access accordingly.

The minimum necessary standard is based on the theory that PHI should not be used or disclosed when it’s not necessary to satisfy a particular job. The minimum necessary standard generally requires law firms to take reasonable steps to limit the use or disclosure of, PHI to the minimum necessary to represent the healthcare client. The Privacy Rule’s requirements for minimum necessary are designed to be flexible enough to accommodate the various circumstances of any covered entity.

The first thing firms should understand is that, as Business Associates subject to HIPAA through their access and use of client data, firms are subject to the Minimum Necessary Standard, which requires that when a HIPAA-covered entity or a business associate (law firm) of a covered entity uses or discloses PHI or when it requests PHI from another covered entity or business associate, the covered entity or business associate must make “reasonable efforts to limit protected health information to the minimum necessary to accomplish the intended purpose of the use, disclosure, or request.”

Law firm information governance professionals need to be aware of this rule and build it into their healthcare client related on-boarding processes.

You Don’t Know What You Don’t Know

Blog_06272014_graphicThe Akron Legal News this week published an interesting editorial on information governance. The story by Richard Weiner discussed how law firms are dealing with the transition from rooms filled with hard copy records to electronically stored information (ESI) which includes firm business records as well as huge amounts of client eDiscovery content. The story pointed out that ESI flows into the law firm so quickly and in such huge quantities no one can track it much less know what it contains.  Law firms are now facing an inflection point, change the way all information is managed or suffer client dissatisfaction and client loss.

The story pointed out that “in order to function as a business, somebody is going to have to, at least, track all of your data before it gets even more out of control – Enter information governance.”

There are many definitions of information governance (IG) floating around but the story presented one specifically targeted at law firms: IG is “the rules and framework for managing all of a law firm’s electronic data and documents, including material produced in discovery, as well as legal files and correspondence.” Richard went on to point out that there are four main tasks to accomplish through the IG process. They are:

  • Map where the data is stored;
  • Determine how the data is being managed;
  • Determine data preservation methodology;
  • Create forensically sound data collection methods.

I would add several more to this list:

  • Create a process to account for and classify inbound client data such as eDiscovery and regulatory collections.
  • Determine those areas where client information governance practices differ from firm information governance practices.
  • Reconcile those differences with client(s).

As law firms’ transition to mostly ESI for both firm business and client data, law firms will need to adopt IG practices and process to account for and manage to these different requirements. Many believe this transition will eventually lead to the incorporation of machine learning techniques into IG to enable law firm IG processes to have a much more granular understanding of what the actual meaning of the data, not just that it’s a firm business record or part of a client eDiscovery response. This will in turn enable more granular data categorization capability of all firm information.

Iron Mountain has hosted the annual Law Firm Information Governance Symposium which has directly addressed many of these topics around law firm IG. The symposium has produced ”A Proposed Law Firm Information Governance Framework” a detailed description of the processes to look at as law firms look at adopting an information governance program.

Emails considered “abandoned” if older than 180 days

The Electronic Communications Privacy Act – Part 1

Email Privacy

It turns out that those 30 day email retention policies I have been putting down for years may… actually be the best policy.

This may not be a surprise to some of you but the government can access your emails without a warrant by simply providing a statement (or subpoena) that the emails in question are relevant to an on-going federal case – criminal or civil.

This disturbing fact is legally justified through the misnamed Electronic Communications Privacy Act of 1986 otherwise known as 18 U.S.C. § 2510-22.

There are some stipulations to the government gaining access to your email;

    • The email must be stored on a server, or remote storage (not an individual’s computer).This obviously targets Gmail, Outlook.com, Yahoo mail and others but what about corporate email administered by third parties, what about Outlook Web Access, remote workers that VPN into their corporate email servers, PSTs saved on cloud storage…
    • The emails must have already been opened. Does Outlook auto-preview affect the state of “being read”?
    • The emails must be over 180 days old if unopened

The ECPA (remember it was written in 1986) starts with the premise that any email (electronic communication) stored on a server longer than 180 days had to be junk email and abandoned.  In addition, the assumption is that if you opened an email and left it on a “third-party” server for storage you were giving that “third-party” access to your mail and giving up any privacy interest you had which in reality is happening with several well-known email cloud providers (terms and conditions).  In 1986 the expectation was that you would download your emails to your local computer and then either delete it or print out a hard copy for record keeping.  So the rules put in place in 1986 made sense – unopened email less than 180 days old was still in transit and could be secured by the authorities only with a warrant (see below); opened email or mail stored for longer than 180 days was considered non-private or abandoned so the government could access it with a subpoena (an administrated request) – in effect, simply by asking for it.

Warrant versus Subpoena: (from Surveillance Self-Defense Web Site)

To get a warrant, investigators must go to a neutral and detached magistrate and swear to facts demonstrating that they have probable cause to conduct the search or seizure. There is probable cause to search when a truthful affidavit establishes that evidence of a crime will be probably be found in the particular place to be searched. Police suspicions or hunches aren’t enough — probable cause must be based on actual facts that would lead a reasonable person to believe that the police will find evidence of a crime.

In addition to satisfying the Fourth Amendment’s probable cause requirement, search warrants must satisfy the particularity requirement. This means that in order to get a search warrant, the police have to give the judge details about where they are going to search and what kind of evidence they are searching for. If the judge issues the search warrant, it will only authorize the police to search those particular places for those particular things.

Subpoenas are issued under a much lower standard than the probable cause standard used for search warrants. A subpoena can be used so long as there is any reasonable possibility that the materials or testimony sought will produce information relevant to the general subject of the investigation.

Subpoenas can be issued in civil or criminal cases and on behalf of government prosecutors or private litigants; often, subpoenas are merely signed by a government employee, a court clerk, or even a private attorney. In contrast, only the government can get a search warrant.

With all of the news stories about Edward Snowden and the NSA over the last year, this revelation brings up many questions for those of us in the eDiscovery, email archiving and cloud storage businesses.

In future blogs I will discuss these questions and others such as how does this effect “abandoned” email archives.

Cloudy, with a chance of eDiscovery

In the last year there has numerous articles, blogs, presentations and panels discussing the legal perils of “Bring Your Own Device” or BYOD policies. BYOD refers to the policy of permitting employees to bring personally owned mobile devices (laptops, tablets, and smart phones) to their workplace, and to use those devices to access privileged company information and applications. The problem with BYOD is company access to company data housed on the device. For example, how would you search for potentially relevant content on a smartphone if the employee wasn’t immediately available or refused to give the company access to it?

Many organizations have banned BYOD as a security risk as well as a liability when involved with litigation.

BYOC Equals Underground Archiving?

Organizations are now dealing with another problem, one with even greater liabilities. “Bring your own cloud” or BYOC refers to the availability and use by individuals of free cloud storage space available from companies like Microsoft, Google, Apple, Dropbox, and Box.net. These services provide specific amounts of cloud storage space for free.

The advantage to users for these services is the ability to move and store work files that are immediately available to you from anywhere; home or while they’re traveling. This means employees no longer have to copy files to a USB stick or worse, email work files as an attachment to their personal email account. The disadvantage of these services are that corporate information can easily migrate away from the organization with no indication they were ever copied or moved – otherwise known as “underground archiving”.  This also means that potentially responsive information is not protected from deletion or available for review during eDiscovery.

Stopping employee access to outside public clouds is a tough goal and may negatively affect employee productivity unless the organization offers something as good  that they can manage and access as well. For example several companies I have talked to over the last year have begun offering Dropbox accounts to employees with the understanding that the company has access to for compliance, eDiscovery or security reasons all the while providing the employee the advantages of a cloud account.

The other capability organizations should research about these cloud offerings is their ability to respond to legal hold and eDiscovery search. Questions to consider include: Does the organization have the ability to search across all company owned accounts for specific content? What type of search do they offer; Keyword, concept? Can the organization view the contents of documents without changing the document metadata? Can the organization place to “stop” on deletions by employees at any time?

Organizations need to be aware of and adapt to these cloud services and be thorough in addressing them.

For Corporate counsel:
  1. Be aware these types of cloud storage services exist for your employees.
  2. Think about offering these cloud services to employees under the organization’s control.
  3. Create a use policy addressing these services. Either forbid employees from setting up and using these services from any work location and company owned equipment or if allowed be sure employees acknowledge these accounts can and will be subject to eDiscovery search.
  4. Audit the policy to insure it is being followed.
  5. Enforce the policy if employees are not following it.
  6. Train the employees on the policy.
  7. Document everything.
For employees:
  1. Understand that if you setup and use these services from employer locations, equipment and with company ESI, all content in that account could be subject to eDiscovery review, personal or company related.
  2. Ask your organization what the policy is for employee use of cloud storage/
  3. If you use these services for work, only use them with company content, not personal files.
  4. Be forthcoming with any legal questioning about the existence of these services you use.
  5. Do not download any company ESI from these services to any personal computer, this could potentially open up that personal computer to eDiscovery by corporate counsel
For opposing counsel:

Be aware of these services and ask the following questions during discovery:

  1. Do any of your employees utilize company sanctioned or non-sanctioned public cloud storage services?
  2. Do you have a use policy which addresses these services?
  3. Does the policy penalize employees for not following this use policy?
  4. Do you audit this use policy?
  5. Have you documented the above?

These cloud services are an obvious productivity tool for employees to utilize to make their lives easier as well as more productive. All involved need to be aware of the eDiscovery implications.

Tolson’s Three Laws of Machine Learning

TerminatorMuch has been written in the last several years about Predictive Coding (as well as Technology Assisted Review, Computer Aided Review, and Craig Ball’s hilarious Super Human Information Technology ). This automation technology, now heavily used for eDiscovery, relies heavily on “machine learning”,  a discipline of artificial intelligence (AI) that automates computer processes that learn from data, identify patterns and predict future results with varying degrees of human involvement. This interative machine training/learning approach has catapulted computer automation to unheard-of and scary levels of potential. The question I get a lot (I think only half joking) is “when will they learn enough to determine we and the attorneys they work with are no longer necessary?

Is it time to build in some safeguards to machine learning? Thinking back to the days I read a great deal of Isaac Asimov (last week), I thought about Asimov’s The Three Laws of Robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Following up on these robot safeguards, I came up with Tolson’s Three Laws of Machine Learning:

  1. A machine may not embarrass a lawyer or, through inaction, allow a lawyer to become professionally negligent and thereby unemployed.
  2. A machine must obey instructions given it by the General Counsel (or managing attorney) except where such orders would conflict with the First Law.
  3. A machine must protect its own existence through regular software updates and scheduled maintenance as long as such protection does not conflict with the First or Second Law

I think these three laws go along way in putting eDiscovery automation protections into effect for the the legal community. Other Machine Learning laws that others suggested are:

  • A machine must refrain from destroying humanity
  • A machine cannot repeat lawyer jokes…ever
  • A machine cannot complement opposing counsel
  • A machine cannot date legal staff

If you have other Machine Learning laws to contribute, please leave comments. Good luck and live long and prosper.

Discovering Dark Data

dark doorDark data, otherwise known as unstructured, unmanaged, and uncategorized information is a major problem for many organizations (and many don’t even know it). Many organizations don’t have the will, systems or processes in place to automatically index and categorize their rapidly growing unstructured dark data and instead rely on employees to manually manage their own information. This reliance on employees is a no-win situation because employees have neither the incentive nor the time to actively manage their information so dark data continues to pile-up all over the organization. This accumulation of dark data has several obvious problems associated with it:

  • Dark data consumes costly storage space and resources – Most medium to large organizations provide terabytes of file share storage space for employees and departments to utilize. Employees drag and drop all kinds of work related files (and personal files like personal photos, MP3 music files, and personal communications) as well as PSTs and work station backup files. The vast majority of these files are unmanaged and are never looked at again by the employee or anyone else.
  • Dark data consumes IT resources – Personnel are required to perform nightly backups, DR planning, and IT personnel to find or restore files employees could not find.
  • Dark Data masks security risks – File shares act as “catch-alls” for employees. Sensitive company information regularly finds its way to these repositories. These file shares are almost never secure so sensitive information like personally identifiable information (PII), protected health information (PHI, and intellectual property can be inadvertently leaked.
  • Dark data raises eDiscovery costs – Organizations find themselves trying to figure out what to do with huge amounts of dark data, particularly when they’re anticipating litigation. Almost everything is discoverable in litigation if it pertains to the case and reviewing GBs or TBs of dark data can push the cost of eDiscovery up substantially.

Dark Data…It’s a Good Thing

Many organizations have begun to look at uncontrolled dark data growth and reason that, as Martha Stewart use to say….”it’s a good thing”. They believe they can run big data analytics on it and realize really interesting things that will help us market and sell better. This strategy misses the point of information governance, which is defined as;

“a cross-departmental framework consisting of the policies, procedures and technologies designed to optimize the value of information while simultaneously managing the risks and controlling the associated costs, which requires the coordination of eDiscovery, records management and privacy/security disciplines.”

Data has risks associated with it as well as cost beyond its daily cost of storage. Let’s consider the legal implications of dark data.

Almost everything is discoverable in litigation if it’s potentially relevant to the case. The fact that tens or hundreds of terabytes of unindexed and unmanaged content is sitting on file shares means that those terabytes of files might have relevant content so it may have to be reviewed to determine if they are relevant in a given legal case. That fact can add hundreds of thousands or millions of dollars of additional cost to a single eDiscovery request. For example, according to a CGOC survey in 2012, on the average 1% of data is subject to legal hold, 5% is subject to regulatory retention and 25% has some values to the business leaving 69% with no real legal, regulatory or business reason to be kept. So for a given 20 TB file share, on the average 1% or 200 GB is potentially relevant to a given eDiscovery request. 200 GB of content can conservatively hold 2 million pages that might have to be reviewed to determine relevancy to the case. These same 2 million pages of content would cost $1.5 million to review using standard manual review processes. The big question that has to be asked is how many of these 2 million pages were considered irrelevant to the business and should not have been kept? Considering the same 69% number from the survey mention above; 2 million docs * 69% = 1.38 million docs that should have been deleted and would never had to have been reviewed for the case.

Ask your GC if uncontrolled and unmanaged dark data growth is a “good thing”…

Dark data equals higher discovery costs so make dark data visible so that you can find it, manage it, and act on it.