Ask the Magic 8-Ball; “Is Predictive Defensible Disposal Possible?”


The Good Ole Days of Paper Shredding

In my early career, shred days – the scheduled annual activity where the company ordered all employees to wander through all their paper records to determine what should be disposed of, were common place. At the government contractor I worked for, we actually wheeled our boxes out to the parking lot to a very large truck that had huge industrial shredders in the back. Once the boxes of documents were shredded, we were told to walk them over to a second truck, a burn truck, where we, as the records custodian, would actually verify that all of our records were destroyed. These shred days were a way to actually collect, verify and yes physically shred all the paper records that had gone beyond their retention period over the preceding year.

The Magic 8-Ball says Shred Days aren’t Defensible

Nowadays, this type of activity carries some negative connotations with it and is much more risky. Take for example the recent case of Rambus vs SK Hynix. In this case U.S District Judge Ronald Whyte in San Jose reversed his own prior ruling from a 2009 case where he had originally issued a judgment against SK Hynix, awarding Rambus Inc. $397 million in a patent infringement case. In his reversal this year, Judge Whyte ruled that Rambus Inc. had spoliated documents in bad faith when it hosted company-wide “shred days” in 1998, 1999, and 2000. Judge Whyte found that Rambus could have reasonably foreseen litigation against Hynix as early as 1998, and that therefore Rambus engaged in willful spoliation during the three “shred days” (a finding of spoliation can be based on inadvertent destruction of evidence as well). Because of this recent spoliation ruling, the Judge reduced the prior Rambus award from $397 million to $215 million, a cost to Rambus of $182 million.

Another well know example of sudden retention/disposition policy activity that caused unintended consequences is the Arthur Andersen/Enron example. During the Enron case, Enron’s accounting firm sent out the following email to some of its employees:

This email was a key reason why Arthur Andersen ceased to exist shortly after the case concluded. Arthur Andersen was charged with and found guilty of obstruction of justice for shredding the thousands of documents and deleting emails and company files that tied the firm to its audit of Enron. Less than 1 year after that email was sent, Arthur Andersen surrendered its CPA license on August 31, 2002, and 85,000 employees lost their jobs.

Learning from the Past – Defensible Disposal

These cases highlight the need for a true information governance process including a truly defensible disposal capability. In these instances, an information governance process would have been capturing, indexing, applying retention policies, protecting content on litigation hold and disposing of content beyond the retention schedule and not on legal hold… automatically, based on documented and approved legally defensible policies. A documented and approved process which is consistently followed and has proper safeguards goes a long way with the courts to show good faith intent to manage content and protect that content subject to anticipated litigation.

To successfully automate the disposal of unneeded information in a consistently defensible manner, auto-categorization applications must have the ability to conceptually understand the meaning in unstructured content so that only content meeting your retention policies, regardless of language, is classified as subject to retention.

Taking Defensible Disposal to the Next Level – Predictive Disposition

A defensible disposal solution which incorporates the ability to conceptually understand content meaning, and which incorporates an iterative training process including “train by example,” in a human supervised workflow provides accurate predictive retention and disposition automation.

Moving away from manual, employee-based information governance to automated information retention and disposition with truly accurate (95 to 99%) and consistent meaning-based predictive information governance will provide the defensibility that organizations require today to keep their information repositories up to date.

Predicting the Future of Information Governance


Information Anarchy

Information growth is out of control. The compound average growth rate for digital information is estimated to be 61.7%. According to a 2011 IDC study, 90% of all data created in the next decade will be of the unstructured variety. These facts are making it almost impossible for organizations to actually capture, manage, store, share and dispose of this data in any meaningful way that will benefit the organization.

Successful organizations run on and are dependent on information. But information is valuable to an organization only if you know where it is, what’s in it, and what is shareable or in other words… managed. In the past, organizations have relied on end-users to decide what should be kept, where and for how long. In fact 75% of data today is generated and controlled by individuals. In most cases this practice is ineffective and causes what many refer to as “covert orunderground archiving”, the act of individuals keeping everything in their own unmanaged local archives. These underground archives effectively lock most of the organization’s information away, hidden from everyone else in the organization.

This growing mass of information has brought us to an inflection point; get control of your information to enable innovation, profit and growth, or continue down your current path of information anarchy and choke on your competitor’s dust.

img-pred-IG

Choosing the Right Path

How does an organization ensure this infection point is navigated correctly? Information Governance. You must get control of all your information by employing the proven processes and technologies to allow you to create, store, find, share and dispose of information in an automated and intelligent manner.

An effective information governance process optimizes overall information value by ensuring the right information is retained and quickly available for business, regulatory, and legal requirements.  This process reduces regulatory and legal risk,  insures needed data can be found quickly and is secured for litigation,  reduces overall eDiscovery costs, and provides structure to unstructured information so that employees can be more productive.

Predicting the Future of Information Governance

Predictive Governance is the bridge across the inflection point. It combines machine-learning technology with human expertise and direction to automate your information governance tasks. Using this proven human-machine iterative training capability,Predictive Governance is able to accurately automate the concept-based categorization, data enrichment and management of all your enterprise data to reduce costs, reduce risks, enable information sharing and mitigate the strain of information overload.

Automating information governance so that all enterprise data is captured, granularity evaluated for legal requirements, regulatory compliance, or business value and stored or disposed of in a defensible manner is the only way for organizations to move to the next level of information governance.

Finding the Cure for the Healthcare Unstructured Data Problem


Healthcare information/ and records continue to grow with the introduction of new devices and expanding regulatory requirements such as The Affordable Care Act, The Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health Act (HITECH). In the past, healthcare records were made up of mostly paper forms or structured billing data; relatively easy to categorize, store, and manage.  That trend has been changing as new technologies enable faster and more convenient ways to share and consume medical data.

According to an April 9, 2013 article on ZDNet.com, by 2015, 80% of new healthcare information will be composed of unstructured information; information that’s much harder to classify and manage because it doesn’t conform to the “rows & columns” format used in the past. Examples of unstructured information include clinical notes, emails & attachments, scanned lab reports, office work documents, radiology images, SMS, and instant messages.

Who or what is going to actually manage this growing mountain of unstructured information?

To insure regulatory compliance and the confidentiality and security of this unstructured information, the healthcare industry will have to 1) hire a lot more professionals to manually categorize and mange it or 2) acquire technology to do it automatically.

Looking at the first solution; the cost to have people manually categorize and manage unstructured information would be prohibitively expensive not to mention slow. It also exposes private patient data to even more individuals.  That leaves the second solution; information governance technology. Because of the nature of unstructured information, a technology solution would have to:

  1. Recognize and work with hundreds of data formats
  2. Communicate with the most popular healthcare applications and data repositories
  3. Draw conceptual understanding from “free-form” content so that categorization can be accomplished at an extremely high accuracy rate
  4. Enable proper access security levels based on content
  5. Accurately retain information based on regulatory requirements
  6. Securely and permanently dispose of information when required

An exciting emerging information governance technology that can actually address the above requirements uses the same next generation technology the legal industry has adopted…proactive information governance technology based on conceptual understanding of content,  machine learning and iterative “train by example” capabilities

The lifecycle of information


Organizations habitually over-retain information, especially unstructured electronic information, for all kinds of reasons. Many organizations simply have not addressed what to do with it so many of them fall back on relying on individual employees to decide what should be kept and for how long and what should be disposed of. On the opposite end of the spectrum a minority of organizations have tried centralized enterprise content management systems and have found them to be difficult to use so employees find ways around them and end up keeping huge amounts of data locally on their workstations, on removable media, in cloud accounts or on rogue SharePoint sites and are used as “data dumps” with or no records management or IT supervision. Much of this information is transitory, expired, or of questionable business value. Because of this lack of management, information continues to accumulate. This information build-up raises the cost of storage as well as the risk associated with eDiscovery.

In reality, as information ages, it probability of re-use and therefore its value, shrinks quickly. Fred Moore, Founder of Horison Information Strategies, wrote about this concept years ago.

The figure 1 below shows that as data ages, the probability of reuse goes down…very quickly as the amount of saved data rises. Once data has aged 10 to 15 days, its probability of ever being looked at again approaches 1% and as it continues to age approaches but never quite reaches zero (figure 1 – red shading).

Contrast that with the possibility that a large part of any organizational data store has little of no business, legal or regulatory value. In fact the Compliance, Governance and Oversight Counsel (CGOC) conducted a survey in 2012 that showed that on the average, 1% of organizational data is subject to litigation hold, 5% is subject to regulatory retention and 25% had some business value (figure 1 – green shading). This means that approximately 69% of an organizations data store has no business value and could be disposed of without legal, regulatory or business consequences.

The average employee creates, sends, receives and stores conservatively 20 MB of data per day. This means that at the end of 15 business days, they have accumulated 220 MB of new data, at the end of 90 days, 1.26 GB of data and at the end of three years, 15.12 GB of data. So how much of this accumulated data needs to be retained? Again referring to figure 1 below, the blue shaded area represents the information that probably has no legal, regulatory or business value according to the 2012 CGOC survey. At the end of three years, the amount of retained data from a single employee that could be disposed of without adverse effects to the organization is 10.43 GB. Now multiply that by the total number of employees and you are looking at some very large data stores.

Figure 1: The Lifecycle of data

The above lifecycle of data shows us that employees really don’t need all of the data they squirrel away (because its probability of re-use drops to 1% at around 15 days) and based on the CGOC survey, approximately 69% of organizational data is not required for legal, regulatory retention or has business value. The difficult piece of this whole process is how can an organization efficiently determine what data is not needed and dispose of it automatically…

As unstructured data volumes continue to grow, automatic categorization of data is quickly becoming the only way to get ahead of the data flood. Without accurate automated categorization, the ability to find the data you need, quickly, will never be realized. Even better, if data categorization can be based on the meaning of the content, not just a simple rule or keyword match, highly accurate categorization and therefore information governance is achievable.

Total Time & Cost to ECA


A key phase in eDiscovery is Early Case Assessment (ECA), the process of reviewing case data and evidence to estimate risk, cost and time requirements, and to set the appropriate go-forward strategy to prosecute or defend a legal case – should you fight the case or settle as soon as possible. Early case assessment can be expensive and time consuming and because of the time involved, may not leave you with enough time to properly review evidence and create case strategy. Organizations are continuously looking for ways to move into the early case assessment process as quickly as possible, with the most accurate data, while spending the least amount of money.

The early case assessment process usually involves the following steps:

  1. Determine what the case is about, who in your organization could be involved, and the timeframe in question.
  2. Determine where potentially relevant information could be residing – storage locations.
  3. Place a broad litigation hold on all potentially responsive information.
  4. Collect and protect all potentially relevant information.
  5. Review all potentially relevant information.
  6. Perform a risk-benefit analysis on reviewed information.
  7. Develop a go-forward strategy.

Every year organizations continue to amass huge amounts of electronically stored information (ESI), primarily because few of them have systematic processes to actually dispose of electronic information – it is just too easy for custodians to hit the “save” button and forget about it. This ever-growing mass of electronic information means effective early case assessment cannot be a strictly manual process anymore. Software applications that can find, cull down and prioritize responsive electronic documents quickly must be utilized to give the defense time to actually devise a case strategy.

Total Time & Cost to ECA (TT&C to ECA)

The real measure of effective ECA is the total time and cost consumed to get to the point of being able to create a go-forward strategy; total time & cost to ECA.

The most time consuming and costly steps are the collection and review of all potentially relevant information (steps 4 and 5 above) to determine case strategy. This is due to the fact that to really make the most informed decision on strategy, all responsive information should be reviewed to determine case direction and how.

Predictive Coding for lower TT&C to ECA

Predictive Coding is a process that combines people, technology and workflow to find, prioritize and tag key relevant documents quickly, irrespective of keyword to speed the evidence review process while reducing costs. Due to its documented accuracy and efficiency gains, Predictive Coding is transforming how Early Case Assessment (ECA), analysis and document review are done.

The same predictive coding process used in document review can be used effectively for finding responsive documents for early case assessment quickly and at a much lower cost than traditional methods.

ECAlinearReview

Figure 1: The time & cost to ECA timeline graphically shows what additional time can mean in the eDiscovery process

Besides the sizable reduction in cost, using predictive coding for ECA gives you more time to actually create case strategy using the most relevant information. Many organizations find themselves with little or no time to actually create case strategy before trail because of the time consumed just reviewing documents. Having the complete set of relevant documents sooner in the process will give you the most relevant data and the greatest amount of time to actually use it effectively.

Discoverable versus Admissible; aren’t they the same?


This question comes up a lot, especially from non-attorneys. The thought is that if something is discoverable, then it must be admissible; the assumption being that a Judge will not allow something to be discovered if it can’t be used in court. The other thought is that everything is discoverable if it pertains to the case and therefor everything is admissible.

Let’s first address what’s discoverable. For good cause, the court may order discovery of any matter (content) that’s not privileged relevant to the subject matter involved in the action. In layman’s terms, if it is potentially relevant to the case, you may have to produce it in discovery or in other words, anything and everything is potentially discoverable.  All discovery is subject to the limitations imposed by FRCP Rule 26(b)(2)(C).

With that in mind, let’s look at the subject of admissibility.

In Lorraine v. Markel Am. Ins. Co., 241 F.R.D. 534, 538 (D. Md. 2007), the court started with the premise that the admissibility of ESI is determined by a collection of evidence rules “that present themselves like a series of hurdles to be cleared by the proponent of the evidence”.  “Failure to clear any of these evidentiary hurdles means that the evidence will not be admissible”. Whenever ESI is offered as evidence, five evidentiary rules need to be considered. They are:

  • is relevant to the case
  • is authentic
  • is not hearsay pursuant to Federal Rule of Evidence 801
  • is an original or duplicate under the original writing rule
  • has probative value that is substantially outweighed by the danger of unfair prejudice or one of the other factors identified by Federal Rule of Evidence 403, such that it should be excluded despite its relevance.

Hearsay is defined as a statement made out of court that is offered in court as evidence to prove the truth of the matter asserted. Hearsay comes in many forms including written or oral statements or even gestures.

It is the Judge’s job to determine if evidence is hearsay or credible. There are three evidentiary rules that help the Judge make this determination:

  1. Before being allowed to testify, a witness generally must swear or affirm that his or her testimony will be truthful.
  2. The witness must be personally present at the trial or proceeding in order to allow the judge or jury to observe the testimony firsthand.
  3. The witness is subject to cross-examination at the option of any party who did not call the witness to testify.

The Federal Rules of Evidence Hearsay Rule prohibits most statements made outside of court from being used as evidence in court. Looking at the three evidentiary rules mentioned above – usually a statement made outside of the courtroom is not made under oath, the person making the statement outside of court is not present to be observed by the Judge, and the opposing party is not able to cross examine the statement maker. This is not to say all statements made outside of court are inadmissible. The Federal Rule of Evidence 801 does provide for several exclusions to the Hearsay rule.

All content is discoverable if it potentially is relevant to the case and not deemed privileged, but discovered content may be ruled inadmissible if it is deemed privileged (doctor/patient communications), unreliable or hearsay. You may be wondering how an electronic document can be considered hearsay? The hearsay rule refers to “statements” which can either be written or oral. So, as with paper documents, in order to determine whether the content of electronic documents are hearsay or fact, the author of the document must testify under oath and submit to cross-examination in order to determine whether the content is fact and can stand as evidence.

This legal argument between fact and hearsay does not relieve the discoveree from finding, collecting and producing all content in that could be relevant to the case.

Next Generation Technologies Reduce FOIA Bottlenecks


Federal agencies are under more scrutiny to resolve issues with responding to Freedom of Information Act (FOIA) requests.

The Freedom of Information Act provides for the full disclosure of agency records and information to the public unless that information is exempted under clearly delineated statutory language. In conjunction with FOIA, the Privacy Act serves to safeguard public interest in informational privacy by delineating the duties and responsibilities of federal agencies that collect, store, and disseminate personal information about individuals. The procedures established ensure that the Department of Homeland Security fully satisfies its responsibility to the public to disclose departmental information while simultaneously safeguarding individual privacy.

In February of this year, the House Oversight and Government Reform Committee opened a congressional review of executive branch compliance with the Freedom of Information Act.

The committee sent a six page letter to the Director of Information Policy at the Department of Justice (DOJ), Melanie Ann Pustay. In the letter, the committee questions why, based on a December 2012 survey, 62 of 99 government agencies have not updated their FOIA regulations and processes which was required by Attorney General Eric Holder in a 2009 memorandum. In fact the Attorney General’s own agency have not updated their regulations and processes since 2003.

The committee also pointed out that there are 83,000 FOIA request still outstanding as of the writing of the letter.

In fairness to the federal agencies, responding to a FOIA request can be time-consuming and expensive if technology and processes are not keeping up with increasing demands. Electronic content can be anywhere including email systems, SharePoint servers, file systems, and individual workstations. Because content is spread around and not usually centrally indexed, enterprise wide searches for content do not turn up all potentially responsive content. This means a much more manual, time consuming process to find relevant content is used.

There must be a better way…

New technology can address the collection problem of searching for relevant content across the many storage locations where electronically stored information (ESI) can reside. For example, an enterprise-wide search capability with “connectors” into every data repository, email, SharePoint, file systems, ECM systems, records management systems allows all content to be centrally indexed so that an enterprise wide keyword search will find all instances of content with those keywords present. A more powerful capability to look for is the ability to search on concepts, a far more accurate way to search for specific content. Searching for conceptually comparable content can speed up the collection process and drastically reduce the number of false positives in the results set while finding many more of the keyword deficient but conceptually responsive records. In conjunction with concept search, automated classification/categorization of data can reduce search time and raise accuracy.

The largest cost in responding to a FOIA request is in the review of all potentially relevant ESI found during collection. Another technology that can drastically reduce the problem of having to review thousands, hundreds of thousands or millions of documents for relevancy and privacy currently used by attorneys for eDiscovery is Predictive Coding.

Predictive Coding is the process of applying machine learning and iterative supervised learning technology to automate document coding and prioritize review. This functionality dramatically expedites the actual review process while dramatically improving accuracy and reducing the risk of missing key documents. According to a RAND Institute for Civil Justice report published in 2012, document review cost savings of 80% can be expected using Predictive Coding technology.

With the increasing number of FOIA requests swamping agencies, agencies are hard pressed to catch up to their backlogs. The next generation technologies mentioned above can help agencies reduce their FOIA related costs while decreasing their response time.

Healthcare Information Governance Requires a New Urgency


From safeguarding the privacy of patient medical records to ensuring every staff member can rapidly locate emergency procedures, healthcare organizations have an ethical, legal, and commercial responsibility to protect and manage the information in their care. Inadequate information management processes can result in:

  • A breach of protected health information (PHI) costing millions of dollars and ruined reputations.
  • A situation where accreditation is jeopardized due to a team-member’s inability to demonstrate the location of a critical policy.
  • A premature release of information about a planned merger causing the deal to fail or incurring additional liability.

The benefits of effectively protecting and managing healthcare information are widely recognized but many organizations have struggled to implement effective information governance solutions. Complex technical, organizational, regulatory and cultural challenges have increased implementation risks and costs and have led to relatively high failure rates.  Ultimately, many of these challenges are related to information governance.

In January 2013, The U.S. Department of Health and Human Services published a set of modifications to the HIPAA privacy, security, enforcement and breach notification rules.  These included:

  • Making business associates directly liable for data breaches
  • Clarifying and increasing the breach notification process and penalties
  • Strengthening limitations on data usage for marketing
  • Expanding patient rights to the disclosure of data when they pay cash for care

Effective Healthcare Information Governance steps

Inadvertent or just plain sloppy non-compliance with regulatory requirements can cost your healthcare organization millions of dollars in regulatory fines and legal penalties. For those new to the healthcare information governance topic, below are some suggested steps that will help you move toward reduced risk by implementing more effective information governance processes:

  1. Map out all data and data sources within the enterprise
  2. Develop and/or refresh organization-wide information governance policies and processes
  3. Have your legal counsel review and approve all new and changed policies
  4. Educate all employees and partners, at least annually, on their specific responsibilities
  5. Limit data held exclusively by individual employees
  6. Audit all policies to ensure employee compliance
  7. Enforce penalties for non-compliance

Healthcare information is by nature heterogeneous. While administrative information systems are highly structured, some 80% of healthcare information is unstructured or free form.  Securing and managing large amounts of unstructured patient as well as business data is extremely difficult and costly without an information governance capability that allows you to recognize content immediately, classify content accurately, retain content appropriately and dispose of content defensibly.

Ineffective eDiscovery Processes Raise the Cost of Healthcare


Healthcare disputes arise for many reasons.  Healthcare providers challenge payors’ claims policies, practices and actual payments.  Health insurance beneficiaries and healthcare providers dispute coverage decisions by payors.  Patients file malpractice claims when the end result of a medical procedure doesn’t meet their expectations. Healthcare disputes can lead to litigation which also leads to eDiscovery. Healthcare eDiscovery can be complex and burdensome due to the myriad formats used as well as the data security requirements imposed via federal and state regulatory requirements.

New healthcare information management requirements are changing the way healthcare organizations evolve their enterprise infrastructures as new regulatory requirements direct how information is created, stored, shared, referenced and managed. As new information governance technology is adopted and changes how patient and business records are utilized, healthcare providers as well as healthcare payors and suppliers will have to change and adapt how they respond to eDiscovery.

Healthcare eDiscovery Key Requirements and Recent Developments

The 2006 amendments to the Federal Rules of Civil Procedure (FRCP) established that all forms of ESI are potentially discoverable if not deemed privileged or heresy by the Judge, and apply to all legal actions filed in federal courts on or after December 1, 2006. Under the FRCP, any information potentially relevant to the case, whether in paper or electronic format, is subject to an eDiscovery request. Many states have adopted the federal rules of civil procedure in whole or in part with respect to defining what’s discoverable when it comes to electronic data.

The eDiscovery process for the healthcare industry is the same as for any other industry except that special care has to be taken with patient data. When attorneys do handle protected health information (PHI), they must be aware of state and federal legal ramifications of being exposed to this type of information. Failure to do so could lead to significant fines and damaged reputations stemming from the improper handling of PHI.

Effective Healthcare eDiscovery steps

eDiscovery is a complex process that requires a multidisciplinary approach to successfully implement and manage. Healthcare organizations should consider the following activities to successfully prepare for eDiscovery.

  1. Establish a litigation response team with a designee from the legal, HIM, and IT departments
  2. Review, revise, or develop an organizational information management plan
  3. Identify the data owners or stewards within the organization
  4. Review, revise, or develop an enterprise records retention policy and schedule
  5. Audit compliance with the records retention policy and schedule
  6. Penalize non-compliance with the records retention policy and schedule
  7. Conduct thorough assessment of the storage locations for all data including back-up media
  8. Review, revise, or develop organizational policies related to the eDiscovery process
  9. Establish an organizational program to educate and train/retrain all management and staff on eDiscovery and records retention compliance

The eDiscovery process is equivalent to searching warehouses, waste baskets, file cabinets, home offices, and personal notes to find that “needle in the haystack” that will help prove the other side’s claims. Healthcare organizations are finding it especially difficult to respond to and review the huge amounts of data due to additional healthcare specific data formats and regulatory requirements around patient privacy.

The huge expense of information review during litigation coupled with the high risk of enforcement action by regulatory authorities drives many legal professionals to seek a more proactive, defensible and cost efficient approach.


 

 

 

Coming to Terms with Defensible Disposal; Part 1


Last week at LegalTech New York 2013 I had the opportunity to moderate a panel titled: “Defensible Disposal: If it doesn’t exist, I don’t have to review it…right?” with an impressive roster of panelists. They included: Bennett Borden, Partner, Chair eDiscovery & Information Governance Section, Williams Mullen, Clifton C. Dutton, Senior Vice President, Director of Strategy and eDiscovery, American International Group and John Rosenthal, Chair, eDiscovery and Information Management Practice, Winston & Strawn and Dean Gonsowski, Associate General Counsel, Recommind Inc.

During the panel session it was agreed that organizations have been over-retaining ESI (which accounts for at least 95% of all data in organizations) even if it’s no longer needed for business or legal reasons. Other factors driving this over-retention of ESI were the fear of inadvertently deleting evidence, otherwise called spoliation. In fact an ESG survey published in December of 2012 showed that the “fear of the inability to furnish data requested as part of a legal or regulatory matter” was the highest ranked reason organizations chose not to dispose of ESI.

Other reasons cited included not having defined policies for managing and disposing of electronic information and adversely, organizations having defined retention policies to actually keep all data indefinitely (usually because of the fear of spoliation).

One of the principal information governance gaps most organizations haven’t yet addressed is the difference between “records” and “information”. Many organizations have “records” retention/disposition policies to manage those official company records required to be retained under regulatory or legal requirements. But those documents and files that fall under legal hold and regulatory requirements amount to approximately 6% of an organization’s retained electronic data (1% legal hold and 5% regulatory).

Another interesting survey published by Kahn Consulting in 2012 showed levels of employee understanding of their information governance-related responsibilities. In this survey only 21% of respondents had a good idea of what information needed to be retained/deleted and only 19% knew how  information should be retained or disposed of. In that same survey, only 15% of respondents had a general idea of their legal hold and eDiscovery responsibilities.

The above surveys highlight the fact that organizations aren’t disposing of information in a systematic process mainly because they aren’t managing their information, especially their electronic information and therefore don’t know what information to keep and what to dispose of.

An effective defensible disposal process is dependent on an effective information governance process. To know what can be deleted and when, an organization has to know what information needs to be kept and for how long based on regulatory, legal and business value reasons.

Over the coming weeks, I will address those defensible disposal questions and responses the LegalTech panel discussed. Stay tuned…