Monday, July 30, 2012

200,000 Hits








The Litigation Support and Technology News Blog has reached over 200,000 page views since last year. Thank you for reading and thank you for your comments.


Monday, July 23, 2012

Days Five and Six of a Predictive Coding Narrative: Deep into the weeds and a computer mind-meld moment




http://ow.ly/cqoln

An article by Ralph Losey, Esq. posted on the blog e-Discovery Team®.


The article is part 4 in a series that examines a predictive coding test project that the author initiated as a training session, which analyzed data from 699,082 emails and attachments from the Enron litigation.  The goal of the project was to locate evidence related to one specific issue, involuntary employee terminations. The article also cites the former portions of this series, "The first day of search is described in Day One of a Predictive Coding Narrative: Searching for Relevance in the Ashes of Enron. The second day is described in Day Two of a Predictive Coding Narrative: More Than A Random Stroll Down Memory Lane. The third and fourth days are described in Days Three and Four of a Predictive Coding Narrative: Where I find that the computer is free to disagree."  Links to each earlier article are provided by the author. 


The article provides tips on how to choose a responsible service provider, and also looks at specific steps taken during the test project that the article is based upon.  The article provides comments from predictive coding expert Joe White, and looks at some of the suggestions he provided to Mr. Losey regarding the test review, as it unfolded.  Mr. Losey provides specific information about each step he followed in the test process, and the continued iterations that took place as the computer continued to learn from the designations assigned to documents that were actually reviewed and tagged by the author. The article provides interesting insight into the abilities of the software involved in the predictive coding process, and the abilities available to enhance the accuracy of the ongoing review.



Friday, July 20, 2012

eDiscovery Case Law: Judge Scheindlin Says “No” to Self-Collection, “Yes” to Predictive Coding



http://ow.ly/cnnxB

An article by Doug Austin posted on the eDiscovery Daily blog. The article discusses an opinion by Judge Shira Scheindlin, and an article by eDiscovery expert Ralph Losey, Esq. Mr. Austin provides a link to the article by Ralph Losey, which appeared on Law Technology News, and is entitled "Judge Scheindlin Issues Strong Opinion on Custodian Self-Collection".

The article states, "Regarding the defendant’s question as to “why custodians could not be trusted to run effective searches of their own files, a skill that most office workers employ on a daily basis” (i.e., self-collect), Judge Scheindlin responded as follows:

“There are two answers to defendants' question. First, custodians cannot 'be trusted to run effective searches,' without providing a detailed description of those searches, because FOIA places a burden on defendants to establish that they have conducted adequate searches; FOIA permits agencies to do so by submitting affidavits that 'contain reasonable specificity of detail rather than merely conclusory statements."

The article further states, "...“The second answer to defendants' question has emerged from scholarship and caselaw only in recent years: most custodians cannot be 'trusted' to run effective searches because designing legally sufficient electronic searches in the discovery or FOIA contexts is not part of their daily responsibilities. Searching for an answer on Google (or Westlaw or Lexis) is very different from searching for all responsive documents in the FOIA or e-discovery context.”

“Simple keyword searching is often not enough: 'Even in the simplest case requiring a search of on-line e-mail, there is no guarantee that using keywords will always prove sufficient.' There is increasingly strong evidence that '[k]eyword search[ing] is not nearly as effective at identifying relevant information as many lawyers would like to believe.' As Judge Andrew Peck -- one of this Court's experts in e-discovery -- recently put it: 'In too many cases, however, the way lawyers choose keywords is the equivalent of the child's game of 'Go Fish' ... keyword searches usually are not very effective.'”"

The article provides further comments regarding the use of technology such as predictive coding, and mentions how they can be used to increase the effectiveness of locating relevant information.

Panel Debunks Predictive Coding Myths



http://ow.ly/cnlvv

An article by Monica Bay posted on law.com on the LTN webpage.  The article examines comments made during sessions of the Virtual Corporate Counsel Forum, from the panel discussion entitled "Debunking the Myths about Predictive Coding".  The comments discussed by Ms. Bay are attributed to David Kessler, Esq. of Fulbright & Jaworski, and Howard Sklar, Esq. of Recommind.

The article states, "The two lawyers explained some of the basic concepts of technology-assisted review, such as how sampling is used to determine baseline responsiveness, and how prioritized review results in responsive review batches. "Sampling is used to determine baseline responsiveness," said Sklar." At the beginning of the process, you are trying to get a key set of relevant documents. Then, you can 'interrogate' those documents with sophisticated technology" to establish which documents are most appropriate.

Once you have found those documents, then "you can train the system on relevant documents, so you can find 'more like this,'" Sklar said. The process takes "interaction between the computer and human feedback to define the relevant terms."

The article also provides a link to a PDF version of the powperpoint presentation covered by the panelists.

The article also discusses the four steps of predictive coding: 

Step 1: Using predictive analytics to create review sets, with human review.

Step 2: System training on relevant documents (computer suggestions).

Step 3: Human review of computer suggestions: "adaptive identification cycles" (train, suggest, review).

Step 4: Statistical quality control validation.

In addition, seven myths about predictive coding were also discussed, including: 
  1. Predictive coding is automated coding 
  2. Defensibility depends primarily on the technology
  3. Predictive coding is inherently risky
  4. Culling documents using predictive coding is risky
  5. Technology is replacing judgment
  6. 99% is reasonable and 95% is not
  7. Transparency is the answer







Thursday, July 19, 2012

Predictive Analytics and Big Data with Higher Costs to Boot



http://ow.ly/clyx8

An article by Stephen E. Arnold posted on the itarnold.com website.

This article discusses "predictive analytics", and "big data".  The article examines future technologies that will be designed to apply predictive analytics techniques toward big data.

The article discusses an article by Kroll Ontrack, which discusses the combination of predictive analytics and big data.  The referenced article, "Predictive Coding Helps Tackle Big Data”, explains that as big data becomes more widespread it will make the eDiscovery process more expensive. A link to the referenced article is provided in Mr. Arnold's article.

"Predictive coding could make big data more cost-effective just as it makes attorney fees lower:

“However, having a mushrooming quantity of data means that when an e-disclosure request is issued, it takes even longer to trawl through information, identify relevant documents and compare duplicates. With the increasing time it takes, legal costs can skyrocket, a worrying trend for businesses in the current climate where margins are already stretched thin. For this reason the introduction of predictive coding in likely to be popular as it leaves the legwork to a sophisticated algorithm, finding relevant documents which can then be reviewed more closely.”"

Wednesday, July 18, 2012

ARMA International Maturity Model for Information Governance



http://ow.ly/cjO0P

A link provided above is to the 2012 ARMA Generally Accepted Recordkeeping Practices (GARP) which is available on the arma.org website.  A PDF version of the Maturity Model is provided as well at the website mentioned above.

The Maturity Model is discussed on the ARMA website as follows, "The GARP principles identify the critical hallmarks of information governance, which Gartner describes as an accountability framework that "includes the processes, roles, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals." As such, they apply to all sizes of organizations, in all types of industries, and in both the private and public sectors. Multi-national organizations can also use GARP® to establish consistent practices across a variety of business units."

The Maturity Model addresses the following various levels of recordkeeping practices:
  • Level 1 (Sub-standard): This level describes an environment where recordkeeping concerns are either not addressed at all, or are addressed in a very ad hoc manner. Organizations that identify primarily with these descriptions should be concerned that their programs will not meet legal or regulatory scrutiny.
  • Level 2 (In Development): This level describes an environment where there is a developing recognition that recordkeeping has an impact on the organization, and that the organization may benefit from a more defined information governance program. However, in Level 2, the organization is still vulnerable to legal or regulatory scrutiny since practices are ill-defined and still largely ad hoc in nature.
  • Level 3 (Essential): This level describes the essential or minimum requirements that must be addressed in order to meet the organization's legal and regulatory requirements. Level 3 is characterized by defined policies and procedures, and more specific decisions taken to improve recordkeeping. However, organizations that identify primarily with Level 3 descriptions may still be missing significant opportunities for streamlining business and controlling costs.
  • Level 4 (Proactive): This level describes an organization that is initiating information governance program improvements throughout its business operations. Information governance issues and considerations are integrated into business decisions on a routine basis, and the organization easily meets its legal and regulatory requirements. Organizations that identify primarily with these descriptions should begin to consider the business benefits of information availability in transforming their organizations globally.
  • Level 5 (Transformational): This level describes an organization that has integrated information governance into its overall corporate infrastructure and business processes to such an extent that compliance with the program requirements is routine. These organizations have recognized that effective information governance plays a critical role in cost containment, competitive advantage, and client service.

eDiscovery Professionals: Order-taking or Consulting? – Introduction



http://ow.ly/cjMPR

An article by Jane Gennarelli posted on the eDiscovery Daily blog.

This blog posts indicates that it will be series of articles regarding eDiscovery professionals.  The article will examine eDiscovery professionals that provide consultative services, and also discuss the fact that some eDiscovery professionals are "order takers" instead of consultants.

The article states, "In the best environments, I see litigators turning to the electronic discovery professionals in their firms for advice and guidance in handling discovery. Unfortunately, however, I too often see talented electronic discovery professionals – who have a wealth of knowledge and significant expertise – functioning as “order-takers”. And all too often, this means that electronic discovery work isn’t done as cost effectively as it could be, and the work product suffers."

The article further explains that future blog posts on this topic will discuss the following:
  • Order taking: why it happens and what problems and issues arise?
  • How do you make the shift from order-taking to consulting?
  • What are the characteristics of a good consultant?
  • Some handy consulting tips
  • What are the consulting opportunities in a typical case?