Thursday, May 31, 2012

2012 Am Law 200


http://www.americanlawyer.com/PubArticleTAL.jsp?id=1202553410404








The Second Hundred chalked up a healthy increase in revenue per lawyer in 2011, but costs kept gains in profits small. The Second Hundred's revenue per lawyer rose 5.3 percent and value per lawyer rose 4.7 percent, but profits per partner rose just 2.2 percent, less than The Am Law 100's 3 percent. The Second Hundred's total gross revenue in 2011 was $17.93 billion, a 2.7 percent gain outpaced by The Am Law 100's 5.3 percent gain. For the complete list of the second Hundred click the link above.

Hot Off The Press - Today's New NLRB Social Media Guidance



http://ow.ly/bgHFW

An article by Michael Schmidt posted on the Social Media Law Employment Blog of the law firm Cozen O'Connor.

This article looks at the just released 3rd memorandum providing guidance for social media regulation in the workplace that was issued by the National Labor Relations Board.

The article states, "The first guidance memorandum on August 18, 2011 was focused primarily on adverse employment decisions based on employee social media activity, with a smaller discussion about the scope of social media policies. The second memorandum on January 24, 2012 also addressed adverse employment action, but contained a more detailed analysis of the do’s and don’ts of workplace policies.

Today’s third memorandum is devoted exclusively to the NLRB’s updated thoughts on seven employer social media policies on which the agency has recently issued administrative rulings".  Links to the other two referenced memorandum are provided in the article.

There are 7 cases addressed in the memorandum  that show examples to illustrate the limits of social media regulation.  In 6 of the cases some form of the policy was upheld as lawful, in the 7th case the entire policy was deemed invalid.

The article by Mr. Schmidt provides bullet points that help illustrate what policies are deemed impermissibly overbroad policy provisions according to the NLRB and what is deemed permissible as well.







Should we all be getting the Twitter 'jitters'? – Be careful what you say online



http://ow.ly/bgEpd

An article by Susan McLean and Alistair Maughan posted on the Morrison Foerster  website as a Client Alert

This article is specific to UK law, and looks at various UK laws that are being re-purposed, and used to regulate the use of Twitter and social media posts.

The article mentions the following UK laws and regulations, which have been used in connection to "Tweets" posted on Twitter, although such laws were not originally established for such a reason:


COMMUNICATIONS ACT 2003 (Artilce states: "In 2011, there were 2,000 prosecutions in the UK under section 127 of the Communications Act 2003.");
MALICIOUS COMMUNICATIONS ACT 1988;
CONTEMPT OF COURT ACT 1981;
SERIOUS CRIME ACT 2007; and
DEFAMATION ACT 1996

The article states, "As in other countries, a whole host of UK laws that were designed in an age before social media – even, in some cases, far before the Internet as we know it – are now being used to regulate digital speech. Digital speech, by its very nature, has permanent records that are easily searchable, making the police and the prosecution’s job much easier. Accordingly, these types of cases are only going to increase, and it will be interesting to see where the UK courts decide to draw the line between freedom of expression and the law."

Wednesday, May 30, 2012

U.S. Cross Border Ediscovery vs. EU Data Protection: Clash of the Titans



http://ow.ly/beWvp

An article by Monique Altheim posted on the EDiscovery Map blog.

This article provides insight into the conflict between EU privacy protections and US eDiscovery obligations. The article provides a link to a deck of powerpoint slides that the author used at a recent presentation she provided at the Legal Tech West conference which was subtitled "Clash of the Titans".  The article also provides a link to presentations the author was involved in at the recent CPDP conference held in Brussels.

The article states, "The subtitle “Clash of the Titans” derives from the fact that on the one hand the U.S. has the broadest pre-trial civil litigation discovery procedure on earth, while on the other hand the EU has the most stringent data protection framework on the planet. Trying to collect and transfer terabytes of data, most of which contain personal components, in the EU, where data protection is a fundamental right and very heavily regulated, is indeed quite a challenge.
In this presentation, I analyzed the U.S. jurisprudence on the extra-territorial application of U.S. ediscovery obligations as well as the EU guidelines concerning personal data collected while conducting U.S. civil ediscovery in the EEA."

Among the slides included in the referenced slide deck was the following slide, which provides a foundation for the discussion about the inherent conflict between the EU and the US on this topic:



International e-discovery: Privacy and data protection across the globe



http://ow.ly/beVBJ

An article by Wayne Wong posted on the insidecounsel.com website.

This article examines differences in privacy rights throughout the globe, and looks at issues to keep in mind when dealing with eDiscovery matters in varied jurisdictions.

The article states, "More and more, companies with global operations—and the lawyers that represent them—are finding themselves enmeshed in legal matters around the world. Developing an ironclad international legal skill set begins by building a greater understanding of e-discovery, privacy and data protection laws across the globe."

The article goes on to examine rules that govern the following areas:  United States; England & Wales; EU; Canada; and Asia (APAC).




Tuesday, May 29, 2012

What Does Successful Information Governance Across Europe Look Like?



http://ow.ly/bdjQg

A link to a PDF Document provided the DLM Forum, providing tips on suggested Best Practices for Information Governance Policies. The link provides a detailed overview of the DLM Forum, and what recommendations it has regarding effective information governance practices. An illustration from the document is provided below:




Shining a Light into the Black Box of E-discovery Predictive Coding



http://ow.ly/bdhrl

An article by Matthew Nelson posted on law.com on the Corporate Counsel webpage.

This article discusses issues that relate to predictive coding, and provides some insight into the definition of the term, as well as the state of acceptance of the technology by the judiciary.

The article states, "In 2012, the wait for judicial guidance ended abruptly when not one, but three new predictive coding cases surfaced: Da Silva Moore v. Publicis Groupe; Kleen Products, LLC v. Packaging Corporation of America; and Global Aerospace Inc., v. Landow Aviation, LLP. In Da Silva Moore, Judge Andrew Peck even approved the use of predictive coding technology in “appropriate cases,” leaving some to believe the courthouse doors had been thrown open to unbridled use of the technology. Somehow, within weeks of the decision, the wheels of the predictive coding freight train locked up, leaving many wondering whether or not these new predictive coding cases provided clarity or merely added more confusion."

The author further states, "Predictive coding is a type of machine-learning technology that enables a computer to automatically predict how documents should be classified based on limited human input. The technology is exciting for corporate legal departments attempting to manage skyrocketing litigation budgets, because the ability to automatically rank and then “code” or “tag” electronic documents, based on criteria such as “relevance” and “privilege” has the potential to save companies millions in e-discovery costs. The savings are directly attributable to the fact that fewer dollars are spent paying lawyers to review every document before documents are produced to outside parties during discovery.

The main advantage for corporations is that a fraction of documents are reviewed which results in a fraction of the review costs. The process begins by feeding “relevance” and “privilege” decisions made by attorneys about a small number of case documents called a “seed set” into a computer system. The computer then relies on these “training” decisions to create an algorithm that ranks and codes the remaining documents automatically. The attorneys can then evaluate the accuracy of the computer’s automated decisions."

The article examines 3 recent cases that addressed the possible use of predictive coding.   The article looks at the Da Silva Moore case; the Kleen Products case; and the Global Aerospace case, all of which requested some form of court approval for the use of predictive coding.

The author goes on to write, "The turmoil surrounding the first batch of predictive coding cases has led some to question whether or not predictive coding technology is ready for mainstream adoption. Those on the conservative side will wait and see how the above cases unfold before testing the predictive coding waters. More progressive legal departments will capitalize on the lessons these cases illustrate, instead of being duped into thinking the future of predictive coding technology has been significantly tarnished or delayed.
 
Savvy practitioners will recognize these cases reveal that first-generation technologies lack the transparency and simplicity necessary to take predictive coding mainstream in the legal profession. They will also recognize that technological change and improvement designed to automate a complex new approach to document review is around the corner. Academic studies indicate that predictive coding technology truly can yield far superior results to manual review—at less cost—when managed properly. (See Maura R. Grossman and Gordon V. Cormack, "Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review" [PDF], XVII, RICH. J.L. & TECH, 11 [2011].)  A link to the referenced article is provided.

Challenging Predictive Coding to Better Defend It



http://ow.ly/bdfA3

An article by Michael Roach posted on law.com on the LTN webpage.

This article provides some information from the Legal Tech West Conference of 2012, and looks at issues related to predictive coding technology.

The article states, "In a spirited and entertaining discussion, the speakers put to test predictive coding in electronic data discovery -- the process of taking a subset of a data collection, and having senior litigators mark certain documents as responsive or non-responsive and then the software "learns" from these decisions. They raised concerns not about the technology's viability, but about how the current controversies will affect its use in litigation. Key issues that arose included: 1) how industry and other promotional representations of the technology could negatively impact its widespread adoption, 2) potential fallout from the "computer-assisted coding" order from Magistrate Judge Andrew Peck of the U.S. District Court for the Southern District of New York in Monique da Silva Moore v. Publicis Groupe and MLS Group, and 3) musings on just who should be creating the seed sets used to "predict" which documents will be responsive".  Links are provided in the article to the referenced court orders.

In addition, the article goes on to provide some comments made by some of the thought leaders that were involved the Legal Tech West discussion in regard to the 3 topics set forth above..

Friday, May 25, 2012

Gartner: 2012 Magic Quadrant for E-Discovery Software




http://www.law.com/jsp/lawtechnologynews/PubArticleLTN.jsp?id=1202555971820&Gartner_Top_EDiscovery_Vendors_Eclipse_1B_in_2010_Revenue=&et=editorial&bu=LTN&cn=LTN_20120525&src=EMC-Email&pt=Law%20Technology%20News&kw=Gartner%3A%20Top%20E-Discovery%20Vendors%20Eclipse%20%241B%20in%202010%20Revenue&slreturn=1


Gartner a technology research firm has released its 2012 Magic Quadrant for E-Discovery Software. There was some movement at the top of the quadrant with Kcura and FTI Consulting moving out of the Leader quadrant. In addition, there are also fewer large e-discovery companies that qualified for this year's report compared to last year. This year's participants reached a combined revenue of $1 billion in 2010. Gartner, which previously predicted $1.5 billion industry revenue by 2013, evaluated 21 companies for this report. That's 10 fewer than evaluated last year, mostly because of industry consolidation and a lack of significant new players, the analysts explained.

Activating Your Information Management Shield



http://ow.ly/b9bVL

An article by Jim Shook posted on the EMC Source One Insider website.

This article provides a discussion regarding information governance practices within a corporate workplace.

The article states, "Good policies, with technology to enable and enforce them, can help insure that records and compliance information are retained for the right amount of time, while also enabling the deletion of stale and useless information which has outlived its retention period. Good information management processes insure that protected information is stored in the right place, operational efficiencies are enhanced by focusing on useful information and the e-Discovery process is easier and more efficient."

The article looks at two recent cases, and uses them as an example to illustrate the need for good records management practices.  The article states, " If your organization is looking for more reasons why good information management is valuable, two recent cases provide some great reasons:

  • If you have an information governance policy, it may help you to defeat a claim for sanctions even if data has been deleted; and
  • If you don’t have an information governance policy, and you delete data that was subject to compliance requirements, the lack of a policy can help to establish the bad faith necessary to award sanctions."
The referenced cases are the following:

Danny Lynn Electrical & Plumbing, LLC v. Veolia Es Solid Waste Southeast, Inc., 2012 U.S. Dist. LEXIS 62510 (M.D. Ala. May 4, 2012), the plaintiff requested sanctions for the defendants’ alleged failure to properly implement a litigation hold. The defendant's were not sanctioned, primarily because they had set up an email archive that the court felt showed they were not acting in bad faith.

FDIC v. Malik, 2012 U.S. Dist. LEXIS 41178 (E.D.N.Y. Mar. 26, 2012) where the court also considered a spoliation motion for the deletion of emails. The email messages related to a law firm’s prior representation of a mortgage company.  In this matter the defendant was subject to sanctions, since the missing emails were also subject to a regulatory requirement, beyond the litigation hold. 

Links to both cases are provided in the article.

Metrics, Social Media, Magistrates, Monkeys and Mitigating Risk at CEIC 2012 in Las Vegas



http://ow.ly/b99m1

An article by Chris Dale on his blog the e-Disclosure Information Project.

This article discusses the Guidance Software’s CEIC 2012, a recent conference held in Nevada which dealt with eDiscovery and computer forensics issues, which purports to be the leading conference on eDiscovery, cyber response and digital investigations.

Mr. Dale's article discusses the meaning of the seemingly odd headline by stating, "In this case, the headline is an honest, if partial, summary of what lies below: metrics are the key to eDiscovery decision-making; social media is the fastest-growing source of potentially discoverable data; Magistrate Judges turn up to share their wisdom with us; monkeys appear twice, once as part of the question “who are you talking to?” and once in an echo of a recent post of mine about cross-border discovery and blocking statutes; risk mitigation is the theme which binds them all together."

The article goes on to provide insight into a number of topics addressed at the conference, and references many of the top eDiscovery experts, and provides comments about the information shared by such experts during the conference.  This is a comprehensive article that touches on several key topics that are being discussed within the litigation support and eDiscovery community at present.

The article discusses a panel session that addressed the need for metrics to measure eDiscovery costs, and states, "...companies should start collecting basic metrics – files per custodian, processing time, hosting costs and the like – to help understand where the money is going. That is a prerequisite for the search for alternative approaches and for making a case to the legal department and to budget-holders for internal investment. It seems obvious to say that the one is the prerequisite for the other, but if it was that obvious then we would not see the startling statistics which emerged last year from studies in both the US and the UK to the effect that many companies do not actually know what they are spending on eDiscovery."

The article addressed many other topics as well, of course developments such as predictive coding were discussed, as well as social media's impact on eDiscovery.  As far as the monkeys...and their connection to iPads...you will have to read the full article to make sense of that.





eDiscovery Cautionary Tales: Inadvertent Disclosure Leaves Naked Short Selling Practices Exposed



http://ow.ly/b98B0

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article discusses a litigation in which Overstock.com accused Goldman Sachs and Merrill Lynch of the practice of "naked short selling" in order to allegedly lower the value of Overstock's stock price.

The article references other articles about this litigation, and provides links to an un-redacted piece of evidence which was apparently inadvertently produced by counsel for Goldman Sachs. The un-redacted version of the document was allegedly kept secret by the defendants, and certainly will cause some concern regarding the act of naked short selling.

The article states, "Matt Taibbi of Rolling Stone provides a commentary regarding information contained in the filing (language warning!), as follows:

“Now, however, through the magic of this unredacted document, the public will be able to see for itself what the banks’ attitudes are not just toward the “mythical” practice of naked short selling (hint: they volubly confess to the activity, in writing), but toward regulations and laws in general.

“Fuck the compliance area – procedures, schmecedures,” chirps Peter Melz, former president of Merrill Lynch Professional Clearing Corp. (a.k.a. Merrill Pro), when a subordinate worries about the company failing to comply with the rules governing short sales.

We also find out here how Wall Street professionals manipulated public opinion by buying off and/or intimidating experts in their respective fields. In one email made public in this document, a lobbyist for SIFMA, the Securities Industry and Financial Markets Association, tells a Goldman executive how to engage an expert who otherwise would go work for “our more powerful enemies,” i.e. would work with Overstock on the company’s lawsuit.”"


Thursday, May 24, 2012

http://ow.ly/b7Gg6

E-discovery: The relationship between cloud computing, e-discovery and privilege | Inside Counsel



 http://ow.ly/b7E6p

An article by Eric Hutz posted on the Legal Technology Today website.

This article looks at the use of cloud computing, and discusses the impact that is has upon eDiscovery requirements, as well as the relationship to claims of privilege for specific communications.

The article mentions the following concerns that need to be taken into consideration when using cloud based computing systems:


  • Unauthorized access to confidential information;
  • Notification policies for security breaches;
  • Jurisdictional issues:  What rules govern? Is it the corporate place of business, the cloud providers place of business, or alternatively the location of the servers?
  • Software licenses that might be required, and do they permit cloud use?
  • Data backup, data encryption and data destruction;
  • Audits; and
  • Need for client's consent before storing or transmitting data.




Inside and Outside Counsel Need to Implement Stronger Social Media Policies



http://ow.ly/b7AZR

An article by Kevin L. Nichols posted on the eDiscovery Journal that examines social media use within corporate environments, and provides tips on how to effectively regulate such use.

The article states, "Recently, a CFO from Francesca’s Holdings Corp, a Houston based company, was fired for wrongfully disclosing sensitive non-public information via social media. In general, it is difficult and nearly impossible to prohibit employees to have LinkedIn, Facebook, and Twitter accounts for personal use, let alone monitor them effectively. However, this incident involved an executive/board member of a publicly traded company, which raises further concerns about who and what employees may “share” with the public via Social Media. Companies, their in-house counsel, and their outside counsel need to develop complex strategies to minimize the exposure and liability that may exist due to their own negligence of protecting private sensitive information."

In addition, the article provides the following tips, and offers narrative around each of these points:
  1. Develop a Social Media Conduct Policy and Require All Employees to Execute It
  2. Create a Mandatory Training Program that Addresses Social Media
  3. Monitor Social Media Activity
  4. Use Search Engines to Independently Monitor Your Company/Client’s Brand
P.S.  The link provided below provides actual social media policies that have been implemented by over 200 corporate entities, giving anyone examples to use if they are in the process of addressing this issue within their own company.

Wednesday, May 23, 2012

Law Firms Rally Around Information Governance



http://ow.ly/b5X5n

An article by Sean Doherty posted on law.com on the LTN webpage.

This article discusses information governance practices, and examines whether law firms are practicing what they preach to their clients.

The article states, "In the real world, law firms have a long way to go, and many are reaching out to the legal community to gather standards and best practices.

"Many law firms are already introducing information governance in some form or fashion, but what they lack is a platform for collaborating on industry-wide common principles and best practices," said Carolyn Casey, Esq., senior manager, legal vertical for Iron Mountain."  The article goes on to discuss a conference taking place this week focusing on the topic of information governance, and examining suggested best practices.


Myth and Reality about Predictive Coding



http://ow.ly/b5Tnf

An article by Jennifer Shockley posted on the ArnoldIT.com website on the Beyond Search blog.

This article looks at predictive coding technology, and provides a link to another article entitled,  Will Predictive Coding Live Up to the eDiscovery Hype?, authored by Philip Favro.

Ms. Shockley states, "Reality is that Predictive Coding can’t exist without human’s providing the data, and then the program optimizes it. The process combines people, technology and workflow to find documents referencing keywords. The three basic components are:
  1. To predict utilizing predictive analytics.
  2. To code, utilizing a keyword to locate relevant documents.
  3. To process a proven workflow."

Tuesday, May 22, 2012

Automatic Deletion…A Good Idea?



http://ow.ly/b4qdn

An article by Bill Tolson posted on the eDiscovery 101 blog.

This article examines the practice of automatic deletion, which is routinely employed by many corporations.  The article refers to a recent post by the same author, which examined "defensible deletion", which involves more of a managed process than does an automatic deletion process.

The article states, "This brings up the subject of the Data Lifecycle. Fred Moore, Founder of Horison Information Strategies wrote about this concept years ago, referring to the Lifecycle of Data and the probability that the saved data will ever be re-used or even looked at again. Fred created a graphic showing this lifecycle of data.

Figure 1: The Lifecycle of data – Horison Information Systems

The above chart shows that as data ages, the probability of reuse goes down…very quickly as the amount of saved data rises. Once data has aged 90 days, its probability of reuse approaches 1% and after 1 year is well under 1%."

The author further provides concerns related to storing old electronic information, "For organizations, it’s a question of storage but more importantly, it’s a question of legal risk and the cost of eDiscovery. Any existing data could be a subject of litigation and therefore reviewable. You may recall in my last blog, I mentioned a recent report from the RAND Institute for Civil Justice which discussed the costs of eDiscovery including the estimate that the cost of reviewing records/files is approximately 73% of every eDiscovery dollar spent. By saving everything because you might someday need to reuse or reference it drive the cost of eDiscovery way up." A link to the recent blog referred to in the article is provided.

The author looks at possible ways in which to delete data, without violating legal and regulatory obligations.  The article states, "The key question to ask is; how do you get employees to delete stuff instead of keeping everything? In most organizations the culture has always been one of “save whatever you want until your hard disk and share drive is full”. This culture is extremely difficult to change…quickly. One way is to force new behavior with technology."

Reining In Discovery Costs Through Predictive Coding Programs



http://ow.ly/b4oLN

An article by Stephanie W. Yeung posted on the eDisocvery Alert website of the law firm Schnader.

This article looks at eDiscovery costs, and examines advancements in technology that may help control costs.  The article specifically discusses recent cases that utilized predictive coding technology.

The article states, "A recent study of electronic discovery costs shows that the total cost of production could range from $17,000 to $27 million, with a median of $1.8 million.(Footnote provided in the article) Because discovery expenses have the potential to explode, it often becomes the predominant focus and concern in litigation, demanding the lion’s share of the party’s resources. Recent advancements in e-discovery technology may provide a way to manage the costs, thus shifting the party’s resources and focus back to the legal issues in the litigation."

The article further states, "While the benefits of predictive coding are most dramatic in cases involving large amounts of reviewable electronic data, the savings in time and cost are universal."

Monday, May 21, 2012

eDiscovery Case Law: Another Case with Inadmissible Text Messages



http://ow.ly/b2Cmm

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article examines the case of  Rodriguez v. Nevada, No. 56413, 2012 WL 1136437 (Nev. Apr. 5, 2012), in which text messages were ruled inadmissible due to a lack of supporting foundation.  A link to the case opinion is provided in the article.

The article follows up on a recent Pennsylvania state case, which also rejected text messages since there was no additional supporting evidence offered to substantiate the authenticity of the text messages.

The article states, "The court noted that “Text messages offer new analytical challenges when courts consider their admissibility. However, those challenges do not require a deviation from basic evidentiary rules applied when determining authentication and hearsay.” Further noting that “establishing the identity of the author of a text message through the use of corroborating evidence is critical to satisfying the authentication requirement for admissibility", the court concluded that when there has been an objection to admissibility of a text message, “the proponent of the evidence must explain the purpose for which the text message is being offered and provide sufficient direct or circumstantial corroborating evidence of authorship in order to authenticate the text message as a condition precedent to its admission”.

Since the state did not offer any corroborating evidence that the defendant authored 10 of the 12 text messages, those messages were ruled as inadmissible. The other two messages were deemed admissible and not considered to be hearsay because in those instances, the state was able to present bus surveillance video of the defendant participating in using the phone at the time those two messages were sent."

In Search of a Strategic Imperative for Managing Enterprise Content: Lessons from AIIM, ECM Vendors and the User Community



http://ow.ly/b2BHT

An article by Gary MacFadden posted on the enterprisecontentmanagement blog.

This article discusses information governance issues, and focuses on some of the various companies that offer technological solutions to issues that corporations must deal with while managing their electronic information.

The article states, "...personal devices and social media are radically changing work behaviors and technology adoption – for better or worse. Smartphones and tablets coupled with the rise in the use of social networks and other forms of cloud-based communication and collaboration pose challenges and opportunities for organizations that are struggling to balance privacy and security issues with the need to stay competitive, drive innovation and better serve their constituents."

In addition, the article further states,

"In the depths of the recent financial downturn, companies cut non-essential workers and adopted mostly siloed, departmental content management solutions to meet regulatory compliance and litigation support requirements and/or to reduce costs in accounting departments, customer service, logistics and other areas.

By all accounts, ECM solutions revenue, led by ediscovery, analytics and departmental ECM, grew steadily throughout the recession and, compared to manual processes, outmoded legacy systems or expensive service provider alternatives, these departmental solutions generally help organizations realize a substantial, often quick return on their investment (ROI).

However, IT and the business are now paying the price for reactively adopting technology. Lack of interoperability between ECM solutions and limited access across the enterprise to business-critical siloed content repositories increases complexity and risk, reduces ROI and saps productivity – not to mention making for a stultifying user experience. No wonder so many workers are disengaged."

The article goes on to discuss specific companies, such as Symantec, Reccomind, and Nuix, which offer technological solutions to help tackle some aspects of the challenges associated with enterprise content management.

Friday, May 18, 2012

Cloud computing: a world-changing power



http://ow.ly/aZMc2

An article by Dr. Fang Binxing posted on the Guardian website.

This article defines and discusses "cloud computing" and provides insight into some of the impacts that the increased use of the cloud is having upon corporations.

The article states, "Cloud computing involves three key elements, namely, resource pooling, capability supply, and the service model. As a process, the cloud provider concentrates a mass of resources and seamlessly provides them for users."

The article goes on to discuss the three elements listed above, and also examines issues such as security and the economics of using the cloud.


Thursday, May 17, 2012

BREAKING NEWS! FBI’s Newly Announced Investigation of JPMorgan to Create More E-Discovery Issues



http://ow.ly/aYp06


An article by Mike Hamilton, J.D. posted on the E-Discovery Beat website of Exterro.

This article discusses the fact that the FBI will be investigating JPMorgan Chase based on possible criminal activity that has resulted in the recent loss of billions of dollars by the company.

The article states, "While the specific nature of the investigation remains unclear, the FBI and the Securities Exchange Commission (SEC) (under a prior ongoing investigation) are probing for information from JPMorgan. Along with these external pressures, JPMorgan is conducting its own internal investigation to examine their recent $2 billion dollar trading loss stemming from a string of trades on credit default swaps.

As JPMorgan treads its way through this tidal wave of regulatory pressure, its legal and compliance teams must remain acutely aware of the potential risk that is involved with properly conducting discovery in their own internal investigation as well as in the two other external investigations."

In addition, the article provides certain tips that should be used to ensure that any actions taken in order to fulfill eDiscovery obligations are legally defensible.  The article also mentions an upcoming webcast scheduled for June 7th on this topic.

The Nebulous Nature of eDiscovery




http://ow.ly/aYgxG


An article by Linda Sharp, Esq., MBA posted on the Modern Archivist blog.

This article discusses the case of Bradley B. Larsen vs. Coldwell Banker Real Estate Corp (C.D. Ca. Feb. 2, 2012), and looks at the question of what constitutes a "reasonable" eDiscovery request.

The article states that a plaintiff's request for a 2nd round of eDiscovery production, based on allegations of the defendant's failure to comply with the first series of eDiscovery requests, was not supported by enough evidence from the plaintiff which showed a lack of compliance by the defendant.

The article states, "Judge Goldman, citing the Sedona Conference Principles, deemed that the plaintiff must show that the defendant’s steps to preserve and produce relevant ESI were inadequate for a second round of production to be ordered. The plaintiff was unable to show enough facts to lead to this second round. Due to this lack of evidence, Judge Goldman dismissed the request.

Judge Goldman deemed that the collection of ESI performed by the defendant, which included over one-thousand man hours of processing, collecting, and reviewing ESI (costing in excess of $100,000), to be sufficient. This process produced nine-thousand pages of information of which the plaintiff could only cite two examples of information that might be missing."

The article goes on to mention that the court relies on "proportionality tests", but that it is still not an exact science as to what is a sufficient response to request for production.

eDiscovery Case Law: Inadmissibility of Text Messages Being Appealed



http://ow.ly/aYfMX

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article examines an appeal filed in the case Commonwealth v. Koch, No. 1669-MDA-2010, 2011 Pa. Super. LEXIS 2716 (Sept. 16, 2011), in which a State court in Pennsylvania held that text messages were not admissible in the case, since the proper foundation was not offered to substantiate the authenticity of the evidence.

The article discusses a current appeal of the reference case, and states, "First, the justices will examine whether the text messages “were not offered for their truth” and were therefore admissible. The state questioned whether the Superior Court, in reversing a Cumberland County judge’s decision to admit the texts, had ruled against its own previous holding in another case and thusly created “uncertainty in the law.”
The high court is also tasked with reviewing the case in terms of Pa.R.E. 901, on “Requirement of Authentication or Identification.” According to the Tuesday allocatur grant, prosecutors asked the court to examine whether the Superior Court panel “misapprehended” Rule 901, again going against its own jurisprudence and again creating “uncertainty.”"

Wednesday, May 16, 2012

Fulbright Forum: International eDiscovery - When Cyber Workspaces Collide with U.S. Litigation



http://ow.ly/aWFYJ

The monthly Fulbright Forum series tackled International eDiscovery issues in May 2012 and the video of the presentation is available via youtube at the link above.  The presentation provides powerpoint slides with audio narration, and the presentation credits Lana Varney of Fulbright, and John Unice of Bayer Corp.


5 Examples of Using eDiscovery To Find Evidence



http://ow.ly/aWEjs

An article by  David Kaufer posted on the Teris Sophisticated Litigation Support Blog.

This article provides 5 examples of specific litigation matters that involved some form of eDiscovery to locate relevant evidence.

The article provides narrative discussion of the following 5 cases, each of which had different issues that pertain to eDiscovery:

1. Oracle America v. Google
2. State of Oregon v. Urbina
3. Danny Lynn Electrical v. Veolia Es Solid Waste
4. Northington v. H & M International
5. United States v. Briggs
The article states, "Ediscovery is not just about finding supporting evidence which is used to prove or disprove a case; the electronic discovery process can also reveal if evidence has been destroyed or is missing. Sometimes the lack of evidence is just as elemental in winning a case as the evidence itself. The magic of ediscovery is that it addresses both sides of the evidence coin, while case histories are simultaneously building new guidelines regarding the responsibilities of companies to maintain proper electronic records."

Technology Assisted Review Backgrounder



http://ow.ly/aWDGV

A compilation of articles from various sources regarding topics related to predictive coding and/or technology assisted review posted on the Unfiltered Orange website of Orange Legal Technologies, and posted on Twitter by Rob Robinson.

This list provides a comprehensive collection of articles that should serve as a valuable resource regarding the topic of technology assisted review.

Tuesday, May 15, 2012

The Practitioners Role in eDiscovery 2.0



http://ow.ly/aVm4L

An article by Cat Casey posted on the Hudson Legal Blog.

This article looks at the impact of technology upon the legal profession, and discusses the use of predictive coding as opposed to a traditional attorney review process.

The article references the recent cases of Da Silva Moore, and Global Aerospace, and provides links to both opinions.  The recent case developments provide insight into the court's willingness to utilize technology assisted review, and advances such as predictive coding.

The article states, "In looking at the cases above, it is clear that there is a willingness of the bench to accept Technology Assisted Review (TAR). The key factor being the word “assisted”. Practitioners must take an active role in the implementation of these tools and use them in conjunction with fact driven case development to reap the rewards offered by predictive coding or any of the litany of other tools. Judicial approval lies insound processes and the intersection of man and machine, not merely in which widget is selected."

The article further looks at established workflows that are utilized during the predictive coding process. The article also mentions that these new developments impacting the attorney review process do not mean that the traditional manual review is dead, "There is no need to figuratively throw the baby out with the bath water, because traditional keyword searches coupled with Attorney Review can be highly effective to examine large amounts of data. The problems noted above arise when ill-conceived searches utilizing broad search terms that are over and or under inclusive lead to massive amounts of data being pushed to traditional manual review."


http://ow.ly/aVjVZ

Was Samsung Deal a Watershed for use of Machine Translation in FTC Second Requests?



http://ow.ly/aVfe7

An article by Bob Ambrogi posted on the Catalyst Secure website.

This article discusses the recent approval for the use of machine translation during a 2nd request by the FTC during a review of a prospective deal involving Samsung.

The article states, "...before the deal could be consummated, it had to await Second Request review by the FTC, a fast-track, discovery-like process that gives the FTC the documents and information it needs to evaluate whether the proposed transaction may violate antitrust laws.

One Million Pages to Translate

For the attorneys at Paul Hastings, the Second Request process was further complicated by one simple fact—the vast majority of the documents they had to produce were in Korean, but the Second Request process requires all documents to be produced in English. Based on the firm’s preliminary estimates, of roughly 350,000 documents they had collected, 200,000 were in Korean. At an estimated five pages per document, Paul Hastings estimated that they could be facing as many as a million pages of Korean-language text."

The law firm Paul Hastings faced significant challenges associated with this second request, as the article outlines, both time and cost were major hurdles. The firm relied upon machine translation to complete the 2nd request.  The article states, "As the FTC was quick to note, the machine-translated documents did not have the same level of quality as the hand-translated materials. However, Paul Hastings took the position that the translations were sufficient to allow the agency to identify relevant materials for further review and to seek hand translation of particular documents as necessary. In the end, the Second Request process ran its course and the FTC allowed the merger to close.

In this case, at least, machine translation was sufficient to allow the process to reach its end. Not only that, but by using MT, Paul Hastings saved its client a huge expense."  The article provides further comments from Michael S. Wise, of Paul Hastings regarding the far-reaching implications of the Samsung deal.

P.S  The author states in closing,  "The story of how lawyers can use technology to streamline e-discovery is not just about predictive coding. With machine translation, the story now has another chapter."


The Burden of E-discovery: Global Perspective



http://ow.ly/aVdtH

An article by Katey Wood posted on the ESG-Global website, providing insight into IT spending based on the results of a 2012 ESG survey, the "2012 IT Spending Intentions Survey".

The article states, "Analyzed by size of IT budget, organizations spending at least $50 million were three times as likely as those spending less than $5 million annually on IT products and services to label e-discovery a significant burden (see chart below). This could be because the sheer size and complexity of large enterprise IT environments makes e-discovery burdens that much more difficult, even with purpose-built solutions in place to address these legal requirements."

Graphs are provided in the article, revealing some interesting trends regarding eDiscovery spending, and burdens associated with eDiscovery obligations. The article also notes that Western European based entities are finding eDiscovery obligations more burdensome that American based organizations, "While North American organizations were slightly more likely than their Western European peers to classify e-discovery as a significant burden (17% vs. 16%), overall, 48% of Europeans find e-discovery activities to be an operational burden compared with 45% of Americans. By way of comparison, only 21% of organizations in Asia-Pacific considered e-discovery any kind of a burden at all."



Spoliation, Not Speculation: Good Faith in Evidence Destruction Claims



http://ow.ly/aVccL

An article by Chris Marzetti, Esq. posted on the Applied Discovery website.

This article discusses factors that the courts will take into account in deciding whether a motion for sanctions for spoliation should be granted.

The article states, "Courts may look to a number of things as evidence of good faith in discovery to counter an opposing party’s claim for sanctions. One court in Alabama recently pointed to a party’s use of an e-mail archiving system and of document review technology as measures sufficient to defeat a claim of spoliation."

The article cites the referenced Alabama case, "Danny Lynn Electrical & Plumbing, LLC v. Veolia Es Solid Waste Southeast, Inc., No. 2:09cv192-MHT (WO), 2012 U.S. Dist. LEXIS 62510 (M.D. Ala. May 4, 2012)" and provides a link to the opinion.  The court determined that spoliation sanctions should be granted only when there is intentional destruction or concealment of evidence.  The article further states, "The court found that the plaintiff’s argument rested upon not actual proof, but rather “speculation that critical evidence was in fact lost or destroyed.” It ruled there was no evidence the defendants permanently deleted any e-mails, aside from a few “accidentally deleted due to a computer virus or other unforeseen circumstance.” The court determined that the defendant’s use of a monthly archive system ensured the preservation of e-mail on backup tapes. Notably, the court also referred to the defendants’ purchase of “expensive document review technology” and an e-discovery database."


Monday, May 14, 2012

ILTA Insight Session - Predictive Coding: What is it?



http://ow.ly/aTjcF

A powerpoint presentation by Epiq posted on the ILTA website as part of a continuing educational series.

The powerpoint provides insight and information regarding "predictive coding" technology, and discusses workflow associated with this type of technology. The presentation also provides information regarding the concepts of "recall" and "precision", which are vital metrics to measure in conjunction with an attorney review process.


eDiscovery Search and Collection in the Cloud



http://ow.ly/aThWb

An article by John Patzakis posted on the Next Generation eDiscovery Law & Tech Blog.

This article discusses eDiscovery practices associated with cloud computing services.

The article states, "The cloud means many things to many people, but I believe the real eDiscovery action (and pain point) is in Infrastructure as a Service (IaaS) cloud deployments (such as the Amazon cloud, Rackspace, or pure enterprise cloud providers such as Fujitsu). According to a recent PwC report, Cloud IaaS will account for 30% of IT expenditures by 2014. IaaS currently provides the means for organizations to aggressively store and virtualize their enterprise data and software, thus potentially spawning the same large data volumes and requiring the same critical search and eDiscovery requirements as traditional enterprise environments. Amazon Web Services, the leading IaaS cloud provider, reports in our discussions with them extensive customer eDiscovery requirements that are currently addressed by inefficient and manual means."

The article discusses the costly and time consuming processes that are presently involved with collection of electronically stored information from a cloud based computer system.  The article promises to provide insight in future posts regarding a better method for handling eDiscovery obligations when the data is cloud based.

Friday, May 11, 2012

More eDiscovery Cowbell?



http://www.complexdiscovery.com/info/2012/01/21/more-ediscovery-cowbell/

Joe Bartolo of this blog is pleased to be named as one of this weeks eDiscovery Cowbell honorees.

eDiscovery Case Law: Twitter Seeks to Succeed Where Defendant Failed



http://ow.ly/aQswu

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article discusses a recent case in which Twitter has objected to a court ordered subpoena, and the article discusses the three grounds that Twitter relied upon for their motion to quash the subpoena.

The article states, "In People v. Harris, No. 2011NY080152 (N.Y. Crim. Ct.), Twitter opened its brief by indicating that section 2703(d) of the Stored Communications Act provides that “[a] court issuing an order pursuant to this section, on a motion made promptly by the service provider, may quash or modify such order, if. . . compliance with such order otherwise would cause an undue burden on such provider." Twitter identified three reasons that it argued the order compelling its production of the defendant’s information imposes an undue burden."

The article provides discussion regarding the 3 grounds that Twitter relied upon for their motion to quash.  The 3 reasons briefly summarized are as follows:

  1. Twitter contended that the court’s ruling that the defendant has no proprietary interest in the requested information (and therefore no standing to challenge the subpoena), was in conflict to Twitter’s Terms of Service.
  2. Twitter also claimed that the order forces it to “violate federal law” and noted that the Stored Communications Act (SCA) has been held to violate the Fourth Amendment “to the extent that it requires providers to disclose the contents of communications in response to anything less than a search warrant.
  3. Twitter also argued that “a criminal litigant cannot compel production of documents from a California resident like Twitter without presenting the appropriate certification to a California court, scheduling a hearing and obtaining a California subpoena for production.
The article provides a link to another article on the same blog which addresses a portion of this issue related to the first rationale of Twitter's objection.  The article also provides citations of the statutes Twitter is relying upon, and case law cited by Twitter.

P.S.  This will be an important decision which will add some clarity as to the proper means that must be followed in order to legally obtain information from social media networks such as Twitter.  Will Twitter's 2nd argument regarding the requirement for a search warrant in order to comply with the SCA ring true?



Communication is King in E-Discovery Matters



http://ow.ly/aQryy

An article by Daniel Garrie posted on the infosec island website.

This article discusses the importance of communication within the context of the eDiscovery process.

The article states, "Today, a common vocabulary is certainly emerging making dialog between counsel and technologists both productive and effective. However, equally important is that lawyers engage their client’s technology team in dialog on e-discovery issues, because when counsel fails the fall out can be costly.

Ultimately the ramifications when a communication disconnect arises between legal and technology are penalties that can range from a mild public judicial admonishment to losing a case and incurring significant costs. A textbook example of such a communication breakdown is illustrated in Play Visions (Play Visions, Inc. v. Dollar Tree Stores, Inc., No. C09-1769 MJP (W.D. Wash. June 8, 2011))."

The article goes on to further describe shortcomings committed by plaintiff's counsel, in that they failed to familiarize themselves with their client's document retention policies, and did not participate in the identification of potentially relevant information.  In addition, the plaintiff's counsel failed to participate in the identification of custodians of potentially relevant information.

The article describes the resulting sanctions against plaintiff's counsel, "This failure by counsel to communicate with their client’s in-house IT department resulted in the Court granting the Defendants motion for sanctions of $137,168.41 holding the Plaintiff and counsel jointly and severally liable.

Play Visions illustrates what happens when counsel fails to communicate to the client and client’s technology team about the proper management of discovery involving electronic records."

Thursday, May 10, 2012

Governance policies solve information paradoxes



http://ow.ly/aP55P

An article by Admire Moyo posted on the Web Enterprise Solutions website.

This article discusses the need for information governance practices to assist in managing the large volumes of electronically stored information that corporations must handle.

The article states, "Sound information governance policies can help organisations avoid the confusion that is being created by the increasing volumes of information they have to grapple with.

So said Debra Logan, research VP and analyst at Gartner, presenting yesterday, in Johannesburg, on how big data and extreme information are transforming business and IT.


“When it comes to information, we all have too much, but we almost never have enough. This is the information paradox. To solve the information paradox, organisations need to apply the principles of information governance,” she said."

The article goes on to further state, "Logan defined information governance as the specification of decision rights and an accountability framework to ensure the appropriate behaviour in the valuation, creation,storage, use, archival and deletion of information.

“This includes the processes, roles, standards and metrics that ensure the effective and efficient use of information in enabling an organisation to achieve its goals.”

Highlighting the paradoxical facts about information, Logan pointed out that, despite increases in volume and velocity of information, people seem to want more and more of it. On the other hand, she noted, as information creates more risk and requires more governance and policy to manage, enterprises seek to control it less and less."

'Pippins' and the Proportionality Debate



http://ow.ly/aP1QD

An article by Wayne Matus, John Davis and Peter Ostrovski published by the New York Law Journal, and posted on law.com on the LTN webpage.

This article discusses the need for proportionality with respect to requests for preservation and production of electronically stored information during litigation. The article looks at a specific case to help define the limits of preservation obligations, Pippins v. KPMG, No. 11 Civ 0377, 2012 WL 370321 (S.D.N.Y. Feb. 3, 2012).

The article states, "Courts are split on whether, and if so, how, a party may limit the scope of its preservation efforts in a given case commensurate with the likely significance of the information and the amount at issue. U.S. District Judge Colleen McMahon of the Southern District of New York recently added her views to the debate in Pippins v. KPMG, No. 11 Civ 0377, 2012 WL 370321 (S.D.N.Y. Feb. 3, 2012). The court endorsed the concept of proportionality, but pointedly refused to grant KPMG relief from full preservation activities, as a result of KPMG's perceived lack of cooperation in the discovery process and failure to demonstrate that the value of preservation was outweighed by the costs. This case provides useful lessons in evaluating and applying proportionality analysis to preservation obligations." A link to the case opinion is provided in the article.






ABA Program to Highlight E-Discovery Issues, Trends and Challenges



http://ow.ly/aP0c7

The link above provides information regarding an upcoming ABA event, to be held in New York City on May 18th, posted on the ABA website.

The provided information states, "Experts in trial practice will gather to examine potential changes to federal rules on electronic discovery, the challenges of protecting privileged information and the impact that technology will have on civil litigation during the American Bar Association’s Sixth Annual National Institute on E-Discovery. The conference, sponsored by the ABA’s Section of Litigation, is Friday, May 18, at the University Club of New York in New York City."  Links providing more information are provided in the ABA post.

U.S. Magistrate Judge John M. Facciola will be one of the panelists, and the information provided by the ABA states that the discussion will address such topics as: litigation hold; recent case law; updates on proposed Federal Rules Amendments; and attorney-client privilege within the context of electronically stored information.


5 Things Attorneys Need to Know About Social Media



http://ow.ly/aOZg4

An article courtesy of "JD Supra" posted on the Legal Technology Bytes website.

This article provides narrative regarding 5 different topics related to use of social media, and how it impacts the legal profession.

The 5 topics discussed are as follows:

1. Social Media Impacts Every Facet of Your Client’s Business and Your Business
2. Not All Social Media is the Same
3. Privacy Does Exist on Social Media
4. Ethics Rules Apply on Social Media
5. Social Media is Worth the Effort


Can Twitter Protect Your Data?



http://ow.ly/aOY4t

An article by Kendra Srivastava posted on the mobiledia webiste.

This article discusses recent subpoenas issued to Twitter seeking information without a warrant, as part of criminal investigations against specific individual Twitter users.

The article states, "The social media company seeks to overturn a subpoena demanding account records for Malcolm Harris, a Twitter user arrested during last fall's Occupy protests."  The article goes on to further state, "Twitter also argues the subpoena violates its Terms of Service in requesting user records across state lines without a warrant.

This isn't the first time Twitter has been ensnared in legal matters on users' behalf and likely won't be the last. But although the company continues to advocate for account privacy, it faces increasing government opposition to its efforts.

The San Francisco-based social media company tried its best to shield the account information of key WikiLeaks suspects, moving to suspend subpoenas against them, but ultimately failed in its attempt.

U.S. District Judge Liam O'Grady in January ordered Twitter to hand over records, saying, "Petitioners knew or should have known that their IP information was subject to examination by Twitter.""


Wednesday, May 9, 2012

eDiscovery Trends: CGOC’s Information Lifecycle Governance Leader Reference Guide



http://ow.ly/aNlUz

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article discusses information governance practices, and looks at ways to prevent sanctions for failure to preserve electronically stored information that may be requested during the discovery phase of litigation.

The author states, "With all of the recent attention on technology-assisted review and current case law related to that subject, it’s sometimes easy to forget that most sanctions are issued because of failure to preserve potentially responsive data. A sound Information Governance (aka Records Management) policy is the first step to enabling organizations to meet their preservation obligations by getting control of the data up front. Organizations such as EDRM and ARMA have focused on Information Governance and have even collaborated on a January report on Information Governance. Another organization focused on Information Governance is the Compliance, Governance and Oversight Council (CGOC). In the fall of 2010, CGOC issued its Information Governance Benchmark Report, which presented findings from their first survey of legal, records management (RIM) and IT practitioners in Global 1000 companies."  Links to the referenced reports are provided in the article, and there is also mention of a recent GGOC report focusing on the economics of eDiscovery.


The article further states, "For most organizations, information volume doubles every 18-24 months and 90% of the data in the world has been created in the last two years. In a typical company in 2011, storing that data consumed about 10% of the IT budget. At a growth rate of 40% (even as storage unit costs decline), storing this data will consume over20% of the typical IT budget by 2014. Accumulating, storing and litigating data without value is simply no longer an economically viable proposition. The 36 page Information Lifecycle Governance Leader Reference Guide (written by Deidre Paknad and Rani Hublou) provides a program for operationalizing an eff­ective defensive disposal program for expired data and overcome the barriers to do so."   The article provides a link that will require registration to download the referenced 36 page report.

The author describes the content of the Information Lifecycle report, as being divided into 5 sections, not including the introduction and conclusion, as follows:

  • Defining Program Strategy
  • Setting Quantifiable Cost and Risk Reduction Goals 
  • Operationalizing the Strategy 
  • Program Leadership
  • Process Maturity and Management
P.S.  With a continuing convergence taking place of eDiscovery, regulatory compliance, and information governance within corporations, this Information Lifecycle report can prove to be a useful resource for companies that are currently addressing these issues.

Lawyers and Social Media: What could possibly go wrong?



http://ow.ly/aNl9h

An article by Thomas J. Watson posted on the Wisconsin Lawyer website, at the State Bar's website, www.wisbar.org.

This article discusses potential risk associated with attorneys use of social media networks.

The article addresses various issues and states, "Although a social medium can be a cost-effective way to reach a large audience, you risk losing control of your message, creating unrealistic client expectations, inadvertently creating an attorney-client relationship, and running afoul of the rules of professional conduct.

The question is: How can you operate effectively and appropriately in a Web-based environment without leaving yourself vulnerable to a malpractice claim or a complaint to the Office of Lawyer Regulation?"



Tuesday, May 8, 2012

Random Sample Calculations And My Prediction That 300,000 Lawyers Will Be Using Random Sampling By 2022




http://ow.ly/aLTMu

An article by Ralph Losey, Esq. posted on his blog e-Discovery Team®.

This article discusses sampling, and provides detailed analysis and future projections based on mathematical formulas provided by the author.  Based on the author's projects, an expected range of 270,000 to 333,000 attorneys will be using random samples to perform an attorney review, with some form of technology assisted review after the samples have been tagged.

The author states, "Random sampling is still a rare exception in U.S. legal culture. And therein lies the problem, at least in so far as e-discovery quality control is concerned. Sampling now has a very low prevalence rate.

But those of us in the world of e-discovery are used to that. There are still very few full-time specialists in e-discovery. This is changing fast. It has to in order for the profession to cope with the exploding volume and complexity of written evidence, meaning of course, evidence stored electronically. We e-discovery professionals are also used to the scarcity of valuable evidence in any large e-discovery search. Relevant evidence, especially evidence that is actually used at trial, is a very small percentage of the total data stored electronically. DCG Sys., Inc. v. Checkpoint Techs, LLC, 2011 WL 5244356 at *1 (N.D. Cal. Nov. 2, 2011) (quoting Chief Judge Rader: only .0074% of e-docs discovered ever make it onto a trial exhibit list). Again, this is a question of low prevalence. So yes, we are used to that. See Good, Better, Best: a Tale of Three Proportionality Cases – Part Two; and, Secrets of Searcharticle, Part Three (Relevant Is Irrelevant)."  Links to the other informative article referenced by Mr. Losey are provided in his article.

The article goes on to state, "Assuming that by the year 2022 there are 1.5 Million lawyers (the ABA estimated there were 1,128,729 resident, active lawyers in 2006), I predict that 300,000 lawyers in the U.S. will be using random sampling by 2022. The confidence interval of 2% by which I qualified my prediction means that the range will be between 18% and 22%, which means between 270,000 lawyers and 330,000 lawyers. I have a 95% level of confidence in my prediction, which means there is a 5% chance I could be way wrong, that there could be fewer than 270,000 using random sampling, or more than 330,000."

Mr. Losey provides some intriguing mathematical formulas to support his projections.  In addition, the author  provides rationale as to why the legal profession needs to further embrace random sampling during the discovery phase.  The article states, "For my purposes as an e-discovery lawyer concerned with quality control of document reviews, this explanation of near certainty is the essence of random probability theory. This kind of probabilistic knowledge, and use of random samples to gain an accurate picture of a larger group, has been used successfully for decades by science, technology, and manufacturing. It is key to both quality control and understanding large sets of data. The legal profession must now also adopt random sampling techniques to accomplish the same goals in large-scale document reviews."

The article goes on to discuss issues such as "prevalence", which looks at the percentage of relevant information within a larger corpus.  The article also provides links to RaoSoft's "calculator", which provides ability to determine the number of documents that must be reviewed in order to have a correct sample size, based on the desired prevalence level that is desired to be attained.

Mr. Losey states, "Here is one way of expressing the basic formula behind most standard random sample size calculators:
n = Z² x p(1-p) ÷ I²

Description of the symbols in the formula:

n = required sample size

Z = confidence level (The value of Z is statistics is called the “Standard Score,” wherein a 90% confidence level=1.645, 95%=1.96, and 99%=2.577)

p = estimated prevalence of target data (richness)

I = confidence interval or margin of error

Putting the formula into words – the required sample size is equal to the confidence level squared, times (the estimated prevalence times one minus the estimated prevalence), then divided by the square of the confidence interval."

P.S.  The conclusion section of the article also provides a nice recap of the formulas relied upon by the author, and methods used to provide the anticipated results for the year 2022.