Monday, April 30, 2012
An article by Sophie Ross posted on the insidecounsel.com website.
This article discusses approaches being used to optimize eDiscovery and control costs.
The article states, "As the legal community continues to confront e-discovery, corporate legal departments are beginning to develop different methodologies to strategically manage the process. This operationalizing of e-discovery offers corporations and in-house counsel a number of benefits, including cost and time efficiencies. By creating some structure around the e-discovery process and proactively negotiating with vendors, in-house counsel can move away from unpredictable per-unit pricing models in exchange for more transparent fee structures."
The article goes on to outline the "piecemeal approach" and compares it to the "holistic approach", and looks at ways corporate law departments are dealing with eDiscovery obligations.
Friday, April 27, 2012
An article by Luis Salazar published by the Daily Business Review and posted on law.com on the LTN webpage.
This article discusses predictive coding, and the recent order of Magistrate Judge Peck, which was upheld by Judge Carter in an order entered April 26th.
The article states, "There are a number of important points for legal professionals to take from the decision. First, in high-stakes, high-data litigation, the use of intelligent technology like predictive coding is inevitable, and failure to use it may ultimately be a strategic risk.
Second, quality control in the customizing of the predictive-coding search process to the specific case is critical to both its success and its permissibility as a discovery tool.
But in order for predictive coding to be used effectively, transparency between opposing counsel as to its use is indispensable. And, lawyers must embrace -- perhaps they even have a duty to embrace -- this technology as a critical tool in delivering cost-effective services to their clients."
An article by Pamela Lewis Dolan posted on the amednews.com website.
This article looks at recent surveys taken of healthcare providers regarding regulatory compliance, and also regarding data security.
The article states, "
The Healthcare Information and Management Systems Society surveyed 250 senior health information technology and data security officers on behalf of Kroll Advisory Solutions, a risk-management firm whose services include data security and data-breach response. The officers reported that they were prepared to meet compliance regulations. On a scale of one to seven, with one being “not at all compliant” and seven being “compliant with all applicable standards,” respondents reported that they were an average of 6.64 in terms of meeting regulations set by the Centers for Medicare & Medicaid Services, a 6.62 for meeting HIPAA regulations, and a 6.41 for meeting state security laws.
However, evidence continues to mount that despite the compliance, health organizations, particularly physician practices, are vulnerable to data breaches. Verizon’s “2011 Data Breach Investigations Report” stated that small organizations, including physician practices, represented the largest number of data breaches in 2011. A previous Kroll report said physician practices were at risk for breaches because they are “the path of least resistance,” with basic security protections overlooked as practices focus on meeting regulatory requirements."
Thursday, April 26, 2012
This is a copy of the court order entered by Judge Andrew L. Carter, Jr. upholding the order of Magistrate Judge Andrew J. Peck, which compelled the parties to use predictive coding technology for the attorney review phase of the litigation, and sought a protocol from the parties to outline the use of such technology. The plaintiff's in the case had objected to Judge Peck's order, and sought further expert testimony regarding the protocol for the use of predictive coding, as well as seeking to remove Judge Peck from the case.
The Order states, "Plaintiffs object to the February 8 discovery rulings, the ESI protocol, and the February 24 opinion and order, arguing, inter alia, that the predictive coding method contemplated in the ESI protocol lacks generally accepted reliability standards, that the use of such method violates Fed. R. Civ. P. 26 and Federal Rules of Evidence 702, that Judge Peck improperly relied on outside documentary evidence in his February 24 opinion and order, that MSLGroup’s expert is biased because the use of the predictive coding method will reap financial benefits for the company, that Judge Peck failed to hold an evidentiary hearing, and that he adopted MSLGroup’s version of the ESI protocol on an insufficient record. Plaintiffs request that the Court overturn the Magistrate Judge’s rulings because they are erroneous and contrary to law....Plaintiffs also submitted a letter requesting that Judge Peck recuse himself from the action, which Judge Peck denied on April 2, 2012, but allowed them to file a formal motion."
The Order fully upholds Judge Peck's earlier ruling, and further states, "The lack of a formal evidentiary hearing at the conference is a minor issue because if the method appears unreliable as the litigation continues and the parties continue to dispute its effectiveness, the Magistrate Judge may then conduct an evidentiary hearing. Judge Peck is in the best position to determine when and if an evidentiary hearing is required and the exercise of his discretion is not contrary to law. Judge Peck has ruled that if the predictive coding software is flawed or if Plaintiffs are not receiving the types of documents that should be produced, the parties are allowed to reconsider their methods and raise their concerns with the Magistrate Judge. The Court understands that the majority of documentary evidence has to be produced by MSLGroup and that Plaintiffs do not have many documents of their own. If the method provided in the protocol does not work or if the sample size is indeed too small to properly apply the technology, the Court will not preclude Plaintiffs from receiving relevant information, but to call the method unreliable at this stage is speculative."
An article by Mary Pat Gallagher published by the New Jersey Law Journal, and posted on law.com on the LTN webpage.
This article looks at recent proposed changes to eDiscovery rules for criminal maters in New Jersey, which have been made available by a report of a Supreme Court Committee in New Jersey, a link to the report is provided in the article. The article states, "Many suggestions address electronic discovery, such as adding "electronically stored information" to the list of materials covered by discovery; allowing discovery to be made using "CD, DVD, e-mail, internet or other electronic means"; requiring Alcotest data to be provided in a readable digital database; establishing uniform fees and formats; and having the judiciary institute computer training courses for both judges and lawyers.
Electronic documents would have to be provided in PDF format. Those not in PDF or some other readily accessible format would have to be accompanied by a self-extracting computer program that would enable access.
A list or index would have to be supplied with discovery provided in electronic format so that the recipient would not have to open up every file to find out what was provided, which the committee called a "common complaint."" The article goes on to provide further obligations that are suggested for the prosecutors regarding what digital information they must produce to the defendants.
An article by Greg Buckles posted on the eDiscovery Journal website.
This article discusses changes in the eDiscovery service provider landscape. The article points out that a recent conference on the topic of vendors found that IPRO vendor numbers have shrunk, while at the same time revenue has grown.
The article states, "One interesting statistic from Jim King’s (IPRO CEO) keynote was a 17% decline in the number of IPRO service providers while the actual revenue they generated was up 12%. That’s right, the IPRO channel is shrinking (just like the eDiscovery market) but the volume is still growing. This resonated with the panelist perspectives on the increasing need to find alternatives to volume pricing ($/GB). Providers need to differentiate their offerings with standardized processes, project expertise, transparent invoicing and metric-rich reporting to survive the transformation from commodity vendor to managed service provider."
Wednesday, April 25, 2012
An article posted on the Orange Rag Law Blog, no author credit provided.
This article discusses a case involving Global Aerospace, and looks at a court order that requires the parties to utilize predictive coding technology for the attorney review phase of litigation.
The article states, "On Monday 23rd April, Judge James H. Chamblin of the Circuit Court of Loudoun County, Virginia, issued the first known order authorizing the use of predictive coding methods over the objections of opposing counsel. This ruling will allow the defendants in Global Aerospace Inc. v. Landow Aviation Limited Partnership, et al., No. 61040, and nine other consolidated cases, to use predictive coding technology to cull large quantities of electronically stored information (ESI), distilling them down to a reasonable volume."
The article further mentions, "“We were very pleased to be able to show the scientific accuracy of predictive coding to a court in a formal hearing setting,” said Dr. (Herbert) Roitblat. “Keyword searching seems to be perfectly acceptable to attorneys, even though several studies have focused on its inaccuracy. If keyword searching with 20 percent proven accuracy is okay, how can predictive coding with more than 90 percent demonstrable accuracy be unacceptable? I see this as the first step in that mental barrier coming down for lawyers.”
The court ordered: “Defendants shall be allowed to proceed with the use of predictive coding for purposes of the processing and production of electronically stored information. . . . This is without prejudice to a receiving party raising with the Court an issue as to completeness or the contents of the production or the ongoing use of predictive coding.”"
An article by the Daily Dot posted on the Mashable website.
This article discusses a recent criminal case in New York State, in which a subpoena was issued to Twitter seeking information posted on a Twitter account that may substantiate allegations of disorderly conduct during an Occupy Wall Street protest.
The article states, "“New York courts have yet to specifically address whether a criminal defendant has standing to quash a subpoena issued to a third-party online social networking service seeking to obtain the defendant’s user information and postings,” wrote Judge Matthew Sciarrino Jr. in his decision. “Nonetheless, an analogy may be drawn to the bank record cases where courts have consistently held that an individual has no right to challenge a subpoena issued against the third-party bank.”
“Twitter’s license to use the defendant’s tweets means that the tweets the defendant posted were not his,” the judge added." The article provides a link to the judges decision.
Tuesday, April 24, 2012
An article by Stacey Higginbotham posted on the Gigaom website.
The article examines the impact that Unstructured Data is having upon the internet, and the author points out some areas of concern.
The article states, "Cheap computing and the ability to store a lot of data at a low cost have made the concept of big data a big business. But amid the frenzy to gather that data, especially unstructured information scraped from or accessed via crawling web sites, companies might be pushing the boundaries of polite (or ethical) behavior. They may also be stealing valuable IP. So is it stoppable and could the current solutions lead to the demise of the open web?"
The article looks at the trend of "scraping" of websites by companies that are trying to create databases to track information. The article states, "But pinging a web site to grab its information exacts a toll on the site, and an overzealous crawler or hundreds of sites gathering data at any one time could create problems for the crawled site. A bunch of rapid web crawls can look similar to a denial of service attack, simply because the site has to respond to so many requests."
The article discusses recent trends where service providers block the ability to scrape their data, in order to keep more control and possibly even charge those entities that want access to certain data. The article states, "
Most businesses recognize that their value is in owning the end user and the end users’ data, whether or not the end user herself recognizes that. Twitter’s massive valuation isn’t based on its platform, it’s based on the information it has about the tweets hitting its servers ever second.
Even a company like Yelp, which went public based on the content provided by users — content that it vigorously defended from Google’s indexing — is taking advantage of user-generated data to enrich itself. Protecting that asset from becoming scraped, commoditized and turned into revenue for others seems like a no-brainer."
An article by Mikki Tomlinson posted on the eDiscovery Journal website.
This article discusses pricing issues associated with eDiscovery service providers, and wonders if lower prices are reducing the quality of services.
The article states, "One might argue that because of the advances in technology, the vendors are in a better position to continue to lower their prices, yet maintain their ability to provide quality services and profit. Perhaps that is so. However, I still feel a little uneasy. Let’s break it down. What does it take to manage a collection/processing/production project? People. Process. Technology.
People: This is not a “push this button and you are done” environment. Those involved in the collection/processing/production pieces need to have good, solid knowledge of what they are doing and why. Granted, labor loads are lightened by technology improvements that decrease the time needed for processing, loading, conversion, and production activities. However, the labor expertise required to handle ESI has not decreased. If anything, it has increased. So, while the amount charged by a service provider goes down, the cost of quality employees does not.
Process: Developing solid and efficient processes is key in eDiscovery. These process are oftentimes the advantage that one service provider has over another. You can’t afford to not have good process. Again, while the amount charged by the service provider goes down, the efforts in developing and carrying out solid and efficient processes does not.
Technology: This is where we perhaps have the most impact on pricing. As stated above, technology has certainly evolved in ways that make the entire process move faster. But has the service provider’s cost of technology really gone down enough to bring the prices down to $75 per GB from collection to review?"
The article then mentions that a price model of $75 per GB needs to be closely examined, as the true "apples to apples" comparison can sometimes reveal that a pricing model is not as seemingly advantageous as it may seem.
Monday, April 23, 2012
The Evolving Trends of Social Media eDiscovery: Tidbits from the Masters Series 2012 in San Francisco
An article by Kevin L. Nichols regarding information provided at the recent Masters Conference in San Francisco regarding eDiscovery and social media.
The article states, "Further, social media’s role in litigation is intensifying drastically. Trial attorneys are using SM to research opposing parties and even potential jurors in real time via tablets, laptops, and smartphones in order to determine whether or not they would be receptive to the attorneys’ arguments based on their postings, pictures, and comments via LinkedIn, Facebook, and Twitter. Moreover, defamation, cyber-bullying, and employment litigation cases have become more and more prevalent since SM’s ascension as a household name and activity. Companies are openly using SM to weed out applicants. The presenters stated that 25% of companies research applicants using SM. Out of that 25%, 85% said that they were less likely to hire these individuals if their postings were unprofessional.
Companies need to be mindful of some of the hazards of conducting such investigations, such as Title VII of the Civil Rights Act of 1964 (race, religion, sex, ethnicity, sexual orientation), the Age Discrimination Employment Act of 1967, theAmericans with Disabilities Act, the Fair Credit Reporting Act, and various insurance claims that may arise because of causes of action based on same. The advice given to companies by the panel was to take action. Companies should come up with policies and procedures regarding SM, develop a technological solution to prohibit use of SM at work, and monitor their own SM outlets to make sure that they defend their brand." Links are provided by the author to the various acts referenced in the article.
An article by Joshua Gilliland, Esq. on his blog the Bow Tie Law Blog.
This article discusses a situation where a 3rd party provider had electronic communications that belonged to a client. The matter in question involved a 3rd party that was a psychic services provider, and the plaintiff in the matter had received a request to produce data held by the 3rd party provider.
Ultimately, the plaintiff was required in this matter to produce most of the information on their own, and seek only information that was solely in control of the 3rd party. The court refused to analyze if the Stored Communications Act would prevent the 3rd party from producing the information, since the plaintiff had access to the information, and could also consent to the production of the requested relevant information as well.
The article states, "I think Judge Maas got this case right on producing ESI highly comparable to social media information. Instead of propounding discovery on third parties with lengthy analysis of the Stored Communication Act, or compelling a producing party to surrender their login credentials to a requesting party, the burden should be on the producing party to review and produce relevant electronically stored information.
Discovery over email does not require passwords and login credentials being surrendered to a requesting party to review email messages at will. Moreover, cases involving the mirror imaging of hard drives do not allow a requesting party to review the entire contents of someone’s digital life. In most situations, the producing party can review for relevance or privilege.
Social media should be no different. Relevancy should not ignored simply because of “friend requests” or Tweets."
An article by Ralph Losey, Esq. on his blog e-Discovery Team®.
This article looks at a Rand Study regarding eDiscovery expenses, and examines the findings that document review is the primary cost associated with the eDiscovery process.
The article states, "The Rand Corporation is a well-known and prestigious non-profit institution. Its stated charitable purpose is to improve policy and decision-making through research and analysis. It has recently turned its attention to electronic discovery. Rand concluded, as have I, and many others, that the primary problem in e-discovery is the high cost of document review. They found it constitutes 73% of the total cost of e-discovery. For that reason, Rand focused its first report on electronic discovery on this topic, with side comments on the issue of preservation. The study was written by Nicholas M. Pace and Laura Zakaras and is entitled Where The Money Goes: Understanding Litigant Expenditures for Producing Electronic Discovery. It can be downloaded for free, both a summary and the full report (131 pages). A nicely bound paper version can be purchased for a modest fee of $20." Links to the summary and full report are provided in Mr. Losey's article.
The article further states, "The Rand Report does more than just recommend the use advanced technology, it actually endorses one particular type of technology, my friend Predictive Coding. That’s right, this prestigious, non-profit, independent group has reach the same conclusions that I have, and many, many others have (in fact, you would be hard pressed to find any bona fide expert to argue against the idea of predictive coding). It is now official. Predictive coding is the best answer we have to the problem of the high costs of e-discovery. Of course, there will be good faith debates for years to come on the best methods to use this new technology, and in what cases it is appropriate."
Friday, April 20, 2012
An article by Ross Cunningham appearing on the Texas Lawyer, and also on law.com on the LTN webpage.
The article discusses ongoing eDiscovery obligations under Texas and Federal law, that go beyond the initial meet and confer, and the initial judges conference.
The article states, "More often than not, in 2012, attorneys come to a Rule 26(f) conference prepared. They know a client's information technology infrastructure; they know that asking for backup tapes can sometimes be more of a curse than a blessing; and they even know that discussions about cost-shifting in production of information should occur on the front end of the case rather than the back end.
Despite all of this, attorneys continue to fall well short of the mark in completing the e-discovery process. Much like the dieter on New Year's Day, attorneys approach a new case with the best of intentions of searching for and producing ESI. They make the initial inquiry and make their initial good-faith production.
However, six months to a year into the process, most lawyers forget a key obligation. Under Texas and federal rules, all parties have an ongoing duty to supplement discovery responses. This means an ongoing duty to verify continued implementation of the litigation hold. This means an ongoing duty to search custodians for additional responsive information. This means an ongoing duty to update searches on servers, phones, and other ESI sources. A failure to supplement can mean missing out on key evidence -- or even the imposition of sanctions."
The article goes on to look at specific obligations that must be fulfilled by counsel in order to avoid possible sanctions.
An article by Ralph Losey, Esq. posted on his blog e-Discovery Team®.
The article discusses 3 cases that pertain to proportionality issues, and examine objections to eDiscovery production requests.
The article states, "Three cases came out recently onproportionality, the key legal doctrine to discovery based on Federal Rules 26(b)(2)(C), 26(b)(2)(B)(iii), and 26(g)(1)(B). I-Med Pharma Inc. v. Biomatrix, 2011 WL 6140658 (D.N.J. Dec. 9, 2011) (GOOD); U.S. ex rel McBride v. Halliburton Co.,, 272 F.R.D. 235 (D.D.C. 2011) (BETTER); DCG Sys., Inc. v. Checkpoint Techs, LLC, 2011 WL 5244356 (N.D. Cal. Nov. 2, 2011) (BEST). Since all three cases embody proportionality, they are all good. But some are better looking than others.
The quality of the application of the doctrine in these cases is directly tied to the parties timing. In the best case the issue was raised fast, even before discovery. It was raised in the 26(f) conference and 16(b) hearing." The "best case" the author points out to be the DCG Sys. matter.
The author looks at the I-Med Pharma case, and notes that the plaintiff was seeking relief from the defendant's request. The author questions why the plaintiff's counsel agreed to the keyword list provided the defendant in that matter, in the first place. Mr. Losey states, "...when I say keyword search sucks, as I did in Secrets of Search: Part One, this is the kind of search I am referring to: the blind guessing,Go Fish, linear kind with no quality controls. I am certainly not referring to the kind of iterative keyword search on steroids that we see in Kleen Products, which appears to be almost as good as predictive coding based hybrid multimodal methods."
The article goes on to further discuss I-Med by saying, "Plaintiff’s counsel finally woke up and discovered proportionality (here is where we get to the better late than never part), when the forensic expert searched the unallocated space of their client’s computer system and found 64,382,929 hits covering the equivalent of 95 Million pages of documents! Based on the complete failure to limit the search to custodians, or date, and the Go Fish type list of keywords, this result, in Judge Debevoise’s own words, should come as no surprise."
The article then goes on to further examine proportionality, and the author discusses examples provided by himself, and U.S. Magistrate Judge Facciola, during a recent CLE, where Mr. Losey used artwork as an example to illustrate proportionality, and Judge Facciola used classical music to illustrate the same point.
Thursday, April 19, 2012
A blog post by Rees Morrison, Esq. posted on the Law Department Management blog.
This article discusses eDiscovery costs, and examines recent survey results.
The article states, "In Met. Corp. Counsel, March 2012 at 16, an FTI consultant firstname.lastname@example.org shares some findings from FTI’s interviews last fall with 31 in-house counsel. The topic was e-discovery and the participants were primarily from huge U.S. companies. He writes, “In spite of greater emphasis and attention on e-discovery, corporations still don’t have a concrete understanding of how much they spend year over year.” The consultant was also surprised that “most participants don’t yet have a line-item tracking system for all expense areas [of e-discovery].”"
The blog post goes on to provide some insight into why eDiscovery costs are not tracked more effectively.
An article by Barry Murphy posted on the eDiscovery Journal website.
This article discusses eDiscovery, and the impact that cloud computing is having upon preservation and processing of electronically stored information.
The article states "While companies embrace The Cloud for various business purposes, the ability to conduct eDiscovery on information stored in The Cloud tends to be an afterthought – less than 16% of respondents in eDJ’s survey last year reported creating an eDiscovery plan before moving data to The Cloud. This number is not surprising. eDiscovery is not exactly the sexiest topic in the world and, unless a company has been burned before, there is less urgency to prepare for it. In addition, many just assume that, as long as data is searchable, eDiscovery requirements are met. It is not as simple as that, unfortunately. Recently, though, eDJ was briefed on product aimed at making eDiscovery of data stored in The Cloud possible in an efficient manner."
The article goes on to look at techniques being employed by eDiscovery service providers that are aimed at dealing with data that is based in the cloud.
Digital Evidence, Digital Investigations and E-Disclosure: A Guide to Forensic Readiness for Organizations, Security Advisers and Lawyers
A report by Peter Sommer posted on the iaac.org website.
This report discusses information that should be made part of a forensic readiness plan.
The article states, "Peter Sommer identifies the need for a Forensic Readiness Plan, closely related to a
Disaster Recovery Plan. He highlights the importance of enterprises having sound plans to identify, collect and preserve digital evidence in forms that will prove robust against testing in legal proceedings. With this groundwork, he judges that directors and senior managers should be able to develop a corporate plan of action that meets the specific needs of their organisation."
Wednesday, April 18, 2012
An article by Sean Doherty posted on law.com on the LTN webpage.
The article discusses the Da Silva Moore case, in which U.S. Magistrate Judge Andrew J. Peck had requested a protocol from the parties regarding the use of predictive coding technology during the attorney review process.
The article states, "The e-discovery dispute in Monique da Silva Moore, et al. v. Publicis Group SA, et al. (Case No. 11-CV-1279), in the U.S. District Court for the Southern District of New York, took one step closer to a reality show on April 13, when plaintiffs filed a formal motion to recuse or disqualify Magistrate Judge Andrew Peck.
The case gained notoriety from Peck's approval of a protocol for e-discovery that included predictive coding technology using Recommind's Axcelerate product. After plaintiffs had previously submitted a letter to Peck requesting him to recuse himself from the case, they filed a motion (.pdf) with a supporting memorandum (.pdf), which included over 200 pages of exhibits." Links to the referenced .pdf files and the exhibits are provided in the article.
The essence of plaintiff's arguments is based on an allegation of bias against Judge Peck, since he has written articles and provided remarks in public speeches advocating the use of predictive coding workflows. The plaintiff's allege that Judge Peck is biased toward the defendant's proposal seeking the use of a specific service provider's technology, and the plaintiff's claim that further expert testimony (and perhaps "Daubert" hearings) should be required to formalize the use of predictive coding by the parties.
An article by Stephen E. Arnold posted on the arnoldit.com website.
This article discusses "big data" and looks at results from a recent study by SAS. The article provides a link to a report entitled "Data Equity: Unlocking the Value of Big Data".
The article states, "The main point of the study is that every industrial sector will be forced to deal with big data. Okay, as news flashes go, this is not one which lit up the Beyond Search editorial team. We did notice a number of interesting charts. The one reproduced below shows how much uptake in big data occurs by industrial sector today and in 2017. The key point is that the numbers and bars show big data becoming a “big” deal."
An article by Nolan Goldberg posted on the Privacy Law blog of the law firm Proskauer's website, discusses a new ABA Resolution, aimed at compelling U.S. litigants to comply with privacy regulations of foreign jurisdictions, when such regulations impact U.S. litigation.
The article states, "Litigants navigating the conflict between U.S. discovery obligations and foreign data protection laws have a new ally, the American Bar Association (“the ABA”). The ABA recently passed Resolution 103, which “urges” that:
[W]here possible in the context of the proceedings before them, U.S. federal, state, territorial, tribal and local courts consider and respect, as appropriate, the data protection and privacy laws of any applicable foreign sovereign, and the interests of any person who is subject to or benefits from such laws, with regard to data sought in discovery in civil litigation." A link to the full text of the resolution is provided in the article.
The article further states, "The ABA’s involvement with this issue is particularly timely, as it has recently become apparent that new data analytic technologies have weakened the effectiveness and reliability of anonymization, one of the primary mechanisms available to litigants to navigate cross border discovery conflicts. See e.g., The Practice of Law in the Age of Big Data, Nat. L. J., April 11, 2011." A link to the reference article is also provided by the author.
Tuesday, April 17, 2012
An article by Evan Koblentz posted on the EDD Update website.
The article discusses recent developments regarding TREC study results, and similar studies, and states, "
Official results from the 2011 TREC Legal Track were scheduled for public release on March 15. "We've been promising it in the next week or two for the last four or five weeks," Legal Track coordinator and University of Waterloo professor Gordon Cormack told me today. Asked if the overview will be published before April is through, "I sincerely hope so," he said.
Unofficial results, dated Oct. 24, 2011, are already circulating but are not endorsed by TREC." A link to the TREC study is provided in the article.
An article by Greg Buckles posted on the eDiscovery Journal website.
This article discusses the recent case in which U.S. Magistrate Judge Andrew J. Peck requested a protocol from the parties for the use of predictive coding technology.
The article states, "Last week, Barry Murphy and I spoke with Paul Neale, plaintiff consultant/expert in Da Silva Moore and Kleen Products. Unlike some other sideline bloggers, I am going to resist the urge to debate the merits of matters still before the bench or allow my pulpit to be used to put forth either side’s agenda. Instead, I will continue to derive insight from the interesting technical and procedural elements exposed in both cases. In case you have been actually working instead of reading reams of filings, here is my overview:
Da Silva Moore – the parties agreed to use predictive coding to tackle ~3.3 million items, but disagree on how to measue and define relevance success.
Kleen Products – Defendants have invested heavily in a more ‘traditional’ Boolean search strategy that Plaintiffs want to replace with a TAR methodology.
So beyond the use of TAR for relevance determination, what do these matters have in common? Mr. Neale asserts that, “the plaintiffs in both cases are trying to incorporate methods to measure the accuracy of the defendant’s productions.” I agree in general, though I do believe that there is merit in transparent discussion and analysis of the chosen methodology and technology to assess any potential gaps or necessary exception categories. Both cases are stuck on methodology specifics when they should be defining measurement standards. Essentially, focus on the outcome over the process."
Monday, April 16, 2012
An article by Ralph Losey, Esq. on his blog e-Discovery Team®.
This article is part 2 in a series, and provides a link to part 1. The article looks at specific cases in which proportionality of the discovery requests was an issue being raised during the litigation. The article states, "Compared to I-Med Pharma andU.S. ex rel McBride, DCG Systemsis the best of the lot. DCG Sys., Inc. v. Checkpoint Techs, LLC, 2011 WL 5244356 (N.D. Cal. Nov. 2, 2011) It is better than the rest because of timing. The issue of proportionality of discovery was raised in DCG Systems at the beginning of the case. It was raised at the 26(f) conference and 16(b) hearing as part of discovery plan discussions. That is what the rules intended. Proportionality protection requires prompt, diligent action."
The author further states, "A key lesson of these three cases is that timing is everything. Consider proportionality from the get go, and remember that it is not only based on the protective order rule, 26(b)(2)(C), it is based on the rule governing a requesting party’s signing a discovery request. I am talking about the Rule 11 of discovery, Rule 26(g)(1)(B)(iii)..."
The article goes on to provide a discussion of the model patent order. In addition, the article looks at other resources regarding the topic of proportionality, and looks at other precedent case law as well.
Friday, April 13, 2012
An article by David Braue posted on the stuff.co.nz website.
This article provides insight into certain risks associated with using a cloud computing services provider.
The article states, "Web-based services like Apple's iCloud; file-sharing site DropBox; Adobe's Photoshop Express; Microsoft's new Office 365 apps and Windows Live services; photo-sharing sites like Facebook, Yahoo's Flickr and Smugmug; and Google Docs and related apps - these, and hundreds of other online services, are built around the idea that the software providers will look after your data for you, forever.
Since they promise to look after that data, you assume they're backing it up and will always make it available to you - but what if they don't, or can't? What if, as in the case of file-sharing site Megaupload, they are unexpectedly closed down. Or, as now-defunct hosting provider Distribute.IT found out last year in dramatic style, what if they're hacked, data protection measures fall over, and countless gigabytes of crucial customer data is lost?" Links are provided in the article to the examples referenced in the article.
An article by Erin Hendrix posted on the eLL (e Lessons Learned) blog site.
This article discusses the case of Synopsys, Inc. v. Ricoh Co., 661 F.3d 1361 (Fed. Cir. 2011). Ricoh sought to avoid having to share costs for an eDiscovery review platform, arguing that it was not necessary to the case. The plaintiff had agreed to share the costs of the production with Ricoh. In addition, the plaintiff also sought cost recovery for paper based documents that they had to incur for copying the documents.
The article states, "Parties in modern litigation cannot hope to avoid costs associated with production of discovery by labeling electronic services as “modern conveniences.”
Despite the court’s pronouncement, Ricoh was saved by the fact that Synopsys had agreed to share the cost of Stratify (The parties agreed to use Stratify platform as the means of production).
Synopsys also erred in the area of document production, but it was not as fortunate as Ricoh. The district court originally awarded Synopsys costs related to document copying costs. The issue was whether the costs claimed by Synopsys were actually associated with documents handed over to Ricoh. The court sided with other circuits which have held that “a list of costs and expenses must be adequately detailed, identifying the purpose of each expenditure.” Synopsys did not meet this burden. Many of its records were very vague, with some simply stating that the copies were for “document production.” The court vacated the district court’s award and remanded the issue. While the court did not ultimately come to a conclusion on this issue, it is clear that generic record keeping is not sufficient to uphold an award of costs under 28 U.S.C. §1920."
Thursday, April 12, 2012
The link above provides a recap of the Cowen Group's Sixth Annual Salary Survey for law firm litigation support professionals.
The survey result summary states, "2011 was a pivotal year for executives who are managing litigation support departments at major law firms. A handful of firms purchased next generation technologies and invested in new talent to support and operate these systems.
eDiscovery Practice Groups converged with litigation support departments, causing firms to invest in attorneys to manage technologists, cases, and client relationships. Some law firms struggled with how to offer competitive, value added support by evaluating widely varying service models and alternate billing arrangements. Our 2011 Market Landscape and Salary Survey show Directors, Firm-wide Managers, and eDiscovery Practice Group Heads have assumed elevated executive leadership roles—managing departments that spend millions of dollars on salaries alone, and in return realizing comparable revenue.
With our sixth annual Salary Report, The Cowen Group has focused on helping executives gain better insight into the human capital costs of running a litigation support department. In 2011, we asked firm leaders to share department headcount, compensation, billable hours, and billable rates. For the first time, we have attempted to size the marketplace, contextualize the ratio of salary to revenue, and help firm leadership gain clarity about their competitive future.
In 2011, law firms employed approximately 2,600 litigation support and eDiscovery professionals, spending $200 million on salaries and realizing $275 million in hourly revenue. 57% of firms anticipate adding staff in 2012, creating 375 positions—a market growth rate of 13.7%"
The survey provides detailed insight into salaries of various positions, and this can certainly prove as a useful resource for anyone employed within the litigation support profession.
A blog post by Sharon Nelson posted on the Ride the Lightning blog site.
This article discusses the impact that predictive coding a/k/a technology assisted review might have upon the legal profession.
The article states, "Whether you call it technology-assisted review, predictive coding, or any one of its zillion names, we have been predicting that contract lawyers would take a hit.
Hat tip to Casey Horne for sending me a whimsical blog post by a contract attorney who has already been felled by technology. Blogger Rich Merritt entitles his post, "You've Been Replaced . . . By An Algorithm."" A link to the referenced blog post is provided in the article.
Wednesday, April 11, 2012
An article by Chuck Rothman posted on the eDiscovery Journal website.
This article, which is part 2 of a series, discussing processing, and looks at what takes place during the de-duplication process. The article looks at emails, attachments, and Microsoft Office Documents.
The author states, "...when conducting a native review, what format should the native email, without attachment, take? In many cases, it ends up being the email extracted from its container, with all the attachments still embedded!. This leads to some issues:
1. From a cost perspective, the volume is nearly doubled – you pay for the email file including the size of all attachments, plus you pay for each extracted attachment.
2. If the review environment is web based (as many of the more modern ones are), the amount of time a reviewer waits for a record to appear on their screen is directly related to the size of the file being downloaded. If a very short email contains many attachments, a reviewer could wait 30 seconds or more before being able to review one or two lines of text. If a review database contains 100,000 such emails, that’s an extra 50,000 minutes, or 833 hours of review time that has been added solely because of the way the email was processed.
3. If producing native files, it is impossible to redact attachments to an email if the email is produced.
A more sensible way to process emails is to think of it as like a zip file – the email container contains an email body and zero or more attachments. Each attachment, as well as the email body, is extracted from the email container file, and the whole group is linked together."
An article by Herbert L Roitblat, Ph.D posted on the Orca Tec blog Information Discovery.
This article discusses the plaintiff's objections regarding the protocol requested by Magistrate Judge Andrew J. Peck on the Da Silva Moore case, which sought information regarding the proposed use of predictive coding technology to be used during the attorney review.
This article states, "The current disagreement embodied in the challenge to Judge Peck's decision is not about the use of predictive coding per se. The parties agreed to use predictive coding, even if the Plaintiffs now want to claim that that agreement was conditional on having adequate safeguards and measures in place. Judge Peck endorsed the use of predictive coding knowing that the parties had agreed. It was easy to order them to do something that they were already intending to do.
Now, though, the Plaintiffs are complaining that Judge Peck was biased toward predictive coding and that bias somehow interfered with him rendering an honest decision. Although he has clearly spoken out about his interest in predictive coding, I am not aware of any time that Judge Peck endorsed any specific measurement protocol or method. The parties to the case knew about his views on predictive coding, and, for double measure, he reminded them of these views and provided them the opportunity to object. Neither party did. In any case, the point is moot in that the two sides both stated that they were in favor of using predictive coding. It seems disingenuous to then complain about the fact that he spoke supportively of the technology." A link to the plaintiff's memo objecting to Judge Peck's order is provided in the article.
This is an article by Karl Schieneman posted on the Doc Review MD Blog.
This article discusses the case of Da Silva Moore, in which Magistrate Judge Andrew J. Peck had requested a protocol from the parties regarding the use of predictive coding technology during the attorney review process.
The article states, "The common strand in both of these matters is offering some comfort to an adversary that most of the ESI that is related to a case is in fact being produced and the gaps in production are not intentional but are caused by search and retrieval limitations.
Both cases are about this same issue and are less about predictive coding than people realize.
The Da Silva Moore plaintiffs are comfortable that predictive coding works better than key word searching. What they are not comfortable with is how they can be sure they received the relevant documents and that is what is at the heart of their debate over the size of the sampling to be done to validate the process. TheKleen plaintiffs want predictive coding to be used because they know identifying key words in an antitrust case is really hard to do. They believe their best chance to find the relevant ESI they hope to find is to use some form of predictive coding across a wide number of data sources and custodians. The key theme here with both parties is a lack of comfort from parties receiving the ESI on what they are receiving and a desire to receive what they are entitled to."
Tuesday, April 10, 2012
An article by Richard Ruyle, posted on the Legal IT Professionals website.
This article examines some of the differences between a SQL database and a Flat-File database system, and provides some insight into which might be better for specific situations.
The article states, "Both SQL and file based systems are affected by data storage speed and capability. SQL Server is a fantastic and sometimes amazing solution; but if the requirements do not dictate the need, then a quality file based database is a much better choice, eliminating the complexity and the need for specialized hardware and personnel while performing equally as well. A decision should be based on features, technology, and the company that created the application instead of just the platform or data structure."
An article by Barry Murphy posted on the eDiscovery Journal website.
This article discusses information governance, and looks at approaches to managing corporate records and information.
The article states, "As electronic data grows exponentially, managing the risk that information poses is harder and harder. For every effort a company takes to safeguard information, employees create a workaround if that effort impinges on the velocity of information. In turn, those workarounds lead to a vicious circle of eDiscovery nightmares.
A major issue that comes up for companies is over-preservation when conducting legal holds. Many companies are now mature enough to create aggressive retention policies to undertake defensible disposition of information. That sounds good in theory, but can be difficult to implement in practice."
There is a link to a\n information governance survey in the article, which offers a chance to win $250 for participating.
Spoliation — Failure to Prove Evidence Ever Existed Precludes Sanction - Corporate “Best Practices” Do Not Represent Policy That, If Violated, Might Be Deemed Spoliative
An article posted on the Gregory P. Joseph Law firm website.
This article discusses the case of Gomez v. Stop & Shop Supermarket Co.., 670 F.3d 395 (1st Cir. 2012). The plaintiff in a slip and fall claim sought spoliation sanctions against the defendant supermarket, due to the fact that the supermarket had a security camera system but no videotape of the plaintiff's fall was ever produced by the defendant.
The article states, "...the party urging that spoliation has occurred must show that there is evidence that has been spoiled (i.e., destroyed or not preserved). Tri-County Motors, Inc. v. Am. Suzuki Motor Corp., 494 F. Supp. 2d 161, 177 (E.D.N.Y. 2007).
The plaintiff falls woefully short of meeting these requirements. He relies on three facts to support his contention that the defendant destroyed a videotape of the accident: the defendant had a store security system that employed a series of cameras; the defendant had exclusive control over that system; and no videotape was produced during discovery. These facts are true but, without more, they are inadequate to show spoliation." The article goes on to provide other case law which held that there is no reasonable inference created that a security system in a store must have captured an incident that occurred within the store.
Monday, April 9, 2012
An article by Mark Michels posted on law.com on the LTN webpage.
This article provides information regarding discussions being held by the Federal Judicial Advisory Committee regarding possible new eDiscovery Civil Procedure Rules, to further revise the changes that took place in 2006.
The article states, "The Federal Advisory Committee on Rules of Civil Procedure met on March 22 and 23, 2012, in Ann Arbor, Mich., to consider, among many other issues, the proposed preservation amendments to the Federal Rules of Civil Procedure, a process which began at the May 10-11, 2010, "Duke Conference." Additionally, the Advisory Committee may consider whether it wishes to explore a rule imposing certain limitations on ESI discovery scope. (The meeting agenda and materials can be found in the article on the link posted above.) There are significant roadblocks to progress on these FRCP amendments. Assuming that these obstacles are overcome, however, the proposed amendments would not likely come into effect before December 2015." Links to the referenced meeting materials are provided in the article.
An article by Sandy Serkes posted on the Valora Technologies Blog.
This article discusses the terms "predictive coding" and "automated review", and provides definitions that show differences between the terms. The author provides definitions for 3 types of "automated review", only one of which is the equivalent or "predictive coding."
The article states, "Predictive coding, in which a topic-expert manually codes a "seed set" of documents (and the software follows suit) is a type of automated review. There are 2 other types.
A second approach to automated review is called Rules-Based Coding, in which a set of rules is created to direct how documents should be coded, very similar to a Coding Manual or a Review Memo that might be prepared for a group of on- or off-shore contract attorneys. The preparation of the Ruleset is typically done by some combination of topic experts, attorneys and technologists. The rules are run on the document population and it is evaluated, tweaked and run again until all parties are satisfied.
The third approach to automated review is called Present & Direct, in which software takes a first, unprompted assessment of the documents and puts forth a graphical representation (pretty charts and diagrams) of what the data contains. This is sometimes called Early Case Assessment or Data Visualization."
After three years of flat to negative growth, 2011 was the year that the nation's 250 largest law firms started getting bigger again. Headcount among NLJ 250 firms was up a collective 2,132 lawyers which represents a growth of 1.7%. Below is a list of the Top 25 for the complete list click on the above link.
|Rank 2011||Rank 2010||Firm Name|
|1||1||Baker & Mckenzie|
|5||5||Latham & Watkins|
|6||7||White & Case|
|22||18||Morrison & Foerster|
|23||21||McDermott Will & Emery|
|24||22||Ropes & Gray|
|25||31||Winston & Strawn|
Thursday, April 5, 2012
An article by John Patzakis on the eDiscovery Law and Tech blog.
This article discusses certain issues related to eDiscovery processes that involve data which is based in cloud computing infrastructures.
The article states, "According to a recent PwC report, Cloud IaaS will account for 30% of IT expenditures by 2014. IaaS currently provides the means for organizations to aggressively store and virtualize their enterprise data and software, thus potentially spawning the same large data volumes and requiring the same critical search and eDiscovery requirements as traditional enterprise environments. Amazon Web Services, the leading IaaS cloud provider, reports in our discussions with them extensive customer eDiscovery requirements that are currently addressed by inefficient and manual means. So for purposes of this discussion, IaaS, which is essentially cloud for the enterprise and where there is a current significant eDiscovery challenge, is what we will focus on." The article promises to provide further information about how to process data that resides in the cloud, without the need for a time consuming export process.
An article by Charles Skamser posted on the eDiscovery Paradigm Shift blog.
This article discusses predictive coding technology, and the author examines some of the technology assisted attorney review services provided by Equivio. The author states, "Predictive coding has captured the imagination of the eDiscovery market. As a result, my daily conversations with members of the legal departments of the global 2000 and the eDiscovery professionals within the major law firms now routinely include questions about how predictive coding works, what predictive coding technologies are available and which predictive coding vendors I recommend."
The article further mentions, "In early 2009, Equivio made the gutsy decision to enter the unknown and highly under appreciated predictive coding wilderness with Relevance, its standalone predictive coding platform. Along with a handful of other predictive coding pioneers such as Recommind, Orcatec and Xerox, Equivio set out to convince the legal community that this new software with its complex mathematical algorithms and confusing statistical models could do a better (i.e. more statistically significant) job of identifying relevant documents than human reviewers. And, although I haven’t asked Equivio about the early financial returns on this bet, I would suspect that the initial missionary marketing efforts were tough and didn’t produce a financial return on their investment. However, Equivio didn’t give up. And, with the recent landmark court decision by Southern District of New York Magistrate Judge Andrew Peck on the Da Silva Moore case opening the flood gates for the legal tolerance for computer assisted review (aka predictive coding), they are now well positioned as one of the few legacy players in the predictive coding market."
Tuesday, April 3, 2012
An article by Doug Austin posted on the eDiscovery Daily Blog.
This article provides information about the case Jacob v. Duane Reade, Inc., 11 Civ. 0160.
This article looks at waiver of privilege, and examines the order for the case which discusses the factors involved in determining if privilege has been waived.
The article states, "Magistrate Judge Theodore Katz of the US District Court for the Southern District of New York found that a privileged, two-page email that was inadvertently produced did not have to be returned and that the privilege had been waived because the producing party, Duane Reade, had failed to request its return in a timely manner. According to Defendants' counsel, the ESI production involved the review of over two million documents in less than a month; that review was accomplished with the assistance of an outside vendor and document review team."
The article goes on to further state, "In determining whether an inadvertent disclosure waives privilege, courts in the Second Circuit have adopted a middle of the road approach. Under this flexible test, courts are called on to balance the following factors: (1) the reasonableness of the precautions to prevent inadvertent disclosure; (2) the time taken to rectify the error; (3) "the scope of the discovery;" (4) the extent of the disclosure; and (5) an over[arching] issue of fairness.”
The Court ruled that the production of the email was inadvertent and that Duane Reade had employed reasonable precautions to prevent inadvertent disclosures (such as drafting lists of attorney names, employing search filters and quality control reviews). However, given the over two month time frame for the Defendants to request return of the email, the Court determined that the privilege was waived because the Defendants did not act “promptly to rectify the disclosure of the privileged email.”"
An article by Kevin Kelly posted on the e-Discovery Law Review website of Cozen O'Connor.
This article discusses the case In re Delta/Airtran Baggage Fee Antitrust Litigation, 2012 U.S. Dist. LEXIS 13462 (N.D. Ga. February 3, 2012), and provides information about the sanctions entered for eDiscovery abuses in that case.
The article discusses the fact that Delta had produced documents in two separate matters, both involving allegations of price fixing regarding baggage fees. The government was involved in one of the two matters, as noticed that there were documents which were not produced to the plaintiff's in the class action suit of private claimants. Delta than apparently admitted to certain errors made in the production for the class action matter, and noticed that certain electronic systems were not searched.
The article states, "The investigation revealed two sources of unproduced documents. First, the contents of a number of custodian hard drives were never uploaded to Delta’s electronic information management program. Only documents uploaded to the program were reviewed for production. Second, during its investigation, Delta’s IT personnel found an unmarked box containing backup tapes of server information in the office that manages document discovery responses. The tapes contained documents that were relevant to the baggage fee litigation. Delta eventually released an additional 60,000 pages of documents to Plaintiffs.
As a result, Plaintiffs sought discovery sanctions from Delta. The court found that Delta was subject to sanctions for two violations. First, the court determined that Delta had failed to make a reasonable inquiry into the completeness of its discovery responses as it had certified pursuant to FRCP 26(g). The court indicated that Delta should have done a better job making sure that the IT department followed its instructions and produced the correct documents. Based on the violation of FRCP 26(g), the court awarded Plaintiffs reasonable expenses, including attorney’s fees, caused by Delta’s violation. Second, the court determined that Delta had breached its obligation to supplement its discovery requests pursuant to FRCP 26(e). According to the court, Delta should have been more diligent in searching such an obvious area for tapes related to the investigation." Attorney fees, costs and other monetary sanctions were issued due to Delta's errors.