Friday, March 30, 2012

Technology: The evolution of e-discovery orders



http://ow.ly/9YLw7

An article by James Hanft posted on the insidecounsel.com website.

This article discusses model orders in various jurisdictions, and looks at how they address eDiscovery.

The article states, "Early e-discovery rules mandated that the parties confer and agree on procedures. The problem was that court intervention was required to resolve roadblocks, leading to delays and uncertainty. The District of Delaware solved this dilemma by making the e-discovery rules a “default standard” that applies in the absence of agreement. The Federal Circuit and Texas Model Orders stepped back from this approach, appearing to tacitly acknowledge that there is no one-size-fits-all approach and instead made the rules discretionary by providing a model that may be adopted by the court.

The most prevalent forms of ESI for most businesses roughly come in three categories:
Shared business documents that typically reside on a central network or on individual computers, such as word processing, spreadsheets, presentations and occasionally separately saved emails, usually managed through a document management system
Emails and instant messages managed through a centralized email or IM server
All other types (e.g., accounting records, drawings, engineering documents, source code, graphics and pictures), typically generated and/or managed through a specialized program.

Recognizing the realities of modern business practices and that there are, at times, differences between what is requested by a litigant and what actually leads to useful admissible information, recent model orders treat email discovery separately and subject it to strict limitations. Litigants must now specifically propound email production requests, identify the custodian, search terms and time frame in the request, and are limited as to both the number of custodians and the number of search terms."

In addition, the author further points out, "One of the more interesting points of divergence is on the issue of proportionality and cost-shifting. The Federal Circuit model order stresses the shifting of costs for disproportionate ESI production requests and/or dilatory tactics. The Texas model order moved away from this and followed the more historical approach of leaving it within the purview of Fed. R. P. 37. This may be taken as an effort to control the cases involving non-practicing entities, where discovery burdens are typically one-sided, as opposed to competitor cases."

Two World's of eDiscovery: Ignorance and Denial



http://ow.ly/9YGwC

The link above is to the Organization of Legal Professionals Spring Journal for 2012.  The Journal contains several articles and interviews regarding issues related to technology and eDiscovery.  The journal includes an article entitled, "Why Predictive Coding Works", authored by Herb Roitblat, Ph.D.

Making Knowledge Exchange Work



http://ow.ly/9YFkX

An article on V Mary Abraham's blog Above and Beyond KM.

This article provides a link to a video that discusses Knowledge Management and looks at ideas shared via the Sustainable Learning Project and the Involved Project. Links to information about the two referenced projects are also provided.

The video discusses the following topics:
  1. Design knowledge exchange into your work.
  2. Make sure you systematically represent the needs and priorities of everyone who’s likely to use your work.
  3. Make sure knowledge exchange is a two-way process.
  4. Create a safe space in which people can share opinions and existing knowledge, and generate new knowledge together.
  5. Deliver tangible outcomes that people involved in your work want as soon as possible.
  6. Create a culture of trust where everyone’s knowledge is valued and people stay engaged.
  7. Reflect and evaluate so you can refine your practice."

Thursday, March 29, 2012

An overview of state e-discovery rules



http://ow.ly/9Xk4E

An article by David Canfield posted on the insidecounsel.com website.

The article states, "The Federal Rules approach tackles e-discovery by listing ESI, in Rule 34, as a discoverable source of information, and also addresses ESI throughout rules 16, 26, 34 and 37. As noted by Thomas Y. Allman in his 2012 article “E-Discovery in Federal and State Courts after the 2006 Federal Amendments,” today:
  • 30 states follow the Federal Rules approach in whole or in part
  • Texas, Idaho and Mississippi follow an embodiment of the Texas rules approach
  • Other states take different approaches that don’t necessary include legislative action."  A link to the referenced article by Thomas Allman is provided by the author. 

How to Secure the Cloud



 http://ow.ly/9XfPa

An article by Jeffrey Roman posted on the bankinfosecurity.com website.

The article consists of an interview with David Rockvam regarding security issues related to cloud computing.

The article states, ""Instead of you running and managing that authentication solution or having your own certificates, why not go to a provider that puts that in the cloud for you," Rockvam asks.

In an exclusive interview, Rockvam discusses:
  • How organizations are coping with BYOD;
  • Cloud computing, authentication and other security challenges;
  • The topics discussed at RSA Conference 2012."  

Wednesday, March 28, 2012

eDiscovery Trends: Three Years Later, “Deleted” Facebook Photos Still Online



http://ow.ly/9VYM8

An article by Doug Austin posted in the eDiscovery Daily Blog.

This article discusses the practices followed by Facebook regarding deleted images, and addresses the fact that many photos that were deleted on Facebook still appear on the Facebook servers. The article provides a link to an article provided by Ars Technica on this topic.

The article states, "As author (from Ars Technica) Jacqui Cheng notes, “There were plenty of stories in between as well, and panicked Facebook users continue to e-mail me, asking if we have heard of any new way to ensure that their deleted photos are, well, deleted. For example, one reader linked me to a photo that a friend of his had posted of his toddler crawling naked on the lawn. He asked his friend to take it down for obvious reasons, and so the friend did—in May of 2008. As of this writing in 2012, I have personally confirmed that the photo is still online, as are several others that readers linked me to that were deleted at various points in 2009 and 2010.” However, she noted that Facebook did delete herpictures after she did a story in 2010."

In addition, the article provides several links to other items that can be used to illustrate the example of Facebook photos that were deleted by users whom posted them, but where the photo still exists in Facebook's internal environment.

Da Silva Moore Fury


http://ow.ly/9VRj9


An article by Monica Bay on the EDD Update blog.

This article discusses the recent decision by U.S. Magistrate Judge Andrew J. Peck, and looks at some of the reaction to the plaintiff's objections to the court order seeking a protocol for the use of predictive coding technology.

The article quotes various reactions, and provides links to other commentaries about the case.  One of the experts quoted is Craig Ball, Esq. whom stated "Where predictive coding is concerned, the devil is very much in the details. But their pairing of valid concerns with a sleazy personal attack on the judge has to be one of the dumbest moves to come down the e-discovery pike since Creative Pipe named its line of stolen garbage can designs FUVISTA (for F*** yoU VIctor STAnley). It puts the duh in Da Silva Moore.""

Tuesday, March 27, 2012

Training of Predictive Coding Systems Fosters Debate



http://ow.ly/9U5eC

An article by Evan Koblentz posted on law.com on the LTN webpage.

This article discussed predictive coding technology, and other similar technology assisted review processes.

The article states, "At issue is the method for measuring accuracy of the software's output. But a secondary debate may have wider long-term impact -- what's the best method for training predictive coding systems in the first place?

Training is the process of teaching the software, for every document review, which pieces of electronically stored information are responsive and which are not. It initially requires human input to determine relevancy, but at a certain point the computer takes over. A predictive coding system's final results can be more accurate than human reviewers, its advocates claim."

The article then examines some of the workflow processes followed by various service providers, and the methods used to provide more accurate results.  The article points out some of the limitations in the process, and the need for accurate input from the human coders that are setting up the process at the outset of a specific case.

P.S.  What is not referenced in the article are two important metrics that can be used to measure the effectiveness of any review:  Precision and Recall. In order for a review to be truly effective, the precision and recall rates must both be factors to consider.

Precision: the extent to which only responsive documents are captured during an attorney review.

Recall: the extent to which all responsive materials are captured in an attorney review.

FTI Surveys In-House Counsel on Streamlining EDD



http://ow.ly/9U4jM

An blog post by Albert Barsocchini, Esq appearing on the EDDupdate.com blog.

This item provides a link to a recent survey taken by FTI, which you must provide information within the site to obtain FTI's article regarding the survey's results.

The survey is entitled "Advice From Counsel: an Inside Look at Streamlining E-Discovery Programs."  Mr. Barsocchini's blog post provides 8 items that the survey findings reveal, including the increasing importance of "information governance", an approach that addressed eDiscovery concerns in a broader context of records management.

Monday, March 26, 2012

Peck hawked predictive coding out-of-court and had improper contacts with lawyer, plaintiffs allege



http://ow.ly/9T7d4

An article by ACEDS staff posted on the aceds.org website.


This article discusses the plaintiff's objections to the order by U.S. Magistrate Judge Andrew J. Peck, in which he sought a protocol for the use of predictive coding in the da Silva v. Moore case.  The article states that a portion of the plaintiff's objections to the ruling focused on allegedly improper ex-parte communications that took place between Ralph Losey, Esq., e-Discovery advisor for the defendants in that matter, and Judge Peck.

The article states, "Losey, a frequent speaker at e-discovery conferences and seminars, has been the defendants’ e-discovery advisor in the case since at least June 2011, when the parties held their first discovery telephone call.

The plaintiffs say that while the case has been in progress, Peck appeared jointly with Losey on January 18 in Charlotte, North Carolina, at an event called “E-Discovery Judges in Charlotte,” which was sponsored by e-discovery vendor Nova Office Strategies. Federal magistrate Judges John Facciola and Paul Grimm also participated there. “Predictive coding was discussed at length,” the plaintiffs assert in their court filing.

Peck and Losey, along with lawyer Maura Grossman, of Wachtell, Lipton, Rosen & Katz, in New York, appeared together in a January 30, 2012, panel at the annual LegalTech Conference in New York. The plaintiffs say Peck and Losey promoted predictive coding during the panel, “Man v. Machine: The Promise/Challenge of Predictive Coding and Other Disruptive Technologies,” which Symantec sponsored.

The next day, on January 31, Peck moderated a panel at LegalTech, “Hot Topics in E-Discovery: Point/Counterpoint Discussion with Craig Ball and Ralph Losey.” Losey endorsed predictive coding during this panel, according to the plaintiffs’ court filing. BIA sponsored the panel.

The same day, Peck spoke on another LegalTech panel, “Judicial Perspectives on Technology-Assisted Review,” in which he promoted predictive coding and noted that unnamed defendants in a case before him “must have thought [they] died and went to heaven.” Losey did not participate in this panel..."

P.S.  Joe Bartolo, one of the writers of this blog, was a participant in the event referenced in Charlotte.  Although certainly predictive coding was discussed, the issue is here is did Mr. Losey ever discuss the specific case with Judge Peck, and Mr. Losey denies doing so.  Mr. Losey is a published author and recognized expert in the area of eDiscovery.  Hence is seems Mr. Losey is certainly within his rights to participate in educational forums pertaining to issues such as predictive coding, including those in which Judges are also participants.

In the cloud, your data can get caught up in legal actions



http://ow.ly/9SHjM

An article by Thomas J. Trappler posted on the infoworld.com website.

This article discusses the use of cloud computing and certain possible pitfalls associated with utilizing a cloud service provider.

The article states, "With cloud computing, data from multiple customers is typically commingled on the same servers. That means that legal action taken against another customer that is completely unrelated to your business could have a ripple effect. Your data could become unavailable to you just because it was being stored on the same server as data belonging to someone else that was subject to some legal action. For example, a search warrant issued for the data of another customer could result in your data being seized as well."

P.S. The use of the cloud will continue to grow, but it is important to understand all of the risks associated with such use.  Certain corporations may need to have special environments created by their cloud service providers, in order to avoid the co-mingling of data within cloud based servers.

Friday, March 23, 2012

Should the ‘Daubert’ Standard Apply to Predictive Coding? We May Know Soon



http://ow.ly/9QiMZ

An article by Bob Ambrogi posted on the Catalyst eDiscovery Search Blog.

This article discusses the recent case Da Silva Moore v. Publicis Groupe, in which U.S. Magistrate Judge Andrew J. Peck had issued an order seeking a protocol from the parties for the use of predictive coding technology to be used during the attorney review phase of the litigation. U.S. District Judge Andrew L. Carter Jr. will weigh in on the issue, since on March 13th he entered an order granting plaintiffs’ request to submit additional briefing on their objections to Judge Peck’s order.

The article states, "...in the course of that opinion, Judge Peck made another significant ruling. He concluded that Federal Rule of Evidence 702 and the Supreme Court’s decision in Daubert v. Merrell Dow Pharmaceuticals do not apply to a court’s acceptance of a predictive-coding protocol.

Rule 702 and Daubert give trial judges the responsibility to act as “gatekeepers” to exclude unreliable scientific and technical expert testimony. Judge Peck reasoned that these did not apply to the Da Silva Moore case because no one was trying to put anything into evidence."

The article further states that Judge Carter will likely issue a ruling providing further clarification on this matter, "It seems unavoidable that any ruling he issues will address the core issue of the appropriateness of computer-assisted review, at least in this case. Most likely, he will also have to address this secondary issue of the applicability of Daubert. If he does, in fact, squarely address these issues–and regardless of whether he agrees with Judge Peck–his ruling will be yet another milestone for predictive coding."

Catalyst, CaseCentral Among EDD Vendors Targeted by 'Patent Troll'



http://ow.ly/9Q3oO

An article by Evan Koblentz posted on law.com on the LTN webpage.

This article discusses recent patent suits filed against eDiscovery service providers Catalyst and CaseCentral, among others, by an entity named Lone Star Document Management.

The article states, "Notable defendants include CaseCentral, which is currently being acquired by Guidance Software; Catalyst Repository Systems; Digital Reef; Gallivan, Gallivan & O'Melia; and Trial Solutions of Texas, which functions under the Cloud Nine Discovery banner. Other e-discovery defendants are Atalasoft, Breeze, Business Intelligence Associates, and Lexbe.

The U.S. patent is 6,918,082, Electronic Document Proofing System, filed in 1998 and granted in 2005. Its inventors are Jeffrey M. Gross and Matthew H. Parker, both of Brooklyn, New York. Neither could be reached for comment. A person at Lone Star's registered business address in Plano, Texas, said the location is a series of subcontracted suites and that Lone Star does not rent one, but rather has its mail forwarded to Gross. Attorneys at Dallas-based McDole, Kennedy & Williams, which represents Lone Star, did not return messages.

John Tredennick, CEO of Catalyst, said he intends to fight the suit -- but he recognizes that settling could be done for less than $100,000, while fighting could cost millions.

"We're studying it. I have not been served yet. So obviously we have to take a hard look at the patent and the complaint. But it looks to me like a case of patent trolls," he said, in Denver."

Flattened By Race Tires: The Third Circuit Limits What Types of E-Discovery Costs Are Recoverable by a Prevailing Party



http://ow.ly/9PREp

An article by Mike Zabel posted on the ediscoverylawreview blog of Cozen O'Connor.

This article discusses the recent case of Race Tires America, Inc. v. Hoosier Racing Tire Corp. et al., in which the U.S. Court of Appeals for the Third Circuit limited the recovery of eDiscovery costs that were awarded by the lower court.

The article states, "The Third Circuit reversed that District Court’s decision in Race Tires, as the appellate court opted for a limited, rather than expansive reading of § 1920(4). The court identified only two e-discovery costs that were recoverable in the case: 1) the conversion of native files to an ESI format which had been agreed upon by the parties, and 2) the scanning of physical documents to create digital duplicates. As a result, the court reduced the defendants’ award of costs to just over $30,000.

In the opinion, Judge Vanaskie emphasized the historical purpose of § 1920 and its statutory predecessors, and the “‘American rule’ against shifting the expense of litigation to the losing party.” The court cited Supreme Court precedent for the principle that § 1920 was intended to provide “rigid controls on cost-shifting in federal courts” and thus that the statute “defines the full extent of a federal court’s power to shift litigation costs absent express statutory authority.”

The article goes on to further explain, "Prior to the ESI era, Judge Vanaskie noted, there could also be a lengthy process involved in producing copies for discovery which included collecting, processing, and reviewing paper files for relevancy and privilege, and the costs of those activities were never taxable under the statute. Similarly, the court reasoned, the costs of “gathering, preserving, processing, searching, culling and extracting ESI” may be necessary expenses leading up to the production of ESI, but they cannot be considered the costs of “making copies.”

Several components of e-discovery do qualify as “making copies,” according to the Third Circuit. The court expressly approved of scanning paper documents into electronic form and transferring VHS tapes to DVD as taxable costs. Additionally, because the parties in Race Tireshad agreed to produce ESI in TIFF format, the court allowed the defendants to recover the costs of converting non-TIFF electronic files into TIFF format. These recoverable costs represented roughly $30,000 worth of the defendants’ e-discovery bill, which totaled more than $367,000."

Thursday, March 22, 2012

689 Published Cases Involving Social Media Evidence (With Full Case Listing)



http://ow.ly/9OLvP

An article by John Patzakis posted on the x1Discovery.com blog.

This article provides information regarding 689 published cases in 2010 and 2011 that discuss evidence obtained from social media networks.  The article provides a link to the spreadsheet, which provides information regarding each of the cases, and links to the case opinions that are available.  This list can certainly  serve as a useful resource.

A snapshot of the initial portion of the spreadsheet is provided below:

Update – Plaintiffs Attack Judge Peck’s Da Silva Moore Predictive Coding Order Again



http://ow.ly/9OmOL

An article by Brandon D. Hollinder on the ediscoverynewsource.blogspot.com blog,

This article discusses the ruling by U.S. Magistrate Judge Andrew J. Peck seeking a protocol for the use of predictive coding.

The article states that the plaintiff's are objecting to Judge Peck's order.  The article states, "Asking “that the court reject MSL’s use of predictive coding and require the parties come up with a new ESI Protocol,” plaintiffs warn that “Judge Peck sets a dangerous precedent that is likely to deter future litigants from even considering predictive coding, lest they be bound by a protocol that contains no measure of reliability.” Obviously, counsel is trying to persuade the court here (which I can certainly appreciate), but I strongly disagree with this point. As I recently discussed in a blog titled: You Cannot Unring a Bell – Judge Peck’s Da Silva Moore Opinion Will Continue to Be Influential Despite Objection (http://ediscoverynewssource.blogspot.com/2012/03/you-cannot-unring-bell-judge-pecks-da.html), regardless of the outcome of this particular objection, predictive coding will continue to be a hot topic, and litigants will use it to the extent it makes fiscal sense and produces reasonable results.
Interestingly, plaintiffs cite Kleen Prods., LLC v. Packaging Corp. of Am., No. 10 C 5711 (N.D. Ill) in support of their arguments. Kleen is the case where plaintiffs have asked Magistrate Judge Nan R. Nolan to order defendants to use predictive coding. Plaintiffs in the Da Silva Moore matter hold Judge Nolan’s decision to require full briefing, expert reports, and evidentiary hearing on the use of predictive coding in high regard when contrasted with Judge Peck’s relatively quick process and decision."

The article further states, "The outcome of the Da Silva Moore predictive coding dispute is now squarely in the hands of Judge Andrew Carter..."  Stay tuned everyone.

Managing Information Risk and Archiving Social Media



http://ow.ly/9Oki0

An article by Ben Kerschberg posted on Forbes.com website.

This article discusses the implications of social media use by employees, and the potential legal risk it creates for corporations.

The article states, "Social media has changed the face of business. Whether in product marketing, consumer branding, customer relations, and/or human resources, the benefits of corporate social media are beyond dispute. Yet mounting evidence shows that the risks are, too. Last week, Symantec released the results of an independent survey of 2,000 global enterprises across a variety of industries with a minimum of 1,000 employees. (Symantec confirmed that “[t]he respondents do not represent any kind of grouping of former or current Symantec customers.”) The survey results speak to the heterogeneous nature of the types of electronically stored information (“ESI”) stored during legal proceedings. See Evan Koblentz, Symantec: Files, Databases Overtake Email in E-Discovery, Law Technology News (Sept. 19, 2011). As part of the survey, respondents were asked the following question:

How frequently are the following documents requested in conjunction with a legal, compliance, or regulatory request for [ESI]?

Forty one percent (41%) indicated social media. To put that figure in perspective, consider that email, that ubiquitous element of our daily work lives, was indicated only 58% of the time. (Multiple answers were allowed.) See Information Retention and eDiscovery Survey, Global Findings(Symantec 2011)."

The article further states, "Gartner Group predicts that by the end of 2013, half of all corporate litigants “will be asked to produce material from social media websites for e-discovery.” Symantec’s empirical data suggests that the 50% mark will be reached far sooner."

The article goes on to discuss corporate policies that should be in place to regulate the use of social media by employees, in order to be in position to react when potential abuses arise.

Wednesday, March 21, 2012

What's Hot in E-Discovery?



http://ow.ly/9MTeH

An article by Sharon Nelson and John W. Simek posted on the slaw.ca website.

This article explores several "hot" topics in eDiscovery and provides narrative around each referenced topic.

The topics addressed include:


  • Machine Assisted Review;
  • Costs of Hosted Review;
  • Smartphones;
  • Social Media; and
  • Preserving Social Media and Website Evidence
P.S.  The authors of this blog would also add mobile devices to the discussion in the smartphone category.  In addition, another hot topic that merits discussion would be the prevailing party seeking cost recovery for eDiscovery expenses.


Social Media's Role in Police Investigations Is Growing



http://ow.ly/9MR18

An article by Roger Yu posted on the CIO-Today.com website.

This article discusses the growing use of social media networks as a tool for gathering evidence in criminal investigations. The article also discusses issues related to privacy rights associated with social media use.

The article states, "...The issue of properly handling social-media content is also igniting heated debates about privacy and the limits of the current law that spells out how police can legally retrieve personal data. Adding to the confusion is the reality that rules and logistics for obtaining private information are still not firmly established for many police departments. Fewer than half of all law enforcement agencies, 48.6%, have a social-media policy, according to a survey by the International Association of Chiefs of Police."

Tuesday, March 20, 2012

eDiscovery Case Law: Not So Fast On eDiscovery Cost Reimbursement



http://ow.ly/9LmAH

An article by Doug Austin posted on the eDiscovery Daily Blog.

This article discusses attempts to seek cost recovery of eDiscovery expenses as part of the reimbursement sought by the prevailing party for litigation costs.

The article references that there will possibly be a reversal in a recent Southern District of New York case that awarded the winner the recovery of costs for eDiscovery expenses.  The article goes on to state, "One of the emerging trends for 2011 was the growing number of cases where the prevailing party was awarded reimbursement of eDiscovery costs, including this case and this case. Another case of eDiscovery cost reimbursement reported in this blog was Race Tires Amer., Inc. v. Hoosier Racing Tire, Corp., No. 2:07-cv-1294, 2011 WL 1748620 (W.D. Pa. May 6, 2011), where U.S. District Judge Terrence F. McVerry in Pittsburgh ruled that the winning defendants in an antitrust case were entitled to reimbursement of more than $367,000 in eDiscovery costs."

The article goes on to provide a quote from the court opinion that sheds more light on this matter, "We conclude that of the numerous services the vendors performed, only the scanning of hard copy documents, the conversion of native files to TIFF, and the transfer of VHS tapes to DVD involved copying, and that the costs attributable to only those activities are recoverable under § 1920(4)‘s allowance for the costs of making copies of any materials. Those costs total $30,370.42. We find that none of the charges imposed by DMS‘s vendor are taxable, and that the award in favor of Hoosier should be reduced by$95,210.13, the difference between the electronic discovery vendors‘ charges awarded by the District Court ($125,580.55) and the charges of Hoosier‘s electronic discovery vendors we find taxable ($30,370.42).” {emphasis added}"

Can Technology-Assisted Review Co-Exist With Strategic Search?



 http://ow.ly/9QmSb

An article by the editor of Metropolitan Corporate Counsel, consisting of an interview with Amanda Jones, senior search consultant of Xerox.

This article discusses the use of advanced techniques such as predictive coding during the attorney review phase of litigation.

The article states, "We don’t believe that technology-assisted review is a substitute for predecessor technologies and methodologies. Rather, it represents a new alternative that is appropriate for some use cases and most often can be complementary to other established techniques, such as keyword search. Discarding search entirely in favor of technology-assisted review would unnecessarily forfeit the many benefits search has to offer.

The goal in any e-discovery review is to identify as many responsive documents as possible, while reviewing as few non-responsive documents as possible, at a cost proportionate to the value of the case. Review efficacy is measured by the information retrieval metrics known as recall and precision. Recall represents the extent to which all responsive materials are captured in a review. Precision represents the extent to which only responsive documents are captured. Thus, recall is a measure of completeness, while precision is a measure of accuracy. While perfect recall and precision are impossible to attain, a strategic review will strive to attain high scores on both metrics simultaneously in order to ensure that clients maximize the return on their review investment."

Monday, March 19, 2012

New Methods for Legal Search and Review



http://ow.ly/9JQbB

An article by Ralph Losey, Esq. on his e-Discovery Team® blog.


This article discusses many developments in the legal profession that are related to technological advancements used to enhance the quality and speed of the attorney review process. 


The article provides references to many cases and other articles that shed light on changes in the standard used to gauge the effectiveness of the attorney review process.


The article states, "The profession is beginning to understand that in today’s world ofToo Much Information, the production of all relevant information is a practical impossibility. See eg. Rowe Entm’t, Inc. v. William Morris Agency, Inc., 205 F.R.D. 421, 423 (S.D.N.Y. 2002)".

The article further states, "Instead, the goal should be production of as many highly relevant documents as is proportionate to the value and significance of the case. Rule 26(b)(2)(c) Federal Rules of Civil Procedure; Sedona Conference Commentary on Proportionality (2010); The Sedona Principles (2007), 2ndPrinciple:

When balancing the cost, burden, and need for electronically stored information, courts and parties should apply the proportionality standard embodied in Fed. R. Civ. P. 26(b)(2)(C) and its state equivalents, which require consideration of the technological feasibility and realistic costs of preserving, retrieving, reviewing, and producing electronically stored information, as well as the nature of the litigation and the amount in controversy."

Friday, March 16, 2012

FTC Looks to Link Do-Not-Track, Big Data Privacy Concerns; Seeks Solutions



http://ow.ly/9HlZF

An article by Boris Segalis and Nihar Shah posted on the infolawgroup.com website.

This article discusses the definitions of "Do Not Track" (DNT) as well as "Big Data".  The article also discusses government attempts to regulate DNT and Big Data.

The article states, "Big Data is poised to expand. The advent of the Smart Grid (which includes smart meters, smart appliances, electric/hybrid car charging stations and other elements of the utility infrastructure) will enable the collection of ever more precise and powerful information about consumer behavior. Again, the Smart Grid has the potential to boost the U.S. economy, but as the consumer information flows into Big Data, regulators will want the industry to play by the rules.

While Big Data is in flux, there are things data companies can do: understand how the company processes data, contractual and legal limitations on the data processing, best practices (including those gleaned from FTC guidance and White House and FTC reports) and enforcement risks, and implement privacy controls that are consistent with the organization’s business needs and risk comfort levels. We know that the departure point for FTC’s enforcement is privacy violations that the Commission perceives to be egregious. This should give some comfort to Big Data companies that strive to process personal data in a fair and transparent manner that they would not be the first door on which the FTC knocks.

Finally, while the DNT debate is raging, companies have at their disposal many existing options to be proactive in ensuring that their online privacy practices are fair and transparent in the eyes of regulators and consumer advocacy groups (e.g., BBB and NAI advertising opt-out programs, website analytics opt-outs and other tools). However the debate on DNT ultimately settles, companies can use these tools today to demonstrate their commitment to respecting consumers’ privacy choices."

5 Big Security Mistakes You're Probably Making



http://ow.ly/9Hbsl

An article by Roger Grimes posted on infoworld.com

The article discusses common security mistakes and provides discussion around the following five topics:


Security mistake No. 1: Assuming that patching is good enough
Security mistake No. 2: Failing to understand what apps are running
Security mistake No. 3: Overlooking the anomalies
Security mistake No. 4: Neglecting to ride herd on password policy
Security mistake No. 5: Failing to educate users about the latest threats

5 Big Security Mistakes You're Probably Making

http://ow.ly/9Hbsl

An article by Roger Grimes posted on the infoworld.com website.

This article provides tips regarding data security, and references five common mistakes.

The security mistakes discussed in the article are as follows:

Thursday, March 15, 2012

Will Recent Court Approval of Computer-Assisted Document Review Spur Acceptance in Antitrust Investigations?



http://ow.ly/9GkcF

An article by Craig A. Waldman Ryan C. Thomas and Carmen G. McLean posted on the Jones Day website.

The article discusses the recent case in which U.S. Magistrate Judge Andrew J. Peck requested a protocol from the parties to use predictive coding technology for the attorney review. The article discusses the impact that this decision might have on antitrust cases.

The article states, "Despite the increasing burden of e-discovery, private litigants and parties before the U.S. antitrust agencies have been cautious about embracing new e-discovery technologies to assist in identifying what documents are responsive to discovery or government requests.The reasoning is simple: concern that the software will miss documents that are critical to the case.This skepticism now faces a growing body of evidence demonstrating that the historic approach –"linear" document-by-document review, perhaps aided by the use of keyword searches – is no better, and likely is less accurate, than computer-assisted review. A leading federal court has now endorsed this empirical evidence. Last month in Da Silva Moore v. Publicis Group, Magistrate Judge Andrew Peck of the Southern District of New York issued an important opinion involving the use of computer-assisted"predictive coding" in electronic document productions."

In addition the article further mentions, "Nevertheless, judicial acceptance of predictive coding is a first step along the path towards FTC and DOJ, not to mention courts and agencies in other contexts, accepting these and other software tools as valuable tools for e-discovery.

The FTC already has acknowledged the importance of these new technologies. In January 2012, the FTC published proposed revisions to the Commission's Rules (open for public comment until March 23). One area for revision is how to address electronic discovery. Citing the widespread use of electronic materials and the need to improve the efficiency of its investigations, the FTC stated,"Document discovery today is markedly different than it was only a decade ago . . . . Searches, identification, and collection all require special skills and, if done properly, may utilize one or more search tools such as advanced key word searches, Boolean connectors, Bayesian logic, concept searches, predictive coding, and other advanced analytics." The FTC also proposed additional"meet and confer" obligations, which could lead to transparency like that cited in Da Silva Moore. The DOJ has not ventured into this area publicly."

Technology-Assisted Review: What Should We Call This Market?



http://ow.ly/9FRfO

An article by Barry Murphy posted on the eDiscovery Journal website.

This article discusses Technology Assisted Review (similar technology is referred to as predictive coding) and examines the question of what to call this emerging market.

The article asks for the users input to vote for what to call the new technology used during the attorney review process. The article states, "Jason Velasco had offered up PC-TAR as the term, while others have called it meaning-based coding, adaptive coding, predictive priority, transparent predictive coding, or automated document review. The bottom line is that, in the context of today’s advanced technological world, TAR is about using a combination of technology and people to actually speed, improve, and sometimes automate elements of the legal review process in such a way as to reduce costs and improve quality.

eDJ is working on a market overview report about technology-assisted review and we want your help in naming the category."

P.S.  Feel free to cast your vote by looking at the article and clicking on your preferred term.

The Future of the Review Attorney



http://ow.ly/9FPWA

An article by Albert Barsocchini, Esq posted on the EDD Update website on the Corporate Counsel webpage.

The article discusses the future of the attorney review process, and looks at how technology provided through outside service providers is impacting law firms.

The article states, "What will become of the current field of attorneys that have become dependent on review projects to sustain their legal careers while they await more permanent employment? Will large firms continue to sustain massive contract review staffs, or will review specialists develop niche areas of skills, such as securities, or patent law?

Hard to predict, but events certainly point toward a contraction in the legal industry, with fewer jobs available to those who don't secure law firm employment upon graduation.

A problematic part of attorney review is the disparity between the function vendors provide for clients and the educational requirement expected of those who participate in the process, i.e. nobody went to law school to review documents on a computer all day."

Wednesday, March 14, 2012

Machine Learning For Document Review: The Numbers Don’t Lie



http://ow.ly/9Ehbj

An article by James Shook, Esq. posted on the Kazeon.com website.

This article discusses predictive coding technology, and looks at the limits for the use of this technology, and looks at two important recent cases.

The article states, "Perhaps more important, according to recent studies the predictive coding process is also more effective than human or keyword review. Unfortunately, it is difficult to determine the true accuracy of human review because opinions, even among experts, can vary on whether a document is relevant to a case. (Maura Grossman & Gordon Cormack, Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review, XVII Rich. J.L. & Tech. 11 92011 at 9). But the bulk of available information implies that machine coding is better. In fact, some studies put a human reviewer’s recall– the percentage of relevant documents actually located – at less than 50%. The use of basic keywords is even worse, dropping recall to about 25%. (Grossman/Cormack at 18-19). Some predictive coding studies indicate that the process is far more accurate, in the range of 70% recall (Grossman/Cormack at 36-37). Given the lower cost and speed, recall that’s even close to the human level should be acceptable. (Note that other measures, such as precision and F1 (the harmonic mean of recall and precision) – are also important in this process. For more information see Grossman/Cormack at 9)."  A link to the referenced article is provided by the author.

In addition, the article looks at the recent opinions in the cases of Magistrate Judge Andrew Peck’s recent decision in Da Silva Moore v. Publicis, (where the parties were in agreement regarding the use of predictive coding, and the court sought a protocol from the parties), and the Kleen Products LLC v. Packaging Corp. of America, in which Magistrate Judge Nan Nolan is addressing an issue where one party has objected to the use of such technology, and the court is being asked to compel the use of predictive coding over the objection.

Managing the Law Firm Risk Role in Outsourcing




http://ow.ly/9Egl6

An article posted on the 3 Geeks and a Law Blog website.

This article discusses the role of the law firm in managing a project which involves outsourced services.

The article discusses the need for the law firm to be involved in the management of services being outsourced to service providers.  The article states, "This 'supervision' challenge already exists with many third party providers, such as e-discovery vendors. In those circumstances, smart firms are always evaluating the providers, making sure their processes are adequate. LPOs are really on a higher plane in this regard, as their services tread much deeper in to practicing law. With services like “contract drafting” and patent preparation, the evaluation and oversight of an LPO by a firm should be much deeper and hands-on."

Tuesday, March 13, 2012

Lessons Learned From Predictive Coding in 'Da Silva Moore'



http://ow.ly/9CKmS

An article by Rebecca N. Shwayri posted on law.com on the LTN webpage.

This article looks at the recent case of  Monique Da Silva Moore v. Publicis Groupe & MSL Group, Case No. 11-cv-01279 (S.D.N.Y. Feb. 24, 2012) in which U.S. Magistrate Judge Andrew J. Peck requested a protocol for the use of predictive coding technology.  A link to the court order is provided in the article.

The article states, "Predictive coding has many benefits -- it is cost-effective and can cut review time down to a few weeks. In contrast, manual review and keyword searches can cost up to $8.50 per document. In document-intensive cases, the costs can add up to millions of dollars and take many months to complete. Even more troubling, keyword searches can leave up to 80 percent of relevant ESI undiscovered. Predictive coding reduces the number of documents that need to be manually reviewed, which results in a significant reduction of e-discovery costs.

Unlike other search methods that are used throughout the legal industry, like keyword searches, predictive coding uses mathematical formulas that are derived from document coding choices made by experts, who are usually senior attorneys working on the case. The expert codes random sets of documents taken from the corpus of e-discovery and specifies the relevancy of the documents. The computer learns how the lawyer codes the documents and develops a formula for "relevancy." The formula is then applied to the entire document collection to locate all relevant documents in the case."  Footnotes are provided providing additional source materials.

The article goes on to discuss the importance of this ruling, as well as the rulings likely impact on future litigation.  The article further states, "Several factors motivated the court to endorse the use of predictive coding. In addition to the agreement between the parties, the court highlighted the 3 million documents to be reviewed, the superiority of computer-assisted review to the "available alternatives," the need for cost-effectiveness and proportionality under Rule 26(b)(2)(c), and the transparent process proposed by the defendant, MSL.

Citing the importance of transparency as an aspect of cooperation in the discovery process, the court explained that MSL made the court's decision easier because it agreed to provide plaintiffs counsel with all non-privileged documents and the issue tags coded for each document. The court hoped that parties in the future would utilize such a transparent process.

Finally, the court noted that while predictive coding technology can locate ESI much faster and at a fraction of the cost compared to traditional technologies, the technology is not a case of "machine replacing humans.""

Can These Plaintiffs Put the Keyword Horse Out to Pasture?


http://ow.ly/9CImC

An article by Jon Resnick, Esq. posted on the applieddiscovery.com website.

This article discusses the case of Kleen Products LLC v. Packaging Corp. of America, in which the plaintiffs have objected to the defendant's use of keyword searching, and want the court to compel a content based advanced analytics workflow.

The article states, "The defendants contended that the plaintiffs "provided no legitimate reason that this Court should deviate here from reliable, recognized, and established discovery practices" in favor of their "unproven" CBAA methods. Furthermore, the defendants asserted that they have "tested, independently validated, and implemented a search term methodology that is wholly consistent with the case law around the nation and that more than satisfies the ESI production guidelines endorsed by the Seventh Circuit and the Sedona Conference." According to the defendants' briefing, they produced more than one million pages of documents using their search methods during discovery. Thus, they expressed outrage that the plaintiffs would ask the court to "establish a new and radically different ESI standard for cases in this District."
The defendants cited a publication by The Sedona Conference from 2007, "Best Practices Commentary on the Use of Search & Information Retrieval Methods in E-Discovery," which quoted a 2004 federal district court opinion, saying "by far the most commonly used search methodology today is the use of 'keyword searches.'""

In addition, the article goes on to further state, "Fortunately for the plaintiffs, U.S. Magistrate Judge Andrew Peck’s opinion in Da Silva Moore v. Publicis Groupe came at the right time. Judge Peck's opinion is in harmony with the plaintiffs' proposition that computer-assisted review is superior to the alternatives. The plaintiffs went so far as to analogize the defendants' election of keyword searching to "choosing a horse as a mode of transportation . . . because it is the best available horse, even though technology has evolved and a superior form of transportation—the automobile—is now available."
Even so, the defendants may still carry the day. They rightly claimed that the plaintiffs "fail to show that the sophisticated search protocol used by Defendants is inadequate or that CBAA . . . is the only permissible means of searching electronic documents for potential production in this case." According to the defendants, the plaintiffs want the court "to support the unprecedented conclusion that search terms per se are an unacceptable methodology for locating potentially responsive ESI." Judge Peck's opinion even acknowledged that though keyword searches "are not overly useful," they can be when "well done and tested," and particularly so when added to predictive tagging and other methodology."

P.S. This opinion will be one to watch as it could lead to additional objections to the use of keyword searching, depending on the court's ruling.

Monday, March 12, 2012

Do Company Execs Know Sensitive Data When They See It? Many in IT Say No



http://ow.ly/9BeVY

An article by Roy Harris posted on CFO World.

This article discusses classifying and identification of corporate data that has been collected by IT for various purposes.

The article states, "Today’s companies, clearly very good at collecting data, seem “less savvy when it comes to how to classify and manage it.”

That’s the conclusion of a survey among 100 IT executives and others conducted by global consulting firm Protiviti, which finds that there is “limited or no understanding of the difference between sensitive information and other data” at nearly a quarter of the companies participating in its survey.

The report is titled “The Current State of IT Security and Privacy Policies and Practices." Its topics: how organizations classify and manage the data they accumulate; specifically how they ensure customer privacy when they handle sensitive data, and how they comply with federal and state privacy laws and regulations."  A link to the referenced survey and report are provided in the article.

Saturday, March 10, 2012

E-discovery in the cloud? Not so easy



http://ow.ly/9zABS

An article by Tam Harber posted on the itworld.com website.

This article discusses the impact that computing has on eDiscovery.

The article states, "Many lawyers and IT staff "just assume if they put data in the cloud it's going to be at their fingertips, that it's inherently discoverable," says Barry Murphy, co-founder and principal analyst at eDJ Group Inc., a consulting firm specializing in e-discovery. "That's not necessarily the case."


The cloud has dramatically expanded the number of places where electronically stored information (ESI) can live. Under the Federal Rules of Civil Procedure (pdf), a party to litigation is expected to preserve and be able to produce ESI that is in its "possession, custody or control."

With cloud, those duties are split -- the ESI may not technically be in your possession anymore, and yet it's presumably under your control, says James M. Kunick, principal and chair of intellectual property and technology practice at law firm Much Shelist P.C."

Friday, March 9, 2012

The Importance of People and Process in Electronic Discovery



http://ow.ly/9yurU

An article by  Daryl Shetterly on the e-Discovery Myth website of the law firm LeClair Ryan.

The article discusses issues that pertain to the planning and workflow process associated with the eDiscovery requirements associated with a specific litigation.

The article states, "More often than not the “eDiscovery problem” is really a communication problem, a planning problem or a failure to get the right people involved until it is too late to complete the task without Herculean effort that is disruptive to the organization. Everyone has their war stories, here are a few of the situations I hear of most often:
  • waiting too long to begin identifying sources of relevant data – resulting in a late start that makes each phase of the project a mad dash to production;
  • not properly preparing for the meet & confer process – resulting in a lost opportunity to fully utilize the cooperative potential of the process;
  • collecting documents too quickly – resulting in multiple collection requests to IT or the expense of using outside resources to re-collect once the scope of collection changes;
  • gross underestimation of the time and money required to complete the project – resulting in budget overruns and missed production deadlines;
  • using the wrong document review technology – resulting in inefficient and costly document review based on an inability to leverage technology properly; and
  • beginning document review too early – resulting in a costly re-review when the issues are clarified."
The article further states, "Both eDiscovery and project management are complex topics that are not mastered by attending a CLE or reading a few books. Quarterbacking the eDiscovery aspect of a case is best done by someone who has been doing so long enough to see the anomalies and to know how easily the budget or timeline can go off the rails."

Developing a comprehensive information management plan to facilitate eDiscovery - part 2



http://ow.ly/9ypQJ

An article by Tim Bovy posted on the inforiskawareness.co.uk website.

This article is part 2 of a series, and provides a link to the initial article in the series as well. The articles discusses best practices associated with implementing an information management plan for a business.

The article discusses the fact that many businesses fail to effectively plan how to internally use the resources that they have.  The article states, "In its 2010 survey, for example, Symantec noted that "an information management gap exists between what enterprises realise they should do and what they actually do." Less than 46 percent of the 1,680 senior IT and legal executives that they surveyed in 26 countries had "a formal information plan". The consequences of this inertia account, at least in part, for the unsettling fact that "the average life expectancy of a company in the S&P 500 has dropped precipitously, from 75 years (in 1937) to 15 years in a more recent study." In today's business environment, organisations that do not see IT as the handmaiden of the business are bound to fail."

Thursday, March 8, 2012

Law School Rankings



Article by Vivia Chen

http://thecareerist.typepad.com/thecareerist/2012/03/law-school-ranking-the-international-edition.html


QS Quacquarelli Symonds's which ranks universities around the world, offers a handy chart of the world's top law schools. Here are the top 10 based on academic reputation, employers' views, and citations per faculty member

1Harvard UniversityUnited States
2University of OxfordUnited Kingdom
3University of CambridgeUnited Kingdom
4Yale UniversityUnited States
5Stanford UniversityUnited States
6University of California, Berkeley (UCB)United States
7London School of Economics and Political ScienceUnited Kingdom
8Columbia UniversityUnited States
9The University of MelbourneAustralia
10New York University (NYU)United States

N.Y. Appellate Division Continues to Press 'Zubulake' EDD Standard



http://ow.ly/9x2VL

An article by Marshall H. Fishman and Dana L. Post posted on law.com on the LTN webpage.

This article looks at recent New York State Appellate Division decisions that uphold eDiscovery requirements set forth in the well document Zubulake litigation.

The article states,

"In U.S. Bank v. GreenPoint Mortgage Funding, Inc., the First Department held that the producing party should bear the initial costs of "searching for, retrieving and producing discovery," but that lower courts may permit cost shifting based on the factors set forth in Zubulake, described below."  A footnote is provided in the article.

In addition, the article goes on to mention the cost shifting factors of the Zubulake case, which were referenced in the GreenPoint matter, "The court did, however, rule that the lower court had the discretion under CPLR Article 31 to order cost shifting between the parties by considering the following seven factors set forth in Zubulake:

(1) [t]he extent to which the request is specifically tailored to discover relevant information;
(2) [t]he availability of such information from other sources;
(3)[t]he total cost of production, compared to the amount in controversy;
(4) [t]he total cost of production, compared to the resources available to each party;
(5) [t]he relative ability of each party to control costs and its incentive to do so;
(6) [t]he importance of the issues at stake in the litigation; and
(7) [t]he relative benefits to the parties of obtaining the information."

Judge Peck’s Predictive Coding Opinion – reporting the reaction



http://ow.ly/9wZmg

An article by Chris Dale posted on his website the e-Disclosure Information Project.

This article discusses reaction to the recent decision by U.S. Magistrate Judge Andrew J. Peck, in which he sough a protocol from the parties to enable the use of predictive coding for the document review.

The article states, "In what is, I think, the only UK article thus far apart from mine, Charles Holloway of Millnet reckons, in an article headed In the jaws of ediscovery, that an English judge would take the same view as a US judge in similar circumstances. There are judges and judges of course, on both sides of the Atlantic, but I think Charles is right. Millnet know whereof they speak in this regard, having been involved in the only UK predictive coding case whose outcome has been written up publicly – see my article Two predictive coding case studies emphasise time and cost savings, which involves a US case involving Epiq Systems well as Millnet’s UK one."  Links to the other referenced articles are included in Mr. Dale's article.

The article provides comments from several leading eDiscovery experts, and includes links to several other articles about this important topic.

Wednesday, March 7, 2012

The SEC and e-Discovery



http://ow.ly/9vr9L

An article by Wallis Hampton, Elizabeth Russo, and Canaan Himmelbaum posted on the DiscoveryResources.org website.

This article looks at the SEC and the way that they enforce eDiscovery production requests during investigations.

The article states, "Beginning with its 2001 Seaboard Report, the SEC has heavily emphasized the importance of cooperation in investigations while simultaneously underscoring the perils of being perceived as uncooperative. In 2004, Stephen Cutler, the SEC’s Director of Enforcement at the time, noted that the SEC “seeks to recognize, in its charging and sanctioning decisions (and in its decisions not to charge and not to sanction), efforts by companies to police themselves, report problems to the government and establish a solid culture of compliance.” More recently, the SEC has begun utilizing cooperation credits and deferred prosecution agreements as incentives, both of which historically have been limited to criminal prosecutions. If these carrots aren’t sufficient, the SEC also wields a number of powerful sticks, including the threat of obstruction charges and the risk of SEC action against attorneys themselves."  Foonotes are provided throughout the article, and certain specific cases are referenced as well, as examples.

The article goes on to discuss certain SEC requirements, and provides information on several topics:
  • Review Platform. 
  • Scanned Collections. 
  • Metadata. 
  • Custodians. 
  • Preservation of ESI. 
  • Accounting Workpapers. 

Governance Professionals Left in the Dark About e-Discovery Costs




http://ow.ly/9vpBd

An article by Aarti Maharaj posted on the business2community.com website.

This article discusses the costs of eDiscovery, and looks at what corporate officers know about their own employers eDiscovery expenses.

The article states, "According to a poll taken during this live event, an astounding 62 percent of corporate secretaries and general counsels are unaware of how much a company spends on its e-discovery; 25 percent, however, believe the cost at their own firm is less than $50,000.

‘In a case like this, it depends on the type of lawsuit companies are dealing with,’ adds Barnard. ‘Certainly, most companies have become savvy about having a good document retention program in place and e-discovery cases are going to increase in the next few years.’

When dealing with e-discovery cases, 41 percent of corporate officers feel that the biggest challenge is collecting and producing responsive documents that will be useful in litigation. Conducting relevant keyword document searches, determining which documents to retain for future e-discovery cases and limiting costs all tied at 16 percent."  A link to information from a webinar about this topic is also provided in the article. The article also references a recent model order in the Federal courts aimed at limiting the costs and scope of eDiscovery in certain types of litigation.

Tuesday, March 6, 2012

USING ALGORITHMS TO ADVANCE ACCURACY: A METADATA METHOD TO MARCH MADNESS


http://ow.ly/9tXgl

An article by Sekou Campbell, Esq. posted on the sportslaw.foxrothschild.com website.

This article discusses predictive coding technology, and looks at a similar algorithm based technology that is being used to rank sports teams.

The article states, "Southern District of New York magistrate judge Andrew Peck recently weighed in on the issue in his Opinion and Order, which cites heavily to his article, “Search, Forward: Will Manual Document Review and Keyword Searches be Replaced by Computer-Assisted Coding?” He lauds the process of computer-assisted coding as a way to find relevant documents more accurately and efficiently in high-volume discovery cases. Computer-assisted coding, is a search based on a human review of a “seed set” of documents (a small subset of representative documents). Based on that “seed set” review, computer software creates an algorithm that analyzes the contents of the documents selected for data and metadata like time of creation, author, and program type. Once the algorithm is created, the computer can code documents for relevance based on how the “seed set” was searched (the existence of certain data in the context of certain metadata). On the back end, a human can test the relevance of randomly selected sample sets to ensure the algorithm produced an accurate search."  A link to the referenced article is provided by the author.

The article goes on to describe the new method that is being used to rank teams based on their performance, in ways that differ from just looking at the team's final scores.

New York's 1st Department Weighs In on ESI Preservation



http://ow.ly/9tUXX

An article by Mark A. Berman posted on the New York Law Journal, and also on law.com on the LTN webpage.

The article discusses two recent cases VOOM HD Holdings LLC v. EchoStar Satellite LLC, and  Scarola Ellis LLP v. Padeh , links to both case opinions are provided in the article.

The cases are recent interpretations of New York State Law, as it relates to obligations requiring preservation of ESI.  In the Voom case suspension of normal document retention practices is the issue, and in the Scorla case, an affidavit from an IT professional was required to state that no further information was in possession of a party.

The Voom case was discusses as follows, "In Voom, the defendant had not implemented a litigation hold on ESI until after litigation had actually been commenced, and the "hold" did not suspend defendant's automatic deletion of emails, which automatically and permanently purged, after seven days, any emails sent and deleted by an employee from defendant's computer servers. It was not, however, until four months after the commencement of the lawsuit, and nearly one year after defendant was on notice of anticipated litigation, that defendant suspended the automatic deletion of relevant emails from its servers."  

The article goes on to state, "Accordingly, the 1st Department affirmed the motion court's finding that defendant's "conduct, at a minimum, constituted gross negligence" and that "a negative, or adverse inference against [defendant] at trial was an appropriate sanction, rather than striking [defendant's] answer, since other evidence remained available to [plaintiff], including the business records of [defendant] and the testimony of its employees, to prove [plaintiff's] claims.""

In addition, the article mentions the Scorla case, "the court directed defendant to produce an affidavit from a system administrator or other similar computer systems specialist which stated there were no responsive documents and also detailed the search methods used." In response, defendant produced an affidavit from a computer engineer stating that he found no documents on the internal network or servers. The motion court found the affidavit to be insufficient and directed that a supplemental affidavit be provided by the computer engineer:
explaining which computers and system were searched, the date of the search, what kind and type of search or additional searches if necessary were performed, whether a search was made for other types of electronically stored documents other than emails, whether a search was made for deleted content, and what the origins were of the nine emails attached as exhibits in the opposition to the motion for summary judgment."

Monday, March 5, 2012

http://ow.ly/9srKS

KMWorld 100 Companies That Matter in Knowledge Management



http://ow.ly/9sqlr

An article by Hugh McKellar published by KM World on the KMWorld.com website.

The article provides a list of 100 companies that provide technology or services aimed at assisting corporations with knowledge management functions.

The article states, "...because we were taking the lead in defining the knowledge management market, we created our first list of 100 Companies That Matter. We assembled a team of judges comprised of colleagues, analysts, system integrators, theorists, practitioners and a few select users to identify companies that were providing true solutions to real and profound problems and challenges.

Everyone involved in the judging process has his or her prejudices, of course, but we set those aside (for the most part) and assembled the list you see here, 100 companies whose products and services best meet the needs of our readership."  See the article for information about the 100 listed companies, and we wish congratulations to those entities.

P.S.   Special congratulations to kCura, Symantec and Stored IQ, who are named in the list of 100, and have all assisted the writers of the Litigation Support Technology and News Blog on past client specific projects.

Ethics of Electronic Discovery – Part One



http://ow.ly/9sgT3

An article by Ralph Losey on his blog e-DiscoveryTeam®.

The article discusses ethical issues that are related to eDiscovery.  The article refers to a prior article from 2009 by Mr. Losey, "Lawyers Behaving Badly: Understanding Unprofessional Conduct in e-Discovery, 60 Mercer L. Rev. 983 (Spring 2009)".  A link to the referenced article is provided in Mr. Losey's current article.

The article looks at four factors that are vital to attorney ethics, as the topic pertains to eDiscovery, "There are four fundamental forces at work in e-discovery, which when considered together, explain most attorney misconduct:

(1) a general lack of technological sophistication,
(2) over-zealous attorney conduct,
(3) a lack of development of professional duties as an advocate, and
(4) legal incompetence."

The article also provides a discussion of 6 ABA rules that pertain to eDiscovery ethics:

"To summarize our review of the ABA Model Rules of Professional Conduct, six rules seems especially important to the field of e-discovery:
Rule 1.1 – Competence
Rule 1.3 – Diligence
Rule 1.6 – Confidentiality
Rule 3.2 – Expediting Litigation
Rule 3.3 – Candor Toward the Tribunal
Rule 3.4 – Fairness to Opposing Party and Counsel"


The article also provides various illustrations that shed further light on the author's points, such as the diagram below: