http://ow.ly/8Qnn2
An article by Ralph Losey posted on the e-DiscoveryTeam® blog.
This article discusses other recent articles regarding discovery obligations, and methods to deal with measuring fees associated with the attorney review process.
The article states, "This time I direct you to an important article, Evaluation of Information Retrieval for E-Discovery, Artificial Intelligence and Law, 18(4)347-386 (2011). It was written by leaders of TREC Legal Track and established giants in the filed of legal search: Douglas W. Oard, Jason R. Baron, Bruce Hedin, David D. Lewis, and Stephen Tomlinson. They analyzed the now fully published test results of the experiments in 2008, and carefully examined the interactive task, topic 301, as the best test of competing legal search technologies. This task made use of a subject matter expert and an appeals process for quality control on relevance determinations. Four teams of experts participated in the test, two academic and two commercial. A well known e-discovery vendor wonthe test (scientists hate it when I put it that way). They won because they attained better precision and recall scores than the three other participants.
Now we come to the punch line, the winning vendor attained a recall rate of only 62%. That’s right, they missed 38% of the relevant documents. And they were the winner. Think about it. The other three participants in the scientific experiment attained recall rates of less than 20%! That’s right, they missed over 80% of the relevant documents."
Now we come to the punch line, the winning vendor attained a recall rate of only 62%. That’s right, they missed 38% of the relevant documents. And they were the winner. Think about it. The other three participants in the scientific experiment attained recall rates of less than 20%! That’s right, they missed over 80% of the relevant documents."
This article provides a link to the reference articles, as well as links to other useful information on this topic.
No comments:
Post a Comment