Monday, September 19, 2011
My Thoughts on Electronic Discovery's Per Gigabyte Pricing Model (Bob Krantz)
http://ow.ly/6yu7r
An article by Bob Krantz, posted on the EDD Blog Online website.
The article responds to a recent article about the possible demise of the per-GB pricing model. The author of this blog post defends the traditional per-GB pricing model, and provides argument for the logic of the per-GB approach.
As the blog post points out, "Technology is only one piece of the equation; if the technology is not used properly, or if the infrastructure is inadequate and there are not clear cut processes around how the technology will be incorporated any firm is going to be in trouble whether large or small. Technology is simply a tool for delivering results, just because I have a hammer in my hand doesn’t make me a carpenter."
The author goes on to point out, "The gigabyte is one of the few known measures when beginning the discovery process. Charging on the gigabyte enables companies to gain a better understanding of what the projected downstream costs will be.Through education and experience companies can leverage the lower cost filtering and analysis components to manage their costs moving downstream while interacting with their data early and often. This process allows companies to continue expanding the review set by applying what is learned through review back into the overall universe of case data to prioritize more data for review."
In addition, the following statement is made, "...companies will have the ability to evaluate the amount of data being hosted and managed by their service partner as well as the technologies built into their workflow and from there a fixed annual cost can be derived that will assist in sharing risk; but bear in mind whether directly stated or not this fixed cost will in some way involve some unit of measure and that measure will more than likely be the gigabyte."
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment