RVM Enterprises, Inc. eDiscovery

Blog Posts

How to Manage the Challenges of Change in Technology

A Q&A with RVM CTO Geoffrey Sherman

Geoffrey Sherman, Chief Technology OfficerIn our day-to-day lives, technology goes in and out of style every few years, from a PC operating system to the latest media platform. Fortunately, in most of these cases, upgrading to the newest technology can be accomplished with minimal expense and time.

At the enterprise level, however, changes in technology can pose big headaches, ranging from compatibility with existing systems, retention for compliance, security, and of course – cost.

Many companies will delay the move to new or changed technologies as long as they can, opting instead for patches and workarounds. But at some point, it doesn’t make sense to fight the tide any longer; change is inevitable.

Geoffrey Sherman is RVM’s Chief Technology Officer, responsible for overseeing and deploying information technology products and solutions used by both RVM’s internal workforce and its clients around the world. What did he have to say about managing the challenges of change?

 

What might drive a company to consider upgrading to a new platform or system?

There are a few things at play. First, a company should look at its needs and determine whether they are being met by its current technology solution. An aging system may have a negative impact on work product or be vulnerable to security flaws.

Even if the product still functions perfectly, the company’s business needs may have evolved to the point where they are no longer being met. Or, new products may enter the market that include features not available previously. The value of those features will be weighed to see if they provide significant value to warrant the upgrade.

What are a company’s considerations before transitioning to a new technology platform?

Once a company realizes that their technology is not meeting their needs, the first thing to consider is whether the problem can be addressed with something less impactful, such as a version update, edition change, or workflow change. Failing those solutions, a replacement product or technology may be in the best interest of the organization.

For RVM, a key consideration is the impact that this change will have on our user base. We also need to evaluate if this change will require heavy architectural changes. We are sensitive to whether this would be a full replacement for our current platform, or whether the two would co-exist in parallel.

What are some of the concerns with upgrades in general?

RVM’s goal is to be on the cutting edge of technology, while stopping short of the “bleeding edge.” This means that we approach any upgrade strategically to be sure that the product version change is well tested, often-maintaining smaller test environments for major applications. However, in spite of the planning and testing, not all upgrades will go as planned.

There are ways to combat this. The most basic way is to have a maintenance plan in partnership with the technology vendor to handle minor problems as they arise. In addition to this, we firmly believe in recognizing the circumstances that warrant backing out of a given upgrade and having a mature process to revert to a known good state.

What steps can a company undertake to ensure a smooth transition to a new platform?

If you are reading this, platform change has become inevitable. There are many steps involved in an enterprise-level transition to a new platform. The key is gaining buy-in from the user base and testing early and often in manageable batches. This is crucial at the beginning for two reasons. First, it creates a sense of ownership. Future users that are invested early will apply a more critical eye and be more likely to contribute to the process, ensuring satisfaction with the result. Second, when users have a better grasp of a product and understand how it can meet their needs, there is a much higher rate of user adoption, which in turn improves return on investment.

Testing is just a common sense practice for any technology rollout. We recognize that no solution is going to be perfect out of the box, so it is critical that we conduct rigorous tests to find as many bugs and fix them before we roll out the product to our clients.

How do you measure the success of a transition?

Before we attempt any transition, we develop a notion of what the success factors will be, and solicit client feedback afterward. We find that this creates a feedback loop resulting in lessons learned on both sides. Responses may come in the form of contemporaneous feedback (e.g., emails or phone calls about the product) or may be a more formalized approach using surveys. No matter what we hear, we make sure we are learning from the process.

Can you give an example of a successful migration that RVM implemented?

RVM recently transitioned to a SaaS-based email security offering. The transformation was smooth due to meticulous planning and testing, and the user community was pleased with the added functionality, clear documentation, and fanatical levels taken to validate that operations went as planned. This was a situation where we performed several days of IT lead training to ensure everyone knew the timeline of events and that their expectations were well set. It was such a smooth transition that afterward I had to walk the floor and chat with users to make sure they were all working as expected, since there were no service desk tickets logged.

What should organizations consider if they have a legacy litigation offering or are facing challenges with the current system?

RVM can reduce the burden of housing legacy systems and even offload some modern ones, allowing our clients to focus on the merits of their cases without worrying about upgrades, testing, and downtime. As an example, RVM can migrate Cases from Veritas’s Clearwell Platform, which has a large footprint of mostly static cases. RVM specializes in migrating Relativity Workspaces to eliminate the burden and upkeep of hosting Relativity in house. Such options offer clients the ability to put contingencies in place for service and support.

eDiscovery in O365 is Easy, But Still Best Left to the Experts

By Sean King, Chief Operations Officer

I admit it. I do my own taxes.  I like the control I have in organizing my finances and filling out a ridiculously complex set of forms and fields. Honestly, I do my own taxes because the software that is available makes completing it much easier. But despite the “risk meter” shown by the software, every year after I complete the process and triple-check all of my information, I never feel confident I did it 100 percent correctly. I worry about the potential for an audit and the stiff penalties that accompany a failed audit.

A lot of technology that has rolled out in the last few years takes complex tasks and reduces them to everyday functions. With cloud-based solutions like Office 365, management of a company’s email and legal functions that relate to data management and information governance are becoming routine. As someone who has spent his career working with and around legal professionals, I wonder whether we realize the potential legal risk that presents.

Recently I moderated an RVM webinar, Office 365 – The Unseen Legal Risks, where we elaborated on some of those risks inherent in the implementation and use of Office 365. There is an expectation of compliance, process, and collaboration that has greatly expanded over the years as new technology, such as predictive coding and technology-assisted review (TAR), has become more acceptable in the mainstream, as noted in court opinions from matters like Moore v. Publicis Groupe (287 F.R.D 182 (S.D.N.Y. 2012) or in Winfield v. City of New York, 2017 US. Dist. LEXIS 194413 (S.D.N.Y. Nov 27, 2017) where Judge Parker directed the City to use TAR instead of linear review. This trend is further complicated by cloud-based systems like Office 365.

As you may be aware, the heavy lifting in completing personal income taxes is the overall questionnaire. During this stage you enter in your family information, where you live, your W2 data, investments, etc. If you read or interpret the question wrong, the best tax calculator in the world won’t be able to help you.

Same thing with Office 365.

In Office 365, you need to create your rules and establish how your data will look. Companies who choose to go with an “out of the box” setup in their application may get a nasty surprise when it comes time to pull data for an investigation. One such example we learned about was how long Microsoft will store data. Your company policy may be to keep emails for one year, but if you’re not aware of your settings, you could be responsible for producing emails going back much farther.

There are other legal concerns as well, using a product that proudly associates with “the cloud,” which suggests that the data is in an unknown location. This could present jurisdictional concerns or GDPR compliance issues, as your responsibility for producing data or protecting privacy may hinge on the country or region in which your data is being stored. Just because your data is not in the United States does not protect it from the U.S. courts, and being an American company does not mean that your U.K.-based data is exempt from GDPR compliance.

Another important concern is whether like me and my taxes, companies are performing tasks that are perhaps best left to certified professionals. Typically, a company that receives a document request or subpoena will engage in a process overseen by a lawyer or outside counsel. But, with Office 365, it becomes easy for a company to bypass much of that process, believing that the risk is low. But, is that enough? What if I misinterpret or do not understand the function of search or analytics in O365 and do not get the right results?  Will I even know if it is right?  Do I know what O365 is NOT giving me, and should?  While it may seem easy, it may not be done correctly to meet discovery or evidentiary requirements.

RVM has written in the past about self-collection and the risks that it can entail. The logical interface and robust nature of O365 could lead even more companies down a road that we previously described as similar to driving with too little insurance: it may save in the short run, but in the long-term you’ll likely end up paying more.

Finally, Office 365 gives companies the ability to analyze and review documents.  As a litigation support professional, I recognize the power and effectiveness of this kind of technology, as have the courts who have started encouraging the use of analytics during document review. But, in the hands of someone lacking the proper training, such a tool becomes highly ineffective, resulting in potentially deficient production that can negatively impact summary judgments. The key question as we learned from Allan Johnson, from Actium LLP, was whether you are able to speak to the results you achieved and the process used to get those results. The best way to guarantee that is to ask about your O365 environment from your IT person or consultant and work with an experienced forensics professional familiar with O365.

We as professionals have a requirement and a duty to understand the technology that we use every day. I am concerned by the lack of understanding that companies exhibit about their Office 365 licensing, functionality, setup, and workflows. The courts will not accept ignorance as a legitimate rationalization for failing to meet the standards of legal competence, and most companies cannot afford the fallout from a negative ruling.

Doing your taxes on your own might be one thing, letting anyone do email collection and export might be a level of risk we should not take for granted.

 

Learning from Equifax: Preparing for a Cybersecurity Breach

by Danielle McGregor

 

data server bankOn March 1, Equifax announced that its 2017 data breach affecting over 140 million Americans had actually affected over 2 million MORE than previously reported. The initial breach took place on July 29, 2017, but wasn’t made known to the public until September 7, over a month later. For a month, information such as driver’s license numbers, birth dates, and social security numbers were exposed without their knowledge, and as a result Equifax and its reputation were excoriated.

Unfortunately, data breaches like this are not uncommon, and, in fact, might be becoming a norm rather than an exception. We need to ask ourselves what have we done to prepare for a data breach and do I have access to technologies that will allow me to respond quickly and efficiently?

Earlier this month, The Sedona Conference released an Incident Response Guide that addressed this scenario. Their guidance was for every company to have an Incident Response Plan in place for handling data breaches. This plan should be broad enough to cover any scenario, but provide key actionable details so that should the unthinkable happen in the middle of the night, everyone knows who to call first. The first step in the plan is to identify what format of data the organization has (e.g., digital, or paper) and where it is located.

RVM recommends that any protocol include three basic steps to minimize the risk to the public as well as mitigate any potential public relations fallout.

Step 1 – Determine the nature of the breach and fix it! This is easier said than done. The company’s IT department or technology experts have to determine what vulnerabilities may have given outsiders access to their information and plug the hole. This is important from a business perspective, but also very helpful when assuring the public that the problem will not be repeated in the future.

Step 2 – Engage a Data Forensic Team that can isolate the affected systems and collect the images of the breached data for your review and analysis. This will help to determine the extent of the damage of the breach by identifying affected parties, data sources, and the sensitivity of the information that was stolen.

Step 3 – Consult with legal counsel to determine what the law states in regards to a duty to notify and who should be notified. Determining what type of information was accessed will provide guidance for who outside of the governmental entities will need to be notified. While transparency may open you up to scrutiny, it will also help to establish a level of trust with the authorities and the general public.

Most states require that notice be given “without unreasonable delay.” For example, New York State requires that consumer notice be given in the “most expedient time possible and without unreasonable delay.” (N.Y. Gen. Bus. Law § 899-AA, N.Y. State Tech. Law 208) However, some states have a specific date limitation. Vermont requires that consumer notice be made “in the most expedient time possible and without unreasonable delay, but no later than 45 days after discovery.” (9 V.S.A. § 2435)

With so much on the line, time is of the essence, so it is critical to identify the affected information/data as quickly as possible. This is where analytics comes into play.

Doing simple linear review in these cases can take a lot of time, especially if the amount of data breached is large. Leveraging a large variety of processes, analytics can help narrow down the data or help to identify what is in the data.

After a data breach, analytics can help a company determine whether personally identifiable information (PII) was exposed and identify the documents in which this information is held. This is possible through the use of fact first, the idea of prioritizing what is known. What types of sensitive information could be accessed? What information is the most damaging?

RVM’s analytics team typically starts by identifying standard PII, which includes social security numbers, bank card numbers, etc. With the use of technology and our analytics experience, we can quickly identify documents that contain social security information and isolate those documents for review. After the breached data is identified, the next step is to determine whether it contained trade secrets or privileged information. When you know what you are looking for, analytics can help shorten the time spent on the search.

A large-scale data breach can be a scary event for any organization, no matter the size. However, by adequately preparing for this likelihood and applying sound analytics, it is possible to mitigate the damages and maintain a positive relationship with stakeholders. In particular, companies with large volumes of sensitive data may do well to work with an advisor capable of developing a plan and implementing it.

While no company wishes to go through this ordeal, the important thing is to take the proper steps to minimize the likelihood of it happening again.

The “A” in AI Needs a Makeover

By Jeanne Somma, Director, Analytics and Managed Review

When I first joined RVM, I had a meeting with a law firm client that was supposed to focus on how RVM could use technology to cut costs for the firm and ultimately their client’s. What it turned into, though, was a frank conversation about the firm’s reluctance to rely on technology without something more to back it up. The client put a name to that “something more,” calling it “legal legs.” His frustration was that technology companies are selling tech to lawyers, and that these things don’t have the proper legal legs to warrant complete adoption.

At the time I found that to be an interesting and frustrating stance. There is case law, metrics, and a whole host of other things that I believed to be the legal legs necessary to back up the use of technology. I left that meeting with a sticky note that simply read “tech needs legal legs” and fear that I wouldn’t be able to bridge the gap between what is possible when using technology and what is acceptable to the lawyers that use it.

It’s no surprise that many of the conversations and presentations at this year’s Legalweek centered on arguably the most polarizing technology, Artificial Intelligence (AI). The use of the term in the legal sector has grown exponentially, and at Legalweek that talk seems to reach a fever pitch. But even as session after session describes the amazing things that AI can do for the legal community, I wonder if the mass adoption promised by presenters is reasonable in the current environment. Put another way, we need to focus on evolving the conversation, rather than just evolving the technology.

One of the sessions that I was excited to attend was The AI Bootcamp. There the conversation was more of the same, with unfailing optimism that not only could the technology solve the problems of the legal sector, but that the sector would soon embrace it for those reasons. The feeling in the room was that there was a wide agreement that AI would be adopted and permeate the legal sector. That is, until the questions started.

The questions from the audience dissected the great divide between the technology’s capabilities and an attorney’s faith in the result being the same or better than if they had used the processes they are used to. With each comment from the practicing attorneys in the room, I was reminded more and more of that sticky note in my office – “legal legs.” The technology is ready, but it seems the lawyers may not be.

As I listened to the dialogue between the adopters and the more reticent participants, the real difference of opinion boiled down to defensibility. Some attorneys were finding it hard to rely on a system that made decisions using algorithms that were not easily articulated. These attorneys wanted to be able to tell a judge or opposing party that the results of an AI exercise would be the same or better than if they had done it without the technology. How can they do that when the machine is doing the thinking?

Looking at my notes and seeing the word “artificial,” I realized that that was my stumbling block. It’s the implication that the results are being driven by a machine, which is not accurate. The type of technology that we use across the legal sector – whether in contract analysis, legal research, or predictive coding — is meant to take human understanding and amplify those results in order to provide the user with an efficient way to get to their result. The process is the same – a human with the required intelligence on the subject must train the machine on their thought process and desired results. The machine then simply takes that knowledge and applies it across large data sets faster than the human could. The machine isn’t deciding anything that it wasn’t already told by the human. What it does do is amplify the human’s intelligence. In other words, people are still driving the results, but with the assistance from the technology.

Termed as amplification rather than artificial we take the mystique out of the process. We bring the real definition to light – a process that leverages human intellect in order to train a machine to do the resulting work quicker. The result is the same because the human’s intelligence and input at the outset is the same. Also the ability to check the work is the same. The only thing that’s changed is the speed with which the process can be completed.

We need to change the conversation as technology and legal service providers. We need to focus on the amplification power of AI solutions and the defensibility of relying on the human subject matter expert. Until we can show the legal community that AI is not artificial at all, we will continue to have this battle between capabilities and adoption.

I for one want to solve this problem – if only so I can finally throw out that sticky note.

Q&A with Jeanne Somma About Legalweek

On January 29, legal and IT professionals from all over the country will be heading to Legalweek, hosted by ALM.

Legalweek LogoFor us at RVM, attendance at Legalweek is a must. Where else can you network, exchange ideas, and leverage the expertise of representatives from a large swath of the legal profession including corporate counsel, law firms, corporate IT, or any of the myriad professions that work together to provide eDiscovery services? We pride ourselves on delivering products and services that meet or exceed our customers’ expectations, so it is critical that we maintain our up-to-the-minute understanding of the landscape, which we can do at Legalweek.

To get a better understanding of Legalweek and why it’s so important to firms like RVM, we spoke with RVM Director of Analytics and Managed Review, Jeanne Somma.

Q: What makes Legal Week the “IT” place to be?
Legal Week is the perfect storm. It’s one of the biggest legal conferences in the U.S. and comes right at the perfect time. I know that I’m always focused on growing professionally and also finding ways to grow our business come January, and LegalTech really provides the right concentration of knowledge and technology to help me chart a course for the rest of the year. It’s also a conference that, if attended correctly, provides a way to tailor your experience to your needs. There are so many education programs, technology demonstrations, and chances to network that it’s like a live action choose-your-own-adventure eDiscovery style.
Q: What are you looking forward to seeing or hearing while you’re there?
Last June I joined RVM to head up the Analytics and Managed Review service lines. As part of that role I have been focused on the best use of all of the analytics tools and processes we have in house – especially as it comes to offering our clients what I think of as the next-generation managed review process meant to offer the most cost-effective and defensible experience in the market. That said, I am really looking forward to exploring what new analytics tools are out there, or how analytics technology has grown from last year in order to keep providing the most forward thinking services to our clients. As data volumes grow and technology quickens its pace, we can’t afford to accept that we are good enough. We need to keep ourselves at the forefront, and having all this access to knowledge during the conference will really help to achieve that.
Q: What do you see as RVM’s role at Legalweek?
RVM’s focus has always been on building relationships and providing outstanding customer service. We are excited to discuss the innovations RVM is rolling out in 2018 and really want to focus on having those discussions on a personal level. Our goal is always to give our customers an individualized and personal experience. So, while my colleagues and I are at the event we’ll be working on our connections – making time for existing relationships and making new ones – as well as improving our understanding of the issues in the market that affect our customers so we can provide more effective consultations. On the 29th (the first day of Legalweek) RVM will be hosting a private dinner with our leading corporate counsel and law firm eDiscovery clients to discuss the current state of eDiscovery and what we see happening in 2018.

Look for Jeanne and other members of the RVM team who will be on the ground at Legalweek to get their take on the show and on what 2018 holds in store for eDiscovery.

 

Key Considerations Before Migrating to Office 365

Companies of all sizes are preparing for their transition to the cloud. Office 365 (O365) will likely be a foundational part of that transition, particularly for small- and medium-sized businesses.  The transition is certain: it’s no longer a question of if, but a matter of when businesses will do it.

For small- to medium-sized business, there are a number of things that must be considered, from internal processes to compliance. While the benefits of migrating to the cloud may be clear: lower operational costs, simplicity, scalability, redundancy, and easy mobile access – the risks are easily overlooked.  We’ve compiled a list of things to consider before making the big move to the cloud.

    1. Data Protection
      Office 365 ChecklistCompanies with highly sensitive data naturally have heightened security needs and would be wise to consider how comfortable they are with having all of their data stored on a public cloud server. While O365 is very secure – it maintains high standards for backup and encryption procedures – migrating entirely to the cloud is effectively entrusting your data to a third party. A solution partner like RVM can help your organization adopt best practices such as minimizing the identity information copied to the cloud, providing policy to block unauthorized access, and employing multifactor authentication and integrated device management. Industry standard security parameters are available and can be customized to fit your organization’s requirements. Depending on the complexity or simplicity of your environment, it may be recommended to look for a hybrid solution where some mailboxes remain on premises as others move to the cloud. This allows you to test as you migrate.
    2. Compliance
      Many businesses today are bound by compliance. While this may have prevented businesses from adopting the cloud in years past, it’s less of a hindrance now that the Financial Conduct Authority has approved cloud usage (including public cloud providers). That said, understanding your company’s compliance responsibilities should still be a consideration before migration.  These may affect your company’s use of document retention and  data export settings, should you need to demonstrate documentation in response to a subpoena or compliance investigation.  Your licensing package, volume of data, and software expertise impact how efficient or inefficient this endeavor can be.
    3. Litigation Readiness
      Often companies overlook the business need to be litigation ready. They look at solutions like O365 as a means to reduce their operational costs related to IT and forget that there may be an impact down the road, like when faced with an SEC subpoena or a civil litigation.  When implementing O365, companies need to  conduct analysis beyond email, and consider additional impacts such as email archive solutions, integration with other business systems, and how to functionally use it to accomplish data exports or other recovery tasks.  Companies often realize their inability to accomplish these things too late, when they are faced with subpoenas and document requests, and end of paying a lot of money to quickly fix what they already spent a lot of money to implement.
    4. Managing Accounts
      For small- or medium-sized businesses, finding a solution to automate the cumbersome process of setting up accounts across cloud apps is crucial to success. Tools that enable provisioning of users for all services can be difficult — especially if you have custom or legacy apps that require complex configuration – but often pay off, as provisioning is typically the easiest way to add new users into the Active Directory. There are a number of options for managing synchronization between Active Directory and O365, supporting third party applications and single sign-on, and providing multiple accounts for multiple applications.
    5. Licensing
      O365 licensing includes many options. Many users will require different levels of access, based on use case. A valuable asset of O365 is the ability to avail the right toolsets to the right users. The platform also enables administrators to track license consumption and availability, reducing costs and simplifying true-ups.
    6. Hands off – Patch Management & Control
      Moving to O365 means giving up control over elements such as the patch management process, software upgrades, and other administrative tasks that could previously be performed on premises. Many organizations use third party utilities to manage their internal servers (Microsoft Exchange, Lync/Skype and SharePoint), but utilities designed to be installed directly on a server won’t work with O365 – as the management is done through O365’s portal. One benefit of remote management is that Microsoft pushes out environment updates regularly, meaning that users will always be running the most recent tools.

Above all, there is no one right answer for all organizations. Each should take the time to consider all the factors mentioned above (and any others that are relevant to the company or industry) and weigh the pros and cons. Should your company elect to migrate to O365, it is critical that you do so strategically, and with consideration for the safety and security of your data. Hiring a company like RVM to oversee the migration can ensure a proper setup protecting your company from threats now and in the future.

Leading Technology Through Strategy

As 2017 comes to a close we at RVM are taking stock of the changes we’ve seen this year and honing our strategies to remain on the forefront of analytics and technology application in eDiscovery in 2018. eDiscovery has undergone immense change as technology has evolved to tackle growing data sources and foster the needs of the attorneys wading through them. While that evolution has resulted in improved workflows adoption of these workflows has thus far been slow.

Technology Options

There are myriad technology options – a seemingly unending list of interesting tools that promise to push our industry into the future. It would be easy to race right to artificial intelligence (AI) and push ourselves into the sphere of the futurists. However, as we discussed in our recent webinar “Demystifying Analytics, Automation, and Predictive Coding in eDiscovery” there is no one-size-fits-all solution for the best application of technology and analytics, and the focus should be on the project process and goals – not the technology.

The webinar was designed to make attorneys comfortable with the many ways analytics can be used to accomplish your matter’s goals in the most efficient and — more importantly — defensible way. We also wanted to highlight that the courts are quickly adapting to these changes and embracing counsel’s use of technology up to and including predictive coding. The most pertinent decisions are summarized in our webinar materials. Full versions of those cases can be found in the Sedona Conference TAR Case Law Primer.

Those thoughts were echoed in a recent article for LegalTech News entitled “eDiscovery Leaders Look to Methodology, Not AI, to Update Toolkits.”

Applying the Technology

The article recognizes industry experts who agree that parties have become more comfortable with the technical aspects of eDiscovery and seem more willing to utilize technology to accomplish their goals. They see increased adoption of technology-assisted review (TAR) and predictive coding on the rise, and the courts support this evolution. The continued and thoughtful use of technology will make for better case outcomes, but the process needs to match the goals. The article’s author, Ralph Losey, points out that “Software improvement by vendors should be a constant process, but that is usually beyond the direct control of lawyers. What we can control is the methodology.” We agree with this sentiment.

Our aim for 2018 is to continue to be on the cutting edge of technology application for our clients, by coupling it with strategic consulting in order to leverage the right technology and process to meet a client’s goals. Without the process, the technology will not succeed on its own.

It Pays to Use Formal Discovery

Preparing for litigation comes with a mountain of expenses and challenges —much of which are attributable to discovery. And, as data volumes grow, so too, do those discovery costs. Unfortunately, eDiscovery is often misunderstood by clients and rationalized to be more complicated than it needs to be.

In an effort to contain the rising tide of costs and perceived complexity, some litigants are undertaking “informal discovery” — a process that on its face seems like a cost-effective and ideal option. It allows for the exchange of key documents without the burden of production format, custodian tracking or consideration for defensibility. In a common scenario the client will comb through their own inbox and send the relevant emails to counsel.

Sounds like a good deal, right?

“Clients don’t like the idea of paying money for things that they believe they can do themselves,” says Greg Cancilla, Director of Forensics at RVM Enterprises. “Collecting data can seem more like a job for an intern than an eDiscovery and legal forensics firm.”

Although it might seem like a cost-effective approach, parties that engage this way may be in for trouble.

The Trouble with Informal Discovery

Common Missteps in Informal Discovery
Self-selection of relevant documents
Self-collection of ESI
Emailing documents to counsel as attachments
Copying and pasting files to external media or an FTP site
Producing ESI by a) printing to hard copy or b) converting the files to .pdf
Bates numbering documents individually

A major concern with informal discovery is the risk exposure regarding authentication of evidence and the potential extra time and costs one might incur to correct the collection of data.  While eDiscovery providers have developed systems and technologies that enable them to work quickly and efficiently in an appropriate review environment, an informal approach does not offer those advantages. eDiscovery providers take the appropriate time and use the correct processes to collect data so that it can be done once, efficiently, and defensibly. With informal discovery, if further searches are warranted, the entire process may need to be repeated, adding undesirable costs and time.

Another issue is the likelihood of altering metadata. By using the “copy and paste” — or “foldering” —approach to data collection, you run the risk of modifying key dates such as last opened, last modified, etc. This can make authentication problematic, and makes it harder to sort and de-dupe files that have been modified, again adding to cost.

The most important shortcoming of the informal method is the unnecessary risk of misstating the scope of the production of electronically stored information (ESI). (Applied Underwriters, Inc. v. American Employer Group). In some circumstances, courts have held that self-identification and collection may not even be defensible.

According to Cancilla, “Self-collection puts all the responsibility on the custodian to determine what ESI is relevant. Foldering in particular can be troubling, as even well-intentioned clients may simply not realize that certain sources, a sent mail box for example, need to be included in the folder to be produced.”  In today’s age of electronic information, it is important to note that relevant information is not just the substance of the document, but also the metadata — or surrounding information — of the document.  FRCP Rule 34(b)(2)(E) advises that a party must produce documents “as they are kept in the usual course of business” or must “organize and label them to correspond to the categories in the request.”  “Informal Discovery” adversely impacts that instruction.

Changes on the Horizon

Two proposed amendments to Federal Rule of Evidence 902 are set to take effect on December 1, 2017 that will significantly affect the collection of ESI and its admissibility. In addition to providing a structure for standardizing ESI collection, these amendments, 902(13) and 902(14) demand a stricter, more organized method of collection that is outside the scope of informal eDiscovery. Where the current version of Rule 902 allows for self-authentication of certain types of documents, the new additions allow for authentication of electronic evidence by an affidavit of a “qualified person” who can certify in writing that the document was obtained with the requirements of Rule 902(11) and (12).

“The new rules are changing everything,” continues Cancilla. “It doesn’t make any attempt to disincentivize self-collecting, but by making ESI gained through formal discovery ‘self-authenticating,’ the advantages are well worth any cost to work with the professionals.”

The new rules cover records that can be authenticated using a document’s hash values, which are assumed to be unique. For purposes of authentication, hash values are the backbone of the proof that Rule 902 requires, but not the only allowable method. As the Advisory Committee on Evidence notes, “[t]he rule is flexible enough to allow certifications through processes other than comparison of hash value, including by other reliable means of identification provided by future technology.”

As December draws closer, parties must consider the implications of these rule changes and how they may affect authentication in upcoming trials. If they wish to take advantage of the new rules they must be prepared to track digital fingerprints on any new collection. If they don’t, they stand to spend more time and money authenticating their documents, including having their own in-house IT and network administration staff called to testify.

Says Cancilla, “Using the informal method of discovery is like driving with too little insurance: you’ll save money for a while, but if anything bad happens, you could wind up paying for it. Companies should remember that a well-documented and formalized data collection process is a small investment relative to the overall eDiscovery spend, but can significantly affect accuracy and defensibility.”

###

Greg Cancilla, EnCE, ACE is a Certified Computer Forensic Engineer and the Director of Forensics at RVM. He has performed countless digital forensics investigations since entering the field in 2003. Additionally, Greg has offered testimony in numerous cases, including presenting a key piece of evidence in Ronald Luri v. Republic Services, Inc., et al., which rendered the largest verdict in the State of Ohio’s history.