Archives for mcook

RVM Top 5: Reasons to Revise Your Data Retention Policy

Do me a favor. Take a look at your records retention policy.

I’ll wait.

Did you do it? It looks fine, right? All the language is there. The dates and times are in it, the “I’s” are dotted and the “T’s” are crossed. It must be good enough.

But how well is it really going to serve you in the event of a legal hold, and is it costing you money simply by its own inefficiency?

To help you better make that determination, we at RVM have compiled a thorough list of reasons why you may want to consider updating your policy, taking into consideration both the legal liabilities represented by the policy as well as its cost to your business operations.

Here are RVM’s TOP 5 REASONS TO REVISE YOUR DATA RETENTION POLICY.

You have retained data old enough to sit on your board and vote on a data retention policy.
Preserving records is important, and different agencies will have different reporting requirements. That said, making a determination about how long to retain your data and sticking to it will save a lot of headaches.

Determining who is responsible for managing a legal hold turns into a game of “not it!” among your leadership team.
Data may have many owners. But in the event of a legal hold or investigation, there’s no time for disorganization. Make the determination ahead of time who will take responsibility for coordinating a response.

Your data storage bills are bigger and more complicated than your quarterly tax statements.
You’re paying for all that data you’re storing. So why do you want to pay for data that you will never need?

“Where is our data?” is really more of a rhetorical question than one for which you have good answers.
Part of developing a record retention policy is identifying the locations of all the data – an important exercise that can ease the burden of collecting in the event of a legal hold.

The technology at the backbone of your policy, helping you maintain and organize your data is Microsoft Excel.
Excel is a great program, but responding to a legal hold or investigation is serious business, for which there are serious tools. The most efficient way to proceed is to utilize one of those tools.

It’s Time to Take action Against IP Theft

Recently, Tesla CEO Elon Musk was forced to admit that his company was the victim of sabotage by one of its own employees. That employee, frustrated over recently being passed up for promotion, applied damaging code to the company’s manufacturing system and shared large amounts of sensitive data with third parties.

Given the company’s desperate need to make progress following a string of negative announcements, the timing couldn’t have been worse.

Tesla’s situation, though perhaps one of the highest profile cases, is not new or unheard of. Companies quietly monitor their workflows and processes for any signs of IP theft or sabotage by disgruntled or even misinformed employees. Very often, it’s simply a case of those employees taking the work product that they created, believing that they have ownership. In other cases, an employee may copy large contact lists hoping to maintain and divert relationships to a new employer.

Whatever the theft, and whatever the motivation behind it, this particular crime is common and can cause a company not only financial loss, but the potential for serious reputation damage and even litigation.

Roughly 50 percent of employees will take work product when they leave a company, and close to 40 percent will attempt to leverage that work product on behalf of their new employer.

But what can we do about it?

Most companies leverage commonplace strategies, such as blocking employees from using online storage sites such as Dropbox, or disabling USB ports so that files cannot be moved to USB storage devices. The fact is that these methods are only a minor stumbling block for an employee intent on taking work product.

In the past, to determine whether information was stolen, companies needed to do forensics work, costing a lot of money, time, and resources.  It is hard to measure an ROI for a process like this because you cannot assess the value of an event that may have been prevented, and you cannot assume the result before you commit the resources.  Many companies struggle to see the value in building processes that protect their IP in the face of committing resources to R&D, service line launches, shareholder rewards, or employee benefits.

Understanding this challenge and leveraging its forensics expertise, RVM created a tool – Tracer – to analyze computers and identify activities that might be affiliated with potential IP theft. It is designed to look for user behaviors (online and offline) that may indicate an employee’s ill intentions. The tool can sweep through the user’s actions looking for files and actions and can draw attention to troubling patterns to guide an employer’s decisions.

But, technology alone may not be enough to overcome the problem.  Leveraging experts that can properly assess the problem and collaborate with a company to right-size the solution is a powerful next step.  The best way for companies to protect their IP is to ask the hard questions regarding its value and be prepared to take action.

Tesla is a strong company with a stable revenue stream, and will likely weather this storm. Other companies may not be so fortunate.

Navigating Dangerous Waters: Using Technology to Sail Through Second Reviews

by Jonathan Alef

With the economy heating up leading to cash-heavy corporate balance sheets, navigating the corporate environment has become akin to sailing through pirate-infested waters, as companies ride the tide, snapping up smaller entities through mergers and acquisitions. The bevy of activity seen thus far in 2018 is on pace to eclipse the record of $4.7 trillion set in 2015.

What the public sees through the headlines and articles in the media often belies the complexity of the work being done in the background. The FTC and DOJ, empowered by the Hart-Scott-Rodino Act (HSR), investigate large proposed mergers for potential antitrust issues, potentially forcing second reviews. The large volumes of data and tight deadlines that are common in these proceedings, make them an ideal setting for an analytics suite and technology assisted review (TAR) workflows.

second reviewA second request is much like it sounds, where the acquirer in the transaction — if it meets certain thresholds — has to submit filings and a fee to the FTC and DOJ for them to review and grant approval of the merger. However, if either agency believes that the proposed merger would likely result in negative impacts to competition, a second request is made to allow the agencies the opportunity to perform a deeper investigation. Second requests can cover anything and everything deemed necessary to ensure compliance with antitrust provisions.

In recent years both agencies (the FTC in 2015 and DOJ in 2016) amended their Model Second Request forms to allow for the expanded use of analytics and TAR workflows. This is a potential blessing, so long as you follow their prescribed procedures. Most find it worth gaining written approval prior to implementing these tools and processes so that their company can see substantial time and cost savings over traditional linear review. Prior to the amendments, companies were required to review documents one-by-one for responsiveness, PII, and Privilege, often necessitating an army of contract attorneys.

When reviews are augmented with technology, they can usually be completed in a fraction of the time and at a reduced cost. For some background, there are two primary analytics functions that are always helpful to employ when performing a review, especially when used together: email threading and similar content analytics. Email threading is primarily used to organize review content, but can also be used to reduce duplicative content. Meanwhile similar content analytics (conceptual searching, near dupe analysis and clustering) work to identify the information that is most sought after by the requesting agency faster that it can be moved to the forefront of production. Compared to traditional keyword searching, these procedures offer significantly more flexibility, helping to produce results more quickly to meet deadlines.

TAR workflows incorporate machine learning, as a team of subject matter experts review a small segment of the total data population, and “feed” their learning through machine algorithms. The machine can then take this learning and apply it to the larger population, quickly identifying and classifying responsive documents. TAR is tremendously valuable, but does still require careful vetting for privileged and PII content, and the final outcome must include robust reporting of the process and metrics as required by FTC and DOJ.

In the past decade 575 mergers and acquisitions required second review, about 3 percent of all eligible transactions. That’s more than triple the likelihood of having your taxes audited by the IRS in 2015. These companies were subjected to a painstakingly laborious process requiring countless hours of discovery and legal review. Meanwhile, trillions of dollars rested on the findings of those second reviews, making technology-guided processes that whittle down and organize the review population critical.  The smaller, richer review populations go a long way toward minimizing the risk and the costs.

On the Ground in the EU: Key Takeaways on GDPR

GDPR data privacyOn May 25, the General Data Protection Regulation (GDPR) took effect in the EU, and the world has scrambled to demonstrate its compliance. With so much on the line, many companies have been turning to vendors experienced in multinational cross-border cases to better meet the standards and requirements of the new regulations. RVM’s clients are no exception, often having terabytes of data stored in physical and cloud servers around the world.

RVM recently completed its first in-country data privacy review since GDPR went into force. Our team was contracted by a U.S.-based multinational corporation that required onsite privacy culling to meet some of the guidelines set out in the new regulations. Through the process, RVM forensic engineers collected and reviewed custodian emails and file data in country by performing searches based on relevance and date. The data was exported to native and load-file formats for upstream hosting and review in the United States.

To ensure that the work being performed was in compliance with GDPR, RVM worked throughout the project with local and outside counsel – including Data Privacy Officers – to ensure all documentation and agreements were in place.

 

DOCUMENT, DOCUMENT, DOCUMENT

There are a lot of moving pieces with GDPR, so it is important that all parties have an understanding of the prescribed rules and work in hand with the data privacy officers to build a process that meets both the business and legal requirements. The more you can demonstrate in writing, the better. Some of the documentation, like data privacy agreements, should be in place before your team ever gets on site. Documenting each step in the process ensures the safety of both the vendor performing the work and the client, and it can affect your ability to complete a project on time.

Avoid GDPR Fixation

There is no question that GDPR is new and important. However, the EU is not the only place that has rules and laws governing data handling and privacy. Large projects may involve data stored or moved between multiple countries and multiple jurisdictions. Satisfying GDPR regulation is important, but companies need to be aware of other regulations that may differ from or even supersede those of the EU. For this reason it is critical to be in communication with client counsel, other data processors, and the data controller where you are working to ensure compliance in all relevant jurisdictions.

Ask Before You Move that Data

In this example RVM experts were able to satisfy the GDPR requirements for data export to a third country when it ingested data that originated in the EU into a review platform in the United States. Through GDPR data containing personal information cannot simply be transferred outside of the EU. It is critical to work with the client counsel, other data processors, and the data controller to complete all expected processes and identify and obtain consent where required to complete the project.

The Necessary Evil of Search Terms

by A.J. Strollo

“Having lawyers or judges guess as to the effectiveness of certain
search terms is ‘truly to go where angels fear to tread.’”
Magistrate Judge Facciola,
United States v. O’Keefe, 537 F. Supp. 2d 14, 24 (D.D.C. 2008)

This statement was made 10 years ago, and the wisdom – particularly when looking at the complexities relating to term syntax and what exists within data sets – has only become more prescient. Search terms can seem fraught, if not outright risky. So why do we continue to rely on them?

Despite the concerns surrounding keywords, and even after all the recent technological gains, they remain the most common way to cull data for potential review and production. The reason for this is likely that they are familiar, and as we all know, the legal community can be slow to move away from the tried and true, particularly when the alternatives involve relinquishing control to machines.

It’s relatively easy to generate a proposed list of terms, run them against the data, and determine how many documents the terms capture. But knowing whether the terms actually capture information of interest is a different story. Along those lines, Magistrate Judge Facciola noted that whether the terms “will yield the information sought is a complicated question involving the interplay, at least, of the sciences of computer technology, statistics and linguistics.”  Id.

Facciola may have said this because of the way lawyers often use the search results without substantive analysis. A common practice when running terms is to look at the volume of data that is returned, rather than the quality or effectiveness of the search. So, if the data returned is significantly higher than expected, the lawyer may narrow the terms arbitrarily with the goal of reaching the “right” number of documents. How they determined what is “right” can be a mystery. These adjustments may yield fewer results, but also risk eliminating necessary ones. While that’s not to say that this practice is haphazard, it does lack defensibility, especially if parties are locked in a contentious battle over the scope of discovery.

For me – and I think Facciola would agree — instead of volume, a better focus is on the effectiveness of the terms, measured not solely by number, but on the richness, or “relevancy rate,” of the potential review population.

So how do we make keywords and search terms more effective and assuage the “fears of the angels?”

A big step is to perform substantive analysis of any search terms rather than the commonly used guess and check method. When the starting point is a list of proposed terms from opposing counsel with an uncertain level of effectiveness, we must assess and refine those terms to increase the likelihood of capturing the most relevant documents. Borrowing concepts from basic statistical analysis, the process for vetting terms and suggested revisions can be based on results of a sample review.  Terms are modified by targeting common false positive hits — hits on the term but not for the intended target — identified within the non-relevant documents from the sample.

Imagine a fact pattern where the relevant discussions involve Jacob Francis and his interactions with a specific contract. Initial searches for Jacob OR Francis in documents that also contain the contract title or number would yield a substantial volume of documents based on the commonality of Jacob’s name.  It’s easy to label this as a bad term, but a lawyer’s analysis is helped much more by understanding why it is bad and how to make it better. Attorneys can do this by looking at the documents, which reveals that there are others at the company with Jacob or Francis in their names (e.g., Jacob Smith or John Francis), thus opening the door to an array of potential term revisions to minimize the number of documents returned. This is a good start, but the analysis does not end there.

Next, it is important to check actual document hits to ensure they are consistent with any assumptions. To do that an attorney should draw and review an additional sample from the documents that were removed from the review population to ensure the new terms are not missing potentially relevant content. Digging into these, the attorney may find out that Jacob Francis had a nickname, “Jake,” which would not be captured using the terms Jacob OR Francis to Jacob w/2 Francis. Continued analysis may also uncover references to the contract negotiation as “Project Apple” instead of the contract title or number.

Using this knowledge and adding or modifying the search to include “Project Apple” and “Jake” addresses these missing documents, avoiding potentially serious omissions. Additional considerations might include running “Project Apple” as a conceptual search rather than as a strict keyword, seeking documents that are similar in meaning but that do not necessarily share the same set of terms.

The payoff of all this work is a more focused set of documents for review, reducing associated costs, and concentrating the review team’s time working on documents in need of review. Considering the alternative of reviewing countless volumes of data unnecessarily, or worse, discarding valuable documents, it’s clear that using keyword searches – effectively – is not only necessary, but beneficial.

How to Manage the Challenges of Change in Technology

A Q&A with RVM CTO Geoffrey Sherman

Geoffrey Sherman, Chief Technology OfficerIn our day-to-day lives, technology goes in and out of style every few years, from a PC operating system to the latest media platform. Fortunately, in most of these cases, upgrading to the newest technology can be accomplished with minimal expense and time.

At the enterprise level, however, changes in technology can pose big headaches, ranging from compatibility with existing systems, retention for compliance, security, and of course – cost.

Many companies will delay the move to new or changed technologies as long as they can, opting instead for patches and workarounds. But at some point, it doesn’t make sense to fight the tide any longer; change is inevitable.

Geoffrey Sherman is RVM’s Chief Technology Officer, responsible for overseeing and deploying information technology products and solutions used by both RVM’s internal workforce and its clients around the world. What did he have to say about managing the challenges of change?

 

What might drive a company to consider upgrading to a new platform or system?

There are a few things at play. First, a company should look at its needs and determine whether they are being met by its current technology solution. An aging system may have a negative impact on work product or be vulnerable to security flaws.

Even if the product still functions perfectly, the company’s business needs may have evolved to the point where they are no longer being met. Or, new products may enter the market that include features not available previously. The value of those features will be weighed to see if they provide significant value to warrant the upgrade.

What are a company’s considerations before transitioning to a new technology platform?

Once a company realizes that their technology is not meeting their needs, the first thing to consider is whether the problem can be addressed with something less impactful, such as a version update, edition change, or workflow change. Failing those solutions, a replacement product or technology may be in the best interest of the organization.

For RVM, a key consideration is the impact that this change will have on our user base. We also need to evaluate if this change will require heavy architectural changes. We are sensitive to whether this would be a full replacement for our current platform, or whether the two would co-exist in parallel.

What are some of the concerns with upgrades in general?

RVM’s goal is to be on the cutting edge of technology, while stopping short of the “bleeding edge.” This means that we approach any upgrade strategically to be sure that the product version change is well tested, often-maintaining smaller test environments for major applications. However, in spite of the planning and testing, not all upgrades will go as planned.

There are ways to combat this. The most basic way is to have a maintenance plan in partnership with the technology vendor to handle minor problems as they arise. In addition to this, we firmly believe in recognizing the circumstances that warrant backing out of a given upgrade and having a mature process to revert to a known good state.

What steps can a company undertake to ensure a smooth transition to a new platform?

If you are reading this, platform change has become inevitable. There are many steps involved in an enterprise-level transition to a new platform. The key is gaining buy-in from the user base and testing early and often in manageable batches. This is crucial at the beginning for two reasons. First, it creates a sense of ownership. Future users that are invested early will apply a more critical eye and be more likely to contribute to the process, ensuring satisfaction with the result. Second, when users have a better grasp of a product and understand how it can meet their needs, there is a much higher rate of user adoption, which in turn improves return on investment.

Testing is just a common sense practice for any technology rollout. We recognize that no solution is going to be perfect out of the box, so it is critical that we conduct rigorous tests to find as many bugs and fix them before we roll out the product to our clients.

How do you measure the success of a transition?

Before we attempt any transition, we develop a notion of what the success factors will be, and solicit client feedback afterward. We find that this creates a feedback loop resulting in lessons learned on both sides. Responses may come in the form of contemporaneous feedback (e.g., emails or phone calls about the product) or may be a more formalized approach using surveys. No matter what we hear, we make sure we are learning from the process.

Can you give an example of a successful migration that RVM implemented?

RVM recently transitioned to a SaaS-based email security offering. The transformation was smooth due to meticulous planning and testing, and the user community was pleased with the added functionality, clear documentation, and fanatical levels taken to validate that operations went as planned. This was a situation where we performed several days of IT lead training to ensure everyone knew the timeline of events and that their expectations were well set. It was such a smooth transition that afterward I had to walk the floor and chat with users to make sure they were all working as expected, since there were no service desk tickets logged.

What should organizations consider if they have a legacy litigation offering or are facing challenges with the current system?

RVM can reduce the burden of housing legacy systems and even offload some modern ones, allowing our clients to focus on the merits of their cases without worrying about upgrades, testing, and downtime. As an example, RVM can migrate Cases from Veritas’s Clearwell Platform, which has a large footprint of mostly static cases. RVM specializes in migrating Relativity Workspaces to eliminate the burden and upkeep of hosting Relativity in house. Such options offer clients the ability to put contingencies in place for service and support.

The “A” in AI Needs a Makeover

By Jeanne Somma, Director, Analytics and Managed Review

When I first joined RVM, I had a meeting with a law firm client that was supposed to focus on how RVM could use technology to cut costs for the firm and ultimately their client’s. What it turned into, though, was a frank conversation about the firm’s reluctance to rely on technology without something more to back it up. The client put a name to that “something more,” calling it “legal legs.” His frustration was that technology companies are selling tech to lawyers, and that these things don’t have the proper legal legs to warrant complete adoption.

At the time I found that to be an interesting and frustrating stance. There is case law, metrics, and a whole host of other things that I believed to be the legal legs necessary to back up the use of technology. I left that meeting with a sticky note that simply read “tech needs legal legs” and fear that I wouldn’t be able to bridge the gap between what is possible when using technology and what is acceptable to the lawyers that use it.

It’s no surprise that many of the conversations and presentations at this year’s Legalweek centered on arguably the most polarizing technology, Artificial Intelligence (AI). The use of the term in the legal sector has grown exponentially, and at Legalweek that talk seems to reach a fever pitch. But even as session after session describes the amazing things that AI can do for the legal community, I wonder if the mass adoption promised by presenters is reasonable in the current environment. Put another way, we need to focus on evolving the conversation, rather than just evolving the technology.

One of the sessions that I was excited to attend was The AI Bootcamp. There the conversation was more of the same, with unfailing optimism that not only could the technology solve the problems of the legal sector, but that the sector would soon embrace it for those reasons. The feeling in the room was that there was a wide agreement that AI would be adopted and permeate the legal sector. That is, until the questions started.

The questions from the audience dissected the great divide between the technology’s capabilities and an attorney’s faith in the result being the same or better than if they had used the processes they are used to. With each comment from the practicing attorneys in the room, I was reminded more and more of that sticky note in my office – “legal legs.” The technology is ready, but it seems the lawyers may not be.

As I listened to the dialogue between the adopters and the more reticent participants, the real difference of opinion boiled down to defensibility. Some attorneys were finding it hard to rely on a system that made decisions using algorithms that were not easily articulated. These attorneys wanted to be able to tell a judge or opposing party that the results of an AI exercise would be the same or better than if they had done it without the technology. How can they do that when the machine is doing the thinking?

Looking at my notes and seeing the word “artificial,” I realized that that was my stumbling block. It’s the implication that the results are being driven by a machine, which is not accurate. The type of technology that we use across the legal sector – whether in contract analysis, legal research, or predictive coding — is meant to take human understanding and amplify those results in order to provide the user with an efficient way to get to their result. The process is the same – a human with the required intelligence on the subject must train the machine on their thought process and desired results. The machine then simply takes that knowledge and applies it across large data sets faster than the human could. The machine isn’t deciding anything that it wasn’t already told by the human. What it does do is amplify the human’s intelligence. In other words, people are still driving the results, but with the assistance from the technology.

Termed as amplification rather than artificial we take the mystique out of the process. We bring the real definition to light – a process that leverages human intellect in order to train a machine to do the resulting work quicker. The result is the same because the human’s intelligence and input at the outset is the same. Also the ability to check the work is the same. The only thing that’s changed is the speed with which the process can be completed.

We need to change the conversation as technology and legal service providers. We need to focus on the amplification power of AI solutions and the defensibility of relying on the human subject matter expert. Until we can show the legal community that AI is not artificial at all, we will continue to have this battle between capabilities and adoption.

I for one want to solve this problem – if only so I can finally throw out that sticky note.

Q&A with Jeanne Somma About Legalweek

On January 29, legal and IT professionals from all over the country will be heading to Legalweek, hosted by ALM.

Legalweek LogoFor us at RVM, attendance at Legalweek is a must. Where else can you network, exchange ideas, and leverage the expertise of representatives from a large swath of the legal profession including corporate counsel, law firms, corporate IT, or any of the myriad professions that work together to provide eDiscovery services? We pride ourselves on delivering products and services that meet or exceed our customers’ expectations, so it is critical that we maintain our up-to-the-minute understanding of the landscape, which we can do at Legalweek.

To get a better understanding of Legalweek and why it’s so important to firms like RVM, we spoke with RVM Director of Analytics and Managed Review, Jeanne Somma.

Q: What makes Legal Week the “IT” place to be?
Legal Week is the perfect storm. It’s one of the biggest legal conferences in the U.S. and comes right at the perfect time. I know that I’m always focused on growing professionally and also finding ways to grow our business come January, and LegalTech really provides the right concentration of knowledge and technology to help me chart a course for the rest of the year. It’s also a conference that, if attended correctly, provides a way to tailor your experience to your needs. There are so many education programs, technology demonstrations, and chances to network that it’s like a live action choose-your-own-adventure eDiscovery style.
Q: What are you looking forward to seeing or hearing while you’re there?
Last June I joined RVM to head up the Analytics and Managed Review service lines. As part of that role I have been focused on the best use of all of the analytics tools and processes we have in house – especially as it comes to offering our clients what I think of as the next-generation managed review process meant to offer the most cost-effective and defensible experience in the market. That said, I am really looking forward to exploring what new analytics tools are out there, or how analytics technology has grown from last year in order to keep providing the most forward thinking services to our clients. As data volumes grow and technology quickens its pace, we can’t afford to accept that we are good enough. We need to keep ourselves at the forefront, and having all this access to knowledge during the conference will really help to achieve that.
Q: What do you see as RVM’s role at Legalweek?
RVM’s focus has always been on building relationships and providing outstanding customer service. We are excited to discuss the innovations RVM is rolling out in 2018 and really want to focus on having those discussions on a personal level. Our goal is always to give our customers an individualized and personal experience. So, while my colleagues and I are at the event we’ll be working on our connections – making time for existing relationships and making new ones – as well as improving our understanding of the issues in the market that affect our customers so we can provide more effective consultations. On the 29th (the first day of Legalweek) RVM will be hosting a private dinner with our leading corporate counsel and law firm eDiscovery clients to discuss the current state of eDiscovery and what we see happening in 2018.

Look for Jeanne and other members of the RVM team who will be on the ground at Legalweek to get their take on the show and on what 2018 holds in store for eDiscovery.

 

Key Considerations Before Migrating to Office 365

Companies of all sizes are preparing for their transition to the cloud. Office 365 (O365) will likely be a foundational part of that transition, particularly for small- and medium-sized businesses.  The transition is certain: it’s no longer a question of if, but a matter of when businesses will do it.

For small- to medium-sized business, there are a number of things that must be considered, from internal processes to compliance. While the benefits of migrating to the cloud may be clear: lower operational costs, simplicity, scalability, redundancy, and easy mobile access – the risks are easily overlooked.  We’ve compiled a list of things to consider before making the big move to the cloud.

    1. Data Protection
      Office 365 ChecklistCompanies with highly sensitive data naturally have heightened security needs and would be wise to consider how comfortable they are with having all of their data stored on a public cloud server. While O365 is very secure – it maintains high standards for backup and encryption procedures – migrating entirely to the cloud is effectively entrusting your data to a third party. A solution partner like RVM can help your organization adopt best practices such as minimizing the identity information copied to the cloud, providing policy to block unauthorized access, and employing multifactor authentication and integrated device management. Industry standard security parameters are available and can be customized to fit your organization’s requirements. Depending on the complexity or simplicity of your environment, it may be recommended to look for a hybrid solution where some mailboxes remain on premises as others move to the cloud. This allows you to test as you migrate.
    2. Compliance
      Many businesses today are bound by compliance. While this may have prevented businesses from adopting the cloud in years past, it’s less of a hindrance now that the Financial Conduct Authority has approved cloud usage (including public cloud providers). That said, understanding your company’s compliance responsibilities should still be a consideration before migration.  These may affect your company’s use of document retention and  data export settings, should you need to demonstrate documentation in response to a subpoena or compliance investigation.  Your licensing package, volume of data, and software expertise impact how efficient or inefficient this endeavor can be.
    3. Litigation Readiness
      Often companies overlook the business need to be litigation ready. They look at solutions like O365 as a means to reduce their operational costs related to IT and forget that there may be an impact down the road, like when faced with an SEC subpoena or a civil litigation.  When implementing O365, companies need to  conduct analysis beyond email, and consider additional impacts such as email archive solutions, integration with other business systems, and how to functionally use it to accomplish data exports or other recovery tasks.  Companies often realize their inability to accomplish these things too late, when they are faced with subpoenas and document requests, and end of paying a lot of money to quickly fix what they already spent a lot of money to implement.
    4. Managing Accounts
      For small- or medium-sized businesses, finding a solution to automate the cumbersome process of setting up accounts across cloud apps is crucial to success. Tools that enable provisioning of users for all services can be difficult — especially if you have custom or legacy apps that require complex configuration – but often pay off, as provisioning is typically the easiest way to add new users into the Active Directory. There are a number of options for managing synchronization between Active Directory and O365, supporting third party applications and single sign-on, and providing multiple accounts for multiple applications.
    5. Licensing
      O365 licensing includes many options. Many users will require different levels of access, based on use case. A valuable asset of O365 is the ability to avail the right toolsets to the right users. The platform also enables administrators to track license consumption and availability, reducing costs and simplifying true-ups.
    6. Hands off – Patch Management & Control
      Moving to O365 means giving up control over elements such as the patch management process, software upgrades, and other administrative tasks that could previously be performed on premises. Many organizations use third party utilities to manage their internal servers (Microsoft Exchange, Lync/Skype and SharePoint), but utilities designed to be installed directly on a server won’t work with O365 – as the management is done through O365’s portal. One benefit of remote management is that Microsoft pushes out environment updates regularly, meaning that users will always be running the most recent tools.

Above all, there is no one right answer for all organizations. Each should take the time to consider all the factors mentioned above (and any others that are relevant to the company or industry) and weigh the pros and cons. Should your company elect to migrate to O365, it is critical that you do so strategically, and with consideration for the safety and security of your data. Hiring a company like RVM to oversee the migration can ensure a proper setup protecting your company from threats now and in the future.