Blog

Key Questions for Evaluating Legal Tech CLM Vendors

By Steve Nunes, Product Marketing Manager

Choosing the right solution and partner for contract lifecycle management (CLM) is crucial.

That's especially true as vendors are making artificial intelligence (AI) claims – and right now, there are a lot of CLM providers doing exactly that.

In this post, we’ll provide you with some of the questions you should ask during your procurement process to ensure you’ve got all the answers you need to make an informed choice that lets you future-proof your contract AI investment.

Evaluating CLM vendors

An in-depth assessment of CLM legal tech vendors is paramount for legal departments to ensure they partner with contract AI solutions providers that align with their specific needs and standards. If those vendors are promoting AI claims about their software, it becomes even more important to ask the right questions.

That’s not just because any CLM software solution that makes AI claims should measure up to its promises. It’s because contract AI solutions will need access to contract data – your contract data – to perform properly. And that contract data is, it goes without saying, sensitive and valuable.  

Misuse of that data – whether intentional or unintentional – is a violation of trust and could have significant consequences. And poor vendor transparency about data privacy and security makes it difficult for buyers to comply with their internal data governance policies. It’s glaringly apparent that any contract AI provider who might wind up touching your corporate contract data should go through intense vetting so you can be certain every possible safeguard is in place.

Another concern? AI washing is too common in the legal tech segment, as vendors make inflated or misleading claims about the AI capabilities of their products. An unwary purchaser may be stuck with a “solution” that doesn’t live up to its hype and proves to be a poor investment.

What are some of the elements to include when evaluating CLM vendors?

Company background and expertise

When assessing a vendor's background, key factors include how long it has been operating and its legal tech expertise. Evaluate whether the vendor has a proven track record backed by testimonials or case studies from prior customers using its contract AI solutions.

  • Time in business: This is a clue about vendor stability and industry experience.
  • Expertise in legal tech: Greater specialization assures understanding of legal complexities.
  • Expertise in AI: Are they an innovator/expert? Have they built their own AI, or just using a GPT wrapper around a third-party tool?

Product features and capabilities

Product features play a pivotal role. Examine the software’s ability to address specific business and legal challenges, and look for scalability and customization options.

  • Specific business and legal challenges: Does the solution meet your organization’s unique requirements?
  • Scalability: Can it grow (or otherwise flex) with your organization’s evolving needs?

Security and compliance standards

Cybersecurity and data protection? They’re non-negotiable. Legal tech vendors must adhere to industry compliance standards and have robust security protocols to protect sensitive data.

  • Security protocols: Measures like encryption and de-identification.
  • Compliance: Vendor should adhere to stringent data privacy and security frameworks such as ISO 27001 and 27701.

Integration with current systems

Integration capabilities are essential to avoid disruptions. The vendor’s contract AI solutions should seamlessly integrate with your organization's existing software and databases.

  • Software compatibility: Check for compatibility with current tech stack.
  • Database connectivity: Ability to connect with existing databases.

Customer support and training

Post-deployment support is key to a successful implementation. Evaluate the level of customer support and the availability of training materials to ensure efficient product adoption.

  • Support availability: Types of support offered (e.g., 24/7, live chat, phone, customer success managers).
  • Training resources: Access to guides, webinars, and self-service training.

Pricing and payment structure

Organizations should request a detailed pricing structure, clarifying upfront costs, recurring payments, and any potential hidden fees. They need to understand the total cost of ownership and assess it against their budget.

  • Upfront costs
  • Recurring payments
  • Hidden fees (if any)
  • Training and support costs

Implementation timeline and process

A clear timeline is vital for seamless implementation. The vendor should provide a step-by-step process outline, specifying responsible parties for each phase.

  • Preparation: Gathering requirements, setting a data migration strategy.
  • Deployment: Planning and design, workflow-building, and integration with existing systems.
  • Training: Scheduling sessions for end-users.
  • Go-live: Actual software activation and rollout.

Post-implementation review

After implementation, they must critically assess the performance of contract AI solutions against initial objectives. You’ll want this done at regular intervals to ensure ongoing value.

  • Immediate performance check: Within the first month after your implementation.
  • Regular interval reviews: Quarterly or bi-annually to measure continuous improvement.

Protect The Business: 13 Questions to Ask Your Legal AI Vendor

Watch the on-demand demo →

Find out how

Evisort

can help your team

Test Evisort on your own contracts to see how you can save time, reduce risk, and accelerate deals.

Vendor evaluation questions about AI responsibility

In evaluating CLM vendors making AI claims, these are key questions to ask them regarding data privacy, security, and IP protection.

Question 1: Have you built a proprietary AI and large language model (LLM) specifically for contracts, or do you rely on third-party LLM providers like OpenAI?

Why is this important? CLMs that offer a complete suite of proprietary, purpose-built AI — rather than merely building a “wrapper” around third-party AI tech — deliver critical advantages:

  • Accuracy: A contract-specific LLM can outperform generic LLMs, and the vendor can audit and improve its accuracy over time. Third-party AI is a “black box” that can’t be readily improved or audited by a vendor.
  • Security: Contract-specific LLMs reduce the need to send customer data outside of a secure, closed system, unlike vendors using third-party LLMs that share customer data among external systems and companies.
  • Control: Contract-specific LLMs let vendors keep delivering updates and innovations with control over scale, performance, and cost. CLMs using third-party LLMs are inherently built on less predictable foundations.

Did you know? Third-party LLMs are subject to restrictions that limit how much data can be processed.

Question 2: If you use a third-party LLM, which one? How long can that provider store our data? Can their staff access our data? Will our data be used to train AI models or for any purpose aside from provision of services?

Why is this important? Using a CLM vendor with a no-data-storage agreement with their third-party LLM provider can limit your exposure to third-party risk and prevent them from using your data to train publicly available models.

Question 3: If you use OpenAI, are you licensing that technology directly through OpenAI or via the Microsoft Azure OpenAI Service?

Why is this important? Using contract AI solutions through Microsoft’s Azure OpenAI service provides enterprise-grade security safeguards that are not available from OpenAI. Gartner reports that the Azure OpenAI Service is “significantly more enterprise-ready” in terms of risk and security than OpenAI.

Vendor evaluation questions about AI foundations

These questions for evaluating CLM vendors pertain to matters of scalability and implementation.

Question 4: Does your LLM use AI-augmented OCR to extract data from documents with non-standard text or aberrations? Can it accurately extract data if there are watermarks, blurry scans, handwritten text, creases, smudges, text from tables, and/or text with non-Latin characters?

Why is this important? Make sure the CLM tool of a potential vendor can ingest all your contracts to avoid manual and fragmented contract imports. Successful processing of contract text is crucial for AI performance at every stage of the contract lifecycle.

Question 5: Does your AI require “human in the loop” review and validation during contract ingestion?

Why is this important? Implementing contract AI solutions requiring a “human-in-the-loop” will require additional time and resources. For the fastest time-to-value, consider using CLM products using pre-trained AI with self-sufficient ingestion and processing capabilities.

Question 6: If your product has generative AI features, what controls do you have to address scalability (latency, rate limits, costs, etc.)?

Why is this important? It is crucial to ensure that your CLM provider has controls in place to enable their generative AI features to operate at scale and avoid issues such as latency or inflated costs as adoption and innovation continue to grow.

In the next post in this series, we’ll cover more questions you should include if evaluating CLM vendors who make AI claims. Or watch our on-demand demo on the same topic.

Related Resources

Infographic

Smarter Contracting Starts Here

On-demand Webinar

Unlock the Power of Contract AI: How Intel is Leading the Way

Report

2024 Gartner® Report: Critical Capabilities™ for Contract Life Cycle Management

See Evisort in action!

Test Evisort on your own contracts to see how you can save time, reduce risk, and accelerate business.

Get a demo