_
_
Back to Blog
ServiceNow
GenAI

Now Assist Series: Connecting GenAI to External LLMs

Effortlessly Integrate External LLMs into ServiceNow
5
min read
|
by
Dor Vaknin
December 12, 2024

ServiceNow’s Now Assist brings the power of GenAI to enterprise workflows, enabling seamless automation, enhanced user assistance, and contextual insights. Designed to simplify complex tasks and improve overall efficiency, Now Assist is an invaluable tool for organizations aiming to deliver exceptional customer and employee experiences.

A Large Language Model (LLM) is a type of AI trained on extensive datasets and is designed to understand and generate human-like language, making it ideal for tasks such as summarization, answering questions, and creating content. 

By default, Now Assist uses ServiceNow’s proprietary NowLLM, tailored for ServiceNow applications and provides excellent performance for most workflows. However, there are several scenarios where using an external LLM might be beneficial. External models, like OpenAI’s GPT or Google Bard/Gemini, often offer advanced capabilities such as better natural language understanding, multilingual support, or domain-specific expertise. They may also align better with an organization’s specific compliance requirements or offer more cost-efficient options for high-volume tasks. Some businesses might also require the ability to customize and fine-tune their AI models, which certain external LLMs or custom-built models can provide.

Exploring LLM Options and Their Strengths

There are several excellent LLM options to choose from. Depending on your business needs, you might consider one of the following:

  • OpenAI (GPT): Versatile for natural language tasks, ideal for creative and conversational AI.
  • Azure OpenAI: Enterprise-grade security with OpenAI models, great for compliance-focused organizations.
  • Amazon Bedrock: Access to multiple LLM providers with flexibility in integration.
  • Google Bard/Gemini (PaLM): Strong multilingual capabilities and advanced reasoning.
  • IBM WatsonX: Focused on explainability and privacy, suitable for regulated industries.
  • Aleph Alpha: Excels in contextual reasoning and supports European data sovereignty.

Building a custom LLM is another option if none of these meet your needs. This allows you to train the model on proprietary data, ensuring it aligns perfectly with your organization’s language, processes, and compliance requirements. While more resource-intensive, a custom LLM provides unmatched flexibility and domain-specific expertise.

How to Configure an External LLM in Now Assist

ServiceNow provides two primary methods to connect external LLMs using the Generative AI Controller:

  • The Pre-Built Spokes offers several connections to popular AI service providers, such as Azure OpenAI, Google Vertex AI, etc. This is ideal for general-purpose use cases or when a proven LLM is sufficient for your workflows.
  • The Generic LLM Connector is for more specific requirements and is particularly useful when you need industry-specific or custom LLMs.

Steps to configure External LLM for Pre-Built Spoke

  1. Obtain the API credentials (API key, endpoint, etc.) for the external LLM provider.
  2. Configure Provider Connection:
    • Navigate to Connections & Credential Aliases and Open an existing GenAI provider Alias.
    • Select the Create New Connection & Credential related link and insert your API Key to create a new Connection record.

Steps to configure Generic LLM Connector

  1. Obtain the API credentials (API key, endpoint, etc.) for the external LLM provider.
  2. Configure Provider Connection:
    • Configure API Key Credentials record
    • Configure new Connections & Credential Aliases record
    • In the new Connections & Credential Aliases record, create a new Connection record under the Connections related list
      1. Choose the Credentials record created in Step 2
      2. Enter the LLM API endpoint in the Connection URL field
  3. Create Model Configuration:
    • Navigate to sys_generative_ai_model_config
    • Create a new Generative AI Model Configuration
    • Fill out Model, Provider and Connection and Credentials Alias
  1. Create Generative AI Configuration:
    • Navigate to sys_generative_ai_config
    • Create a new Generative AI Configuration
    • Choose the capability via the Definition field 
    • Choose the previously created Model and fill out your Prompt Template. You can review the attributes available for the capability and use them enclosed with braces {{capabilityAttribute}}
  1. Create Generative AI Custom LLM Transformer (sys_generative_ai_custom_llm_transformer) 
    • Use the provided code template and comments to write your transformer

Now Assist’s integration with external LLMs makes it a powerful tool for enhancing intelligence within the ServiceNow platform. By selecting the right LLM to meet your unique needs, you can maximize the value of your AI investments. With the steps outlined above and the right strategic approach, you can unlock the full potential of Now Assist and take your ServiceNow workflows to the next level. Remember, the AI industry is rapidly evolving, so it’s essential to stay current and periodically re-evaluate your LLM choices as new advancements emerge.

Interested to learn more? Check out other posts in our NowAssist series, or reach out to chat@rapdev.io

Written by
Dor Vaknin
Born and raised by the Mediterranean Sea. He is a curious engineer experienced in ServiceNow and external integrations. Passionate about all things tech, exploring cultures and outdoor adventures.
you might also like
back to blog