A Flexible AI Integration Pattern for Ruby on Rails

You’re likely already sprinkling in some AI features to your rails applications, right? At Craftwork, we’ve added a few power features. For instance, we built an assistant for our customer service and sales messaging platform. It generates responses to customers based on context like ongoing project details, past customer messages, image descriptions, progress reports, and FAQs to provide (human-in-the-loop) support.

A known challenge is that providers like OpenAI and Anthropic constantly ship new models, and the dust hasn’t really settled on their APIs. We've switched between OpenAI, Claude, and others multiple times in the last year as new, more capable models are released. This makes it crucial to have a flexible approach to AI integration that allows easily swapping providers.

We’ve iterated a few times on a flexible integration pattern that I’m sharing here in hopes you find it useful.

Here's a roadmap of where we're headed:



One of the nice things about the langchain framework in Python is that it provides a suite of provider-agnostic abstractions. While I’ve experimented with langchain.rb, I felt like our use case didn’t fit well enough with the solutions available in that gem. The Ai::Service class is a single entry point for all AI-related actions. It lets us switch between default models and providers without changing the underlying API client wrappers or going to all the different uses. Instead of scattered AI-related code throughout our codebase, we can centralize all AI-related functionality within the Ai::Service model, making it easier to reason about, test, and maintain.

To use the Ai::Service model, we initialize it with a provider and a user. The provider is optional; without it, we’ll fall back to the current default provider for a given action. The user allows us to generate unique and personalized prompts for employees when they talk with customers.

service = Ai::Service.new(user: current_user, provider: :open_ai)
# or
service = Ai::Service.new(user: current_user)

The Ai::Service model exposes methods for common AI tasks, like image description, text generation, and chat messages. For example, to transcribe an audio file using Deepgram, we can call the transcribe method:

transcription = service.transcribe(audio_url: 'https://example.com/call-recording.mp3')

Similarly, to describe an image or generate a message for a chat conversation, we can use the describe_image or chat methods, respectively:

image_description = service.describe_image(image_url: 'https://example.com/image.png')
response = service.chat(
  system_prompt: 'You are a helpful assistant', 
  messages: conversation_history

It's important to note that while the Ai::Service model provides a unified interface for AI services, the actual implementation details are encapsulated within provider-specific adapters.

Here’s a simplified example of an Ai::Service:

module Ai
  class Service
    attr_reader :user

    def initialize(user: nil)
      @user = user

    def describe_image(image_url:, provider: :open_ai)
      client = llm_client(provider)
      client.describe_image(image_url: image_url)

    def generate_review(project:, provider: :open_ai)
      client = llm_client(provider)
      client.generate_review(project: project)

    def chat(system_prompt:, messages:, provider: :claude)
      client = llm_client(provider)
      client.chat(system_prompt: system_prompt, messages: messages)

    def transcribe(audio_url:, provider: :deepgram)
      client = llm_client(provider)
      client.transcribe(audio_url: audio_url)


    def llm_client(provider)
      case provider
      when :open_ai
        OpenAiClient.client(user: user)
      when :claude
        ClaudeClient.client(user: user)
      when :deepgram
        DeepgramClient.client(user: user)
      when :gemini
        GeminiClient.client(user: user)
      when :ollama
        OllamaClient.client(user: user)
      when :fake
        FakeClient.new(user: user)
        raise "Unknown provider: #{provider}"


You’ll notice the llm_client returns client models like Ai::OpenAiClient and Ai::ClaudeClient. Again, each client is an adapter that wraps the provider-specific logic and communication protocols, abstracting away the complexities of interacting with various AI providers.

The primary role of these client models is to act as a bridge between our application and the respective AI provider's API.

Here's an example of the Ai::ClaudeClient. Notice the generate_review method, which generates example customer reviews unique to each project that we use to inspire customers when asking them to write one:

module Ai
  class ClaudeClient < ApplicationClient
    BASE_URI = "https://api.anthropic.com".freeze

    def generate_review(project:, prompt_manager_class: PromptManager)
      @project = project
      prompt_manager = prompt_manager_class.new(binding)
      system_prompt = prompt_manager.render("generate_review/system")
      user_prompt = prompt_manager.render("generate_review/user")

      result = chat(
        model: "claude-3-opus-20240229",
        system_prompt: system_prompt,
        messages: [{
          role: "user",
          content: user_prompt
        max_tokens: 512

        review: result.dig(:content, 0, :text)

    # Continued...

In this example, the generate_review method utilizes the chat method provided by the Claude API to generate example reviews based on the provided system prompt, user prompt, and project information. If you’re wondering about PromptManager, sit tight; I’ll explain briefly!

Prompt Management

Prompt engineering was hyped about a year ago.

prompt eng trends

The quality and structure of prompts can significantly impact performance and accuracy.

The Ai::PromptManager class and Ai::Promptable module are little utilities for rendering prompts, similar to Rails application views from template files. I wanted to maintain the high-level structure of prompts and easily inject context from data with ActiveRecord.

Prompts are stored as ERB (Embedded Ruby) templates. For example, the system and user prompts for the "generate_review" task might be stored in the following files:


Here's an example of how the Ai::PromptManager can be used to render a prompt from a template file:

module Ai
  class ClaudeClient < ApplicationClient
    def generate_review(project:, prompt_manager_class: PromptManager)
      @project = project
      prompt_manager = prompt_manager_class.new(binding)
      system_prompt = prompt_manager.render("generate_review/system")
      user_prompt = prompt_manager.render("generate_review/user")

      # ... (rest of the method implementation)

In this example, the generate_review method initializes an instance of the PromptManager and uses the render method to retrieve the system and user prompts from their respective template files. The binding passed to the PromptManager allows for an experience similar to working with rails views where methods and instance variables are available in the prompts.

Here’s an example of a prompt for generating customer message responses. Notice how you can write it similar to a rails view, but we’re spitting out text instead of html:

<%= preamble %>

Frequently asked questions:

<% Faq.kept.each do |faq| %>
Question: <%= faq.question %>
Answer: <%= faq.answer %>
<% end %>

Customer Information:

<%= render "customer", partial: true, locals: { customer: conversation.account } %>

<% if progress_reports.any? %>
Today's date is <%= Date.today.strftime("%A, %B %d, %Y") %>.

The paint crew submits daily progress reports from the field at the end of each day's work.

Include details from the most recent progress report to keep the customer informed and engaged if possible.

<% progress_reports.each do |report| %>
Report about project "<%= report.project.title %>" from <%= report.created_at.strftime("%A, %B %d, %Y") %>:

<%= report.report_display %>
<% end %>
<% end %>
# continued...

Integrating with Background Jobs

Many tasks, like image analysis or text generation, introduce latency. To ensure a relatively smooth experience, I like to offload these tasks to background jobs.

Here's an example of how we might use the Ai::Service within a background job to generate image descriptions:

class AddImageDescriptionJob
  include Sidekiq::Worker
  sidekiq_options queue: :low, retry: false

  def perform(blob_id)
    blob = ActiveStorage::Blob.find(blob_id)
    return unless blob.image?
    return if blob.image_data_record.present?

    response = Ai::Service.new.describe_image(image_url: blob.url)
    description = response[:description]

      description: description,
      blob_id: blob_id,
      human_reviewed: false,

Future Improvements and Considerations

While our pattern provides a solid foundation for sprinkling in AI features, there are several ideas. Here are some examples I’d love to add:

  • A/B testing and experimentation for prompts
  • Function calling and tools (e.g. fetch the price from our model for “painting the walls in a large room”)
  • Provider-specific prompts

Leveraging a modular and decoupled architecture, this pattern empowers you to harness the power of different providers while maintaining a high degree of flexibility and adaptability.

While our flexible AI integration pattern provides a solid foundation for adding bits of AI to a web application, I’m not sold on this approach as the best approach to building a full-on agent or product where AI is the core of the business.

I’d love to hear about your approach! How are you solving these problems?