Kevin Sylvestre

Introducing OmniAI

Exploring a variety of LLMs over the past few months has made it clear that they each have interesting strengths and weaknesses. OpenAI offers a solid foundation but may not always be the right solution based on either cost or quality. Sometimes Mistral’s LeChat or Anthropic’s Claude is the right choice. Sometimes the huge context window of Google’s Gemini is needed. Each LLM offers slightly different APIs and features, making it difficult to swap and compare results. Tools like LangChain gave Python developers the ability to experiment with a consistent API. OmniAI aims to offer a similar benefit to Rubyists.

Installation

Installation of omniai works the same as any other gem:

gem install omniai

The core library is split for each provider (OpenAI / Google / Mistral / Anthropic):

gem install omniai-anthropic
gem install omniai-google
gem install omniai-mistral
gem install omniai-openai

For projects configured with bundler, include omniai in the Gemfile instead!

Configuration

Each OmniAI library is configured by simply setting an ENV variable with an api_key. Using OpenAI? Simply set ENV['OPENAI_API_KEY']. Need Google? Just make sure to set ENV['GOOGLE_API_KEY']. If required, libraries are also configurable using a block syntax. For example, with Mistral:

require 'omniai/mistral'

OmniAI::Mistral.configure do |config|
  config.api_key = '...'
end

Usage

OmniAI provides a consistent method for getting back a completion. For example, with OpenAI’s ChatGPT:

require 'omniai/openai'

client = OmniAI::OpenAI::Client.new
completion = client.chat("What is the capital of Canada?")
completion.choice.message.content # "The capital of Canada is Ottawa."

OmniAI also supports streaming for real-time responses. For example, with Google’s Gemini:

require 'omniai/google'

client = OmniAI::Google::Client.new
stream = proc do |chunk|
  print(chunk.choice.delta.content) # 'A', 'penny', 'saved', ...
end
client.chat("What is a great proverb for saving?", stream:)

OmniAI also covers more complex integrations with combination of system and user messages, customized models, customized temperatures, and more. For example, with Anthropic’s Claude:

require 'omniai/anthropic'

client = OmniAI::Anthropic::Client.new
completion = client.chat([
  { role: OmniAI::Chat::Role::SYSTEM, content: 'Only respond in rhyme.' },
  { role: OmniAI::Chat::Role::USER, content: 'How do you eat an elephant?' },
], model: OmniAI::Anthropic::Chat::Model::CLAUDE_OPUS, temperature: 0.9)
completion.choice.message.content # 'To devour an elephant my friend. One bite at a time is the trend.'

For more examples, check the various projects on Github:

This article originally appeared on https://workflow.ing/blog/articles/introducing-omniai.