Today marks the release of OmniAI 2.0. 2.0 represents a significant upgrade to our Ruby library that standardizes interactions with various LLM providers. Whether you're working with Anthropic, DeepSeek, Google, Mistral, or OpenAI, OmniAI offers a consistent interface that simplifies AI integration into your Ruby applications.
Prior to OmniAI 2.0, using tool call while streaming was not possible. With this release, OmniAI now has the ability to automatically handle both tool-call parsing and processing with a straightforward API:
require 'omniai-anthropic'
weather_tool = OmniAI::Tool.new(
proc { |location:, unit: "celsius"| "#{rand(20..50)}° #{unit} in #{location}" },
name: "Weather",
description: "Lookup the weather at a location in either celsius or fahrenheit",
parameters: OmniAI::Tool::Parameters.new(
properties: {
location: OmniAI::Tool::Property.string(description: "e.g. Toronto"),
unit: OmniAI::Tool::Property.string(enum: %w[celsius fahrenheit]),
},
required: %i[location]
)
)
omniai = OmniAI::Anthropic::Client.new
omniai.chat(tools: [weather_tool], stream: $stdout) do |prompt|
prompt.system "You are an expert in weather."
prompt.user 'What is the weather in "London" in celsius and "Madrid" in fahrenheit?'
end
Prior to OmniAI 2.0, developers working with streaming responses were required to manually aggregate chunks to construct the complete response. OmniAI 2.0 eliminates this burden by automatically handling response composition:
require 'omniai-openai'
client = OmniAI::OpenAI::Client.new
stream = ->(delta) { print(delta.text) } # called with chunks of text as generated by the LLM
response = client.chat(stream:) do |prompt|
prompt.system("You are a helpful biologist with expertise in animals that responds with Latin names.")
prompt.user do |message|
message.text("What animals are in the attached photos?")
message.url("https://images.unsplash.com/photo-1472491235688-bdc81a63246e?q=80&w=1024&h=1024&fit=crop&fm=jpg", "image/jpeg")
message.url("https://images.unsplash.com/photo-1517849845537-4d257902454a?q=80&w=1024&h=1024&fit=crop&fm=jpg", "image/jpeg")
end
end
response.text # returns the entire completion with usage statistics and metadata
Lastly, OmniAI 2.0 introduces a new library offering integration with DeepSeek. DeepSeek recently emerged as a popular LLM provider. OmniAI 2.0 offers full support for the DeepSeek API allowing developers to leverage a much cheaper LLM option:
require 'omniai-deepseek'
client = OmniAI::DeepSeek::Client.new
response = client.chat("What is the capital of Canada?")
response.text # => "The capital of Canada is Ottawa."
For documentation / examples be sure to checkout our GitHub repositories: