ruby-openai-3.7.0/0000755000175100017510000000000014425664124013011 5ustar pravipraviruby-openai-3.7.0/.gitignore0000644000175100017510000000021614425664124015000 0ustar pravipravi/.bundle/ /.yardoc /_yardoc/ /coverage/ /doc/ /pkg/ /spec/reports/ /tmp/ # rspec failure tracking .rspec_status .byebug_history .env *.gem ruby-openai-3.7.0/Rakefile0000644000175100017510000000016214425664124014455 0ustar pravipravirequire "bundler/gem_tasks" require "rspec/core/rake_task" RSpec::Core::RakeTask.new(:spec) task default: :spec ruby-openai-3.7.0/CONTRIBUTING.md0000644000175100017510000000050714425664124015244 0ustar pravipravi## Contributing Bug reports and pull requests are welcome on GitHub at https://github.com/alexrudall/ruby-openai. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/alexrudall/ruby-openai/blob/main/CODE_OF_CONDUCT.md). ruby-openai-3.7.0/.github/0000755000175100017510000000000014425664124014351 5ustar pravipraviruby-openai-3.7.0/.github/dependabot.yml0000644000175100017510000000040614425664124017201 0ustar pravipraviversion: 2 updates: - package-ecosystem: bundler directory: "/" schedule: interval: daily open-pull-requests-limit: 10 ignore: - dependency-name: webmock versions: - 3.11.1 - 3.11.3 - dependency-name: rspec versions: - 3.10.0 ruby-openai-3.7.0/.github/ISSUE_TEMPLATE/0000755000175100017510000000000014425664124016534 5ustar pravipraviruby-openai-3.7.0/.github/ISSUE_TEMPLATE/bug_report.md0000644000175100017510000000150214425664124021224 0ustar pravipravi--- name: Bug report about: Create a report to help us improve title: '' labels: '' assignees: '' --- **Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See error **Expected behavior** A clear and concise description of what you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - OS: [e.g. iOS] - Browser [e.g. chrome, safari] - Version [e.g. 22] **Smartphone (please complete the following information):** - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] **Additional context** Add any other context about the problem here. ruby-openai-3.7.0/.github/ISSUE_TEMPLATE/feature_request.md0000644000175100017510000000112314425664124022256 0ustar pravipravi--- name: Feature request about: Suggest an idea for this project title: '' labels: '' assignees: '' --- **Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here. ruby-openai-3.7.0/bin/0000755000175100017510000000000014425664124013561 5ustar pravipraviruby-openai-3.7.0/bin/console0000755000175100017510000000052514425664124015153 0ustar pravipravi#!/usr/bin/env ruby require "bundler/setup" require "openai" # You can add fixtures and/or initialization code here to make experimenting # with your gem easier. You can also use a different console, if you like. # (If you use this, don't forget to add pry to your Gemfile!) # require "pry" # Pry.start require "irb" IRB.start(__FILE__) ruby-openai-3.7.0/bin/setup0000755000175100017510000000020314425664124014642 0ustar pravipravi#!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' set -vx bundle install # Do any other automated setup that you need to do here ruby-openai-3.7.0/CHANGELOG.md0000644000175100017510000001524014425664124014624 0ustar pravipravi# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). ## [3.7.0] - 2023-03-25 ### Added - Allow the usage of proxy base URIs like https://www.helicone.ai/. Thanks to [@mmmaia](https://github.com/mmmaia) for the PR! ## [3.6.0] - 2023-03-22 ### Added - Add much-needed ability to increase HTTParty timeout, and set default to 120 seconds. Thanks to [@mbackermann](https://github.com/mbackermann) for the PR and to everyone who requested this! ## [3.5.0] - 2023-03-02 ### Added - Add Client#transcribe and Client translate endpoints - Whisper over the wire! Thanks to [@Clemalfroy](https://github.com/Clemalfroy) ## [3.4.0] - 2023-03-01 ### Added - Add Client#chat endpoint - ChatGPT over the wire! ## [3.3.0] - 2023-02-15 ### Changed - Replace ::Ruby::OpenAI namespace with ::OpenAI - thanks [@kmcphillips](https://github.com/kmcphillips) for this work! - To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`. ## [3.2.0] - 2023-02-13 ### Added - Add Files#content endpoint - thanks [@unixneo](https://github.com/unixneo) for raising! ## [3.1.0] - 2023-02-13 ### Added - Add Finetunes#delete endpoint - thanks [@lancecarlson](https://github.com/lancecarlson) for flagging this. - Add VCR header and body matching - thanks [@petergoldstein](https://github.com/petergoldstein)! ### Fixed - Update File#upload specs to remove deprecated `answers` purpose. ## [3.0.3] - 2023-01-07 ### Added - Add ability to run the specs without VCR cassettes using `NO_VCR=true bundle exec rspec`. - Add Ruby 3.2 to CircleCI config - thanks [@petergoldstein](https://github.com/petergoldstein)! - A bit of detail added to the README on DALLE image sizes - thanks [@ndemianc](https://github.com/ndemianc)! ### Fixed - Fix finetunes and files uploads endpoints - thanks [@chaadow](https://github.com/chaadow) for your PR on this and [@petergoldstein](https://github.com/petergoldstein) for the PR we ultimately went with. ## [3.0.2] - 2022-12-27 ### Fixed - Fixed Images#generate and Finetunes#create which were broken by a double call of to_json. - Thanks [@konung](https://github.com/konung) for spotting this! ## [3.0.1] - 2022-12-26 ### Removed - [BREAKING] Remove deprecated answers, classifications, embeddings, engines and search endpoints. - [BREAKING] Remove ability to pass engine to completions and embeddings outside of the parameters hash. ## [3.0.0] - 2022-12-26 ### Added - Add ability to set access_token via gem configuration. - Thanks [@grjones](https://github.com/grjones) and [@aquaflamingo](https://github.com/aquaflamingo) for raising this and [@feministy](https://github.com/feministy) for the [excellent guide](https://github.com/feministy/lizabinante.com/blob/stable/source/2016-01-30-creating-a-configurable-ruby-gem.markdown#configuration-block-the-end-goal) to adding config to a gem. ### Removed - [BREAKING] Remove ability to include access_token directly via ENV vars. - [BREAKING] Remove ability to pass API version directly to endpoints. ## [2.3.0] - 2022-12-23 ### Added - Add Images#edit and Images#variations endpoint to modify images with DALL·E. ## [2.2.0] - 2022-12-15 ### Added - Add Organization ID to headers so users can charge credits to the correct organization. - Thanks [@mridul911](https://github.com/mridul911) for raising this and [@maks112v](https://github.com/maks112v) for adding it! ## [2.1.0] - 2022-11-13 ### Added - Add Images#generate endpoint to generate images with DALL·E! ## [2.0.1] - 2022-10-22 ### Removed - Deprecate Client#answers endpoint. - Deprecate Client#classifications endpoint. ## [2.0.0] - 2022-09-19 ### Removed - [BREAKING] Remove support for Ruby 2.5. - [BREAKING] Remove support for passing `query`, `documents` or `file` as top-level parameters to `Client#search`. - Deprecate Client#search endpoint. - Deprecate Client#engines endpoints. ### Added - Add Client#models endpoints to list and query available models. ## [1.5.0] - 2022-09-18 ### Added - Add Client#moderations endpoint to check OpenAI's Content Policy. - Add Client#edits endpoints to transform inputs according to instructions. ## [1.4.0] - 2021-12-11 ### Added - Add Client#engines endpoints to list and query available engines. - Add Client#finetunes endpoints to create and use fine-tuned models. - Add Client#embeddings endpoint to get vector representations of inputs. - Add tests and examples for more engines. ## [1.3.1] - 2021-07-14 ### Changed - Add backwards compatibility from Ruby 2.5+. ## [1.3.0] - 2021-04-18 ### Added - Add Client#classifications to predict the most likely labels based on examples or a file. ### Fixed - Fixed Files#upload which was previously broken by the validation code! ## [1.2.2] - 2021-04-18 ### Changed - Add Client#search(parameters:) to allow passing `max_rerank` or `return_metadata`. - Deprecate Client#search with query, file or document parameters at the top level. - Thanks [@stevegeek](https://github.com/stevegeek) for pointing this issue out! ## [1.2.1] - 2021-04-11 ### Added - Add validation of JSONL files to make it easier to debug during upload. ## [1.2.0] - 2021-04-08 ### Added - Add Client#answers endpoint for question/answer response on documents or a file. ## [1.1.0] - 2021-04-07 ### Added - Add Client#files to allow file upload. - Add Client#search(file:) so you can search a file. ## [1.0.0] - 2021-02-01 ### Removed - Remove deprecated method Client#call - use Client#completions instead. ### Changed - Rename 'master' branch to 'main' branch. - Bump dependencies. ## [0.3.0] - 2020-11-22 ### Added - Add Client#completions to allow all parameters. ### Changed - Deprecate Client#call. - Update README. ## [0.2.0] - 2020-11-22 ### Added - Add method to use the search endpoint. ## [0.1.4] - 2020-10-18 ### Changed - Bump Rubocop to 3.9.2. - Bump Webmock to 3.9.1. ## [0.1.3] - 2020-09-09 ### Changed - Add ability to change API version in the future. - Fix README typos. ## [0.1.2] - 2020-09-09 ### Added - Add tests and cached responses for the different engines. - Add issue templates. ### Changed - Add README instructions for using the gem without dotenv. - Add list of engines to README. ## [0.1.1] - 2020-09-08 ### Added - Run Rubocop on pulls using CircleCI. ### Changed - Clean up CircleCI config file. ## [0.1.0] - 2020-09-06 ### Added - Initialise repository. - Add OpenAI::Client to connect to OpenAI API using user credentials. - Add spec for Client with a cached response using VCR. - Add CircleCI to run the specs on pull requests. ruby-openai-3.7.0/Gemfile.lock0000644000175100017510000000333514425664124015237 0ustar pravipraviPATH remote: . specs: ruby-openai (3.7.0) httparty (>= 0.18.1) GEM remote: https://rubygems.org/ specs: addressable (2.8.1) public_suffix (>= 2.0.2, < 6.0) ast (2.4.2) byebug (11.1.3) crack (0.4.5) rexml diff-lcs (1.5.0) dotenv (2.8.1) hashdiff (1.0.1) httparty (0.21.0) mini_mime (>= 1.0.0) multi_xml (>= 0.5.2) json (2.6.3) mini_mime (1.1.2) multi_xml (0.6.0) parallel (1.22.1) parser (3.2.1.1) ast (~> 2.4.1) public_suffix (5.0.1) rainbow (3.1.1) rake (13.0.6) regexp_parser (2.7.0) rexml (3.2.5) rspec (3.12.0) rspec-core (~> 3.12.0) rspec-expectations (~> 3.12.0) rspec-mocks (~> 3.12.0) rspec-core (3.12.0) rspec-support (~> 3.12.0) rspec-expectations (3.12.2) diff-lcs (>= 1.2.0, < 2.0) rspec-support (~> 3.12.0) rspec-mocks (3.12.3) diff-lcs (>= 1.2.0, < 2.0) rspec-support (~> 3.12.0) rspec-support (3.12.0) rubocop (1.48.1) json (~> 2.3) parallel (~> 1.10) parser (>= 3.2.0.0) rainbow (>= 2.2.2, < 4.0) regexp_parser (>= 1.8, < 3.0) rexml (>= 3.2.5, < 4.0) rubocop-ast (>= 1.26.0, < 2.0) ruby-progressbar (~> 1.7) unicode-display_width (>= 2.4.0, < 3.0) rubocop-ast (1.27.0) parser (>= 3.2.1.0) ruby-progressbar (1.13.0) unicode-display_width (2.4.2) vcr (6.1.0) webmock (3.18.1) addressable (>= 2.8.0) crack (>= 0.3.2) hashdiff (>= 0.4.0, < 2.0.0) PLATFORMS ruby DEPENDENCIES byebug (~> 11.1.3) dotenv (~> 2.8.1) rake (~> 13.0) rspec (~> 3.12) rubocop (~> 1.48.1) ruby-openai! vcr (~> 6.1.0) webmock (~> 3.18.1) BUNDLED WITH 2.4.5 ruby-openai-3.7.0/lib/0000755000175100017510000000000014425664124013557 5ustar pravipraviruby-openai-3.7.0/lib/openai/0000755000175100017510000000000014425664124015032 5ustar pravipraviruby-openai-3.7.0/lib/openai/finetunes.rb0000644000175100017510000000170614425664124017363 0ustar pravipravimodule OpenAI class Finetunes def initialize(access_token: nil, organization_id: nil) OpenAI.configuration.access_token = access_token if access_token OpenAI.configuration.organization_id = organization_id if organization_id end def list OpenAI::Client.get(path: "/fine-tunes") end def create(parameters: {}) OpenAI::Client.json_post(path: "/fine-tunes", parameters: parameters) end def retrieve(id:) OpenAI::Client.get(path: "/fine-tunes/#{id}") end def cancel(id:) OpenAI::Client.multipart_post(path: "/fine-tunes/#{id}/cancel") end def events(id:) OpenAI::Client.get(path: "/fine-tunes/#{id}/events") end def delete(fine_tuned_model:) if fine_tuned_model.start_with?("ft-") raise ArgumentError, "Please give a fine_tuned_model name, not a fine-tune ID" end OpenAI::Client.delete(path: "/models/#{fine_tuned_model}") end end end ruby-openai-3.7.0/lib/openai/models.rb0000644000175100017510000000062114425664124016641 0ustar pravipravimodule OpenAI class Models def initialize(access_token: nil, organization_id: nil) OpenAI.configuration.access_token = access_token if access_token OpenAI.configuration.organization_id = organization_id if organization_id end def list OpenAI::Client.get(path: "/models") end def retrieve(id:) OpenAI::Client.get(path: "/models/#{id}") end end end ruby-openai-3.7.0/lib/openai/compatibility.rb0000644000175100017510000000030514425664124020226 0ustar pravipravimodule Ruby module OpenAI VERSION = ::OpenAI::VERSION Error = ::OpenAI::Error ConfigurationError = ::OpenAI::ConfigurationError Configuration = ::OpenAI::Configuration end end ruby-openai-3.7.0/lib/openai/files.rb0000644000175100017510000000203214425664124016456 0ustar pravipravimodule OpenAI class Files def initialize(access_token: nil, organization_id: nil) OpenAI.configuration.access_token = access_token if access_token OpenAI.configuration.organization_id = organization_id if organization_id end def list OpenAI::Client.get(path: "/files") end def upload(parameters: {}) validate(file: parameters[:file]) OpenAI::Client.multipart_post( path: "/files", parameters: parameters.merge(file: File.open(parameters[:file])) ) end def retrieve(id:) OpenAI::Client.get(path: "/files/#{id}") end def content(id:) OpenAI::Client.get(path: "/files/#{id}/content") end def delete(id:) OpenAI::Client.delete(path: "/files/#{id}") end private def validate(file:) File.open(file).each_line.with_index do |line, index| JSON.parse(line) rescue JSON::ParserError => e raise JSON::ParserError, "#{e.message} - found on line #{index + 1} of #{file}" end end end end ruby-openai-3.7.0/lib/openai/version.rb0000644000175100017510000000005514425664124017044 0ustar pravipravimodule OpenAI VERSION = "3.7.0".freeze end ruby-openai-3.7.0/lib/openai/images.rb0000644000175100017510000000160214425664124016623 0ustar pravipravimodule OpenAI class Images def initialize(access_token: nil, organization_id: nil) OpenAI.configuration.access_token = access_token if access_token OpenAI.configuration.organization_id = organization_id if organization_id end def generate(parameters: {}) OpenAI::Client.json_post(path: "/images/generations", parameters: parameters) end def edit(parameters: {}) OpenAI::Client.multipart_post(path: "/images/edits", parameters: open_files(parameters)) end def variations(parameters: {}) OpenAI::Client.multipart_post(path: "/images/variations", parameters: open_files(parameters)) end private def open_files(parameters) parameters = parameters.merge(image: File.open(parameters[:image])) parameters = parameters.merge(mask: File.open(parameters[:mask])) if parameters[:mask] parameters end end end ruby-openai-3.7.0/lib/openai/client.rb0000644000175100017510000000533714425664124016645 0ustar pravipravimodule OpenAI class Client def initialize(access_token: nil, organization_id: nil, uri_base: nil, request_timeout: nil) OpenAI.configuration.access_token = access_token if access_token OpenAI.configuration.organization_id = organization_id if organization_id OpenAI.configuration.uri_base = uri_base if uri_base OpenAI.configuration.request_timeout = request_timeout if request_timeout end def chat(parameters: {}) OpenAI::Client.json_post(path: "/chat/completions", parameters: parameters) end def completions(parameters: {}) OpenAI::Client.json_post(path: "/completions", parameters: parameters) end def edits(parameters: {}) OpenAI::Client.json_post(path: "/edits", parameters: parameters) end def embeddings(parameters: {}) OpenAI::Client.json_post(path: "/embeddings", parameters: parameters) end def files @files ||= OpenAI::Files.new end def finetunes @finetunes ||= OpenAI::Finetunes.new end def images @images ||= OpenAI::Images.new end def models @models ||= OpenAI::Models.new end def moderations(parameters: {}) OpenAI::Client.json_post(path: "/moderations", parameters: parameters) end def transcribe(parameters: {}) OpenAI::Client.multipart_post(path: "/audio/transcriptions", parameters: parameters) end def translate(parameters: {}) OpenAI::Client.multipart_post(path: "/audio/translations", parameters: parameters) end def self.get(path:) HTTParty.get( uri(path: path), headers: headers, timeout: request_timeout ) end def self.json_post(path:, parameters:) HTTParty.post( uri(path: path), headers: headers, body: parameters&.to_json, timeout: request_timeout ) end def self.multipart_post(path:, parameters: nil) HTTParty.post( uri(path: path), headers: headers.merge({ "Content-Type" => "multipart/form-data" }), body: parameters, timeout: request_timeout ) end def self.delete(path:) HTTParty.delete( uri(path: path), headers: headers, timeout: request_timeout ) end private_class_method def self.uri(path:) OpenAI.configuration.uri_base + OpenAI.configuration.api_version + path end private_class_method def self.headers { "Content-Type" => "application/json", "Authorization" => "Bearer #{OpenAI.configuration.access_token}", "OpenAI-Organization" => OpenAI.configuration.organization_id } end private_class_method def self.request_timeout OpenAI.configuration.request_timeout end end end ruby-openai-3.7.0/lib/openai.rb0000644000175100017510000000225514425664124015363 0ustar pravipravirequire "httparty" require_relative "openai/client" require_relative "openai/files" require_relative "openai/finetunes" require_relative "openai/images" require_relative "openai/models" require_relative "openai/version" module OpenAI class Error < StandardError; end class ConfigurationError < Error; end class Configuration attr_writer :access_token attr_accessor :api_version, :organization_id, :uri_base, :request_timeout DEFAULT_API_VERSION = "v1".freeze DEFAULT_URI_BASE = "https://api.openai.com/".freeze DEFAULT_REQUEST_TIMEOUT = 120 def initialize @access_token = nil @api_version = DEFAULT_API_VERSION @organization_id = nil @uri_base = DEFAULT_URI_BASE @request_timeout = DEFAULT_REQUEST_TIMEOUT end def access_token return @access_token if @access_token error_text = "OpenAI access token missing! See https://github.com/alexrudall/ruby-openai#usage" raise ConfigurationError, error_text end end class << self attr_writer :configuration end def self.configuration @configuration ||= OpenAI::Configuration.new end def self.configure yield(configuration) end end ruby-openai-3.7.0/lib/ruby/0000755000175100017510000000000014425664124014540 5ustar pravipraviruby-openai-3.7.0/lib/ruby/openai.rb0000644000175100017510000000011014425664124016330 0ustar pravipravirequire_relative "../openai" require_relative "../openai/compatibility" ruby-openai-3.7.0/README.md0000644000175100017510000002575714425664124014310 0ustar pravipravi# Ruby OpenAI [![Gem Version](https://badge.fury.io/rb/ruby-openai.svg)](https://badge.fury.io/rb/ruby-openai) [![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/alexrudall/ruby-openai/blob/main/LICENSE.txt) [![CircleCI Build Status](https://circleci.com/gh/alexrudall/ruby-openai.svg?style=shield)](https://circleci.com/gh/alexrudall/ruby-openai) [![Maintainability](https://api.codeclimate.com/v1/badges/a99a88d28ad37a79dbf6/maintainability)](https://codeclimate.com/github/codeclimate/codeclimate/maintainability) Use the [OpenAI API](https://openai.com/blog/openai-api/) with Ruby! 🤖❤️ Generate text with ChatGPT, transcribe and translate audio with Whisper, or create images with DALL·E... Check out [Ruby AI Builders](https://discord.gg/k4Uc224xVD) on Discord! ### Bundler Add this line to your application's Gemfile: ```ruby gem "ruby-openai" ``` And then execute: $ bundle install ### Gem install Or install with: $ gem install ruby-openai and require with: ```ruby require "openai" ``` ## Upgrading The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`. ## Usage - Get your API key from [https://beta.openai.com/account/api-keys](https://beta.openai.com/account/api-keys) - If you belong to multiple organizations, you can get your Organization ID from [https://beta.openai.com/account/org-settings](https://beta.openai.com/account/org-settings) ### Quickstart For a quick test you can pass your token directly to a new client: ```ruby client = OpenAI::Client.new(access_token: "access_token_goes_here") ``` ### With Config For a more robust setup, you can configure the gem with your API keys, for example in an `openai.rb` initializer file. Never hardcode secrets into your codebase - instead use something like [dotenv](https://github.com/motdotla/dotenv) to pass the keys safely into your environments. ```ruby OpenAI.configure do |config| config.access_token = ENV.fetch('OPENAI_ACCESS_TOKEN') config.organization_id = ENV.fetch('OPENAI_ORGANIZATION_ID') # Optional. end ``` Then you can create a client like this: ```ruby client = OpenAI::Client.new ``` #### Custom timeout or base URI The default timeout for any OpenAI request is 120 seconds. You can change that passing the `request_timeout` when initializing the client. You can also change the base URI used for all requests, eg. to use observability tools like [Helicone](https://docs.helicone.ai/quickstart/integrate-in-one-line-of-code): ```ruby client = OpenAI::Client.new( access_token: "access_token_goes_here", uri_base: "https://oai.hconeai.com", request_timeout: 240 ) ``` or when configuring the gem: ```ruby OpenAI.configure do |config| config.access_token = ENV.fetch("OPENAI_ACCESS_TOKEN") config.organization_id = ENV.fetch("OPENAI_ORGANIZATION_ID") # Optional config.uri_base = "https://oai.hconeai.com" # Optional config.request_timeout = 240 # Optional end ``` ### Models There are different models that can be used to generate text. For a full list and to retrieve information about a single models: ```ruby client.models.list client.models.retrieve(id: "text-ada-001") ``` #### Examples - [GPT-4 (limited beta)](https://platform.openai.com/docs/models/gpt-4) - gpt-4 - gpt-4-0314 - gpt-4-32k - [GPT-3.5](https://platform.openai.com/docs/models/gpt-3-5) - gpt-3.5-turbo - gpt-3.5-turbo-0301 - text-davinci-003 - [GPT-3](https://platform.openai.com/docs/models/gpt-3) - text-ada-001 - text-babbage-001 - text-curie-001 ### ChatGPT ChatGPT is a model that can be used to generate text in a conversational style. You can use it to [generate a response](https://platform.openai.com/docs/api-reference/chat/create) to a sequence of [messages](https://platform.openai.com/docs/guides/chat/introduction): ```ruby response = client.chat( parameters: { model: "gpt-3.5-turbo", # Required. messages: [{ role: "user", content: "Hello!"}], # Required. temperature: 0.7, }) puts response.dig("choices", 0, "message", "content") # => "Hello! How may I assist you today?" ``` ### Completions Hit the OpenAI API for a completion using other GPT-3 models: ```ruby response = client.completions( parameters: { model: "text-davinci-001", prompt: "Once upon a time", max_tokens: 5 }) puts response["choices"].map { |c| c["text"] } # => [", there lived a great"] ``` ### Edits Send a string and some instructions for what to do to the string: ```ruby response = client.edits( parameters: { model: "text-davinci-edit-001", input: "What day of the wek is it?", instruction: "Fix the spelling mistakes" } ) puts response.dig("choices", 0, "text") # => What day of the week is it? ``` ### Embeddings You can use the embeddings endpoint to get a vector of numbers representing an input. You can then compare these vectors for different inputs to efficiently check how similar the inputs are. ```ruby client.embeddings( parameters: { model: "babbage-similarity", input: "The food was delicious and the waiter..." } ) ``` ### Files Put your data in a `.jsonl` file like this: ```json {"prompt":"Overjoyed with my new phone! ->", "completion":" positive"} {"prompt":"@lakers disappoint for a third straight night ->", "completion":" negative"} ``` and pass the path to `client.files.upload` to upload it to OpenAI, and then interact with it: ```ruby client.files.upload(parameters: { file: "path/to/sentiment.jsonl", purpose: "fine-tune" }) client.files.list client.files.retrieve(id: 123) client.files.content(id: 123) client.files.delete(id: 123) ``` ### Fine-tunes Upload your fine-tuning data in a `.jsonl` file as above and get its ID: ```ruby response = client.files.upload(parameters: { file: "path/to/sentiment.jsonl", purpose: "fine-tune" }) file_id = JSON.parse(response.body)["id"] ``` You can then use this file ID to create a fine-tune model: ```ruby response = client.finetunes.create( parameters: { training_file: file_id, model: "text-ada-001" }) fine_tune_id = JSON.parse(response.body)["id"] ``` That will give you the fine-tune ID. If you made a mistake you can cancel the fine-tune model before it is processed: ```ruby client.finetunes.cancel(id: fine_tune_id) ``` You may need to wait a short time for processing to complete. Once processed, you can use list or retrieve to get the name of the fine-tuned model: ```ruby client.finetunes.list response = client.finetunes.retrieve(id: fine_tune_id) fine_tuned_model = JSON.parse(response.body)["fine_tuned_model"] ``` This fine-tuned model name can then be used in completions: ```ruby response = client.completions( parameters: { model: fine_tuned_model, prompt: "I love Mondays!" } ) JSON.parse(response.body)["choices"].map { |c| c["text"] } ``` You can delete the fine-tuned model when you are done with it: ```ruby client.finetunes.delete(fine_tuned_model: fine_tuned_model) ``` ### Image Generation Generate an image using DALL·E! The size of any generated images must be one of `256x256`, `512x512` or `1024x1024` - if not specified the image will default to `1024x1024`. ```ruby response = client.images.generate(parameters: { prompt: "A baby sea otter cooking pasta wearing a hat of some sort", size: "256x256" }) puts response.dig("data", 0, "url") # => "https://oaidalleapiprodscus.blob.core.windows.net/private/org-Rf437IxKhh..." ``` ![Ruby](https://i.ibb.co/6y4HJFx/img-d-Tx-Rf-RHj-SO5-Gho-Cbd8o-LJvw3.png) ### Image Edit Fill in the transparent part of an image, or upload a mask with transparent sections to indicate the parts of an image that can be changed according to your prompt... ```ruby response = client.images.edit(parameters: { prompt: "A solid red Ruby on a blue background", image: "image.png", mask: "mask.png" }) puts response.dig("data", 0, "url") # => "https://oaidalleapiprodscus.blob.core.windows.net/private/org-Rf437IxKhh..." ``` ![Ruby](https://i.ibb.co/sWVh3BX/dalle-ruby.png) ### Image Variations Create n variations of an image. ```ruby response = client.images.variations(parameters: { image: "image.png", n: 2 }) puts response.dig("data", 0, "url") # => "https://oaidalleapiprodscus.blob.core.windows.net/private/org-Rf437IxKhh..." ``` ![Ruby](https://i.ibb.co/TWJLP2y/img-miu-Wk-Nl0-QNy-Xtj-Lerc3c0l-NW.png) ![Ruby](https://i.ibb.co/ScBhDGB/img-a9-Be-Rz-Au-Xwd-AV0-ERLUTSTGdi.png) ### Moderations Pass a string to check if it violates OpenAI's Content Policy: ```ruby response = client.moderations(parameters: { input: "I'm worried about that." }) puts response.dig("results", 0, "category_scores", "hate") # => 5.505014632944949e-05 ``` ### Whisper Whisper is a speech to text model that can be used to generate text based on audio files: #### Translate The translations API takes as input the audio file in any of the supported languages and transcribes the audio into English. ```ruby response = client.translate( parameters: { model: "whisper-1", file: File.open('path_to_file', 'rb'), }) puts response.parsed_response['text'] # => "Translation of the text" ``` #### Transcribe The transcriptions API takes as input the audio file you want to transcribe and returns the text in the desired output file format. ```ruby response = client.transcribe( parameters: { model: "whisper-1", file: File.open('path_to_file', 'rb'), }) puts response.parsed_response['text'] # => "Transcription of the text" ``` ## Development After checking out the repo, run `bin/setup` to install dependencies. You can run `bin/console` for an interactive prompt that will allow you to experiment. To install this gem onto your local machine, run `bundle exec rake install`. ## Release First run the specs without VCR so they actually hit the API. This will cost about 2 cents. You'll need to add your `OPENAI_ACCESS_TOKEN=` in `.env`. ``` NO_VCR=true bundle exec rspec ``` Then update the version number in `version.rb`, update `CHANGELOG.md`, run `bundle install` to update Gemfile.lock, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org). ## Contributing Bug reports and pull requests are welcome on GitHub at . This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/alexrudall/ruby-openai/blob/main/CODE_OF_CONDUCT.md). ## License The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT). ## Code of Conduct Everyone interacting in the Ruby OpenAI project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/alexrudall/ruby-openai/blob/main/CODE_OF_CONDUCT.md). ruby-openai-3.7.0/.circleci/0000755000175100017510000000000014425664124014644 5ustar pravipraviruby-openai-3.7.0/.circleci/config.yml0000644000175100017510000000170214425664124016634 0ustar pravipraviversion: 2.1 # Use 2.1 to enable using orbs and other features. # Declare the orbs that we'll use in our config. orbs: ruby: circleci/ruby@1.0 jobs: rubocop: parallelism: 1 docker: - image: cimg/ruby:3.1-node steps: - checkout - ruby/install-deps - run: name: Run Rubocop command: bundle exec rubocop test: parameters: ruby-image: type: string parallelism: 1 docker: - image: << parameters.ruby-image >> steps: - checkout - ruby/install-deps - run: name: Run tests command: bundle exec rspec -fd workflows: version: 2 checks: jobs: - rubocop - test: matrix: parameters: ruby-image: - cimg/ruby:2.6-node - cimg/ruby:2.7-node - cimg/ruby:3.0-node - cimg/ruby:3.1-node - cimg/ruby:3.2-node ruby-openai-3.7.0/pull_request_template.md0000644000175100017510000000051414425664124017752 0ustar pravipravi## All Submissions: * [ ] Have you followed the guidelines in our [Contributing document](../blob/main/CONTRIBUTING.md)? * [ ] Have you checked to ensure there aren't other open [Pull Requests](../pulls) for the same update/change? * [ ] Have you added an explanation of what your changes do and why you'd like us to include them? ruby-openai-3.7.0/ruby-openai.gemspec0000644000175100017510000000276514425664124016622 0ustar pravipravirequire_relative "lib/openai/version" Gem::Specification.new do |spec| spec.name = "ruby-openai" spec.version = OpenAI::VERSION spec.authors = ["Alex"] spec.email = ["alexrudall@users.noreply.github.com"] spec.summary = "OpenAI API + Ruby! 🤖❤️" spec.homepage = "https://github.com/alexrudall/ruby-openai" spec.license = "MIT" spec.required_ruby_version = Gem::Requirement.new(">= 2.6.0") spec.metadata["homepage_uri"] = spec.homepage spec.metadata["source_code_uri"] = "https://github.com/alexrudall/ruby-openai" spec.metadata["changelog_uri"] = "https://github.com/alexrudall/ruby-openai/blob/main/CHANGELOG.md" spec.metadata["rubygems_mfa_required"] = "true" # Specify which files should be added to the gem when it is released. # The `git ls-files -z` loads the files in the RubyGem that have been added into git. spec.files = Dir.chdir(File.expand_path(__dir__)) do `git ls-files -z`.split("\x0").reject { |f| f.match(%r{^(test|spec|features)/}) } end spec.bindir = "exe" spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) } spec.require_paths = ["lib"] spec.add_dependency "httparty", ">= 0.18.1" spec.post_install_message = "Note if upgrading: The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`." end ruby-openai-3.7.0/Gemfile0000644000175100017510000000041014425664124014277 0ustar pravipravisource "https://rubygems.org" # Include gem dependencies from ruby-openai.gemspec gemspec gem "byebug", "~> 11.1.3" gem "dotenv", "~> 2.8.1" gem "rake", "~> 13.0" gem "rspec", "~> 3.12" gem "rubocop", "~> 1.48.1" gem "vcr", "~> 6.1.0" gem "webmock", "~> 3.18.1" ruby-openai-3.7.0/CODE_OF_CONDUCT.md0000644000175100017510000000625614425664124015621 0ustar pravipravi# Contributor Covenant Code of Conduct ## Our Pledge In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. ## Our Standards Examples of behavior that contributes to creating a positive environment include: * Using welcoming and inclusive language * Being respectful of differing viewpoints and experiences * Gracefully accepting constructive criticism * Focusing on what is best for the community * Showing empathy towards other community members Examples of unacceptable behavior by participants include: * The use of sexualized language or imagery and unwelcome sexual attention or advances * Trolling, insulting/derogatory comments, and personal or political attacks * Public or private harassment * Publishing others' private information, such as a physical or electronic address, without explicit permission * Other conduct which could reasonably be considered inappropriate in a professional setting ## Our Responsibilities Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. ## Scope This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at alexrudall@users.noreply.github.com. All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [https://contributor-covenant.org/version/1/4][version] [homepage]: https://contributor-covenant.org [version]: https://contributor-covenant.org/version/1/4/ ruby-openai-3.7.0/.rubocop.yml0000755000175100017510000000063614425664124015273 0ustar pravipraviAllCops: TargetRubyVersion: 2.6 NewCops: enable SuggestExtensions: false Style/Documentation: # Skips checking to make sure top level modules / classes have a comment. Enabled: false Layout/LineLength: Max: 100 Exclude: - "**/*.gemspec" Metrics/BlockLength: Exclude: - "spec/**/*" Style/StringLiterals: EnforcedStyle: double_quotes Style/FrozenStringLiteralComment: Enabled: false ruby-openai-3.7.0/.rspec0000644000175100017510000000006514425664124014127 0ustar pravipravi--format documentation --color --require spec_helper ruby-openai-3.7.0/LICENSE.txt0000644000175100017510000000205714425664124014640 0ustar pravipraviThe MIT License (MIT) Copyright (c) 2020 Alex Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.