Skip to content

Commit 26a6c29

Browse files
authored
Merge pull request #36 from alexrudall/add_new_endpoints
Add Answers endpoint
2 parents fa4c744 + ce28c76 commit 26a6c29

File tree

10 files changed

+330
-3
lines changed

10 files changed

+330
-3
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,12 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8+
## [1.2.0] - 2021-04-08
9+
10+
### Added
11+
12+
- Add Client#answers endpoint for question/answer response on documents or a file.
13+
814
## [1.1.0] - 2021-04-07
915

1016
### Added

Gemfile.lock

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
PATH
22
remote: .
33
specs:
4-
ruby-openai (1.1.0)
4+
ruby-openai (1.2.0)
55
dotenv (~> 2.7.6)
66
httparty (~> 0.18.1)
77

README.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,32 @@ You can alternatively search using the ID of a file you've uploaded:
9292
client.search(engine: "ada", file: "abc123", query: "happy")
9393
```
9494

95+
### Answers
96+
97+
Pass documents, a question string, and an example question/response to get an answer to a question:
98+
99+
```
100+
response = client.answers(parameters: {
101+
documents: ["Puppy A is happy.", "Puppy B is sad."],
102+
question: "which puppy is happy?",
103+
model: "curie",
104+
examples_context: "In 2017, U.S. life expectancy was 78.6 years.",
105+
examples: [["What is human life expectancy in the United States?","78 years."]],
106+
})
107+
```
108+
109+
You can alternatively search using the ID of a file you've uploaded:
110+
111+
```
112+
response = client.answers(parameters: {
113+
file: "123abc",
114+
question: "which puppy is happy?",
115+
model: "curie",
116+
examples_context: "In 2017, U.S. life expectancy was 78.6 years.",
117+
examples: [["What is human life expectancy in the United States?","78 years."]],
118+
})
119+
```
120+
95121
## Development
96122

97123
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.

lib/ruby/openai/client.rb

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,17 @@ def initialize(access_token: nil)
77
@access_token = access_token || ENV["OPENAI_ACCESS_TOKEN"]
88
end
99

10+
def answers(version: default_version, parameters: {})
11+
self.class.post(
12+
"/#{version}/answers",
13+
headers: {
14+
"Content-Type" => "application/json",
15+
"Authorization" => "Bearer #{@access_token}"
16+
},
17+
body: parameters.to_json
18+
)
19+
end
20+
1021
def completions(engine:, version: default_version, parameters: {})
1122
self.class.post(
1223
"/#{version}/engines/#{engine}/completions",

lib/ruby/openai/version.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
module Ruby
22
module OpenAI
3-
VERSION = "1.1.0".freeze
3+
VERSION = "1.2.0".freeze
44
end
55
end

spec/fixtures/cassettes/ada_answers_documents_which_puppy_is_happy_.yml

Lines changed: 75 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/fixtures/cassettes/davinci_answers_file_which_puppy_is_happy_.yml

Lines changed: 79 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/fixtures/cassettes/files_upload_answers.yml

Lines changed: 64 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
RSpec.describe OpenAI::Client do
2+
describe "#answers", :vcr do
3+
let(:question) { "which puppy is happy?" }
4+
let(:examples) { [["What is human life expectancy in the United States?", "78 years."]] }
5+
let(:examples_context) { "In 2017, U.S. life expectancy was 78.6 years." }
6+
7+
context "with a file" do
8+
let(:cassette) { "#{engine} answers file #{question}".downcase }
9+
let(:filename) { "puppy.jsonl" }
10+
let(:file) { File.join(RSPEC_ROOT, "fixtures/files", filename) }
11+
let!(:file_id) do
12+
response = VCR.use_cassette("files upload answers") do
13+
OpenAI::Client.new.files.upload(parameters: { file: file, purpose: "answers" })
14+
end
15+
JSON.parse(response.body)["id"]
16+
end
17+
let(:response) do
18+
OpenAI::Client.new.answers(
19+
parameters: {
20+
model: engine,
21+
question: question,
22+
examples: examples,
23+
examples_context: examples_context,
24+
file: file_id
25+
}
26+
)
27+
end
28+
29+
context "with engine: davinci" do
30+
let(:engine) { "davinci" }
31+
32+
it "answers the question" do
33+
VCR.use_cassette(cassette) do
34+
expect(response.parsed_response["answers"][0]).to include("puppy A is happy")
35+
end
36+
end
37+
end
38+
end
39+
40+
context "with documents" do
41+
let(:cassette) { "#{engine} answers documents #{question}".downcase }
42+
let(:documents) { ["Puppy A is happy", "Puppy B is sad."] }
43+
44+
let(:response) do
45+
OpenAI::Client.new.answers(
46+
parameters: {
47+
model: engine,
48+
question: question,
49+
examples: examples,
50+
examples_context: examples_context,
51+
documents: documents
52+
}
53+
)
54+
end
55+
56+
context "with engine: ada" do
57+
let(:engine) { "ada" }
58+
59+
it "answers the question" do
60+
VCR.use_cassette(cassette) do
61+
expect(response.parsed_response["answers"][0]).to include("Puppy A.")
62+
end
63+
end
64+
end
65+
end
66+
end
67+
end

spec/ruby/openai/client/search_spec.rb

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,6 @@
4242
)
4343
end
4444
let(:best_match) { JSON.parse(response.body)["data"].max_by { |d| d["score"] }["document"] }
45-
let(:cassette) { "#{engine} search #{query}".downcase }
4645

4746
context "with engine: ada" do
4847
let(:engine) { "ada" }

0 commit comments

Comments
 (0)