Skip to content

Conversation

@coolbaluk
Copy link

@coolbaluk coolbaluk commented May 8, 2024

Supersedes #731

Based on the original fork of @tanzyy96 just brought up to date.

We've been running this for a couple of weeks and has served us well.

Example usage:

              stream, err = client.CreateThreadAndStream(ctx, openai.CreateThreadAndRunRequest{
			RunRequest: openai.RunRequest{
				AssistantID: AssistantID,
			},
			Thread: openai.ThreadRequest{
				Messages: Messages,
			},
		})
		
		defer stream.Close()
		
		for {
		        resp, err = stream.Recv()
		        if errors.Is(err, io.EOF) {
			        break
		        }
                 }
		

@codecov
Copy link

codecov bot commented May 8, 2024

Codecov Report

Attention: Patch coverage is 75.67568% with 18 lines in your changes missing coverage. Please review.

Project coverage is 97.92%. Comparing base (774fc9d) to head (bb4bf7f).
Report is 71 commits behind head on master.

Files with missing lines Patch % Lines
run.go 75.67% 12 Missing and 6 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #737      +/-   ##
==========================================
- Coverage   98.46%   97.92%   -0.54%     
==========================================
  Files          24       26       +2     
  Lines        1364     1835     +471     
==========================================
+ Hits         1343     1797     +454     
- Misses         15       26      +11     
- Partials        6       12       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.


🚨 Try these New Features:

@opvexe
Copy link

opvexe commented May 11, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

@tanzyy96
Copy link

tanzyy96 commented May 12, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

There's an API for inserting message in a thread. Example below:

_, err = a.client.CreateMessage(ctx, threadId, openai.MessageRequest{
    Role:    openai.ChatMessageRoleUser,
    Content: messageText,
})
if err != nil {
    logger.Error("failed to create message", zap.Error(err))
    return err
}

outStream, err = a.client.CreateRunStreaming(ctx, threadId, openai.RunRequest{
    AssistantID: assistantId,
})

@opvexe
Copy link

opvexe commented May 12, 2024

may i ask question?
how to talk in next context  use last thread_id?
	cc := openai.NewClientWithConfig(config)

	stream, _ := cc.CreateRunStreaming(context.Background(), "thread_VEPTeIWj1umjdFyUd0Aj4lb2", openai.RunRequest{
		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	})

	//stream, _ := cc.CreateThreadAndStream(context.Background(), openai.CreateThreadAndRunRequest{
	//	RunRequest: openai.RunRequest{
	//		AssistantID: "asst_IuAlZbLkuIgcky26QPB2TQy0",
	//	},
	//	Thread: openai.ThreadRequest{
	//		Messages: []openai.ThreadMessage{
	//			{
	//				Role:    openai.ThreadMessageRoleUser,
	//				Content: "我刚问了什么?",
	//			},
	//		},
	//	},
	//})

	defer stream.Close()

	for {
		resp, err := stream.Recv()
		if errors.Is(err, io.EOF) {
			break
		}

		t.Log("thread_id", resp.ID)

		for _, content := range resp.Delta.Content {
			t.Log(content.Text.Value)
		}
	}

There's an API for inserting message in a thread. Example below:

_, err = a.client.CreateMessage(ctx, threadId, openai.MessageRequest{
    Role:    openai.ChatMessageRoleUser,
    Content: messageText,
})
if err != nil {
    logger.Error("failed to create message", zap.Error(err))
    return err
}

outStream, err = a.client.CreateRunStreaming(ctx, threadId, openai.RunRequest{
    AssistantID: assistantId,
})

thanks very much

@liushaobo-maker
Copy link

合并进去了吗?现在很需要这个

@coolbaluk
Copy link
Author

@sashabaranov

Hi Sasha,

What is needed to get this one through review ?

I believe it's the most straightforward of the streaming implementations and will give a good base for further development.

Can confirm that has been extensively ran with production workloads for a while by us.

@opvexe
Copy link

opvexe commented Aug 16, 2024

i have use office openai sdk https://github.com/openai/openai-go/blob/main/api.md

@nspr-io nspr-io closed this by deleting the head repository Nov 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants