Replies: 2 comments 1 reply
-
|
Hi. Do you know how much text it accepts in the prompt field in PRO? |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
From my tests, it appear to shorten uploaded context before it is processed by the selected model. I think it does some kind of preprocessing to save on tokens and therefore makes it unusable for our purposes. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
A lot of web chat platforms are already supported, but Perplexity AI is not yet.
Is a feature for this already in the works or are there specific reasons why compatibility is not possible for this provider?
Thank you very much!
Beta Was this translation helpful? Give feedback.
All reactions