feat(model): return logprobs for prompt (also for max_tokens=0)#84
feat(model): return logprobs for prompt (also for max_tokens=0)#84nicpopovic wants to merge 3 commits intohyperonym:masterfrom
Conversation
|
Thanks, @nicpopovic ! Please give us some time to test and review the changes. 😉 |
|
Woops, looks like I didn't test with encoder_decoder models, sorry about that... Tests are passing for me now. |
|
@nicpopovic Sorry for the late reply. We carefully reviewed your proposed changes and fully understand the purpose. However, merging this pull request may encounter some difficulties: the logic of We are planning to refactor |
Hi,
Cool project!
OpenAIs API allows you get logprobs for a prompt without generating text (max_tokens=0, logprobs=1).
So far your API returns zero as logprobs for input tokens.
This PR is a quick implementation of this feature (not particularly elegant, but maybe you'll find it useful).
Cheers,
Nicholas