-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Update our model selection to include GPT-5, and use it as part of the table generation #16711
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
QA Wolf here! As you write new code it's important that your test coverage is keeping up. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing job working this out - seems like it was a real pain! Just to check, does this mean we are now using the unidici approach for our mock model responses, but nock everywhere else?
If there is a specific error you remember seeing, could be worth documenting somewhere what to do if you need to mock a builtin node fetch.
That's right, yea. I know it's not ideal but migrating everything over from Nock seemed like being more of a pain. And at least now we have a reasonable concept of how to do this, as we will certainly start to see it as more libs start to move towards undici / internal fetch over node-fetch |
Description
This PR updates our model selector to have the three new GPT-5 models and some accompanying updates around parameters to support this.
Launchcontrol