What Happened?
the prompt_tokens item in the usage object only contains the tokens read from teh input and not from cache, these have to be added to be openai compliant
What Should Have Happened?
No response
Relevant Code Snippet
No response
Your Twitter/LinkedIn
No response