openai[patch]: fix get_num_tokens for function calls (#25785)

Closes https://github.com/langchain-ai/langchain/issues/25784

See additional discussion
[here](0a4ee864e9 (r145147380)).
This commit is contained in:
ccurme
2024-08-27 16:18:19 -04:00
committed by GitHub
parent 2aa35d80a0
commit 2e5c379632
2 changed files with 6 additions and 3 deletions

View File

@@ -947,7 +947,7 @@ class BaseChatOpenAI(BaseChatModel):
else:
# Cast str(value) in case the message value is not a string
# This occurs with function messages
num_tokens += len(encoding.encode(value))
num_tokens += len(encoding.encode(str(value)))
if key == "name":
num_tokens += tokens_per_name
# every reply is primed with <im_start>assistant