count_tokens_approximately was calling json.dumps(tool_dict) and throwing
away everything but the length on every invocation — even though the dict
returned by convert_to_openai_tool(tool) is stable for a given tool. Stash
the char count on the tool instance under _openai_function_chars (paired
with the _openai_function_dict schema cache from the previous commit).
BaseTool.__setattr__ pops both keys on mutation of args_schema / description
/ name so dynamic tool re-registration or in-place edits invalidate
correctly.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>