Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix additional_headers fallback in validate_parms_base #700

Conversation

izo0x90
Copy link
Contributor

@izo0x90 izo0x90 commented Jan 17, 2025

PR Title

Fix additional_headers fallback in validate_parms_base

PR Description

additional_headers can be passed in explicitly as None in data dict. being processed by validate_parms_base and the .get fallback will not return the intended default value of an empty dict.

A specific usecase where this can be observed is when the VertexAI object is being created by LangChains .configurable_fields/ with_config functionality that allows to specific runtime parameters for steps in the chain.

Example:

    llm = VertexAI(model_name="gemini-pro", project="project-ai").configurable_fields(
        temperature=ConfigurableField(
            id="temperature",
            name="LLM Temperature",
            description="The temperature of the LLM",
        )
    )

    qa_chain = create_stuff_documents_chain(
        llm.with_config(configurable={"temperature": 0.9}),
        TEMPLATE,
        document_prompt=DOC_TEMPLATE,
        output_parser=ANSWER_PARSER,
    )

`additional_headers` can be passed in explicitly as `None` in data dict.
being processed by `validate_parms_base` and the `.get` fallback will
not return the intended default value of an empty dict.

A specific usecase where this can be observed is when the VertexAI
object is being created by LangChains `.configurable_fields`/
`with_config` functionality that allows to specific runtime parameters
for steps in the chain.

Example:
```
    llm = VertexAI(model_name="gemini-pro", project="project-ai").configurable_fields(
        temperature=ConfigurableField(
            id="temperature",
            name="LLM Temperature",
            description="The temperature of the LLM",
        )
    )

    qa_chain = create_stuff_documents_chain(
        llm.with_config(configurable={"temperature": 0.9}),
        TEMPLATE,
        document_prompt=DOC_TEMPLATE,
        output_parser=ANSWER_PARSER,
    )
```
@lkuligin lkuligin merged commit 4691610 into langchain-ai:main Jan 23, 2025
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants