Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deepseek的官网被攻击,填写deepseek的api_key无法使用报错。 #17

Open
DAAworld opened this issue Feb 18, 2025 · 17 comments
Open

Comments

@DAAworld
Copy link

我是用阿里灵积平台的deepseek-v3的配置,该怎么写呢?
OPENAI_API_KEY=sk-5523020cf22bf5a
OPENAI_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
DEEPSEEK_API_KEY=
COMPLETION_MODEL=deepseek-v3

我是这么写的,之后litellm就报错没有提供llm_provider.我写成openai/deepseek-v3,报错找不到模型。

@DAAworld
Copy link
Author

写成openai/deepseek-v3报错
File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion
assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode"
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling
raise Exception(
Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=openai. Error: This model isn't mapped yet. model=deepseek-v3, custom_llm_provider=openai. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.

@DAAworld
Copy link
Author

写成deepseek-v3报错
File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main
user_mode(model, context_variables, False)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode
response = client.run(agent, messages, context_variables, debug=debug)
File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run
completion = self.get_chat_completion(
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion
assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode"
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling
raise Exception(
Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3
Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

@tjb-tech
Copy link
Collaborator

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。

COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

@DAAworld
Copy link
Author

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。

COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误

(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main
Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。

Successfully switched to new branch: autoagent_mirror_None
Container 'auto_agent' has been created and started.
⠙ Creating environment...
Browser Env
[2025-02-18 15:30:37]
⠼ Creating environment...
Browser Env
[2025-02-18 15:31:42]
⠇ Creating environment...
Browser Env
[2025-02-18 15:31:43]
Starting browser env...
Browser Env
[2025-02-18 15:31:43]
Traceback (most recent call last):
File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in
sys.exit(cli())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call
return self.main(*args, **kwargs)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main
rv = self.invoke(ctx)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke
return __callback(*args, **kwargs)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main
code_env, web_env, file_env = create_environment(docker_config)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment
web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name)
File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init
self.init_browser()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser
self.process.start()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start
self._popen = self._Popen(self)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen
return Popen(process_obj)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init
super().init(process_obj)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init
self._launch(process_obj)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch
reduction.dump(process_obj, fp)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection
df = reduction.DupFd(conn.fileno())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno
self._check_closed()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed
raise OSError("handle is closed")
OSError: handle is closed
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

@tjb-tech
Copy link
Collaborator

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误

(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。

Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

@DAAworld
Copy link
Author

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。
Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

更新之前是可以进入到模式选择界面的。之后就不行了。

@DAAworld
Copy link
Author

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。
Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

跟命令行中启动有头浏览器有关吗?

@tjb-tech
Copy link
Collaborator

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。
Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

跟命令行中启动有头浏览器有关吗?

您可以尝试关闭有头浏览器,在https://github.com/HKUDS/AutoAgent/blob/main/autoagent/environment/browser_env.py中设置

env = gym.make(
                'browsergym/openended',
                task_kwargs={'start_url': 'about:blank', 'goal': 'PLACEHOLDER_GOAL'},
                wait_for_user_message=False,
                headless=True,
                disable_env_checker=True,
                tags_to_mark='all',
                action_mapping = action_mapping
            )

因为有头浏览器所需要的计算资源较多

@tjb-tech
Copy link
Collaborator

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。
Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

跟命令行中启动有头浏览器有关吗?

请问您解决了嘛

@DAAworld
Copy link
Author

写成deepseek-v3报错 File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 206, in main user_mode(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 269, in user_mode response = client.run(agent, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 108, in get_chat_completion assert litellm.supports_function_calling(model = create_model) == True, f"Model {create_model} does not support function calling, please set FN_CALL=False to use non-function calling mode" File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/litellm/utils.py", line 1833, in supports_function_calling raise Exception( Exception: Model not found or error in checking function calling support. You passed model=deepseek-v3, custom_llm_provider=None. Error: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=deepseek-v3 Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

你好,建议使用openai兼容endpoints,注意openai兼容endpoints不支持原生function calling,需要设置FN_CALL=False,以下是正确的使用命令。我们的repos又更新了,您可以pull一下最新更新。
COMPLETION_MODEL=openai/deepseek-v3 API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 FN_CALL=False OPENAI_API_KEY=sk-your-api-key auto main

我更新代码之后再执行命令就报这个错误
(autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$ auto main Failed to create and switch to new branch. Error: fatal: 一个分支名 'autoagent_mirror_None' 已经存在。
Successfully switched to new branch: autoagent_mirror_None Container 'auto_agent' has been created and started. ⠙ Creating environment... Browser Env [2025-02-18 15:30:37] ⠼ Creating environment... Browser Env [2025-02-18 15:31:42] ⠇ Creating environment... Browser Env [2025-02-18 15:31:43] Starting browser env... Browser Env [2025-02-18 15:31:43] Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 191, in main code_env, web_env, file_env = create_environment(docker_config) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 151, in create_environment web_env = BrowserEnv(browsergym_eval_env = None, local_root=docker_config.local_root, workplace_name=docker_config.workplace_name) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 368, in init self.init_browser() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/environment/browser_env.py", line 393, in init_browser self.process.start() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/context.py", line 288, in _Popen return Popen(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 32, in init super().init(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/popen_spawn_posix.py", line 47, in _launch reduction.dump(process_obj, fp) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 968, in reduce_connection df = reduction.DupFd(conn.fileno()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 170, in fileno self._check_closed() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/multiprocessing/connection.py", line 136, in _check_closed raise OSError("handle is closed") OSError: handle is closed (autoagent) rz@rz-v100:/mnt/data/rz/project/AutoAgent$

看上去启动浏览器环境的时候出了问题,请问您之前可以正常启动浏览器环境嘛(即进入AutoAgent的起始界面)?您可以检查一下是否是内存不足导致的

跟命令行中启动有头浏览器有关吗?

请问您解决了嘛

我能进入模式选择了,但是按照上面您的建议写的

Required Github Tokens

GITHUB_AI_TOKEN=ghp_xdsrGualJxvi1nNXfj

Optional API Keys

OPENAI_API_KEY=sk-556b15aa9623020cf22bf5a
OPENAI_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
DEEPSEEK_API_KEY=
COMPLETION_MODEL=openai/deepseek-v3
FN_CALL=False
ANTHROPIC_API_KEY=
GEMINI_API_KEY=
HUGGINGFACE_API_KEY=
GROQ_API_KEY=
XAI_API_KEY=

没有响应,过一段时间,就在重试,

Image

@DAAworld
Copy link
Author

Image

@tjb-tech
Copy link
Collaborator

Image

你好这个可能和您LLM的rate limit有关,您可以设置DEBUG=True MC_MODE=Fasle来看到详细的消息记录来供我们debug。期待您的进一步feedback

@DAAworld
Copy link
Author

Image

你好这个可能和您LLM的rate limit有关,您可以设置DEBUG=True MC_MODE=Fasle来看到详细的消息记录来供我们debug。期待您的进一步feedback

这个结果测试起来跟FN_CALL=False有关,如果设置=False,base_url正常,设置成True,base——url就是openai的。我想请问下国内有什么模型可以复现readme里面的对话吗?我目前用国内阿里的灵积平台是不能的。

@tjb-tech
Copy link
Collaborator

请问您现在的问题是可以正常运行,但是结果不尽人意嘛?Agent的运行和base model的性能确实有关,您可以试下deepseek v3,deepseek r1,和qwen max这种大参数模型。Claude-3.5 和gpt-4o当然是首选。

@DAAworld
Copy link
Author

请问您现在的问题是可以正常运行,但是结果不尽人意嘛?Agent的运行和base model的性能确实有关,您可以试下deepseek v3,deepseek r1,和qwen max这种大参数模型。Claude-3.5 和gpt-4o当然是首选。

Tell me what do you want to create with Agent Chain? (type "exit" to quit, press "Enter" to continue): I want to createFinancial Agent that can help me to search the
financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given
period.-get income statements for a given ticker over a given period.
Your request: I want to createFinancial Agent that can help me to search the
financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given
period.-get income statements for a given ticker over a given period.
@agent Former Agent will help you, be patient...
Traceback (most recent call last):
File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in
sys.exit(cli())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call
return self.main(*args, **kwargs)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main
rv = self.invoke(ctx)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke
return __callback(*args, **kwargs)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 213, in main
meta_agent(model, context_variables, False)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 216, in meta_agent
agent_form, output_xml_form, messages = agent_profiling(agent_former, client, messages, context_variables, requirements, debug)
File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 29, in agent_profiling
response = client.run(agent_former, messages, context_variables, debug=debug)
File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run
completion = self.get_chat_completion(
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 135, in get_chat_completion
assert agent.tool_choice == "required", f"Non-function calling mode MUST use tool_choice = 'required' rather than {agent.tool_choice}"
AssertionError: Non-function calling mode MUST use tool_choice = 'required' rather than None

这个报错是什么原因呢?设置的灵积平台端点,用的deepseek-v3

@tjb-tech
Copy link
Collaborator

请问您现在的问题是可以正常运行,但是结果不尽人意嘛?Agent的运行和base model的性能确实有关,您可以试下deepseek v3,deepseek r1,和qwen max这种大参数模型。Claude-3.5 和gpt-4o当然是首选。

Tell me what do you want to create with Agent Chain? (type "exit" to quit, press "Enter" to continue): I want to createFinancial Agent that can help me to search the financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given period.-get income statements for a given ticker over a given period. Your request: I want to createFinancial Agent that can help me to search the financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given period.-get income statements for a given ticker over a given period. @agent Former Agent will help you, be patient... Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 213, in main meta_agent(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 216, in meta_agent agent_form, output_xml_form, messages = agent_profiling(agent_former, client, messages, context_variables, requirements, debug) File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 29, in agent_profiling response = client.run(agent_former, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 135, in get_chat_completion assert agent.tool_choice == "required", f"Non-function calling mode MUST use tool_choice = 'required' rather than {agent.tool_choice}" AssertionError: Non-function calling mode MUST use tool_choice = 'required' rather than None

这个报错是什么原因呢?设置的灵积平台端点,用的deepseek-v3

Hi, 你好,这个是因为deepseek-v3使用是采用了tool use转换操作,而agent profiling不需要调用tools。我们会修复这个bug,您有使用其他功能嘛,例如user mode

@DAAworld
Copy link
Author

请问您现在的问题是可以正常运行,但是结果不尽人意嘛?Agent的运行和base model的性能确实有关,您可以试下deepseek v3,deepseek r1,和qwen max这种大参数模型。Claude-3.5 和gpt-4o当然是首选。

Tell me what do you want to create with Agent Chain? (type "exit" to quit, press "Enter" to continue): I want to createFinancial Agent that can help me to search the financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given period.-get income statements for a given ticker over a given period. Your request: I want to createFinancial Agent that can help me to search the financial information online.You may help me to - get balance sheets for a given ticker over a given period.-get cash flow statements for a given ticker over a given period.-get income statements for a given ticker over a given period. @agent Former Agent will help you, be patient... Traceback (most recent call last): File "/mnt/data/rz/miniconda3/envs/autoagent/bin/auto", line 8, in sys.exit(cli()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1161, in call return self.main(*args, **kwargs) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/cli.py", line 213, in main meta_agent(model, context_variables, False) File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 216, in meta_agent agent_form, output_xml_form, messages = agent_profiling(agent_former, client, messages, context_variables, requirements, debug) File "/mnt/data/rz/project/AutoAgent/autoagent/cli_utils/metachain_meta_agent.py", line 29, in agent_profiling response = client.run(agent_former, messages, context_variables, debug=debug) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 384, in run completion = self.get_chat_completion( File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 451, in result return self.__get_result() File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result raise self._exception File "/mnt/data/rz/miniconda3/envs/autoagent/lib/python3.10/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) File "/mnt/data/rz/project/AutoAgent/autoagent/core.py", line 135, in get_chat_completion assert agent.tool_choice == "required", f"Non-function calling mode MUST use tool_choice = 'required' rather than {agent.tool_choice}" AssertionError: Non-function calling mode MUST use tool_choice = 'required' rather than None
这个报错是什么原因呢?设置的灵积平台端点,用的deepseek-v3

Hi, 你好,这个是因为deepseek-v3使用是采用了tool use转换操作,而agent profiling不需要调用tools。我们会修复这个bug,您有使用其他功能嘛,例如user mode

感谢您的回复,是用usermode同样也会报错误,分两种情况,一种是大模型直接响应的比如“帮我做首诗”,这不会报错。当我询问“帮我查询天气”的时候会报错没有playwright的down,page—down。。。的tool,这个错误我没有截屏出来。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants