-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Description
Describe the bug
I want to understand if save_files_as_artifacts_plugin is expected to only work with gemini models by google ai studio or vertex. If you are using e.g litellm with openrouter, along with a GCS storage, you will get an error if you try to upload an image (or any binary file) to the model.
To Reproduce
Please share a minimal code and data to reproduce your problem.
I have defined the artifact_service_uri along with plugin save_files_as_artifacts_plugin using FASTAPI (get_fast_api_app) .
In my agent, i have included the tool for load_artifacts. Also, the model in my agent is as follows:
Agent(LiteLlm(
model="openrouter/openai/GPT-4o",
base_url="https://openrouter.ai/api/v1"))
when i send an image, i can see the following error:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenrouterException - {"error":{"message":"Provider returned error","code":400,"metadata":{"raw":"{\n \"error\": {\n \"message\": \"Invalid Value: 'file'. This model does not support file content types.\",\n \"type\": \"invalid_request_error\",\n \"param\": \"messages[7].content[2].type\",\n \"code\": \"invalid_value\"\n }\n}","provider_name":"Azure","is_byok":false}},"user_id":"user_2omSh8lHArKRHfj0B31X7M3qoHT"}
Is this behavior exepcted as openrouter is not able to access the GCS file directly. Should i be using inmemory for the artifact instead as it is stateless and this feature is only for gemini models hosted in GCP/ ai studio?
Steps to reproduce the behavior:
note that i am using dockerfile based on FROM python:3.11.
Desktop (please complete the following information):
- OS: [e.g. macOS, Linux, Windows]: linux
- Python version(python -V): python3.11
- ADK version(pip show google-adk): latest 1.23
Model Information:
- Are you using LiteLLM: Yes/No: yes
- Which model is being used(e.g. gemini-2.5-pro): openrouter
Additional context
Add any other context about the problem here.