hey,
we are have a client that processes big file using genai context caching, and it works great.
I was looking into a solution to saving conversation and loading back into them.
genai has a “Interactions API” witch is perfect for us as it handles most of the work and hands out keys to load back into conversations.
the thing is it doesn’t support context caches, i was guessing because context cache has a ttl and is part of a different service,
is there a way to use interactions with context? maby store to convo without the context and load a new cache with the interaction?
def _create_first_interaction(self, cached_name, textInput) -> str:
interaction = self.client.interactions.create(
model=self.gemini_model,
store=True,
input=textInput,
generation_config=types.GenerateContentConfig(
thinking_config=types.ThinkingConfig(thinking_budget=-1),
cached_content=cached_name,
stop_sequences=["[DONE]"],
temperature=0.0
),
)
debug("Created interaction: " + interaction)
self.interaction_id = interaction.get('name')
return interaction
any other ideas?? without interactions?