# AI Assistant UI Integration with Storage Service ## Overview The UI should now integrate with the changes made to sidekick service to use storage service ## Why now? I'll be in India for a while, and need something relevant to do that involves primarily heads-down work. This is not only a natural progression of sidekick-storage work that I just finished, it's something we'll want to do eventually anyways. ## Goals: 0. As we implement this, most users' chats prior to the work here should be unaffected. 1. AI Assistant UI should, upon opening for the first time, re-hydrate the state of the conversation from storage service. 2. AI Assistant UI should, at any point, be up to date with the conversation state in storage service as messages are sent. 3. AI Assistant UI should be able to Reset Chat, which will create a new conversation in storage service and clear the UI. 4. Other ad-hoc usages of sidekick service that do not need conversation history should not be affected. ## Solution: Let's conceive of `/sidekick/chat` as a sort of CRUD endpoint for Conversations. Since this endpoint currently creates or updates conversations with new messages by accepting whole conversations, its POST handler should then _return_ the whole, newly updated conversation under `:messages`, not just OAI's response message. For convenience with ad-hoc/non-conversational usage, it can still return a `:message` too. The UI should use that returned collection of messages to inform its conversation state, rather than storing the convo in localstorage and reading/managing it from there. We'll also add a GET handler for `/sidekick/chat/:id` that returns the full convo from storage. Because we're now fetching that data, we should handle loading states. Ideally, we start fetching the convo once the query editor is opened, not once the sidebar is. Regarding Reset Chat: today it _already_ clears the locally-stored chat-id. Thus, any subsequent `send-message` calls will automatically create a new conversation (and ID) - effectively leaving the previous conversation files to TTL naturally per policy. Then, along with the new ID, a new set of files for this convo will be created until Reset Chat is invoked again. **Thus, no work to be done there.** ## On mapping queries to convos: We need a way for callers to `GET /sidekick/chat/:id` to know which convo id to use when attempting to resume conversations. **Today**, the UI knows which query maps to which conversation by storing the convo id in local storage, keyed by the query's `id` as is passed as the arg to `use-ai-assistant`. The hook then manages making the `send-message` API calls with or without the convo id if existant, and stores any new one returned by `send-message`. We will suffer the same risk of losing in-flight convos if the user clears local site data. **We need something more durable than this.** **I propose the addition of 2 new optional columns onto Conversation: `:entity-type` and `:entity-id`**. This allows conversations to be _fairly_ agnostic to how conversations are associated with various user-facing entities (eg per query, segment) or usage patterns (eg an across-app help chatbot), and if we support filtering by these columns, we can support both resuming any single conversation and listing all conversations of a given type or set thereof. IE, this would support implementing both a "Help Chats List/History" and a "My/All Chats" UX. **Why these two columns?**: 1. `:entity-id` to uniquely identify the specific entity associated. 1. `:entity-type` is a string that can be used to filter conversations by type or construct a direct-link URL to an entity and its convo. - This could be "query", "segment", "help", etc. - Unless we're sure we won't want to match chats to anything other than queries, we need this info to - This is only useful if there's ever any chats not associated with the Query Editor, which is likely. In truth we could encode the `:entity-type` into the `:entity-id` and use a delimiter - like `query__seg-4K4jiYaX8qmYQKcZk` or something - but I think this is cleaner and more flexible and requires less knowledge for consumers. **What about ad-hoc chats?** They might not even use the entity mapping columns unless we want to filter these ad-hoc "conversations" by their use case. Eg for file pattern inference,`:entity-type` could be "file-pattern-inference", and `:entity-id` could be nil. Then, if we ever wanted to list all file pattern inference convos, we could filter by `:entity-type` == "file-pattern-inference". ### How to associate existing convos to their queries? Just as we implemented code to "incrementally" upload Query Editor AI Assistant convos to storage service, we can do the same for query-to-convo mappings. More specifically: 1. make backend code change for new columns onto Conversation 1. make backend code change such that `send-message!` accepts `:entity-type` and `:entity-id` args, and upserts them for the corresponding conversation 1. make UI code change that passes the current query-id as the `:entity-id` arg, and `"query"` or something for the `:entity-type` to `send-message!`. 1. wait some safe time, eg a month 1. we're "migrated" from local query-convo mapping to backend-stored ## Why not stick to local storage? Currently, conversation state is stored in local storage. Why not stick to that? - Currently, if a user clears local site data (eg to fix a login issue), they lose conversations Furthermore, part of this work - exposing an endpoint for reading convos - will help us provide conversation sharing ## Caveats and Open Questions: ### Lost data? Since we started storing convos in storage service, any conversations already underway will have been stored **so long as its user has used the AI Assistant since then**. This means that once we switch over to resuming convos from storage service, we should be covered for all conversations except those not interacted with between last week and the day of the switch-over. - **What should we do about these convos?** - Before switching over to read from storage service, let's wait a bit - like a month - **Will those users care? Will we?** - After waiting long enough, a warning should suffice - **How many conversations would even fit this criteria? How can we find out?**. - Though waiting should suffice, we can check honeycomb and compare with what's in postgres/storage service ### Merge conflicts? Graeme and others might be working a lot in the ai ass UI code - there could be merge conflicts and untangling that might need to happen once the UI work is underway. - Graeme says he'll likely be done with any ai assistant UI work by the time I'm getting to it ## PoA: 1. SERVICE - update send-message to return `:messages` 2. SERVICE - expose a Read Conversation api endpoint 3. UI - Behind feature flag - code change to have the UI set convo state directly from `send-message` responses, but still storing in local storage after. 4. Sit on that for a week or two 5. UI - Behind feature flag - code change to have the UI call the new Read Conversation endpoint to resume convos, instead of from local storage, and remove the local storage conversation state code 6. UI - Flip feature flags