# ai-hackathon-midday-check-in
## the capture lab
sarah feedback/feature request
* wants overhead from when they were prepping so wants to see their notes - so screencaps at certain moments
* notes from students are in a box - so can try to get more perfect images
general next steps
* prototyping on a working view UI
## the chronicle lab
* mk said the simplest thing with a system prompt and a name - that's a simple one and that's great
* mk said
* you might load all the bots on app start and then reload every 5 miuntes
* or store in memory - 'create a config function where it loads config from airtable on app start' and then reloads every 5 or 10 miuntes
* or you might load all the bots every time something comes into slack (that's a lot of api calls)
* codex etc. stuff will be able to take in screenshots of airtable fields
next up
* get simple bots working to start seeing what comes up in the console etc.
## the display lab
* mvp: convo between bots that exist, visible
* make it, then get feedback from users on how it looks
* shadcn ui (?) - you have the code, so then you can ask model of choice to refactor
## the composition lab
* maybe agent whose job it is to refactor all the code into the key tools we want to use (like maybe tailwind, or shadcnu)