# 20240213 chris and siriana planning notes
* [project folder](https://drive.google.com/drive/folders/15i6EZDzWiKmbINlbQCmyQfk-upR2GaHt)
## chris
[chris's stable diffusion and control net doc](https://hackmd.io/@ll-23-24/r1e4nvGs6)
* running stable diffusion locally
* thinks students would have fun putting shapes on the screen, and then stable diffusion does the work to decide what everything is
* once you know what the windows do, it's very quick
* mk said if we're using it as a blackbox to generate
* quick blender render of their shapes,
* we could have shapes already there and they can move them around, etc.
* MK : teaching them rudimentary blender and this just punches that up for the takeaway
* so blender station, and stable diffusion is last step, AI can help polish this off
* later on, we can use what they create as part of the pipeline (like backgrounds they can pose in front of)
* mk said we can have the before and after stills
* chris mentioend the pose decoder
* maybe motion capture zone later on
chris will do the blender + stable diffusion station
a series of machines that perform these steps, that will be slicker
## chris and siriana
what siriana wants to do/happy to do:
* teach basics of unity
* mechanics in unity
* story and character design
* she's happy to go where we need here and where others can't go
mk asked if we can connect story to unity this year
* even simple forking narrative structures based on where your characters walks
* talking to an NPC that gives you two choices, you get to a new stage or level, or proximity detected (walk up to an object) - that little interaction
* siriana said there are packs you can download that have set scenes (mostly 2d) that have interactions baked in, you can change what the NPC says to you. it's a little tricky but doable. you can do it in a framework of an already set game
* mk said we could also have more than 1 unity station, and it could also happen later when they come with assets
* mk said maybe beginning is an intro the idea of a video game engine, and then at the end we try to pull things together somehow
* station about a text based game (colored cards
* how to decorate with AI
* maybe a motif of all stations
* for text station, there's certain constraints you want to hold in place, but improvise quite a bit of the texture of the language
siriana is happy to be the alpha or omega in the unity stations
* siriana will get back into unity
aggressively technical but overwhelming so
we want to think about how to make the whole thing come together in the end in an impressive way
switching out cue sounds for collecting tokens in an asset pack like playground, change backing track, etc. if you have mp3s ready on the desktop
* test out unity with network storage, and see how easy it is to auto import (with scripts, like can you have a watch folder)
* **how can we generate assets at all stations and have them populate the machine at the end**
cd excited about the drum machine (percussion + foley?)
* mk said building a soundboard is definitely cool, any fellow who wants to do that is great
* midi controllers too
oculus would be an option at the omega station, maybe input device petting zoo? might be extra decorative experience. but nice to have the oculus be one of the ways they can see what they did
* siriana has signed into it
* siriana has used spatial the most
## post
mk excited for chris to do the station. possibilities for ta da moments
* render it out and decorate in stable diffusion
* importing asset into unity as a 3d object
* rendering a film out of it
* using it in conjunction with ...
other ideas
* adobe station
* sound effects
* generating textures from photos