# Audiovisual Experience
[TOC]
## To Dos
### High Level
> Legend: T = Tati; J = Juaco; C = Chris
- [ ] Explore Resolume C T
- [ ] Make visuals on
- [ ] Resolume T C
- [x] Processing c
- [ ] Touch Designer T
- [ ] Unreal T
- [ ] Other T
- [x] Meet w/ Guillem + Rizzutti for Resolume questions T C
- [x] Test out OSC protocols J C
- [ ] combine with hardware J C
- [x] Contact locations (from Monday 27 to Wednesday 29)
- [x] AKASHA C T
- [x] Hangar J
- [x] Buena Onda J
- [ ] create interesting sounds C
- [ ] Meet up to discuss progress + findings + Fab challenge possibilities C T J
- [ ] then Define timeline
- [ ] Set up next meeting with Carmen
## Meeting Notes
### Meeting w/ Guillem (09.05.22)
#### Two Approaches
- The radical approach: going for the most provocative, weirdest path possible
- The collection approach: going for a collection of small projects, combining them in the end
#### Investigate deeper into sampling
- In terms of arts and music, this is relatively new
-> voice recorder to play back samples? Recording the participants' voice
#### Archiving as curating
- this is a vital part of any curating: building an archive with video and audio content
-> MDEF snippets?
-> is every button doing the same thing every time? Or can it be used to explore the archive itself?
#### On interfaces
- look at it like a ritual:
- are people perfomring something?
- Do the know they are interacting w/ something?
#### On agency
- people have agency; people as agents
- define a ruleset to guide the behavior of the visitors
- people will behave close to or according to these rules
#### On Resolume
- masks, filters, inputs (such as webcams)
- Bridge from processing to Resolume
- **Syphon** (http://syphon.v002.info)
- rendering to a virtual screen
- works with many clients
- works over the network
- **NDI** (https://www.ndi.tv)
- controlled via **OSC** (https://ccrma.stanford.edu/groups/osc/index.html)
#### OSC
- over the network
- can control video and audio triggers
- OSC Mk1(?) app (iOS) is a modular interface to trigger OSC messages
-> good for rapidly prototyping interfaces
#### Audio
- check if Resolume can trigger audio
- Related: **QLAB** (https://qlab.app) trigger video sequentially
> (!) potential synergies with ML outputs
#### Navigating the latent space
- exploring the latent space of MDEF?
- get a great video and audio systen in the space+
- how do people explore that?
- is it two people or 100?
- is it *site-specific* or *artifact-specific*?
http://www.graffitiresearchlab.com/blog/
https://akasha.org/hub-bcn/
### Meeting w/ Carmen (Akasha Hub BCN)(13.05.22)
#### Possible timetable for installation night
- 7pm opening & networking
- 7:30 presentation
- 8:30 data collectioni
- 9-11pm networking
#### Input by Carmen, Objectives by Akasha
- attract people to expand in the network and collaborate
- Involve people, *make it sexy*
- how the event will be communicated depends on the concept and the desired scale
- we want people to stay from 7-11pm
- data collection to implement a feedback mechanism
- such as scanning QR-codes to poll on feedback
- but can also serve as direct data input to influence the installation
- what kind of effect do we want to have on people? What feeling do we want them to walk away with?
#### Central values of Akasha
- decentralisation
- sustainability (ecological, social)
- innovation
#### Support from Akasha
- Space
- Drinks
- PR
- Possible VJ support by [Na.B3](https://www.nab3.es/)
#### Other
- Open Planting Project (OPP) by Chris: Low-cost hydroponic planters -> possibly for 'Slow Lab' rooftop initiative
- Decentralized video platform (alternative to youtube): https://odysee.com
#### Resolume Questions
- How to work with OSC and other devices? Arduino & phones?
- How to sync music samples to video?
- Licenses and watermark?
## Resources
- [Resolume](https://resolume.com)
- [Touchdesigner](https://derivative.ca)
- [OSC Protocol](https://ccrma.stanford.edu/groups/osc/index.html) enables sending audio and video triggers over the network, across platforms
- Syphon (http://syphon.v002.info) enables rendering to a virtual screen
- NDI (https://www.ndi.tv)
- QLAB (https://qlab.app) simple software to trigger video sequentially
- Tutorial: How to sync Processing to Resolume via Spout. (https://www.youtube.com/watch?v=1OId5XCWD7A)
- Best fractal art generators for creative coders (https://aiartists.org/fractal-art-generators)
## Concept
:::info
Concept ideas and timeline here
:::
### Original Idea (06.05.22)
- Triggering soundclips through arduino
- Triggering mp4s alongside it in the same length (including an opacity layer)
- Different interfaces to touch:
- Biomaterials
- Plants
- Buttons
- Rotary knobs (→ controlling effects)
- sliders (→ controlling volume of loops?)
- Through that, building an own midi controller
- Using an arduino probably
### Refined Concept (14.05.22)
- first tought: relate video footage of natural patterns with generative art imitating those
- combining visuals from nature with mathematical patterns, such as fractal geometries
![](https://i.imgur.com/EST7ID5.jpg)
![](https://i.imgur.com/HS702G4.jpg)
### Refined Concept (17.05.22)
#### Existing Projects
- **Artificial Constellations**: https://tatiana-butts.github.io/tatiana-butts/weekly_reflections/term2_week10.html
- **Plant B**: https://plant-b.io
- **WebGeist**: https://github.com/chris-ernst/fabacademy-challenge-3
- **Sound decisions**: https://tatiana-butts.github.io/tatiana-butts/fabacademy/fabchallenge3.html
#### New Projects
- **VJ Set (live & interactive)**: Audiovisuals generated by touching buttons like a VJ/DJ set related to nature & algorithms
- **Fractal Loop (closed video loop)**: Morphing a mix of algorithmically created fractals & "natural" fractals in closed loop (no sound, no interactions)
- **Particles (interactive simulation instances)**: A Processing or Touchdesigner particle simulation
- **Possible NA.B3 project**: unknown
#### Exhibits Short List
1. (Main) Interactive audiovisual set
2. Fractal loop video
3. Interactive particle simulation
4. NA.B3 unknown artifact
5. WebGeist?
:::info
Questions by Clement:
- Why? - Drawing similarities from tech & nature
- What? - Interactive Installation
- Who? - Tech Hippy Folks
- Where? - Akasha Hub
- When? - End of June
:::
### Concept Summary for Carmen (18.05.22)
:::info
https://hackmd.io/@592_YzyWS_SxaHJOH3SSdg/rypywEMP5
:::
## Concept Summary
Natural and technological ecosystems are closely related, however counter-intuitive it might seem at first. Investigating these overlapping areas through our senses, we venture in hybrid worlds of sounds, visuals and haptics. Between self-mutating algorithms and recursive patterns in nature, we draw similarities and aim to provoke questions about a more sustainable and tangible view of technology.
## Possible Tech Stack
### [1] VJ Area: Hybrid Spaces (Working Title)
:::warning
Combined video, audio & hardware installation by Tatiana, Joaco, Chris.
:::
:::info
**TL;DR:** A live and interactive audiovisual installation on the main screen featuring a set of biomaterials and plants as physical triggers.
:::
#### Option 1: Trigger A & V separately
1. Arduino gathering sensor data
2. Arduino sending OSC to Touchdesigner/Resolume (see if feasible)
3. Arduino sending OSC to Ableton (works very nicely via M4L Connection Kit)
4. Sound and visuals arrive seperately to outputs
#### Option 2: control A & V through Resolume
1. Arduino gathering sensor data
2. Arduino sending OSC to Resolume (see if feasible)
3. Triggering audio samples in Resolume (or Resolume sending OSC to Ableton to trigger samples)
4. Sound and visuals are both output by Resolume
#### Option 3?
### [2] Fractals at different scales
:::warning
Video loop by Tatiana.
:::
:::info
**TL;DR:** A video loop exploring recursive forms in nature and mathematics.
:::
This project investigates how fractals and other mathematical patterns appear both in the natural and digital world. It would be a video which morphes a mix of algorithmically created fractals & "natural" fractals without any sound and interactions. The loop would be a few minutes long and displayed on a screen.
### [3] Sonic Ecologies
:::warning
Installation by Joaco.
:::
:::info
**TL;DR:** A interactive experiment distributed across six small stations, inviting participants to shape clay from Collserola park inspired by audio recordings taken there. These clay artefacts will give sonic feedback while being touched.
:::
### [4] Interactive Particles (interactive simulation instances)
:::warning
ViInteractive visuals by Tatiana & Chris.
:::
:::info
**TL;DR:** A live and interactive visual installation inviting participants to move particles through motion inputs.
:::
Gathering motion data input from either [Kinect](https://en.wikipedia.org/wiki/Kinect) (movement, postures) or [Leap Motion](https://en.wikipedia.org/wiki/Leap_Motion) (hands, gestures), we can let visitors interact with a particle simulation running in Touchdesigner, or Processing. The particles will move according to natural laws, akin to snowflakes, water, mircoorganisms or smoke (tbd).
### [5] 5th exhibit & further ideas:
- Possible NA.B3 project
- minting generative NFTs through [p5.js](https://p5js.org) & [fxhash](https://www.fxhash.xyz) ([example](https://editor.p5js.org/chris-ernst/sketches/216APSlbx))
- Mycelium Growth from User input
- Build a particle simulation in Touchdesigner
- Mint NFT as Event Commemorative
- Issue a POAP as a Proof of attendance
## Data Collection & Feedback
Exibit [1], [3] and [4] will be interactive, so there is a substantial amount of visitor interaction provided. Beyond that we are thinking about collecting thoughts and inputs regarding the exhibition matter itself through a QR-code that leads to a web form. There are two possible ways to continue:
- **live feedback:** set up an online form that, when submitted, could trigger an audio sample and potentially display the words on the screen
- **sequential feedback:** QR code linking to a online form, collecting data to analyze & visualize it at the end of the event
### NFTs as commemoratives
- Mint NFT as Event Commemorative
- Issue a POAP as a Proof of attendance: https://poap.xyz
## Floorplan
This is a possible layout of the exhibition space. It aims to let people move throughout the space in an organic way and interact with the different exhibits over time. We hope that this distributed way of arranging the pieces will also enable conversations to form dynamically as people interact with the objects and each other.
![](https://i.imgur.com/JHnHIx9.jpg)
## Progress
https://www.icloud.com/iclouddrive/020z0kdnRfPmfv1oasfaBL93w#8a5453d5-3b48-4cc7-9fe2-8b9ada9eec48
https://editor.p5js.org/chris-ernst/sketches/psaUNtskEl