###### tags: `Parsons` `webanimations` `svg` `vexflow` `heartbeat`
# Prototypes
## 1. From Easeljs to Webanimations
> aug-sept 2019 [**link↗**](https://heartbeatjs.org/prototypes/webanimations1/)<br/>
> **summary:** ==the prototype shows all avatars with all the available animations, the animations run on 40 bpm.==
When the first version of Groovy was built back 2013, Flash as default animation framework had just recently been dethroned. There wasn't directly a clear successor but many former Flash developers started using Easeljs because it was very much developed from a Flash perspective. Also its main developer, Grant Skinner, was a well-known developer of Flash libraries such as Tween (later ported to javascript as well).
But like most great libraries, Easeljs got gradually obsolete because of the ascend of numerous W3C API's. Also because of the rapid adaptation of esnext, the original Easeljs codebase feels a bit like a dinosaur. There is a new version in modern javascript of Easeljs upcoming but it has been in beta for many years now. Meanwhile a new W3C API for animations has been implemented in modern (and also in not so modern) browsers. This API is called the Web Animations API, the WAAPI.
In Easeljs animations are mostly made using spritesheets. This is a very simple and powerful technique that has been used for button mouseovers since the late 90's. What we needed Easeljs for was the ability to control the spritesheet animation. For instance controlling the playback speed and the position of the playhead of the animation.
With the WAAPI this has all become available in native javascript so I made a prototype that uses the original spritesheet data of Groovy2 and plain javascript. The spritesheet data consists of one or more images that are actually the spritesheets, and a document that describes the animations. For instance the walk animation should use the frames 2, 3, 4, 16, 18, 2, 5.
This data was exported from Flash because the artwork was originally created in Flash. Important to note that Flash still is a fantastic authoring tool for timeline animations; we only can't use it anymore as a runtime engine because Flashplayer support has been phased out.
## 2. SVG in Webanimations
> aug-sept 2019 [**link↗**](https://heartbeatjs.org/prototypes/webanimations2/)<br/>
> **summary:** ==the prototype shows the difference between a scaled up Groovy2 part animation in SVG and in bitmap format.==
In the former example I used spritesheet images in bitmap format (png). While this works well, scaling is limited; you can scale up a bitmap only to a certain extend and after that it becomes blurry and jagged. A solution for this could be to use a sourceset like in the `<Picture>` element. This sourceset tells the element which bitmap file to load dependent on the size of the viewport. For instance if the user uses a large desktop monitor a bitmap with a higher resolution will be loaded.
Another solution is to use SVG, it stands for Scalable Vector Graphics and it is scalable by nature. Flash uses a form of scalable vectors as well. Upside of using SVG besides being scalable is that SVG files can be very small and are very easy to manipulate by script or css. Downside is that is it more work to export a timeline animation from Flash in SVG format.
## 3. Webanimations connected to VexFlow and heartbeat
> okt-nov 2019 [**link↗**](https://heartbeatjs.org/prototypes/webanimations3/)<br/>
> **summary:** ==the prototype shows a webanimation that is synced to a VexFlow score that is played back using heartbeat websequencer.==
This prototype consists of three parts:
1. A heartbeat song (a simple scale)
2. A rendering of that song to a VexFlow score
3. A webanimation that follows the score when the song plays
The webanimation is a Groovy avatar as you have seen in the first prototype. The animation is synchronized to both the bpm of the song and the playhead position of the song. The connection between VexFlow and heartbeat is accomplished using the code of this [open source example](https://github.com/abudaan/vexflow-test). The prototype is scalable and adjusts to the screensize automatically.
If you click the play button, the song plays back, highlighting the notes in the score as soon as they are played. The avatar starts its 'walking' animation and follows the position of the song. If you click on a note, the avatar will point to a tooltip that displays the name of the note.
The next prototype will build on this one by adding some more features like a slider to adjust the tempo (bpm) of the song and more interactivity such as dragging the avatar to a position in the song.
## 4. Synchronize live MIDI with rendered MIDI
> dec 2019 [**link↗**](https://heartbeatjs.org/prototypes/video-sync/)<br/>
> **summary:** ==the prototype shows that a MIDI file rendered to audio can be synchronized to live MIDI playback of that same file in a sequencer.==
By rendering MIDI to audio I mean: MIDI exported to an audio file. In Logic this function is called 'bounce', in most other DAWs it is simply called 'export to audio'.
This prototype works equally well for audio and video files; in this case I have chosen to use video. I exported the audio by recording my screen while [Pianoteq](https://www.modartt.com/pianoteq) was playing back a MIDI file of Mozart's Sonata Facile.
I embedded the video in a webpage that also loads the same MIDI file in heartbeat websequencer and then I added some logic to trigger and control the playback of the sequencer when the video controls are used.
When 2 or more entities (devices, programs, etc.) are synchronized there usually is a master and one or more slaves; in this prototype the video is the master and the sequencer is the slave.
In the video you will see a short introduction that shows how I load the MIDI file into Pianoteq and how I select an instrument for playback. Because of this there is an offset of about 23 seconds before the rendered audio starts, as you can see in this screenshot of the video in [kdenlive](https://kdenlive.org/en/):

The synchronization is simply done by matching the position of the playheads of the video and the sequencer. Because of the offset, the playhead of the sequencer is only controlled by the video when its playhead position > offset and < (offset + duration of the song).
In the UI you can see when the video controls the sequencer; if the playhead of the sequencer is not controlled it reads "sequencer inactive" below the video, and when the playhead of the video does control the sequencer you will see the bars and beats position as calculated (in real time) by the sequencer. This happens both during regular playback and when you drag the video position slider while seeking.
Note that this prototype uses the same source (MIDI file) for both audio and MIDI; therefor we only need a single synchronization point and from there both playheads can keep running in their own threads. If you use a different source for the audio, for instance a performance of the piece on YouTube this prototype falls short.
There are two solutions for this, both need to be done manually and the more the audio file differs from the MIDI file in terms of interpretation (tempo changes, volume changes, etc.), the more work it will involve:
1. add more synchronization points to the code that synchronizes the playheads
2. add the timing of the audio file as a tempo track to the MIDI file
The second option is the most elegant solution, so I decided to give that a try. My plan was to sync the MIDI file to a video of a beautiful interpretation of the piece by Daniel Barenboim, you can watch it [here](https://www.youtube.com/watch?v=1vDxlnJVvW8) and as you can hear the tempo is *very* fluid.
I extracted the audio from the video and let Logic create a tempo map but result was very poor. Probably because of the many tempo changes and the lack of a clear beat. The result in Cubase was a bit better but still I needed to do a lot of manual adjustments.
The problem is that it isn't sufficient to adjust the tempo per bar; even within a bar there are a lot of tempo changes, for instance in the fast scales in bar 5 to 10. This means that in some bars you have to adjust the tempo almost on 8th or even 16th level. It is doable but it is a lot of work; I managed to get to bar 16 but that took me over 2 hours of editing.
And so far we have only discussed the tempo; we also have the parameter volume that needs to be adjusted per phrase or per note in order to match the interpretation
But the situation is that in this example the audio file differs a lot from the MIDI file; I can imagine that for learning purposes the MIDI files and their audio interpretations are more alike, so in these cases extracting tempo maps from audio files and add these to the MIDI files might be sufficient.
## 5. VAM Player
> march 2020 [**link↗**](https://heartbeatjs.org/prototypes/VAM.alpha/)<br/>
> **summary:** ==alpha version of a player that can play multiple additional audio and/or MIDI tracks together with a video.==
>
VAM stands for Video, Audio, MIDI. The concept is that you can enrich an exising video with additional audio and MIDI tracks. The player comes with an editor that allows you to adjust the start position of an audio or MIDI track. You can also resize the addional tracks if you want to use excerpts of the original files.
### Current version:
1. doesn't support MIDI files.
2. doesn't have a playback engine yet.
3. you can upload any video or audio file type that is supported by modern browsers to the editor.
4. you can adjust the zoomlevel for more precise positioning of the tracks.
5. tracks can be resized and moved.
### Upcoming version:
- fix isseus 1) and 2) listed above
- allow adding file by using direct urls (for instance to CDN / cloud storage)
- add volume control per track so you can adjust the volume of the additional track to one and eachother and to the audio track of the video (if existent)
- show track information for instance duration, name, size, date modified in a separate pane.
- add playback controls.
- add playhead that follows song while playing and can be dragged to set the playback position.
- integration with the Parsons environment