# VR Research **Instagram handles** Arko - [arccc.co](https://www.instagram.com/arccc.co/) Nilanjana - [onetwistedseahorse](https://www.instagram.com/onetwistedseahorse/) Udita - [udita.palit](https://www.instagram.com/udita.palit/) Chinmay - [bhagat_chinmay96](https://www.instagram.com/bhagat_chinmay96) Karthik - [karth13k](https://www.instagram.com/karth13k/) # **Chinmay Bhagat** **Introduction** The purpose of me compiling this daily journal is to document and reflect my experiences in creative research involved in Art Science Bangalore space offered by Sristhi Manipal Institute. It also includes my personal interpretations of the engagements alongside studio artifacts rendered discoveries charted in this trans disciplinary space. **Theme** *Zoophony is a collection of fauna sounds, choreographed to create an immersive experiences.* *The arrival of chaos which transforms into a perceived silence that is followed by a part of overwhelming amount of hissing sounds, buzzing and whirring of insects and the lack of chaos creeps up on you, how strange!* The composer behind this audio experience is Sergei Khismatov, an aspiring berlin based audio composer reflected his experiences with fauna while exploring the world. He envisons his thoughts through experimental audio medium(s) such as electronic music. **Purpose** Based upon the above theme, the context that was choosen by the creative team lead - Yashas Shetty and art in residence - Chaitanya Krishnan who's a proffesional digital artist in residence at Sristhi Manipal and facilitator for Digital Media Arts Discipline was to explore this innate experiences through immersive mediums such as virtual reality and spatial audio. The virtual reality is showcase visual experiences while spatial audio is to showcase auditorial experiences. Since I am pursuing my specialization in experience design stream, the relevance I see in this space & discipline of creative research is the understanding & reflecting the process involved in 3 dimensional spatial & visual experiences through immersive media such as virtual reality and how I as a creative practioner can reflect my experiences in my pursuit of creative practises. **Timeframe** 5 weeks (October 25 2021 - December 04 2021) ## **Week 1** **October 25, 2021** **Studio Experience** * Commencement of TDR Spce - ArtScience BLR. * Introduction to HackMD to document our creative research process. ## **October 26, 2021** **Studio Experience** * I chose the theme - VR & Spatial Audio. * I started to research VR alonside with my peers. I discovered several resources (web articles, research papers). * I also included my **personal experiences** with VR to confirm bias. * Meanwhile I also learnt to realize that the medium also has its own percuilar set of limitations. I consider the **socio-cultural factors** more concerning than the technological aspects. * I find that 3D & VR technologies require **modern & contemporary gadgets** and ecosystem to function. In this space, we are encouraged to adopt frugal or lite online versions which can be demonstrated using **minimal equipments.** * I learnt that VR can be simulated on web rather than the need for immersive equipment such as googgles. ## **Reference** 1. Storytelling for Virtual Reality Methods & Principles for Crafting Immersive Narratives (https://www.researchgate.net/publication/321154281_Storytelling_for_Virtual_Reality_Methods_and_Principles_for_Crafting_Immersive_Narratives) 2. WebXR emulator extension https://blog.mozvr.com/webxr-emulator-extension/ 3. Verge - The rise & fall & rise of Virtual Reality https://www.theverge.com/a/virtual-reality ## **October 27, 2021** **Studio Experience** * I explored how to use a VR sketching tool (VR Tiltbrush) that allowed me to rediscover sketching. We all know that sketching occurs on 2D surfaces can now be explored in 3D space. * I saw my peers unleashing their artistic side through freestrokes but I wanted to explore something that's definative which I wasn't able to create. * I was introduced to VR space in Unity by demonstration of one of the immersive environments by Karthik in VR Labs at N6 campus. His creation was vivid in terms of experience offered. ***Reference*** 1. VR Tiltbrush https://tiltbrush.com/ 2. SketchFab https://sketchfab.com/chaitanyak/collections/theisro ## **October 28, 2021** **Studio Experience** * I was introduced to basic operations in electronic music explored through various platforms - MIDI piano, digital software on laptop & Ipad by Yashas. It was fascinating to observe let alone perform. * In addition, he introduced us to audio recording, mangement & post processing techniques using the open source tool called Audacity. * Afterwards as a group we engaged in a brainstorm to imagine how the similar experience can be speculated as aternative reality in either 3D or VR space. * I started with a concept demonstrated through sketching & mockups as explorative process in order to understand my thought process. I chose a simple interaction as I am considering the limitations of online 3D / VR. ***Reference*** 1. Audacity https://www.audacityteam.org/ ## **October 29, 2021** **Studio Experience & Asychronous Mode** I and Udita assisted Chaitanya. Our task was to manipulate given images into ones without background. The images were of animals. The task was done in Photoshop. Chaitanya said he wanted to use them for exploring a mockup in 3D space. ***Reference*** 1. Google Docs https://drive.google.com/drive/folders/1e4y87L5TZBmsrApUmWJAcZw1Yw-OG136 2. SketchFab Mockup https://sketchfab.com/3d-models/proto002-explorable-level-51652e45625c40d2a2cfcc8a46346af9 ## **October 30, 2021** **Studio Experience & Asychronous Mode** Chaitanya introduced us to basics of 3D model using Blender and asked us to explore at our own pace it during the weekend & Diwali holiday week. ***Reference*** Blender webpage https://www.blender.org/ ## ## Gap Week **November 01 - 05, 2021** **Asynchronous** * Since we have a long holiday week, I had ample time to explore blender and grasp the fundamentals. * Chaitanya shared some Youtube videos as learning references. I feel the Youtube tutorials intresting but I find it difficult to comprehend in realtime. * Since I have used Maya previously, I found Blender having a more intutive user experience & streamlined workspace simultaneously. Also blender is open source meaning that I can explore the tool without limitations. * My main focus was exploring primitive shapes rather than entering into manipulation modelling workspace. ***Reference*** ![](https://i.imgur.com/lRyn4o0.png) *Image 1 - Chair Orthographic Reference Drawing* ![](https://i.imgur.com/UwY1wIk.png) *Image 2 - Chair Modelling Process (Blender Screenshot)* 1. Blender Guru - Modelling From Blueprints (Youtube Video) https://www.youtube.com/watch?v=Hf2esGA7vCc&ab_channel=BlenderGuru 2. Basic Table Chair Personal Model (SketchFab) https://sketchfab.com/3d-models/basic-table-chair-set-35e7306a709849a89b7dbcb08f02c9dc # **Week 2** **November 08, 2021** **Online** * A brief meetup with everyone was facilated by Yashas. The agenda for the week was discussed considering the quarantine notice which has changed the timetable. * Since I am in VR space, most of us are going to proceed with online mode. Chaintanya presented some of his concepts during the week which were intresting. * The collaboration task requested by Chaitanya for the week is to model a set of animals based upon Chaintanya's vision & Sergey's requirements. ***Reference*** ![](https://i.imgur.com/HisFoi4.png) *Image 1 - zoophony visual concept 1 proposed by Chaitanya* ![](https://i.imgur.com/52hXuk9.png) *Image 2 - zoophony visual concept 2 proposed by Chaitanya* SketchFab Links 1.https://sketchfab.com/models/9c705d7703fb494dae0c2edb20111d6f/edit 2.https://sketchfab.com/3d-models/inverted-world-88a46486959a442da389f57aeb9f5166 ## **November 09, 2021** **Asynchronous** Chaitanya has assigned the task of modelling birds in Blender which I assume are essential assets that are part of visualising Sergey's zoophony project. I and my peers were anxious as I have never attempted to model complex stuff especially organic enities such as fauna. Chaitanya told us not to panic and instructed us with the following notion. * He said that we start with cube and mirror the changes made * To simplify the process and speed productivity, he took one animal - hawk and modelled it in Blender. He then instructed us to choose other variants of hawks in the set. * He mentioned specifically that we should make our chosen variants derive from the hawk, he modelled in class as a demonstration rather than start from scratch. * My personal experiences with modelling process is that organic entities such as fauna is much trickier than I have anticipated in contrast to modelling synthetic entities such as man made objects. ***Reference*** ![](https://i.imgur.com/3Qd1ull.jpg) *Image 1 - Chaitanya's Reference Model - sparrowhawk (blender screenshot)* ## **November 10, 2021** **Asynchronous & Online** * I continued with the animal model - Harris Hawk as requested by Chaitanya. * I showcased some of my process involved in the modelling process to gain feedback. * I have taken the following references for the hawk is open source photos - side view, 3/4 view and front view. *Key Points - I have discovered through the modelling process in Blender is to develop coherent understanding with manipulation of mesh geometry & visualizing scale & proportions of reference images* ***Reference*** ![](https://i.imgur.com/anomIqY.jpg) *Image - Harris Hawk (Image Courtesy - Pixabay)* ![](https://i.imgur.com/h5Robbn.jpg) *Image - Harris Hawk Model (Blender Screenshot)* **November 11, 2021** **Asynchronous & Online** I have finished modelling the hawk model & proceeded to catbird as discussed with Chaitanya yesterday. Today Chaitanya gave essential feedback for the catbird model as I and also assisted me during the modelling process. I chose two different entities which belong to the same group - birds to understand the following * The key takeaway was reflecting the differences in the entites and experiences. * For instance The catbird is a small bird and the **proportions** are say **closely** spaced. I also observed the stance of the bird being close to the ground. * The harris hawk is a large bird and the **proportions** are **large**. I find the stance of the bird being above the ground. Also I decided to model their postures at rest as the flying posture requires me to further understand how they interact with flight experience. * The form & modelling of flight experiences requires understanding of advanced concepts of form & function such as **postures & variable geometry**. * In modelling process, I feel the need to **optimize** my model as flying postures demands generation of complex geometry. ***Reference*** ![](https://i.imgur.com/h5Robbn.jpg) *Image 1 Harris Hawk (blender screenshot)* ![](https://i.imgur.com/3iMTPG3.jpg) *Image - Harris Hawk (Image Courtesy - Pixabay)* ![](https://i.imgur.com/iiFXPtQ.jpg) *Image 2 Catbird (blender screenshot)* ## **November 12, 2021** Today's discussion with Chaitanya was brief. * In the next week, Chaitanya would demonstrate how can animation be implemented which I feel would render lively experiences to our static creations. * To conclude the week, Chaitanaya suggested me to try out a cobra snake to explore modelling process independantly. # **Week 3** **November 15, 2021** **Studio Experience** * Today Chaitanya showed how to implement basic animation & rigging to one of models created as part of zoophony project. * Today's engagement I believe was a critical one as I am learning about animation which translates into dynamic experiences. I wanted to learn about animation from a long time but I felt it being complicated. * He showcased two methods - one is through **keyframing** and the other being **armature** driven. * The difference in both is the **implementation** - I feel the armature one is more adaptive and suited for animating organic entities while keyframming can be used for primitive entities. I need to direct my focus now in understanding the **animation workspace** over modelling workspace as it's intruiging & intimidating simultaneously. I realized that in animation, I need to plan what experiences that my creation intends to demonstrate. It can be literal or abstract. Meanwhile I also showed my work in progress of my cobra to chaitanya for feedback, After the interactions, *Key Takeaway - I realized that modelling an organic entity such as cobra snake requires considerable experience in understanding **complex organic forms** and **proportions** along with **subdivision of surfaces.** He also suggested me to opt for a simpled pose rather than a complex one when tackiling with organic entities.* *Key Points - In case of **armature animation**, its recomended to place the primary bone & joint respective to the **centre or midsection** which in turn promotes **harmonius motion.** Also the armature structure has to be placed in the centre* ***Reference*** ![](https://i.imgur.com/45IgAmE.jpg) *Image 1 - cobra reference image (Image Courtesy - Pixabay)* ![](https://i.imgur.com/VAcDLv0.png) *Image 2 - cobra final model (blender screenshot)* ![](https://i.imgur.com/1tpOugM.png) *Image 3 - sparrowhawk: keyframe animation (blender screenshot)* ![](https://i.imgur.com/sRGEWC8.png) *Image 4 - harris hawk: armature animation (blender screenshot)* ## **November 16, 2021** **Studio Experience** Today's discussion was a key one as we are planning how to stage immersive experience using zoophony audio composition as reference framework. An intense brainstorm session commenced. Meanwhile I decided to take another animal from zoophony to have a reflection of my modelling skills - the animal that I have choosen to model independantly (with minimal assistance from Chaitanaya) was a donkey. *Key Points - Chaitanya showcased a mockup composition containing an exerpt of zoophony. Karthik explained the technical aspects of Unity where the models are going to be imported. The rest of us were speculated possible creative themes for staging the animals.* ***Reference*** ![](https://i.imgur.com/q1DZuKo.jpg) *Image 1 Chaitanya's Zoophony Prototype v0.1 (blender screenshot)* ![](https://i.imgur.com/awQxv3V.png) *Image 2 donkey (dimension.com screenshot)* ## **November 17, 2021** **Studio Experience** Yashas meet with us for quite a while. He was **astonished** with the level of dedication that was put by both the groups. According to his vision, we are almost **midway** in realzing the project for VR. For us in the VR group, however he critiqued that he wishes more **personal reflection**. On the project front, we all started brainstorming various themes to stage the zoophony. One of the theme Chaitanya proposed and selected was a small room that's gonna be occupied with animals. The theme itself is **thought provoking** as I am imagining a lot of things. The concept was thought by Chaitanya while the room's visual concept and model was undertaken by Udita. ***Reference*** ![](https://i.imgur.com/HuFXlTk.png) *Image 1 donkey model (blender screenshot)* *Image 2 zoophony room sketch by Chaitanya* ![](https://i.imgur.com/dAL4VvV.png) *Image 3 Zoophony room model by Udita (blender screenshot)* ## **November 18, 2021** **Studio Experience** I am bit exhausted with the sprint modelling process and wish to proceed with a steady pace. Chaitanya has requested all of us to listen to zoopohny audio with attention and asked us to either imagine or speculate what we interpret. I decided to take the later half as I am tracing a unique pattern. To get a clear view of my track, I imported Sergei's zoophony track into audacity and extracted the section of audio that I was intrested (11 - 21 minutes). ***Reference*** ![](https://i.imgur.com/v5AohL8.png) *Image 1 - Zoophony Audio - Section 11 - 21 minutes (audacity screenshot)* ![](https://i.imgur.com/X2HgFyj.jpg) *Image 2 - Zoophony Audio - Section 11 - 21 minutes (audio recording notes)* **November 19, 2021** The stormy weather currently has forced all of us to stay back at home and resume with our activities again just like the last week's quarantine. ![](https://i.imgur.com/QagLuq5.png) ## # Week 4 **November 22, 2021** **Studio Experience** Today's engagement centered was exciting for me as it was around understanding virtual reality experiences on mobile & web platform. I feel the mobile platform is easy way to ideate our virtual reality creation with the use of minimal equipment. I just need to create an scene in unity, export it through Unity webgl plugin and open it on mobile browser Karthik showed one of the environments he created as a mockup for VR using Unity which he did it during the weekend. I was amazed with the level of efforts he puts in his work. I also enaged briefly with google cardboard and we all discussion on how we are going to translate our assets of zoophony into interactive environment For inspiration I was recomended one game by one of my peers - **superhot** I'll try to arrange a version of one of these to understand experience. ***Reference*** ![](https://i.imgur.com/Vcw0rqB.jpg) Image 1 - Mi VR Headset Google Cardboard https://arvr.google.com/cardboard/ Superhot website https://superhotgame.com/superhot-prototype ## **November 23, 2021** **Studio Experience** I played superhot and I am quite mesmerized by the use of minimal colors and focussed contrast. * The texutal description in between is something I appreciate. * The game's key point is the coherence which is rare in fast paced game such as first person shooters. It felt like i was attracted to the gore. I speculated some of the color schemes for our project zoophony using illustrator. He provided some critical feedback. Chaitanya introduced us to key aspects in Blender which I have covered briefly in the previous Trans Disciplinary Research 2 where I was using Autodesk Maya to showcase how one of my physical models can be expressed as a 3 dimensional model in digital space (This was due to online mode during that time). * uv mapping * texture painting * fbx file export I know about uv mapping for primitves but for organic entities, the procedure is somewhat more complex. I find the **texture painting** workspace is something intruging in blender. I'll explore this sometime later. I was shown basics to Unity game engine at the end of the day. But before I import my models as assets, Chaitanya recomended us to export it as **fbx format** which is standard and conduct **quality check** through viewers such as fbx review. This is so to ensure that assets or models are corrected before they break while importing into unity workspace. Chaitanaya requested us to **rig & animate briefly** (around 60 seconds with 24 fps) some of the animal models that we created in the previous week as they are going to be assembled all together in Unity by Karthik. I am relieved that we are almost reaching close to our deliverable. On the tool front I am glad that at least fbx review is free product from Autodesk. ***Reference*** ![](https://i.imgur.com/2Rntsrg.png) *Image 1 - uv map of 3D model demonstration (Harris Hawk)* ![](https://i.imgur.com/iUjBPzI.png) *Image 2 - proposed color scheme by pitched by me* ![](https://i.imgur.com/rCN8l88.jpg) *Image 3 - experimental color scheme pitched by Chaitanya* ![](https://i.imgur.com/NhZqAhe.png) *Image 4 - Dice uv map (personal work)* ![](https://i.imgur.com/6Hluh7D.jpg) *Image 5 - uv mapping of two cubes in blender - personal work(blender screenshots))* *Weblinks* 1. Blender manual - image texture documentation https://docs.blender.org/manual/en/latest/render/shader_nodes/textures/image.html 2. Blender manual - texture paint documentation https://docs.blender.org/manual/en/latest/sculpt_paint/texture_paint/introduction.html 3. Autodesk FBX Review https://www.autodesk.com/products/fbx/fbx-review 4. Unity Game Engine: https://unity.com/ ## **November 24, 2021** *Studio Experience* It's the middle of the week and I am learning quite a lot not in terms of project but the overall process & procedures involved in 3D creative experiences. today I have not much worked but paid attention to key topics addressed by chaitanya. He demonstrated how we can add **artistic panache** to our 3d assets or models through texturing painting. He showed how we can use photoshop or any painting softwares as a companion tool while modelling in blender. ## **November 25, 2021** *Studio Experience* I decided to model a horse independantly as requested by Chaitanya as part of zoophony experience. I again instead of using photographic references which are confusing references. I searched for website that offers multi projection views simultaneously with systemic dimensioning. Again I chose two different entities which belong to the same group - to understand the following. * The key takeaway in modelling process was reflecting the visual differences in the entites. * For instance the donkey is a quadped and the proportions are say closely spaced. * Also the horse is also a quadped and the proprtions are say slighly spread. The horse has a longer torso and the forelimbs and hindlimbs are similar to donkey but slightly taller. ***Reference*** ![](https://i.imgur.com/ZkNKT9j.png) *Image 1 donkey model (blender screenshot)* ![](https://i.imgur.com/mMl63O9.png) *Image 2 horse (dimension.com screenhot)* ![](https://i.imgur.com/AxNMErv.png) *Image 3 horse model (blender screenshot)* ## **November 27, 2021** *Studio Experience* Today was a fun session. I got to see yashas's another creative project which was about personalized portfolio for artists and other creative practioners. It was refreshing and inspiring. Also I was able to meet Dhruva in person after a long time. He's role in the reseach space and his domain of expertise is intresting. On the project front, I got to glimpse of our first concept in Unity assembled by Karthik. It seems pretty coherent. ## # Week 5 - Final Week **November 29, 2021** *Studio Experience* Today we had a discussion regarding how the zoophony VR interaction is going to be presented through online mode to our audiences considering limited physical interactions due to speculation of another wave of pandemic. Yashas requested us to focus on how we are going to invite participants from beyond the space for understanding their unique perceptions from our interactive creation. On the project front, Chaitanya asked us to choose any **one animal** and add **movement** within the room space as he also envisions how can dynamic interactions can make the experience more lively. For that I'll reuse the room model to plot & study the movements for my chosen model. The model that I have choosen is a flying rabbit which was conceptualized by Nilanjana (my peer). ***Reference*** ![](https://i.imgur.com/W2PPCIW.png) *Image 1 - flying rabbit front view* ![](https://i.imgur.com/9qKxZef.png) *Image 2 - flying rabbit side view* ![](https://i.imgur.com/sSgjzcn.jpg) *Image 3 - flying rabbit flight trajectory* ## **November 30, 2021** *Studio Experience* I have started to experiement with dynamic experiences to the rigged animals. The context was some of the animals could be moving around in the space would draw more enggaing visual immersion. *Key Points - One of the experience challenge I faced was navigating the bunny through tight spaces of the room. the other being the flapping motion to be in sync with the swift movement. I'll sugest her to fine tune the flapping & wing profile section later.* So I speculated a theme where one of them could be interacting around in the room. I chose Nilaanjana's flying bunny as it was **avant garde** Anyway back to the proces, I reffered back to Udita's model and planned out an intresting sequence - the rabbit starts from one of the corners, reaches on the other corner and then flies out of the room. On blender, I just had to add another layer of animation - translation motion which is done through location keyframing in addition to flying state animation. This process was assisted & demonstrated by Chaitnaya. Although the process can be a bit **tedious** for a novice, the end result - animation playback is rewarding. Key points to ponder upon... *I feel for entities with either complex or dynamic animations, it's better to go **stepwise** - the **armature movements & translation motion** have to be be in sync and handled one at a time rather than approaching simultaneously*. *Chaitanaya based upon his **personal experiences** with **advanced animation** process suggested the above process to be done in wireframe mode as it allows for coherent visibility.* ***Reference*** ![](https://i.imgur.com/WtP8gVo.png) *Image 1 - Flying Rabbit Rig & Animation (blender screenshot)* ## **December 01, 2021** *Studio Experience* It's almost near the end of 2021 year and the deadline for our project is nearing. The next few days are gonna be critical & intense for everyone. Today I and my peers were brainstorming ideas for the visuals (posters, social media, website) for the zoophony project. I decided to dabble around posters & take 3D itself as a explorative framework. My peer (Aarko) had made an intresting compostion of animal heads as the theme for poster / cover art. I and my peers also reviewed draft of zoophony website developed by upendra in collaboration with aarko. I decided to take that & iterate a series of visual concepts in grayscale. After that I am gonna explore those in hex color code scheme assigned by Aarko ***Reference*** ![](https://i.imgur.com/uGSs0ti.jpg) *Image - My vision for Zoopohony (Poster Concept 1)* ![](https://i.imgur.com/pyG1Gi0.jpg) *Image - My vision for Zoopohony (Poster Concept 2)* ## **December 02, 2021** *Studio Experience* With just two days remaining to our open house presentation, most of us were busy with visual & onboarding process. Meanwhile Yashas reviewed the draft website collborated by Upendra & Aarko. He critiqued particulalry the introductory page which according to him can make or break the spectator's intrest. One of his creative suggestion concept was a more naturalistic theme. One of his suggested was a flock of animals. It's intresting but I am concerned with the technical aspects - if the website can handle the complex interaction as I understand in general that such interactions demand complex computation capabilities. Meanwhile chaitanya wanted creative assistance in rendition of alternative variation of room scene (the visual is more chaotic when compared to previous version). So the task that he assigned me specifacally is modelling of trees. There are technical constraints - he specifically mentioned to work through cube modelling instead of spline modelling as it's gonna be an asset in online VR space. *Modelling Technique for Trees - He demonstrated briefly the process. He dissolved a cube as a single vertex and extruded them as line and point. To thicken them into branches, a modifier is applied. It basically converts vertexs into polygon faces.* I have started that and plan to continue to render two variantions as requested from him. **Reference** ![](https://i.imgur.com/36yl9hG.jpg) *Image 1 - Zoophony Room Alternative Model by Chaitanya* ![](https://i.imgur.com/X2oiUCw.png) *Image 1 - Tree Model detailed by Chaitanya* ## **December 03, 2021** Today's is the last date where we were busy wrapping up & package our vr interactive content while reflecting what we have done in the past month of engagements. ![](https://i.imgur.com/GtMWjKt.png) *Image 1 - Zoophony Room Final Model by Chaitanya* ## **December 04, 2021** TDR Open house - We presented our creation to the audiences and the cohort. **Reference** ![](https://i.imgur.com/vZHAtnf.jpg) *Image 1 - Zoopohony VR Screenshot* ![](https://i.imgur.com/gZPpt75.jpg) *Image 2 - Zoopohony VR Screenshot* ## # **Udita** Journal How it all started? - Creating illusions that we are present somewhere but we are not. first exampple - 360 degree murals or panoramic paintings. What is a stereoscope? Can virtual reality collaborate with collage making? - David Hockney's attempt at giving a 360degree view of a subject joining multiple images of the same subject from a constant distance. Storytelling through VR - how is the collaborative experience with the artist? it's challenges for both the composer and the artist? 27.10.2021 **VR presentaion for the studio - Thoughts before and after**** Worked on a presentation on VR for the studio. While researching for the same, learned about the history of VR and it's evolution. How did it all start? Well in the beginning I thought VR was a recent discovery, but as I researched I came to know the first thought about virtual reality was mentioned in "The Judas Mandala", a 1982 novel by Damien Broderick. Then the early attempts at VR were panoramic paintings or murals from the nineteenth century. This thought stayed with me and made me realise one of the basic aims of virtual reality is to give a 360 degree experience of a space making it more engaging for the viewer and making them feel a part of the event that the paintiing was made of. It is not just a painting but an experience. I made an illustration of my initial response after reading about the stereoscopic images. How two stereoscopic images when seen side by side gives the viewer a sense of depth and immersion. I lwearned that optical illusion also plays a role in the creating VR. The illustration was inspired by the evolution of VR from 2D to a 3D space. What started with an attempt to 360 degree view of a space through 2D panoramic image, led to it;s evolution to creating 3D spaces using computer and now advanced devices have been created to view VR spaces. VR's growth has been interesting and immense. Innovations are being made in this field to solve problems like phobias for human beings to creatimg artistic artefacts and spaces which can be interactive too. I learned Jaron Lanier was the first person to design the Virtual reality technology and kick started the Virtual reality industry but today he questions his own discovery. He is a musician and wants to move away from technology. I feel the COVID pandemic has pushed human beings into a nutshell and VR industry is making the most of it by creating experiences which does not require people to move out of their houses or comfort zones and one can experience music concerts, travels to different places in the world, Safaris etc. **Studio experience** Today in the studio we got a chance to explore the VR Tiltbrush, where we used the joy stick and oculus to sketch and draw in the space. Initially it was quite engaging, I felt lost in the closed world of VR sketching and I felt I was more foused on one thing since there is no other distraction when you wear the oculus. In the second half of the day, Karthik gave us a demo of a game on VR he was working on. I experienced teleporting feature in VR for the first time. It does make it easier to navigate from one world to another. But I felt I need to play in VR couple of more times to get used to using the joysticks. After this session I felt a little drained out. Especially because of the oculus, I could feel the weight of it after sometime. 28.10.21 **Studio experience** Yashas demonstrated how to work on sound tracks on audacity using the sounds recorded by the archeoacoustic group. We aso learned how to extract sound from a particular track. It was interesting to hear how we can create a new track using the extracted sound from another track. The sound of the music reminded me of some music tracks I have heard before, especially Tajdar Junaid. I realised this technique is used by many musicians to create new tracks. Nilanjana and I drew around 4-5 sketches imagining musical instruments in the form of flowers and plants in a VR space. There was no particular concept behind it, it was just sketching our thoughts on amalgamating music and objects. As a part of the second project we wanted to share our ideas with Chaitanya and Karthik So we made models of our sketches and recorded demo videos of making sound using the models. While making the model, my sketches evolved with more simplistic forms because we had limited materials to work with. Sketching the forms was very organic, but while modeling I put some thought on how the experience of making sound with it would work. This exercise took me back to working on an interactive art installation work I volunteered for. I imagined flowers blooming while we touch them along with the music playing in the background. 29.10.21 While discussing with Chaitanya about the second project and when he demonstrated building the model of one of the models that I made, on blender, I realised that when we doodle something aimlessly based on just a thought, the design of the sketch becomes more streamlined and simplified if/when we need to construct the design. Maybe not always but depends on the medium of construction and its limitations? Need to think over it! Also, having a background in architecture modelling made me realise that most 3d modelling softwares have some basics like drawing and then extruding and rotating, flipping and rendering. 08.11.21 Modelling animals requires basic understanding of anatomy. Our body has various parts joined together based on unique joineries. We need to notice and observe these details make a model on a 3d modelling software like blender(especially). Working on blender felt like sculpting with hands. While modelling animals in a low poly style felt the need of sense of proportions and anatomy more than actual measurements. I really liked how Chaitanya has strategised the process of modelling 100 animals with some of us who have very basic knowledge of blender and still in the process of learning it. From what I understand, he is working on the concept of building the base of a certain type of animal (ex:birds)where most birds have similar anatomy - wings, two feet, claws, beak and has asked us to modify that model to create another bird. the process of modification is helping me gain knowledge on the minute differences between different types of birds. It is an exciting process. 09.11.21 Well, made my first Blender model!!! (though it is a modification of a base model) nonetheless, learned a lot. The first bird I modelled is a yellow-billed kite. My first attempt at the modification on blender was an amateur one, with quite a few mistakes, but I realised I stressed on the form of the bird more than using the proper technique to model on blender and ended up making a makeshift almost similar looking kite. But when Chaitanya guided us individually on the models, I learned the proper technique, which infact eased my process of modification without ruining the geometry of the model. I feel in a day I have come a long way! well it is right that the best way to learn a software is to make sommething in it! 10.11.21 Creating the yellow-billed kite was a great learning experience regarding learning the basics of modifying a model on blender. Again it was all about anatomy and proportions and blender techniques. starting my blender journey with a low poly model was a good thing because I learned to build a 3d animal using basic shapes and based on that foundation I can work on more detailed models. The basics are the same for any kind of model, but it was less overwhelming because began with modifying an already build model by Chaitanya. 11.11.21 My second model was an owl. I chose the great horned owl because I wanted to attempt to create the exaggerated eyebrows of the owl besides creating the structure of the bird. The anatomy of an owl is quite different from that of the yellow-billed kite. It is fluffier with shorter and fleshy claws, shorter beak compared to the almost-squarish face with the extended eyebrows. One thing that was common was that both have deep eye-sockets. So since I already worked on creating that in the kite, it was obvious to go ahead with modifying the model of the kite to create the owl. I wanted to create the owl in flying posture, so the claws were trickiest to create. But I complpeted the model in a day, which was a good development personally. 12.11.21 Today I began my journey to create a blender model from scratch. I realised its a lot like sculpting. One needs to put basic cuboids and stack them up according to the form of the animal you are creating and then chamfering and modifing them. I got a fair idea of how CXChaitanya thinks before he starts creating a model of an animal. I feel he does an initial study of the form of the animal and creates an image of it using simple shapes and starts creating the model. I also realised while working on it that blender is a littlle similar to sketch-up. It is begining to be a therepeutic experience now creating the models. Having ADHD, making models helps me focus better on my work. Today was a bit difficult to complete the model because we had theory and understanding studio work as well, not been an easy day mentally, but I am trying my best to keep working! 15.11.21 I am still struggling with building the paws of the alligator, but I will be done with it today! Chaitanya taught us basic 50 second animations on blender using our own models. The best part is that blender creates the intermediate animation scenes unlike in photoshop where each movement needs to be created to animate. **3rd Week Insights** Started the day with a thought - "VR is build to trick human minds" while watching - The future of an immersive Metaverse by Artur Sychov. I feel this aspect of VR is used to address phobias in human beings. What is fear really? Fear is an emotion that is triggered by anything unusual in our surroundings. Maybe not always unusual situations but something that snaps us out of our present state of mind. I listened to "zoophony" today and I felt it like a mix of emotions bursting out. Serge has made sure all animals are heard :). Actually zoophony invokes different emotions everytime I sit to hear it. Sometimes I miss out on the sounds of some animals and so everytime I hear the track there is something new happening somewhere. I am not sure about Serge's concept behind bringing all the sounds together. But, to me it felt like all the animals celebrating freedom. During our brainstorming sessions this week regarding finalising the idea for the vR experience, there were ideas that spoke about how it would feel if animals inhabit indoor human spaces? How would it be if the defence mechanisms and parts of the animals respond to the sound of the track? The larger idea that all of us were rooting for, was the idea of freedom of animals and experience of human beings in cages. How would it be to reverse the two worlds? We even thought of life after death of animals, their bodies foating in space, as if they can only find peace in space. We raised the question as to why is there a natural inequality in nature in the animal kingdom of which human beings are part of? The probable answewr would be the rate of evolution of each animal. Also, it's like human beings have caged their fears. Again coming back to anything unusual triggers fear in human beings. So after Serge's green signal to our ideas, we discussed to amalgamate some of our ideas into one and something that will help in motivating the viewer to experience the zoophony VR till the end. We needed to get the viewer to engage with different things during the experience. So the idea of bringing animals into the living space of a human suited the purpose. It gave the viewer elements they can explore. The low poly animal models we created, looks a lot like 3d toys that can be a part of one's book shelves. So we thought of placing the animals on the shelves and different parts of the room and make them react to the sound track. The VR experience then ends with the viewer caged. I am still getting used to blender and understanding the short keys for it. I think I tend to get carried away with the measurements and proportions while making the model (used to it because of my architecture days), from what I understand, blender is more about gauging proportions than actually bbuilding something with exact dimensions. **4th Week Insights** The next big step after finalising an idea is finalising th elook and feel of the project and understanding technical limitations and possibilities for the same. Through this process I also realised that ideas can be boundless and can be infinite but creating a product requires streamlining ideas and understanding their realistic possibilities that can be achieved within the time limit of the project. by the end of the day we were able to finalise the look and feel of the space - which was an option I created in collaboration with Arko inspired by an image of an escape room. The image reminded us of what a room of a person with no resources and maintenance would look like. The concept of the VR experience designed for Zoophony is a dark concept where the viewer is made to experience the life of caged animals. The story revolves around the conceot of life after the killings and death of caged animals, who float like spirits around you in your own living space and together they are creating zoophony to celebrate their freedom fromt he human world and the viewer(human) feels caught in middle of it and is unable to escape. And at last it finds itself in a cage! So we needed the look and feel of the room to be a grungy dark space. ![](https://i.imgur.com/vs94pus.png) Worked on the thought of spirits of animals haunting a living space -> feeling trapped in a room filled with water -> a dark room with marks on the walls showing statistics of the death toll of animals killed by humans on the walls -> At last finding oneself in a cage -> someone watching over you on a cctv camera. Working towards the above concept, we needed a script that would merge zoophony and the VR model. we divided ourselves and took up part to write down our experiences. When I started listening to the track, it felt like a burst of emotions from different directions. ![](https://i.imgur.com/xWWxMLd.jpg) But what stayed with me after listening to the track were the pauses in the right places which made it sound like a beautiful syphony of freedom. It took me to a tropical rainforest with animals making their own sounds, but its all in sync with each other. It felt like instead of the monkey making sounds to warn everyone in the jungle about the presence of the tiger, it was making sound with the roar of the to show human beings their strength! It felt good. Few of the animals and hybrids I made in Blender : 1. Tiger ![](https://i.imgur.com/NMCu2qG.png) - Tiger ![](https://i.imgur.com/6REZ0Zn.png) - The hybrid of the tiger ![](https://i.imgur.com/9aH6Qkx.png) - Animation of the hybrid model and the tiger. 2. Alligator ! ![](https://i.imgur.com/98XOTED.png) 3. Great Horned Owl ![](https://i.imgur.com/57trAiJ.png) Learned Animation in Blender, and I just felt how smooth and easy it is to animate in blender if the models are build nicely. Blender creates intermediate scenes for animation- Wow! **5th Week Insights** Feeling anxious in the begining of the week, trying to understand what all we need to do; which models are pending; learning to use unity. One day I was sitting alone in the lab and reflecting on my work this TDR and realised how much I have learned - learning new softwares; conceptualising and storytelling; making the models; sharing ideas. It is a one of a kind experience. I enjoyed our brainstorming and collaborating sessions where all of us would keep coming up with new ideas for the VR experience. I started to connect all the dots between what Yashas tried telling us in the begining about our way of learning in the lab and where we are at with this project. It is difficult to put the multi faceted personality of the lab in a box of thoughts. One of my biggest takeaways of the project is I have learned through my TDR experience - Zoophony VR that virtual reality can make a viewer experience a world inside another person’s mind which is not possible in the real world. It can make someone walk with them onto their journey. When I heard zoophony, the different frequencies in the track as a result of the different sounds of animals invoked a calming effect in my mind. As a result of a collaboration of these two thoughts I feel I can take forward the thoughts I experienced during the process of this TDR and use my learnings of the softwares to create a version of the VR experience that draws inspiration from my mental responses to the music. # **Karthik K** Exporting a WebXR scene : after going through all the tutorials for setting up a WebXR project in Unity , i was able to find a webXR exporter developed by mozilla but the development was stopped 3 years . i went through their frame work and setup a scene . only downside it that the export wasnt working on web due to compatibility . after 8 trails / project setups . i got mixed results , trail 2 is working but not getting exported and trail 8 is getting exported but not able to control on web . finally going through each and every file on project 8 i was able to fix project 2 and make it work . **setting up controls :** planning to visit AR workshop and check these files on VR. planning to visit AR workshop and check these files on VR. We did a demo class on how vr works on general . still working on the vr with demo is pending . **DIWALI BREAK** I worked on few ideas for 2nd VR project. Chaitanya gave us an intro into blender on online meet . The idea was to create a resonating sound as the user touches anything inside the VR . Another idea was to create a synthesizer that resonates as long you move your hand while holding it . 2nd was really hard and definitely time consuming , as I watched many tutorials of how to create one in unity. **NOV 8-14** Unfortunately 2nd project is coming to halt and 1st project is the main focus as the team decided . after 2 sessions with Chaitanya on discussing on how the environment should be and other team members were working on the models using blender . I created another prototype with 28 sounds scattered in the field to see how it performs. **LINK to the DEMO** https://drive.google.com/drive/folders/1dhUSECUBpax0GUQwEjkOFd_Enlc2WgL_?usp=sharing It turned out to be okay but haven’t decided whether we gonna move with unity or 3.js for the final work. I was assigned to do other task such as exporting it to web . it was tricky and had to go through lot of learning . finally talking to Dhruva , he said he would help me with it . Coming week im looking forward have a working prototype on web with VR accessibility along with models that chaitanya has made for the project. **week 14-21** as far the project goes it was not decided whether we go with unity or 3.js . i was really nervous and not sure about my capabilities on how im gonna face this week . later after a meet with yashas and group he gave me a slight confidence on what my focus should be on. he drew a triangle with money,resources , time on 3 vertices and told us how we should approach projects . after that i really felt i should give my best no matter what the outcomes gonna be . i decided to shift my workstation to the centre so i can discuss the process and also update my work by combining the art work that chaitanya and group makes for the vr. it was hell of rollercoaster ride as far my work went . few times i could find solution and other times i have to combining similar work and do a test to see if it works . after discussing the ideas and we all together landed on the home to zoo vr version . yashas wanted to see a small demo by next week . i also took pressure in running through things for the demo to work . later contacted dhruva who helped me with hosting the link on web . for that i have to make small pre demo to see if this version works on the web . i was supposed to upload the whole project through drive to give access to dhruva . as the collabration on unity failed . due to very slow internet i was able to upload it sent it to him by night . i got a text message from dhruva after 2 days saying the version works . this was a big relief for me as i was afraid that i might be a reason for this project delay. later texted chaitanya about hosted link . he tested it and gave a green signal . coming week im excited to face new challenges on the art work from chaitanya and group by implementing . # **Nilanjana Bose** First VR pop album- https://arstechnica.com/gaming/2019/09/bjork-made-musics-first-vr-pop-album-she-opens-up-about-its-heartbreak/ Device evolution in VR https://www.youtube.com/watch?v=yXP307L-fdM History of VR https://www.vrs.org.uk/virtual-reality/history.html **Week 1 Summary** **Understanding VR and its uses and setting agendas for the coming weeks** Agenda decided after first week Agenda A- A VR experience based on Zoophony Part 2 by Sergey Khismatov Agenda B- VR instrument: an environment of musical elements responding to various interaction through sounds and music; an environment that itself presents itself as a wholesome musical experience. ***All that we did in the first week-*** Presentation Summary History of VR and its use in different forms of experiential art showed how humans have constantly been seeking ways to create an illusion of real life and surreal feelings through the medium of virtual reality which started from using binocular vision and panoramic paintings to add three-dimensionality to 2d paintings and pictures and went onto progress into the videorama or experiential theatre VR in art is used widely in music when virtual reality adds that wow factor to a music video and enhances the experience of listening to a piece. Further on watching the Notes on Blindness, a VR experience that was designed to show the world from the person’s eyes just through the phases of his losing vision. Truly something powerful and watching the same showed how virtual reality is basically an enhanced version of the true reality and opens room to read between the lines. Discovering detailed paintings like The Garden of Earthly Delights by Bosch which basically is a triptych and told different stories in each part of the artwork became the starting point for project 2 wherein the idea of creating a VR musical instrument was developed. Musical piece 4’33 by John Cage, an assumed silent piece, something Cage described as “the absence of intended sounds” that used surrounding sounds to reinforce the fact that absolute silence is unattainable. So far the idea of spatial sound, humans interaction with a reality that is custom made is been explored widely. Through the piece by Sergey Khismatov called Zoophony we intend to bring in our understanding of the medium of virtual reality and take it to a space where the cacophony becomes an immersive experience, along with visuals that support a piece like the same. To create some form of understanding 3d physical models were created and interacted with while Yashas gave us sounds for relevant interactions and they are to be polished and worked on in Blender and Unity, to understand the software's conststraints and possibilities. *A few 3D models that were built from scratch and interacted with as part of building a prototype for the VR instrument* ![](https://i.imgur.com/52L2Qff.jpg) ![](https://i.imgur.com/aMllQLL.jpg) ![](https://i.imgur.com/draNrCW.jpg) **Diwali break** Blender explorations A few 3D modelling structures I practiced to get the hang of Blender and have been exploring visual concepts based on the Zoophony track. In the Diwali break I also took the time to explore an idea around the animal sounds where the cacophony will be represented with random objects. The message i was going for was about how the overproduced man-made junk is starting to take up space of the animal habitat, so the growls, screeches and howls will all come out of the objects with visual reseblance to that of the animals. Eg, An alligator's sounds getting shown through that of a stapler. PROS of the concept- Having objects create animal sounds will create a fresher experinece of listening to the various animal sounds where the cacophony can be used to give a meaning and message. A 100+ animal sounds are part of the Zoophony track and modelling that many animals can be cumbersome and a lot of work load can be minimized using a creative solution like this. CONS of the concept- The idea can be confusing as sounds of a specific animal is hard to recognize unless directly shown and having a lot of objects be the source of the sounds, could make interaction haphazrd and hard to navigate through and comprehend. *The idea will be refined to find various angles for ease of interaction.* **Week 2** **3D modelling week** Work after diwali break. Blender shortcuts for easy navigation https://docs.google.com/document/d/1zPBgZAdftWa6WVa7UIFUqW_7EcqOYE0X743RqFuJL3o/edit 8th November 2021 The agendas were revised as a lot of work is to be done for Zoophony. The decision is to work on 3d models for the 100+ creatures. The VR instrument project is currently on hold. We sit with Chaitanya on a call to understand how he does 3d modelling and he shares the process and furtheron a "sparrowhawk" 3d model to use as reference while we take one bird each to model. I decided to work on the "Nightingale" 9th November 2021 First day of finding my way aorund blender and trying to manipulate the "sparrowhawk" into a "nightingale", I was learning to really notice and observe the bird now. The thing that I missed when I started 3D modelling is that, first one needs to learn to see! So, that's something i did extensively on day 2 while i turned my bird on all x,y and z axes and saw how a hawk and nightingale differ in thier body frame, beak structure, wings, head shape etc. a glimpse of what i made on day 2 ![](https://i.imgur.com/QpYWBQM.png)this was the first draft on day 2 of 3d modelling week. The beak is off, so is the head and shape of body 10th November 2021 Chaitanya helped thoroughly to make us understand the use of every tool in doing 3D modelling and so finally after asking a lot of doubts, I start to get the hang of the blender tools and find it way more comfortable and this is how my nightingale looked after refining it quite a few times. This is what the nightingale with the closed wings started to look like- ![](https://i.imgur.com/f6ALpxZ.png) 11th November 2021 The nightingale was needed in a flying variation too. So the next hurdle was to learn to make wings in flight for the nightingale and for that I studied a few images of the nightingale in flight and found a good full wing up view which i decided to replicate in my 3d model and here is how is how it began to look like when it started to fly- ![](https://i.imgur.com/qE5xiRw.png) 12th November 2021 So, after understanding how birds are modelled through the manipulation of a "sparrowhawk" into the "nightingale", Chaitanya suggested we move to modelling 3d mammals now, from scracth. Honestly, it was intimidating at first as initially we worked on an existing model but now we were to create a whole mammal that started with a 3d cube. I chose to work with "rabbit" and so far this is where I have reached- ![](https://i.imgur.com/brnxbUQ.png) Reflections- The entire process of 3d modelling and finding ways to visualize the sounds in Zoophony has been so wonderfully challenging, I honestly feel like i have learnt to truly see and see beyond in the last few weeks. Truly looking forward to seeing all our work come to life and create a meaningful, immersive and beautiful VR experience out of it. **Week 3** 15th November 2021 We tried some rigging to breathe life into our 3D models and this is how the final rabbit came to look like. ![](https://i.imgur.com/9Wmhh8q.png) This was rather animated to try and see how it will look when it moves to the track in the VR experience. I am yet to fully get the hang of animating in Blender but I am experiencing quite an interesting learning curve when I look at all the work that has come together in the halfway mark of this TDR. Really looking forward to see how the entire VR experience will come together. 16th November 2021 The group sat with Chaitanya and we brainstormed some ideas for the structure and design of the environment that the soundtrack will be experienced in. Things to consider for a successful VR experience with the Zoophony track 1) The entire track is 20 minutes long, so the experience needs to be designed in a way that it is engaging and at the same time doesn't overwhelm the person interacting too much 2) The meaning and message should be clear and hopefully impactful Now the group decided to work with 3 ideas to create the environment, one being that of reincarnation of the animals, second one is where they weaponise their defense skills, and third one was the one where i had suggested creating a room situation where animals are inhabiting a human space, the metaphor being, humans invading animal habitat and space and this is a reverse of the same where the humans are the helpless creatures now to show what we are doing to these creatures through our actions. The group built on this idea and it became pretty cool once Chaitanya suggested a great end to the experience with the human finding themselves in the bar. I truly feel adding something as simple as a sense of being trapped will speak volumes and at the same time deliver a crisp message. 17th November 2021 Now that the idea was discussed with Sergey, we decided to move on with the idea of the room, called Room Z13, and then work started on discussing how big the space will be and what all we will need to create this environment. Meanwhile we started work on creating weird creatures now. I too feel that to capture the essense of the soundtrack of Zoophony, we must find a way to integrate some form of randomness and having strange creatures will aid in adding to the weird! The space will also be crowded with random abstract shapes and these blobs and shapes will respond to the symphony og the animal sounds. Adding the crazy visual elements are going to bring out the best of the zoophony track and so I started listening to the track again and started jotting down all the visuals that we could bring into the vr experiemce. There are parts in the track that become slow sometimes and the interesting thing I found when doing this exercise was that the track almost makes it feel as if the chaos is over at some points and that's when the hissing or whispering of some sound creeps up on you, I have been experimenting and finding ways to incorporate some of these ideas into the visual world of the VR experience that we are creating. 18th November 2021 One weird creature that I made was a duck with spider legs and then I moved onto documenting the visuals I had thought of with reference to the zoophony track. It is already getting pretty fun with the freedom we have with creating the weird creatures and I am thiking of exploring more with the musical piece and the other crazy visuals that are possible with it. Will update about the explorations that I do over the weekend soon! ![](https://i.imgur.com/PkdMegz.png) Week 4 Major work done in 4th week- 1. Deciding the look and feel of the VR space 2. Rigginf of our 3D models 3. Concept for the narrative that builds through the experience Look and feel of the VR space Zoophony in a room. There were several color explorations done for the room before we finalized on a dark blue textured wall look for the room. Certain explorations that I worked on looked something like this- ![](https://i.imgur.com/XweTl73.png) ![](https://i.imgur.com/M9X1F3W.png) ![](https://i.imgur.com/jTQvS5w.jpg) ![](https://i.imgur.com/kOvUsHx.png) ![](https://i.imgur.com/4Wh0dtw.png) ![](https://i.imgur.com/kk9MSMj.png) ![](https://i.imgur.com/Qmhbzro.png) With some of these colors we went with a dark theme in the end suggested by Udita and the one that worked really well with the narrative. The color and look is kept dark to create the intensity of the animal sounds in zoophony alongwith other elements to be added as a part of trying to keep the viewer engaged throughout the VR experience. Some explorations I did for a part of the zoophony clip which sounds pretty creepy. This part starts at around 2:21 and ends at 4:06 and consists of mainly hissing sounds and some insects and the way we want to visualize this is terms of a false alarm for arrival of peace as silence prevails for 2 short seconds bu the hissing creeps up on you and finds a way to make you uncomfortable. This part could be utilised by making the viewer read some statistics of animals killed at the hands of humans, the paw marks of these creatures could be on the walls along with marks made to look like they were marked with blood to count lives that disappeared in pain. ![](https://i.imgur.com/8ukiavs.png) ![](https://i.imgur.com/5Hzslli.png) ![](https://i.imgur.com/xkkYLmN.png) **Rigging** Made a crazy flying bunny. Including hybrid animals in the track will add to the weirdness and chaos. Check out my flying bunny that looks like a cute rodent until it flaps away when bothered. ![](https://i.imgur.com/0aUt2yk.png) Through rigging my peacock, I learnt how complicating the mesh can break the model when rigged (: My broken peacock. ;_; ![](https://i.imgur.com/JfHhdbY.png) I will be redoing the peacock and making an easier mesh this time. Chaitanya helped me make one of the rabbits look crazier, through texture paint and it does look like an animal that resides well in the vr world of zoophony ![](https://i.imgur.com/T6VF1TR.png) Having made some of these crazy hybrid animals, I am visualzing the experience to be within the VR world of zoophony to be quite fascinating and a surprise as most of these crearures are imaginary an dadd to the element of cuiosity to the interactor. Narrative building through the track of zoophony Zoophony is a collection of sounds, choreographed to create different experiences in every part of listening to it. The sounds that it begins with is happy and seems like the animals are getting along. Then there's arrival of chaos which transforms into a perceived silence that is followed by a part of overwhelming amount of hissing sounds, buzzing and whirring of insects and the lack of chaos creeps up on you, how strange! For the only 2 mins that the animals aren't screaming and some form of quiet sets in, the entire vr world becomes a stranger place and you yearn for that familiarity of sounds. Quite an interesting composition, the more I listen to it and try to break downn the nuances of it and how all of that can be manifested in its visuals! **FINALE WEEK! WEEK 5** Data collection for the statistics to put on the rooms of the walls https://www.businessinsider.in/science/environment/weve-killed-off-more-than-50-of-forest-animals-on-earth-a-new-report-found-even-more-evidence-of-a-6th-mass-extinction/articleshow/70762156.cms 29th November 2021] Today was about speeding up the 3d animal making and rigging and trying to compose scenes by putting all of them in the VR room. Quite cool how our ghost animals look running and flapping in the room. Did a rig for these creatures too ![](https://i.imgur.com/18ehnUn.png) ![](https://i.imgur.com/rDHnvpW.png) ![](https://i.imgur.com/gsEwbG7.png) The models that I rigged today in blender were a bunch of hybrids like spider duck, duck with spider feet and a goat. Following through with the ghost theme with these creatures made it look even cooler in the VR space and added variety with the flying bunny and spiduck making special appearances in the room. The contrast of how normal the room looks and how crazy the animals are creates an interesting visual experience. Today I experienced the VR room through the Oculus and it looked really fun to look around at the space and finding random creatures, like a swarm of fish floating through the frame and then a huge flying rabbit and an interestingly small tiger, the element of having dynamic scale variaitions made it fun tom look around at the room. 30th Nov-3rd December Major works done through these days. 1. Building promotional collateral 2. Finalizing the VR demo and testing The promotional works included Instagram filters and poster. There is a website for Zoophony that Arko and Upendra worked on an dthe poster would have a QR to access the officia website. AR filter for Zoophony ![](https://i.imgur.com/iVJMDuL.jpg) AR Effect for official Zoophony poster ![](https://i.imgur.com/O5shaib.jpg) The AR effects were created on SPARK AR. It was intended as a way to attarct people's interaction with every promotional collateral before going into the final VR show. I had so much fun creating these effects and I am garteful people intearcted with these an dhad as much fun as well. Certainly looking forward to exploring Spark AR even more and learning possibilities within the same. The zoophony filters have used the 3D animals that we were all building on blender and rigging was done according to the phone space as that is the main medium of interaction to experience these effects. Many iterations were made to fit the space and create a good effect before finalizign the one that was shared through instagram. Most of the work done for the VR space consisted of introducing trees and some depth to create a dynamic visual narrative for the open-house day. VR VISUAL ZOOPHONY ![](https://i.imgur.com/vZHAtnf.jpg) # **Arkoprabho Bhattacharjee** IG: [Arc Co.](https://www.instagram.com/arccc.co/) **Oct 25, 2021** Day 1 at ArtScienceBLR! Began with introductions followed by a meeting with Umashankar Manthravadi. My head behaved like a perfect aerofoil to some of the terminilogies/methodologies (inverse sine sweep, etc.) that he spoke about. An enjoyable interaction, nonetheless. Yashas took us through some of the stuff he'd built. Loved the raw nature of some of the artefacts (especially the microphones) presented. Later in the day, I went through three.js. Cool stuff, but haven't gotten my hands dirty yet. Got a little intimidated by Blender's UI. Gotta delve in though. **Oct 26, 2021** Spline seemed slightly more friendly. However, my computer screamed. Literally. Soaring temperatures, too. Need to figure out a way to get it to stop screaming. Didn't do much else, thanks to me underestimating a submission deadline from a recently concluded workshop. I'm definitely unhappy with the way I managed my time. Not happy about not getting much done. I did everything but research. Fricking annoying, Arko! :/ **Oct 27, 2021** WebXR emulator extension (https://blog.mozvr.com/webxr-emulator-extension/) A Chrome/Firefox extension to emulate VR on the web. Tried it out. It worked. Gotta get comfortable with it now. Model Viewer To display interactive 3D models on the web & in AR (https://modelviewer.dev/) Storytelling for Virtual Reality Methods & Principles for Crafting Immersive Narratives (https://www.researchgate.net/publication/321154281_Storytelling_for_Virtual_Reality_Methods_and_Principles_for_Crafting_Immersive_Narratives) **Oct 28, 2021** John Cage - 4'33" As long as there's perception, there will be no true silence. [Whoa, wut!] Audacity - clean noise from audio files. Listened to Yashas belting out audio pieces on Abelton Live. ArtScienceMaestro! Listened to some of the others in the lab belt out audio pieces on an iPad. Quite an en-JAM-able (enjoyable) experience if you will! Can haptic feedback on mobile devices be used to replicate/recreate textures? In a feasible manner, of course. The tactile nature of a painting in VR can then be can add to the experience. People have tried to do it (https://www.theverge.com/ces/2017/1/5/14185134/tanvas-touchscreen-haptic-feedback-ces-2017). This specific attempt, however, would not fit our needs, I think. **Diwali break** Tried to familiarise myself with Blender's UI. I've been a trackpad-person for the longest time. Tried getting comfortable with a mouse. Worked on an overly-simplified human body modelling tutorial. Just couldn't get myself to figure out the thumb among other things. I finally ended up with 'The Man with the Deformed Thumb'. ![](https://i.imgur.com/K3M73LK.png) Since we're working on an art exhibit, we will eventually need a poster to advertise the event. I started exploring ideas for the typography. I used the Oculus Rift headset and controllers as letters. Here are the explorations. ![](https://i.imgur.com/uiufKAp.png) ![](https://i.imgur.com/uzKJFXy.png) **Nov 8, 2021** Got back to work after a refreshingly long Diwali break! We discussed the nitty-gritty of the Zoophony project during the 10 AM call. Chaitanya shared 2 world-ideas for the project. The first idea involved various platforms connected by wormholes. Each platform could be a grouping of several animals (thr grouping logic is yet to be discussed/finalised). A user would navigate from one platform to another to experience the corresponding Zoophony. The navigation could either be initiated by pointing in the direction of a platform (which apparently is the easiest to implement, if I'm not wrong) or by some other more nuanced interaction. The second world involved the user being inside an inverted globe. That would immediately introduce the concept of continents and thus grouping the animals based on where they're most predominantly located. A point was made by someone about how continents might be difficult to remember/gauge (?) for some users. Chaitanya will propose these ideas to Sergey during their call later in the day. A decision will be made based on the outcome of what's discussed in the call. At this point, I had an idea for a user-interaction for choosing where to head to next. The interaction involves the dial of an old-school rotary dial-phone (please refer attached image for context). ![](https://i.imgur.com/x2d5Yoo.jpg) One immediate concern with the idea is can we condense all animals (100+) into 10 groups? Yet to be discussed. Later in the day, we got on a call with Chaitanya to start modelling the animals. We began with our aerial friends. In particular, he started modelling a sparrowhawk. He took us through the entire process. Inspiration for such modelling is taken from several images of the bird/animal in question. These images act as reference for form and proportion. One must ensure to collect images from different angles and PoVs. In this line of work, observation is of prime, prime importance. Chaitanya's eye for detail was very impressive! I yearn to achieve that level of observation in this lifetime. It was quite astonishing to see how a simple cube can be extruded, slashed, and morphed into a low-poly bird. Absolutely outstanding! Truly. The next task for us was to modify the model created by Chaitanya and morph it into other birds from the aforementioned list of animals and birds. I was assigned a Goshawk. The idea was to get a hang of the various tools within Blender, without getting too overwhelmed with creating a form from scratch. **Nov 9, 2021 - Nov 12, 2021** I started with the beak. It was slightly longer and more curved than a sparrowhawk's. Making sense of the edges and the faces of the model, in addition to visualizing it in 3D space was challenging. This is when I realised how important it is to be able to visualize in your head. You need to be able to see it in your head to be able to truly replicate it to relative perfection. Another important skill is the ability to think in terms of lowpoly faces. Especially while replicating curved surfaces. While I suspect that there is a way to implement Bézier curves in Blender, we weren't making use of it. So, slashing faces at appropriate angles was quite an important learning curve to ride. Nonetheless, after the beak, I progressed to the claws. I noticed how the Goshawk had its claws emanate from the upper portion of their fingers. I learnt how to detach parts from the main mesh, reposition them and finally attach them back to the main mesh. Moving an object in 3D, from point A to B is slightly more tedious than it might seem. Quite satisfying though, once you achieve the repositioning. The tail didn't need much work. The wings needed feathers. That required me to modify the trailing edge of the sparrowhawk's wing. In my first attempt, I managed to mess it up royally. The lower face of the wing completely merged into the upper face, thus incorrectly altering the geometry. Next began the arduous process of redoing the wing-geometry. Several rotations & extrusions later, a decent Goshawk wing was ready. Over the next few days, I took up the task of modelling a Cockatoo, a Humpback Whale, and a Sperm Whale - the latter two being the first models that I created from scratch; by scratch, I mean from a damn cube. Ooof! More on them in a bit. A cockatoo is a bird of grace. It is essentially a type of parrot and features the characteristic short, but curved beak. Its limbs, however, are not as long as a hawk's. They're quite stout and sturdy. The claws are elaborate and elongated though. They seem to be quite flexible too, bending backwards and tucking into the body during flight. The wings of the cockatoo were attained by modifying the wings of the Goshawk. Birds of prey tend to have sharp leading edges. The cockatoo, in contrast, has a more curved, and smooth leading edge. Next, I went on to model the crest of the cockatoo. The crest was the only reason I even chose to model it. By the time I began working on the Humpback Whale, I seemed a lot more confident with Blender, at least with the very small set of tools within it that I had used. As was the case with every other model, I began by collecting several reference images. Since this, like all the previous models that I had worked on, was a symmetric model, I turned on the symmetric modifier within Blender. Then began the process of chiselling a cube. I first worked on the top view of the humpback whale. Once the basic 'chassis' of the whale was ready, I began modelling the tail fin, the flippers, and finally the head. The flippers on a humpback whale are quite bumpy. Yet, these irregularities ('bumps') improve the aerodynamic efficiency of the whale, and helps keep their “grip” on the water at sharper angles and turn tighter corners, even at low speeds. All very interesting! Thankfully, the low-poly aesthetic that we were going for perfectly matched the actual bumps on the wings of the model. Next up, I morphed the humpback whale into a sperm whale. The mouth (under part of its head) was quite hard to achieve. The body was slightly more elongated and wasn't as bulged as the humpback whale. The trailing edge of the tail fin was slightly more rounded & curved too. **Nov 15, 2021** On this November day in 2021, we were finally able to add another tool to our Blender arsenal: model rigging. Rigging is the precursor to animation. It is used to represent a 3D character model using a series of interconnected digital bones. The process begins by creating the first bone, an armature, within the model's body. Next, begin by extruding one of the two nodes of the armature, and continue to do so until a bone exists for each part of the entire animal's body or in the parts as we deem necessary for the animation that we're trying to achieve. The process seemed quite intuitive to me and I seemed to grasp it almost immediately. My first rigging assignment was to rig Chaitanya's Hyena model. I ended up learning about keyframes and also how to loop animations. Quite amazing, truly! **Nov 16, 2021** This November Wednesday began on quite a bright note. The sun was out, but without the sweltering heat. Quite pleasant. A lot of the day was taken up by brainstorming, punctuated by brief modelling sessions. We tried to come up with a storyline for the exhibit. One primary question we tried to answer is "What is it that we want to achieve with this project?" Defining a goal/objective became the immediate next task. Oh, there was another activity that I failed to mention earlier. Chaitanya had come up with an exploration where the animal bodies would react to the audio being played by undulating to the rhythm of the audio. The undulations were often rapid and randomized. The undulated animal bodies somehow felt like the scales on a dinosaur's back. The immediate next thought was, what if animals could use their calls as defence mechanisms? So, each time they'd call out, the surface of their bodies would morph based on the sound that they emanate. An interesting concept that came out of this was this - an animal's inherent size would no longer matter. Their survival depends on the intensity of their war-cry! It did seem to make an impact on the minds of the others in the group, albeit for a short while! :P We did however park this idea to discuss other possibilities. **Nov 17, 2021** Today, we had a group huddle with Yashas and the AA folks after quite a while. Very nice! We went over our reflections so far and had a little chat about what our progress looked like. Post that, we finally got down to whiteboard-ing, to start sketching out the world that we were modelling in our heads the day prior. At this point, I'd like to bring up an incident. I wasn't on my best game when it came to observing. The lab already had a whiteboard, which I failed to notice. I almost needed a new pair of wrists after almost trying to lift one of 'em heavy beasts up two flights of stairs. Anyway, moving ahead. Chaitanya sketched out a view of the room from the aforementioned discussions. Throughtout the rest of the day, we did a bit of modelling on Blender. Chaitanya also took us through some concepts and techniques of materials and textures on Blender. It was quite basic and intense, all at the same time. In the latter half of the day, I did have to break away from this project to work on a submission for my T&U class. That leaves me with some pending modelling task that I'll have to finish tomorrow. I've also been trying to come up with a tagline for Zoophony. Criteria for consideration: succinct, impactful. The following are some ideas bouncing around within my head. I'm yet to assess the impactfulness of these. But, I've made a very conscious effort to keep things succinct. * Zoophony - The Endless Eagerness to Escape (an alliteration was attempted) * Zoophony - The Endless Urge to Escape * Zoophony - The Cacophony that fuels the desire to Escape (not quite succinct, I know.) * Zoophony - The Cacophony of Escape * Zoophony - The Cacophony of Claustrophobia (another alliteration) * Zoophony - The Claustrophobic Cacophony (yet another, oof) I'm yet to share these with the team and get their review. Will mostly do that tomorrow. **Nov 18 & 19, 2021** Chaitanya had asked us to listen to the entire Zoophony track, and note down what we felt throughout the entire piece. I did listen to the entire track and had a handful of moments stand out for me. The rest of the team did a great job of timestamping the entire piece, noting down the minutest of details that caught their attention. At the end of this exercise, we had quite a comprehensive list of ideas about what each bit of piece could mean. I tried an exercise where I played Zoophony in the background, while I worked on something else. It definitely didn't seem overwhelming. However, when you focus your entire attention on just the piece, certain parts of it did feel like a lot was happening? But, in all honesty, with time, the piece started to sound like a symphony of emotions. It was almost as if the animals were clamouring to be heard; clamouring to end the suffocation and breathe in the fresh air of freedom. It's also quite interesting to try and understand the frame of mind Sergey might've been in, while mixing and mashing, splicing and splashing all these different animals sounds together. Was he trying to attain freedom from something himself? Was this piece his way of documenting his efforts to be heard loud and clear? It also reminds me of the phrase "Ordo Ab Chao" or "Order from Chaos". The chaos in this case is the apparent cacophony of noises - a plethora of voices. In the meantime, as a team, we had decided that we would want the entire experience to unfold within a cubical room. Thus began our efforts to model this room in Blender. This specific effort was spearheaded by Udita. She went about building the four walls, and imported assets like a chair and a table among other things from a free marketplace. The rest of us worked on rigging all our previously-built models. I did seem to get quite good at rigging and was able to help out some of the others on the team with doubts and queries. Once again, Chaitanya, thanks for being so patient with us throughout the entire process. **Week 4** This week began with us 3D modellers trying to model hybrid animals. The idea to do so stemmed from the fact that at most parts in the audio piece, the animal & bird sounds were non-differentiable. So, might as well make weird, quirky animals. My take on this was to give an elephant trunk to a seagull. Other folks came up with quite interesting combinations themselves. Udita modelled a flying tiger. Nilanjana modelled a spiduck (a duck with spider legs) among other things and Chinmay modelled a two-headed donkey. But this was only one of the activities. Another major task at hand was to decide the look and feel of the aforementioned room. We were asked to explore various color combinations for the room. While Chinmay and I proposed palettes of our own, both Nilanjana and Udita in fact added the colors to a 2D-mockup of the room in Illustrator. I had done a very basic study of color theory and proposed the following palette based on the emotion that colors convey: Grey (sadness), Red (anger), Green (nature, healing, fresh), Blue (trust, peace, loyalty), Black (dramamtic), and Brown (rugged). Udita and I tried finding textures that could be added to the walls to make it look like a cold, and unforgiving prison cell. We were essentially going for the vibe where we wanted the user to feel what an animal feels when caged, with barely enough resources to survive. Chaitanya also wanted me to explore the Zoophony logo exercise from before in 3D, refining and modelling it accordingly. One bit of feedback that I had received from Yashas on the previous logo exploration was that it was way too literal. I'm not sure if I must attribute this to my background in Engineering, where the clarity of thought and the final outcome mattered most. In the meantime, Chaitanya also wanted me to work on typography and communication icons for the experience. Essentially, think about a splash screen, an end screen, and any other in-game graphics that could be used. An attempt at designing the splash screen literally led to a composition that made its way into the marketing poster for this project and could also be used as a potential landing page hero image as well. The 3D composition of animals essentially consisted of animal heads arranged in a visually pleasing lock-up. Towards the end of the week, Yashas wanted to know if anybody in the lab could work on the website for the experience. The developer in me pushed my hand up in response to his query and thus began the journey of designing and building a landing page for Zoophony. **Week 5** A lot of effort was put into building the website. Upendra was a major collaborator on this task. He started out by building the basic outline of the website. In the meantime, I had to figure out a way to load 3D models on the web. My first experiment was with p5js. P5js is a JavaScript library for creative coding. While trying to load models, I noticed that they would always be flipped by 180 degrees. Also,I couldn't get the texture to load. As a result, I began to take a look at three.js. Three.js while powerful, seemed like it would take a lot of code to display one model. Not one bit feasible. It was at the pinacle of my failure that I realised that I had to find another way that works. Thankfully, after some Googling, I found <model-viewer>, an open source web component developed by Google. It seems like a potent all-in-one tool. It lets one do a bunch of actions on the imported model. With that out of the way, I began working on explorations of the hero section of the landing page. P5js played an important role here. My first attempt was to replicate a Perlin Noise particle-system as a background for the hero section. Next, I explored the Brownian Motion particle-system. The exploration that I personally liked the most was one specific iteration of the Perlin Noise. The team is yet to see this exploration. Another bit of work that took up some time was writing truly mobile & desktop responsive code for the website. I employed the services of tailwind.css to build responsive components for the website. While this happened, Karthik, Dhruva, and Chaitanya wrestled with ways to make the VR experience work without a hitch on the web. They did run into some hickups along the way and are currently in the process of working them out. One other highlight of the final week was the film crew that came along to film a sort of documentary of the entire Zoophony project. I'm extremely grateful to Yashas for letting me be a part of the interview. Really looking forward to the final film. In the meantime, I continue to work out aesthetic elements for the website, which I'm hoping to share with Yashas and the rest of the team soon.