Using a matterport scan of an event space as a showfloor for exhibitors to display virtual content and give presentations in an open mixed reality dev environment.
Originally published October 2018
devlog
conference
janus
scans
There are several reasons why someone would want to create a conference about VR in VR.
Some people have asked, "Why Janus and not other platforms such as VRchat / High Fidelity?"
Other platforms require users to download a large native client which usually only works on desktops, lack easy tools to build the conference center with, and produce long iteration cycles to compile and share which makes collaboration difficult. Also, very few offer any type of analytics to the creator because the networking servers are proprietary.
Janus excels at reducing content creation time and effort; the visual code editing style with drag n' drop is part of a UX that produces really fast iteration cycles. Because the project files are mostly HTML, we make edits and get instant feedback by just refreshing instead of recompiling. The presence server is completely open source which makes doing analytics atleast possible.
Also with Janus we're able to draw on previous experience in putting together GatherVR, a JanusVR made VR conference in VR that was created back in May 2015. It took a total of 2 weeks to go from concept to execution where we had 3 live speakers, 26 exhibits, and atleast 40 people connected at the same time without anybody crashing. Imagine what we can do now with the power of WebVR where a user doesn't even have to download a client now, they can just open a link in a regular 2D browser on their mobile, desktop, or standalone VR devices to attend.
In late August, while randomly searching for matterport scans on the internet, I got lucky and stumbled upon a message from a collective in Oakland California that scanned their space. Even though the messages were from 4 years ago, the links to the files still worked!
https://matterport.com/blog/2019/08/09/how-3d-and-vr-changing-event-marketing/
Turns out, the 3D scans of the hackerspace and the ballroom were part of a much larger community center called the Omni Commons where various collectives sharing DIY interests gather. Very cool!
There's quite a lot of activity happening at the commons on any given day, as evidenced in this calendar snapshot taken during the month of this writing. I thought it was cool that they color coded the events by the rooms it takes place in.
Here's what a typical workshop event looks like in the main ballroom.
Despite the tattered state of the scan, it had charming factors to it that traditional made CG models have trouble producing. Fixing the rough state of the 3D scan began with deleting clutter fused to the mesh, especially on the ballroom floor, which left many holes in the ground where the piano, tables, and chairs once stood.
To fill in the floor, I cloned patches of the ground and roughly sized them to cover the holes. For tears in the walls I duplicated brick walls all along the sides and joined those together.
The next detail I added was spinning ceiling fans, made by adding rotate_deg_per_sec as an attribute to the ceiling objects in the janus markup.
Every Monday at 2pm pacific users meetup in hifi://maker to share vr creations, WIPs, and more. I wanted to make this event appeal to those outside of High Fidelity by livestreaming the event into a webvr scene.
These examples also contained a portal that upon passing through will instantly transport you directly into the hifi://maker world that the live was coming from.
There was now a lot of space cleared up for things on the main floor.
JackpotVR and GambleVR is a virtual reality casino built in 2015 using JanusVR. Users were able to play web based slot machine games.
In 2017 a virtual vegas strip experience was created separately as a means of initiating an experience after a Bitcoin payment has processed.
GatherVR was created by two well known JanusVR developers Aussie and FirefoxG, who passionately believe that virtual conferencing will allow people around the world new opportunities to connect, interact, share and learn.
Since 2014 GeniusVR has experimented in conceptualizing virtual classrooms for some of the biggest online learning resources such as Khan Academy and Ted.
Did you know that learning retention rates in immersive education environments has been cited as high as 80% compared to 15% average from traditional reading and lecture based learning? See demos such as exploring the surfaces of other planets generated by NASA satellite data and an interactive periodic table of elements with private cinemas to each element.
Spyduck is a web and backend developer, VR designer and 3D modeler. Over time he also began to focus more on developing software to record and manage company meetings in VR as well as creating metaverse bots with various capabilities for interaction and search. Currently he is the lead designer of Vesta for JanusVR.
Deskcloud is a suite of open source software for empowering indviduals or teams to communicate and collaborate with little to no friction. One of the usecases made with deskcloud creating an online digital workplace where a player interacts with computers in virtual reality.
Dizzket is a professional UI/UX designer and web developer. His passions are in the intersection between human interaction and computers. He has a passion for retro gaming, having worked on various indie-gaming projects. He enjoys working in virtual reality, having created a wide array of content for JanusVR, a 3D web browser.
WIP 3D frontend to a p2p virtual marketplace; like a decentralized ebay built with open source software. Check back to openbazaar.com for progress on their web client.
In May of 2018, SVVR launched MULTIVERSE initiative - an open design initiative powering real-time, live event communications between real locations and virtual worlds. An open-source development kit is expected sometime in late 2018 for creators to start merging worlds of their own.
Streaming SVVR MULTIVERSE event from HiFi into Janusweb 9-20-2018
To workaround the current limitations of primitive avatar support in WebVR I have been experimenting with a green screen pipeline to bring more expressive avatars from other sources. I'm using VRchat as the Mocap studio because of their great avatar support and the ingame stream camera has a green screen filter which makes it super easy to record quality green screen footage anywhere!
In just a few lines of code I was able to add a chromakey shader and a billboard script to my green screen videos to create the effect of a hologram on stage that always faces towards the player.
Here is a clip from July 2018 while I practiced setting up my avatar and desktop to stream simultaneously using OBS and HLS.
I am back a few months later now with RTC streaming into a WebVR site, this was recorded October 2018.
Kent Bye appeared on an episode of Gunter's Universe on Oct 14. In VRchat you can open up an in-game camera with various filters including a green screen option. Check out this sample from almost 2 hours of amazing footage captured between the two having a conversation in the metaverse.
The green screen recording allows us to put the entire show in a whole new context. We can drop it into new worlds or perhaps in a green screen world where we can re-record clips while drag n' dropping visual artifacts of the things the speakers are talking about.
One of the coolest things I've seen in VRchat are the dancers with full body mocap. Turns out there's a whole community built around dancing in VRchat and I was lucky to be introduced to a couple who didn't mind being recorded with green screen. One of VRChats built in-game features is that the stream camera allows for various filters such as green screen. This means that we can shoot green screen video with perfect lighting in any world!
It would be very beneficial to scan an conference center before a VR/AR conference takes place. According to The VR Fund there already has been nearly 100 major VR/AR conferences and tradeshows this year (2018) and none of them have really utilized the power of the immersive web.
Our ability to recall the value from the conference afterwords quickly gets diminished over time; many of the presentations and digital pictures/files getting scattered into the winds of social media afterwords, conversations and business cards become forgotten.
I have plans to create a vision for what the future of conferences and presentations can look like using a matterport scan of a legit conference center I was fortunate to get.
Having the scan of the space is useful for event planning and delivery afterwords as well. Memories people share during the event can be collected and accessed in a WebVR timecapsule that's self-hosted on the event website.
TO BE CONTINUED!
Update: Part 2: https://hackmd.io/@xr/conf2