# Lightmapping for Web {%hackmd @xr/dark %} ###### tags: `devlog` ## Benefits Baking lightmaps to diffuse is a very cumbersome process that also bloats the project up, having separate UV for lightmap allows artists to create beautiful scenes that are performant. In addition, having lightmaps this way enables artists to utilize the workflows they are already using. ## Test Space - Download: https://booth.pm/en/items/1943320 - VCC版: https://vrchat.com/home/world/wrld_56c605b0-0be6-4a53-9be7-1933baa593df - SDK3.0版: https://vrchat.com/home/world/wrld_fbee6f9f-fd23-43aa-b8c1-5c30552c3dfc ![](https://i.imgur.com/iUnfXPp.png) ### MX Lightmap - https://deploy-preview-557--thirdroom.netlify.app/docs/gltf/MX_lightmap/README.html The `MX_lightmap` extension can be added to any `node` property with a `mesh`. ```json "nodes": [ { "mesh": 0, "extensions": { "MX_lightmap": { "lightMapTexture": { "index": 0, "texCoord": 1 }, "scale": [0.5, 0.5], "offset": [0.5, 0.5], "intensity": 1.0 } } } ] ``` Allows you to specify lightmaps on a per node basis, specify scale and offset values on per object basis which is good when you want to reuse a lightmap and have texture atlassing for lightmaps. Lots more reuse, can take that and use mesh instancing extension and use scale/offset for buffer on each instance to be offset for that lightmap. Lightmap texture uses proper texture info format, in glTF there's a texture index but not usually referred directly; usually texture info object, useful with lightmaps where you need to know which UV set to use with the lightmap. ### Moz Lightmaps **Difficulty** Trying to bake lightmaps for glTF objects using default blender is PAIN! To give you a perspective on the current state of lightmap baking on web consider that Mozilla Hubs published a guide on lightmap baking that's ~16 pages long: https://hubs.mozilla.com/labs/ez-bake-lightmapping/ ![](https://i.imgur.com/MfIwXe4.jpg) - https://github.com/MozillaReality/hubs-blender-exporter#using-lightmaps - https://www.youtube.com/watch?v=ldHwbnMMKVY 2 hour Video tutorial **Using Lightmaps** To use a lightmap, create a MOZ_lightmap node from the Add > Hubs menu and hook up a image texture to the Lightmap input. Use a UV Map node to control what UV set should be used for the lightmap, as you would any other texture in Blender. **lightmap node** Note that for use in Hubs, you currently MUST use the second UV set, as ThreeJS is currently hardcoded to use that for lightmaps. This will likely be fixed in the future so the add-on does not enforce this. **setting bake UV** testing glTF lightmaps > Indeed the [Hubs blender plugin](https://github.com/MozillaReality/hubs-blender-exporter) works for exporting lightmaps. Here's a look at the shader editor for reference after baking the lightmap as a separate UV: > > ![blender_yPSxrfp43N](https://user-images.githubusercontent.com/32600939/111078557-779e7400-84cc-11eb-98af-4a289cdabf69.png) > > If the shadows are too soft, one can import the lightmap into image processing editor like GIMP and make the shadows darker using curves: > > ![shadows_lightmap](https://user-images.githubusercontent.com/32600939/111078616-bdf3d300-84cc-11eb-9c36-3ab82bf0a3ca.jpg) > > Here are a couple side by side looks between VRChat (left) and Webaverse (right): > > ![sbs_atoll1](https://user-images.githubusercontent.com/32600939/111078630-cfd57600-84cc-11eb-8197-cbfb09d8d4d8.jpg) > > ![sbs_atoll2](https://user-images.githubusercontent.com/32600939/111078694-2c389580-84cd-11eb-8a12-3cf6a9ec944a.jpg) > https://github.com/webaverse/app/issues/960 > ![image](https://user-images.githubusercontent.com/32600939/111079016-7cfcbe00-84ce-11eb-9fbc-0d5a7b24edf3.png) > > ![image](https://user-images.githubusercontent.com/32600939/111079059-aa496c00-84ce-11eb-985b-baf27ac4a8a6.png) > https://github.com/webaverse/app/issues/960 --- ### NEEDLE_lightmaps - https://engine.needle.tools/docs/export.html#exporting-lightmaps - https://github.com/needle-tools/needle-engine-support/blob/main/documentation/technical-overview.md#needle_lightmaps Main difference: Similar to unity objects can have separate lightmap UVs, like UV1 etc. Can align closer to IMO good choice to what unity does, object has UV, don't need to bake everything on same object. With moz lightmaps need to have individual UV instances / coordinates. Should stay individual instances, perf / storage it makes sense, export lightmap index / scale offset which shifts around UV coordinates per object. ![](https://i.imgur.com/J6WmfDz.png) **How it looks in Unity** ![image](https://user-images.githubusercontent.com/32600939/215252640-e1e8c7e7-edb7-4ede-861e-f973d2d62cf5.png) **How it looks with Needle Engine (threejs)** ![image](https://user-images.githubusercontent.com/32600939/215252667-92af3fe9-2008-4435-8650-8b21c56e5ba2.png) **How it looks in [gltf-viewer](https://gltf-viewer.donmccurdy.com/)** ![image](https://user-images.githubusercontent.com/32600939/215252677-e28af83d-9998-42b3-924d-2aef8ce49436.png) **How it looks in [hyperfy.io](https://hyperfy.io/uq9p8o1qjq)** ![image](https://user-images.githubusercontent.com/32600939/215253456-757c17d0-10f0-4165-b828-de89b1dadeaf.png) This is a root extension defining a set of lightmaps for the glTF file, whole document. JSON pointer for the texture itself. Lotta things in needle tools uses JSON pointers. Needle lightmaps = lightmaps, environment, and reflection maps. Similar parameters as those coming out of Unity, exporting the same from Blender as well. Can use Unity / Blender to produce exact structure seen here. Objects instanced can have proper lightmaps on them, otherwise hard if there's no offset per object. ```json "NEEDLE_lightmaps": { "textures": [ { "pointer": "textures/20", "type": 1, "index": 0 } ] } ``` > **Note**: At the moment this extension also contains environment texture references. We're planning to change that in a future release. | Texture Type | Value | | -- | -- | | Lightmap | 0 | | Environment Map | 1 | | Reflection Map | 2 | How lightmaps are applied is defined in the `MeshRenderer` component inside the [`NEEDLE_components`](#needle_components) extension per node: ```json "NEEDLE_components": { "builtin_components": [ { "name": "MeshRenderer", ... "lightmapIndex": 0, "lightmapScaleOffset": { "x": 1.00579774, "y": 1.00579774, "z": -0.00392889744, "w": -0.00392889744 }, ... } ] } ``` > **Note**: We may change that in a future release and move lightmap-related data to a `NEEDLE_lightmap` extension entry per node. Needle and MX Lightmap are very similar, just slight syntax changes. Has intensity. Texture index from glTF spec, index / coordinate to specify which UV to use. Instancing via mesh gpu isntancing. Needle made decision that instancing should be viewer, instead of mesh gpu instancing, be smart in runtime and be smart about instancing what makes sense - export without extension by default. Wouldn't want to use for batching, should be engine level optimization instead of asset. MX lightmap will probably go back to what needle_lightmap will do. **Questions** RGBM calculations: similar to unity / needle, needs to be explained what happens with indirect lighting, how do lightmapped objects interract with lights / IBL. When you have IBL, it usually is baked into lightmaps, so basically image based lighting shouldn't effect objects anymore, if extension have proposal should have sections for how to mesh things together lighting model wise. Explain how to treat indirect / direct terms, document that, might need to patch threejs for some things. Can't use compressed textures atm because of basis limitations atm, have to use PNGs atm. Looking into EXR, could maybe abuse KTX2 (container) to store float data. Can store bitmap. I'm lost lol, ac6? Float textures? ## glTF Transform - https://github.com/donmccurdy/glTF-Transform/issues/671 ![](https://i.imgur.com/LllJNkc.png) - https://github.com/donmccurdy/glTF-Transform/issues/85 ![](https://i.imgur.com/OgImmQh.png) --- glTF webp support got merged in godot as well. webp doesn't change much in this regards atm, alternative to store alpha texture info, but makes it more straightforward for godot to read EXR get assets in / out of authoring programs (blender/unity/godot/etc), and expose things in the assets in a way you can program against them. finding common subsets across different engines --- ## Playcanvas - https://developer.playcanvas.com/en/user-manual/graphics/lighting/runtime-lightmaps/ runtime lightmaps - https://developer.playcanvas.com/en/user-manual/graphics/lighting/lightmapping/ general info - https://forum.playcanvas.com/t/is-there-a-way-extract-and-save-lightmap/24156/6 Exporting lightmaps One of the better "all in one" bakes SM SithLord did: https://somniumspace.com/parcel/1761 > There's 2 ways to do baked lighting in PlayCanvas. Regular "all in uv0" works everywhere, but PlayCanvas also supports UV1 lightmaps as well - allowing for tiled textures in uv0. ![](https://hackmd.io/_uploads/H1ap4Lf82.png)