owned this note
owned this note
Published
Linked with GitHub
# Style Guide/Rules
* Direct quotes should be put in >, like this
@ValorZard
> I like cats
* Indirect quotes/content originating from quotes should have the person that said it referenced at the end.
I like cats (and also dogs) by @ValorZard
* This is just so that we can know who said what originally and where we can go to for more information.
# Bevy Editor-UI research
## Target
https://github.com/bevyengine/bevy/issues/254
Before we can start work on the Bevy Editor, we need a solid UI implementation. Bevy UI already has nice "flexbox" layout, and we already have a first stab at buttons and interaction events. But Bevy UI still needs a lot more experimentation if we're going to find the "right" patterns and paradigms. Editor-Ready UI has the following requirements:
- Embraces the Bevy architecture: Bevy ECS, Bevy Scenes, Bevy Assets, Bevy Events
- A Canvas-style API for drawing widgets with shapes and anti-aliased curves
- Define a consistent way to implement widgets
- A core set of widgets: buttons, inputs, resizable panels, etc
- Theme-ability
- "Interaction" and "focus" events
- Translation-friendly. We can't be anglo-centric here
Additional "target" quotes from Discord:
> @cart It must be both modular (not just a bunch of hard coded functionality), well integrated into Bevy ECS, and express-able via Bevy Scenes. Bevy UI will be used for the editor, but what Bevy UI looks like (and if we base it on something) is still up for debate
> @KamBam For bevy’s editor, would it likely include a text editor like Godot?
> @cart It's a pretty low priority. Doesn't seem worth the investment. We should be building game engines, not text editors
## Concerns
### ECS & Functional Reactive Programming
As Cart has stated multiple times any UI solution should feel bevy native, therefore using ECS
> is the idea that Bevy UI is built using the ECS i.e. UI elements are entities, or merely that Bevy UI is optimised specifically for interacting with the ECS
mem
> @cart yup both!
However in the web world there is some degree of concensus and state of the art emerging regarding functional and reactive programing, perhaps best popularised by React, but also others like the Elm framework and Cycle js (where the latter are more "pure" functionally). These ideas also show merit for Native development, as shown by React Native and Flutter.
It's not immediately clear how FRP and ECS could live together
> james.i.h
> i think you could do an elm style thing with ECS, components are basically just data models, you can setup separate systems to handle rendering ("views") and updates, the main disconnect is ecs has no built in concept of a messaging system, but bevy has events, which probably covers that
https://raphlinus.github.io/personal/2018/05/08/ecs-ui.html
https://raphlinus.github.io/ui/druid/2019/11/22/reactive-ui.html
https://raphlinus.github.io/rust/druid/2020/09/25/principled-reactive-ui.html
### Immediate Mode (imgui) vs Retained Mode
Many discord users extoll the virute of imgui (eg `Dear imgui` aka `imgui-rs` or `egui`). However no imgui solution is likely to be the whole story.
> @Cart yeah we are definitely shooting for a "retained" api, but "immediate" also has value. it would be interesting to use immediate mode drawing with a retained wrapper around it
>@StarToaster I think the hardest part is fitting imgui or whatever into ecs.
>@cartYesterday and layout
There has been some appetite for people developing imgui tests as standalone plugins. It's true that Imgui remains very popular, as it allows one to get going very quickly. However see the Unity section for concerns about the long term viablity of imgui.
However performance seems to be a solved problem with more modern IMGUI implementations
> OvermindDL1 they build a full scene in the background and as you submit the calls it just walks down it comparing the commands, and if it doesn't match then it updates it. It's extremely fast and gives you both the speed of cached UI's while being extremely dynamic in call styles
### Layout
The layout engine is key component to any UI solution. Currently Bevy uses flexboxed `strecth`, however see the `iced` section for concerns. There is a question whether we use something more custom like flutter does, or going for something more standards based, like flexbox.
The layout layer should be decoupled from both rendering and other parts of the UI like scheduling, message passing or widget state.
### Styling
It's true that users need to be able to customise widget (theming in the original target description). There's not been much discussion yet of what this might look like in Bevy, but to separate data and presentation is long estabilished wisdom. Obvious example being CSS.
Localization is one particularly complex and important case of this seperation of concerns.
### Rendering
Canvas style api for drawing widgets with shapes and anti-aliasing.
Concerns here probably include font rendering, and diffing to redraw only dirty elements.
Seems there are multiple bedrock options for this:
> @cart thttps://github.com/bevyengine/bevy/issues/90
> We're currently considering lyon, pathfinder, and piet as potential providers of this functionality
### Headless Editing
Cart has expressed desire to make it possible to hand editor scene files.
> @cart a stretch goal for the editor design is for aspects of it to be embeddable in games
There has also been some opinion on whether Bevy games should be possible to be developed entirely outside an editor
https://github.com/WesterWest/bevy-editor/issues/1
### Widget API
@bd
>In general I use components for intra-widget (but multi-entity) communication, and event for inter-widget communication
>For instance, I am making a radio button widget. It's composed of a container and buttons as its children. The container keeps the state of which child is selected. When a button is clicked, it updates the state in the container. That's intra-widget communication. It also fires a RadioButtonSelected event, which any system can listen to, and filter based on maker components given to the widget builder.
>My experimentation right now is to make an ergonomic builder, and have a pattern that works for any widget
>The root entity of a widget has the state in a component, and for more complex widgets other entities can have a component that points to the root entity
@alice
>This would get even nicer once const generics land; you could construct a query that uses With<Widget<42>> (number standing for the entity) to grab the data for specific UI component
>(and even better once the full const generics lands, allowing us to use strings or the Label type that we've been chewing on for systems and stages for UI widgets too)
>Every UI widget should have a unique tag component associated with it for ease of querying its state.
>This can be a unit struct for simple traits or a specific generic type (either with a const generic or more unit structs) when you have many similar UI elements
```rust
struct ButtonClicked<T>(Entity /*entity of parent*/, PhantomData<T>);
struct MyWidgetState(i32);
mod my_widget_buttons {
struct Increment;
struct Decrement;
}
use my_widget_buttons::*;
fn handler(
query: Query<&mut MyWidgetState>,
increment_reader: EventReader<ButtonClicked<Increment>>,
decrement_reader: EventReader<ButtonClicked<Decrement>>,
) {
for event in reader.increment_reader() {
*query.get_mut(event.0).unwrap() += 1;
}
for event in reader.decrement_reader() {
*query.get_mut(event.0).unwrap() -= 1;
}
}
```
### Event-Passing Architecture
This is the most common form of UI when it comes to other game engines. Both Unity and Godot have this system (in Godot's case they are called "signals"). They make for very easy boilerplate and are what more users are going to be used to.
The event based system sounds nice but if every widget type has an accompanying system which handles incoming events, updates state and sends out new events for other widgets that use this widget, state updates could take multiple frames to fully propagate up.
We could solve this issue by condensing into a single UI stage, with a special executor.
Another problem with this design is that most widgets & their events will need type parameters so a ButtonClicked event doesn't trigger every single widget that uses a button won't get set off by a random button press, but then this will make tracking all relevant event readers even more challenging
One solution would probably be to have a custom add_widget_system method in the app builder and UiEventReader, UiEventWriter and .add_ui_event
but then this would segregate ui systems from the rest and cause weird errors due to people accidentally using normal event readers instead etc.
If buttons are dumb and only report what entity they are in the event, all systems from all events will receive and handle this event. This handling process will probably be a hashmap, which goes from entity to a different entity that points to the state and a stateless enum variant that says what the button is for and can be processed in a match statement.
If the event is smarter and has a marker type parameter, and widgets create custom marker types that they attach to their buttons, now not only does the event only go to the intended widget, its already pre-matched for what its purpose is, meaning instead of a hashmap, that maps the button to its parent and its purpose, we can simply pass the parent itself in the button and skip the mapping altogether.
By @TheRawMeatball
**Look towards Elm for some ideas for this.**
#### Channels
Passing events back and forth via naive generic specialization creates a proliferation of types and systems, slowing down our compiler and scheduler.
This example shows the problem:
```rust
struct ButtonClicked<T>(Entity /*entity of parent*/, PhantomData<T>);
struct MyWidgetState(i32);
mod my_widget_buttons {
struct Increment;
struct Decrement;
}
use my_widget_buttons::*;
fn handler(
query: Query<&mut MyWidgetState>,
increment_reader: EventReader<ButtonClicked<Increment>>,
decrement_reader: EventReader<ButtonClicked<Decrement>>,
) {
for event in reader.increment_reader() {
*query.get_mut(event.0).unwrap() += 1;
}
for event in reader.decrement_reader() {
*query.get_mut(event.0).unwrap() -= 1;
}
}
fn main() {
App::build()
.add_event::<ButtonClicked<Increment>>()
.add_event::<ButtonClicked<Decrement>>()
.add_system(button_handler::<Increment>.system())
.add_system(button_handler::<Decrement>.system())
.add_system(handler.system())
.run()
}
```
To get around this, @TheRawMeatball has been [working on](https://github.com/TheRawMeatball/bevy/tree/direct-event-channels) a `Channel` API.
This collects multiple events of the same type, allowing producers and consumers to differentiate which widget is being interacted with without requiring a unique type.
The proposed API wraps `HashMap<Id, Events<T> has two `SystemParam` interfaces:
1. `ChannelReader<T>`: This has only one method `iter(id: Id)`, allowing systems to read events in a channel.
2. `ChannelWriter<T>`: This can reserve new channels with `open() -> Id`, and send messages with `send(id: Id, msg: T)`.
The idea is to coordinate work by storing the channel `Id` in the components for both the sending and receiving widget, and then use other component data on the receiving widget to specialize the action taken appropriately.
Implementing this has the following outstanding concerns:
1. It is blocked on the [Scheduler overhaul PR](https://github.com/bevyengine/bevy/pull/1144), via the follow-up [EventWriter](https://github.com/bevyengine/bevy/pull/1244).
2. The `Id` type would likely be a relatively heavy type that carry a `Arc<AtomicU32>` to drop the channel when it becomes inaccessible. Replacing`Arc` above here with a lighter data structure that matches the use better should help.
3. "I don't know how to safely implement the reserving system, as it seems to use unsafe and I'm not sure I fully understand how the EntityReserver works (if anyone can explain it, I'd really appreciate it)" - @TheRawMeatball
Discord messages on this topic begin [here](https://discord.com/channels/691052431525675048/743663673393938453/801717257927262208).
#### An Event-lite Architecture
@bd is experimenting with an alternate architecture, where widgets directly consume the data they need.
Details are still light, but you can find the discussion [here](https://discord.com/channels/691052431525675048/743663673393938453/801799227121860608).
### Deep UI
One of the core issues we have here is how to handle "deep" UI: where we have UI elements that talk to other UI elements and so on. This would be very common in e.g. resizing a window.
If we use the standard system + events approach, you can take as long as one frame per system for everything to work.
You can improve this by carefully ordering your UI systems with dependencies, but that's brittle and tedious.
If we have a special looping UI stage, we can make sure that this resolves before the frame completes.
If we implement some sugar for [event-dependent systems](https://github.com/bevyengine/bevy/issues/1272) we should be able to implement this quite easily for prototyping purposes.
### Scheduled UI
But the looping UI stage has its flaws: you're calling a ton of systems that have no effect every time until the UI stack is empty.
Stages as of 0.4 can each have their own executor. We can use this feature to design a much smarter recursive scheduler for UI systems than running each system each frame.
If all of our UI widgets are event-responsive, we know that only systems that have unprocessed events of the appropriate type need to be run.
The UI stage's scheduler could queue all of the appropriate systems up (respecting their explicit dependencies), then wait for those to resolve. If new UI events have been generated, it can repeat this process.
By doing so, we avoid scheduling UI systems that we know aren't going to do any work, and make sure that the UI events have all been handled before advancing to the next frame (see below).
@alice
>What I think we actually want is a specially scheduled UI stage, where it only inserts the appropriate UI systems if there are unprocessed events
>Stages as of 0.4 can each have their own executor. We can use this feature to design a much smarter recursive scheduler for UI systems than running each system each frame.
>If all of our UI widgets are event-responsive, we know that only systems that have unprocessed events of the appropriate type need to be run.
>The UI stage's scheduler could queue all of the appropriate systems up (respecting their explicit dependencies), then wait for those to resolve. If new UI events have been generated, it can repeat this process.
>By doing so, we avoid scheduling UI systems that we know aren't going to do any work, and make sure that the UI events have all been handled before advancing to the next frame
## UI Focus
Not all UI elements should read input at all times. In general, UI elements should be able to be focused in a toggleable fashion.
This is particularly common in games when menus are open, often pausing the game, and should play nicely with our existing State solution.
When handling mouse events, the problem with focus centers on overlapping elements.
With more complex UIs, we'll frequently have overlapping elements (including the game state).
In general, you want to dispatch only to the UI element on top.
Doing so requires a central handler to ensure coordination.
Currently, this is handled by [ui_focus_system](https://github.com/bevyengine/bevy/blob/09c15ea890c3d2d13e6238ec4fa45ea58901a450/crates/bevy_ui/src/focus.rs).
## Responsivity
If we use a UI stage, it should occur near the start of each frame to minimize input lag.
### Dedicated UI thread
We don't want to block input on game state: if the game itself is lagging, or waiting on vsync, UI events should still be processed as soon as possible.
Windows will only pass input events to a main thread. The typical pattern in Windows GUI apps is to have the main thread (aka UI thread) be very quick on its callbacks and send off work asynchronously so the application doesn't hang.
With a specialized UI stage (see **Deep UI**), moving execution its own thread would be relatively simple if we had a way to run stages asynchronously.
This dovetails with discussions on [triggered systems](https://github.com/bevyengine/bevy/issues/1273).
# Possible UI Solutions
## Existing UI Libraries
### List of existing Rust UI APIs
[Rust GUI: Introduction, a.k.a. the state of Rust GUI libraries (As of January 2021) - DEV Community](https://dev.to/davidedelpapa/rust-gui-introduction-a-k-a-the-state-of-rust-gui-libraries-as-of-january-2021-40gl)
### Stretch
https://github.com/vislyhq/stretch
This is the layout system used by the "official" Bevy UI implementation. (where any official changes will happen). However, there are a number of alternatives being considered (some have been built already, some require initial prototyping, some require conversations with project owners)
We are currently surveying the space and might decide to mostly stick with the current bevy_ui, but theres also a good chance we will make drastic changes.
However, due to current circumstances we MIGHT have to vendor it, since the maintainers haven't updated it in a while.
### Egui
Egui is an already existing popular Rust GUI plugin used for “easy” ui implementation
[emilk/egui: egui: an easy-to-use immediate mode GUI in pure Rust] (https://github.com/emilk/egui)
Already existing bevy egui plugin:
[mvlabat/bevy_egui: A plugin for Egui integration into Bevy](https://github.com/mvlabat/bevy_egui)
There’s even an existing web implementation using WebGL2
https://github.com/mvlabat/bevy_egui_web_showcase
### imgui-rs
Rust bindings for DearImgui. Dear Imgui is a very popular C++ imgui library, but Bevy has strong preference for Rust native crates
### MegaUI
There isn’t much information about MegaUI, other than the fact it's part of the Micro/Macroquad ecosystem. [not-fl3/megaui](https://github.com/not-fl3/megaui) More information about the Macroquad suite can be found [here](https://github.com/not-fl3/macroquad/)
### Druid
> A data-first Rust-native UI toolkit.
https://github.com/linebender/druid
Created by Raph Levien, spun out of the now defunct xi edtior as a way of doing cross platform GUI. Raph's blog is also some of the most authoritive sources on what it takes to build GUI in Rust. See:
https://raphlinus.github.io/rust/druid/2019/10/31/rust-2020.html
Druid seems to heavily inspired by Flutter. It's not clear how ready Druid is as a complete framework, most of the work seems to be going into piet - a GPU accellerated rendering foundation.
The druid Zulip instance also hosts rust GUI chat
https://xi.zulipchat.com/
### OrbTk
> The Orbital Widget Toolkit is a cross-platform (G)UI toolkit for building scalable user interfaces with the programming language Rust. It's based on the Entity Component System Pattern and provides a functional Reactive-like API.
https://github.com/redox-os/orbtk
They also happen to use an ECS system for their UI, which is quite similar to what we are trying to do.
### Iced
> A cross-platform GUI library for Rust focused on simplicity and type-safety. Inspired by Elm.
(Created by discord User @lone_scientist)
https://github.com/hecrj/iced
This one is quickly gaining steam in the Rust community. Veloren is currently trying to transition their UI over to this.
[hecrj/iced: A cross-platform GUI library for Rust, inspired by Elm (github.com)](https://github.com/hecrj/iced)
This also works on the web using WASM, check it out here
[Welcome - Iced](iced.rs)
#### Lessons
> Iced dropped `stretch` (Editors note: the same lib used by Bevy) in favor of a custom layout engine a while back: https://github.com/hecrj/iced/pull/52 it might be helpful and interesting to read about why they took that approach.
>
This was replaced by layout methodology taken from druid, as it seems the flexbox model create perf concerns.
It's important to keep in mind that iced used an older version of stretch which could have been the source of the performance issues.
### Flutter (Dart)
Flutter is a relatively modern cross platform native UI framework created by Google, written in Dart. And seemingly inspired by React.
It's relatively modernity means that they've been able to incorportate some of the latest thinking in UI. Also, google scale it's a good reference point on how to implement an entire frame work from scracth.
Native mobile doesn't map perfectly to "games" (eg most games don't care about having platform native looking ui), but games **DO** care about platform native UI performance.
Seems like Flutter is strong inspirtation for Druid
https://flutter.dev/docs/development/ui/layout
### Conrod
https://github.com/PistonDevelopers/conrod
This UI is basically defunt. The only major project that uses Conrod is Veloren, and they are trying to transfer over to Iced.
### the prototypes Cart is planning on building
### @TheRawMeatbal's prototype anchor UI
### @Cole Poirier's attempt to port Elm to Rust
He wants to implement the bevy_ui evaluation example with his Rust Elm implementation. First step is to port this https://package.elm-lang.org/packages/mdgriffith/elm-ui/1.1.8/ to rust, then refining it so it's got rusty ergonomics
## Existing Constraint Algorithims
### Cassowary
https://docs.rs/cassowary/0.3.0/cassowary/
This is a constraint solving algorithm.
### Crochet
https://github.com/raphlinus/crochet
Another constraint solving Algorithm, this one is based off of [Ralph Levin's work](https://raphlinus.github.io/ui/druid/2019/11/22/reactive-ui.html)
## Non-Rust implementations
### Unity (C#)
Unity's first GUI was a bespoke Unity authored `imgui` (referred to as `OnGUI` for disambiguation from other `imgui` libraries) for both editor and runtime.
At runtime this was a major source of frustration for many developers. The `OnGUI` draw the world often had poor performance beyond even marginal complexity. Styling was based on serialized `GUIStyle` assets that would be passed into `OnGUI` functions - however creating unique looking UI (important in games) or bespoke functionality was difficult.
For the Unity editor, consistency is important, and performance is less of a concern. These concepts map well to imgui and indeed much of Unity's editor today is still powered by `OnGUI`. However it's worth noting Unity has needed to add conventions to facilitate so more stateful operations, namely serialization (persisting state) and undo/redo (rewinding state). These concepts layer additional complexity for the end user trying to write successful UI editor code - imgui often isn't enough to answer the whole story by itself.
One of the most common Editor use cases is to embellish default editor behavior - in other words you don't want to replace what the editor is doing, but just add a bit extra for efficiency. Generally achieving this with custom editor code involved a fair degree of boilerplate. However it remains true that Unity's editor extensibility is one of it's selling points.
Later based on the limitations of `OnGUI`, Unity created a bespoke retained mode UI (`UGUI` or UnityGUI) for runtime. This was much closer to a traditional OOP display graph GUI as seen in Flash/.NET/Java, with class based widgets storing state. As an end user, the biggest win is to be able to edit UI in the editor - as `OnGUI` was runtime only. This was a huge win for customization. Modification and custom elements also become easier, as now there was an API to create your own widgets. It was now also for the first time possible to UI on 3d canvases in the world.
The library also did a god job of separating UI capturing input into separate modules. In general this concept has survived well - as Mouse Pointers, Touch devices, and VR devices have all been evolutionarily supported by swapping out modules without needing to modify UI code.
Less successful was the layout routines. The esoteric layout strategy (designed to ride on top of Unity's existing transform hierarchy) historically been very difficult to get dynamic UIs to work well and layout how the author expects - a constant source of pain. And even though `UGUI` was an order of magnitude perf gain over `OnGUI` it still struggled with "typical" UI perf bottlenecks, eg drawing large lists or dynamically modifying content.
More recently Unity has created `UIElements`: a unity flavoured html+css like approach based on flexbox (under the hood I believe this uses Yoga - React Natives layout model). It's still too early to evaluate the success of this, but the long term goal is to have this be the blessed path on both Runtime and the editor. Most new editor features are built on this new framework.
As a developer this offers many wins. The markup approach means UI can be hot reloaded at runtime, however it can still be dynamically generated from code (the mark up elements are backed by class based components). So far this approach has been well received by the community. It's worth noting none of the Unity frameworks are opinionated in the in how state should flow through the application. This is mostly up to the end user.
## The Virtual DOM
with custom readers, we could maybe somehow tell the executor not to run this system if none of these readers have anything to read
Which sounds like free/optimized diffing of the 'vdom' were we in browser land
https://reactjs.org/docs/faq-internals.html
# General Comments
@msiglreith:
> Regarding stretch, I moved away from flexbox as it's powerful but also kinda hard to predict (might be also due to my lack of experience back then with UI layouting in general). So I went further to a variant of druid layout system, which is also adopted by iced. It's quite nice and fits well into a multi-pass UI approach. My goal for an UI was to build tools and be useable in games as well, so easy to customize, integrate into the game world and also quick to setup without building the rest of the application around it. druid is on one end of the spectrum by being completely retained. I moved towards a custom approach where the passes could be either retained or immediate (logic/state non retained, only layout data) and then to fully immediate, basically dropping the layout system completely.
> Overall, the user group is important (e.g automatic vs manual layouting?) and the application while working with the constraints given by Rust.
@ncallaway:
> We should take into account device-scaling from the early stages of UI development. `bevy` already has an issue with DPI scaling on different devices (https://github.com/bevyengine/bevy/issues/195). `bevy_ui` should set a good foundation in the core of `bevy_ui` for device independence, with defaults that work across a wide arrange of devices.
> As part of this, we should take a thoughtful approach as to what units we want to present in `bevy_ui`. I would strongly encourage `bevy_ui` to default to logical pixel of some kind (in the same way CSS defines a pixel as 1/96 in), while leaving physical pixels as an available unit.
An explanation of Elm UI from @Cole Poirier
> Elm's model is interesting but to me is irrelevant to this discussion as I feel the bevy ecs system takes care of events and state changes elegantly. What I would suggest looking at it is the specific elm library elm-ui https://github.com/mdgriffith/elm-ui/. It's a layout system that replaces html with function calls instead of xml-like tags, and replaces css with inline styling. One commenter on my post about this in #general said that this is the same as using inline css in html and javascript; this is not the case. What is unique about elm-ui's styling is that it has styling that is designed to do one thing only and do it intuitively. For example, in css there are multiple parameters that affect the alignment of items inside a container as well as their spacing, in elm-ui it simply has align{Left, Right, Top, Bottom} and spacing (value in logical pixels). This removes all of the complexity from css making it useful for styling things and having them do what you want without having to memorize the hundreds of pages of the css specification and knowing all the gotchas and unwanted interactions between multiple settings. This to me seems like a very rusty approach whereas css is very javascripty. To me css is essentially a failed experiment, yet remains the way is forced to do styling and layout; but in a way that feels unstable and unpredictable while building a UI, and is likely to make you want to tear your hair out. If you'd like, I'd be happy to go over elm-ui, and how I think it could be easy to implement in bevy, and why I think its the best and most powerful approach from an ergonomics and a technical point of view. (edited to spelling and clarity)