# Creating a Tools Pipeline for Horizon: Zero Dawn
https://youtu.be/KRJkBxKv1VM?t=5
So my name is Dan Somali / I'm a tools programmer at Guerrilla Games / and together/ with my colleague Sandra van der Steen.
I'm going to talk about / how we rebuilt our tools pipeline during the development of Horizon Zero Dawn.
so those of you / who might be unfamiliar / with the game, / Let's take a quick look at a trailer.
私はゲリラゲームズのツールプログラマーで、同僚のサンドラ・ヴァン・デル・スティーンと共に働いています。
Horizon Zero Dawnの開発において、どのようにツールパイプラインを再構築したかについてお話します。
このゲームに馴染みのない方のために、トレーラーを簡単に見てみましょう。
---
https://youtu.be/KRJkBxKv1VM?t=47
All right, So now everybody knows exactly / what the game is about.
it's continue.
So [**prior**](https://www.ldoceonline.com/jp/dictionary/prior) to creating Horizon / we worked on Killzone / which is a series of first-person shooters.
and when we were gonna [**make the jump from**](https://eow.alc.co.jp/search?q=make+a+jump) / building shooters [**to**](https://eow.alc.co.jp/search?q=make+a+jump) creating the [**vast**](https://www.ldoceonline.com/jp/dictionary/vast) open world of Horizon.
It was gonna be a big challenge / for all [**disciplines**](https://eow.alc.co.jp/search?q=discipline) within our company / and the tools team was no [**exception**](https://www.ldoceonline.com/jp/dictionary/exception).
So we'll cover a few main / a few different areas of interest.
but I guess the main [**takeaways**](https://eow.alc.co.jp/search?q=takeaways) of this talk / are that you'll get / a bit of [**insight**](https://www.ldoceonline.com/jp/dictionary/insight) into / why we decided to rebuild our tools.
and learn a little bit about / how we approached to do this / while we were already in production.
and will show a few ways / in which we think that it [**paid off**](https://eow.alc.co.jp/search?q=pay+off).
so if you're somebody who is / responsible for maintaining a legacy toolset / or you're just somebody / who uses one / on a daily basis this / will hopefully inspire you to do something about that situation.
さて、皆さんはこのゲームがどのようなものかをよくご存じだと思います。
Horizonを作る前に、私たちは「Killzone」というファーストパーソンシューティングのシリーズに携わっていました。
このゲームからHorizonの広大なオープンワールドに移行する際、全スタッフにとって大きなチャレンジとなりました。
Horizonの制作は、社内のすべての部署にとって大きな挑戦となりましたが、ツールチームも例外ではありませんでした。
この講演で覚えて欲しいことは、私たちがなぜツールを作り直そうと思ったのか、その理由を少し知っていただくことだと思います。
そして、すでに生産を開始している状態で、どのように再構築に取り組んだかについて、少しお話します。
そして、その成果を示すいくつかの方法を紹介します。
もしあなたがレガシーなツールセットの保守を担当しているのなら、あるいは日常的にレガシーなツールセットを使っているのなら、この講演がその状況に対して何かをするきっかけになることを期待しています。
---
https://youtu.be/KRJkBxKv1VM?t=95
So yeah, to begin with I'll talk a little bit about the tools / that
we use on Killzone Shadowfall / [followed by](https://eow.alc.co.jp/search?q=followed+by) a bit of background on the process of actually rebuilding our tool chain.
and then we'll go into some detail / about our level Editor or 3D Editor.
and how actual data editing works.
and if there's time / I'll cover a few other tools / we have and friendly share a couple of learnings from our projects.
and a bit about our future plans.
まず最初に、『KILLZONE SHADOWFALL』で使用しているツールについて少しお話します。
Killzone Shadowfallで使用しているツールについて、そして実際にツールチェーンを再構築するプロセスの背景について、少しお話します。
そして、レベルエディターや3Dエディターについて詳しく説明します。
実際のデータ編集がどのように行われるかを説明します。
そして、時間があれば、その他のツールについても説明し、私たちのプロジェクトから得たいくつかの学びと、将来の計画についてもフレンドリーにお話ししたいと思います。
---
https://youtu.be/KRJkBxKv1VM?t=119
so at the start of Horizon / we had a pretty [**fragmented**](https://www.ldoceonline.com/jp/dictionary/fragment) toolset.
it was [**built upon**](https://ejje.weblio.jp/content/build+upon) like / a mixture of different languages and frameworks.
We had very long iteration times.
I mean in a really bad case it could be like / between a Designer making a change / and actually being able to see it in-game was like 20 or 30 minutes.
and to be frank we had a lot of pretty unhappy users / I mean at this time Unity and Unreal were becoming much more popular and visible.
so we had some pretty strong [**dissatisfaction**](https://www.ldoceonline.com/jp/dictionary/dissatisfaction) with our users / when they were comparing their workflows and features in Guerrilla tools and comparing it to you like Unreal and Unity.
Horizonのスタート時点では、かなり断片的なツールセットでした。
さまざまな言語やフレームワークが混在しているような状態で構築されていました。
ひどいときには、デザイナーが変更を加えてから実際にゲーム内で確認できるまでに、20~30分もかかってしまうこともありました。
正直に言うと、多くの不満を持つユーザーがいました、というのもこの頃、UnityやUnrealの人気が出てきて、注目を浴びるようになりました。
そのため、Guerrilla Tools のワークフローや機能を、Unreal や Unity と比較したときに、ユーザーから強い不満の声が上がっていました。
---
https://youtu.be/KRJkBxKv1VM?t=156
so just a quick Technical detail at Guerrilla.
We use an intermediate storage format which is called Core Text and we said for all of our game data.
Almost everything except for textures in audio.
and before the data is loaded and we transform it through a into an optimized binary representation through a process that we call conversion.
ゲリラの技術的なことを簡単に説明します。
ゲリラでは、コアテキストと呼ばれる中間保存形式を使用しており、すべてのゲームデータをコアテキストで保存しています。
オーディオのテクスチャを除く、ほぼすべてのデータです。
そして、データがロードされる前に、変換と呼ばれるプロセスを経て、最適化されたバイナリ表現に変換しています。
---
https://youtu.be/KRJkBxKv1VM?t=176
so the Primary Editor that we had for a Core Text was the Core Text Editor here's an image of the thing in all of its glory.
it's pretty old-school.
Core Text Editor displays the contents of a court text file is a 2D graph.
so objects within Philo resources within the file are represented as notes on the on the canvas and references between those objects are represented as wires.
Core Text Editorは、Core Textを編集するための主要なエディタです。
見た目は古臭いですけど、
Core Text Editorは、Court Textファイルの内容を2Dのグラフで表示します。
ファイル内のPhiloリソース内のオブジェクトは、キャンバス上のノートとして表現され、オブジェクト間の参照は、ワイヤとして表現されます。
---
https://youtu.be/KRJkBxKv1VM?t=197
so we use this to create and configure a lot of our game data but in terms of actual data editing it's pretty low-level it's basically one step removed from just editing your Core Text files directly in a text Editor which incidentally is how a lot of the configuration for earlier Killzone game was actually done.
so much of the users energy was spent on the busy work of kind of creating and understanding these complicated resource structures instead of just being free to focus on the actual creative process.
ゲームデータの作成と設定にはこれを使いますが、実際のデータ編集という点では、テキストエディタでコアテキストファイルを直接編集するのと同じで、かなり低レベルです。
そのため、ユーザーのエネルギーは、実際のクリエイティブな作業に集中する代わりに、こうした複雑なリソース構造を作成したり理解したりする作業に費やされていたのです。
the still was written in C# and WinForms and a Sony framework called ATF which was recently open sourced.
and additionally the code base for this tool was fantastically complicated.
there was required a really a lot of effort to make even trivial changes in this tool.
このソフトは、C#とWinForms、そして最近オープンソース化されたATFというソニーのフレームワークで書かれています。
さらに、このツールのコードベースは非常に複雑でした。
また、このツールのコードベースは非常に複雑で、些細な変更を加えるにも大変な労力が必要でした。
so there was only a handful of programmers in the company who could
actually even work on it and there was basically nobody who enjoyed working on it at all.
そのため、社内でこのツールの開発に携われるプログラマーはほんの一握りしかいませんでした。
そのため、実際に作業できるプログラマーは社内で一握りで、このツールの作業を楽しめる人は基本的にいませんでした。
---
https://youtu.be/KRJkBxKv1VM?t=242
The next major piece of our pipeline was Maya specifically our
in-house Maya plugin.
so the the plugin allows us to edit our game environments inside of Maya and then export them into core text.
and the plugin uses DECIMAS renderer in the viewport so you get an accurate representation of how your assets are going to look in the game.
however it doesn't run the entire game movement so and also it leaves a lot of the game systems in an uninitialized state which means that you miss a lot of things that actually bring life to the
game world.
this was built in C++ and Python and also uses the MAYA API very
heavily.
and much like the Core Text Editor there was only a handful of
people who actually were able to work on it and also willing to work on it.
パイプラインの次の主要な部分はMayaで、特にMayaの社内プラグインです。
プラグインです。
このプラグインは、Mayaの中でゲーム環境を編集し、コアテキストにエクスポートすることができます。
このプラグインは、ビューポートでDECIMASレンダラーを使用するため、アセットがゲーム内でどのように見えるかを正確に表現することができます。
また、多くのゲームシステムが初期化されていない状態のままなので、ゲームの世界に生命を吹き込む多くのものを見逃してしまうことになります。
を見逃すことになります。
これはC++とPythonで作られ、MAYA APIも非常によく使われています。
を多用しています。
コア・テキスト・エディターと同様、実際に作業できる人はほんの一握りでした。
コア・テキスト・エディターと同じように、実際に作業できる人、作業してくれる人はほんの一握りでした。
---
https://youtu.be/KRJkBxKv1VM?t=278
so the last last piece was a State Machine Editor called 'SMED'.
and SMED is a tool that generates Lua script which is then executed in game.
and this was used by Designers for much of the Level Scripting with logic within Killzone.
SMEDは、Luaスクリプトを生成して、それをゲーム内で実行するツールです。
このツールは、Killzoneのロジックを含むレベルスクリプトの多くに使用されました。
---
https://youtu.be/KRJkBxKv1VM?t=289
so 'SMED' begun life is a prototype actually it was created by a Technical Designer to see if it would be possible to produce a Visual Script Editor for our game.
The deal was that if the products I've worked out.
and It was proven to be okay then the tools team would to take ownership of this thing and make it into a real tool.
and then yeah we would go forward.
but even though the prototype was successful that transfer ownership thing never really happened.
and so at some point this tool gained critical mass and the
Technical Designer that made the original prototype was now responsible for maintaining an important part of the tool chain.
and this tool was written in Python.
SMEDはプロトタイプで、テクニカルデザイナーがゲームのビジュアルスクリプトエディターを作れるかどうかを確認するために作りました。
もし、私が開発した製品がうまくいけば、ツール・チームも参加することになっていました。
ツールチームがこれを所有し、実際のツールにすることでした。
そうすれば、私たちは前進することができます。
しかし、プロトタイプが成功しても、所有権の移譲ということは起こりませんでした。
そして、ある時点でこのツールはクリティカルマスになり、最初のプロトタイプを作ったテクニカルデザイナーは、このツールのオーナーになりました。
最初のプロトタイプを作ったテクニカルデザイナーは、ツールチェーンの重要な部分を維持する責任を負うことになりました。
そして、このツールはPythonで書かれていました。
---
https://youtu.be/KRJkBxKv1VM?t=322
so with this setup Designer who's aiming to accomplish a simple task is forced to navigate between this sort of like Bermuda Triangle of lost productivity which involves like flipping between these three different applications manually copying string identifiers file names and values between them.
and the tools also didn't talk to each other.
so it meant that mistakes were only indicated way after the fact and usually by some kind of impenetrable error message when you try to load the game which is not ideal.
このため、単純なタスクを達成しようとするデザイナーは、バミューダトライアングルのように、3つの異なるアプリケーションの間を行き来して、文字列識別子やファイル名、値を手動でコピーしなければならず、生産性が低下してしまいます。
しかも、これらのツールは互いに連携していません。
つまり、ミスは後になってからわかるもので、ゲームをロードしようとすると、たいてい不可解なエラーメッセージが表示され、理想的とは言えません。
---
https://youtu.be/KRJkBxKv1VM?t=348
There were so many other tools and utilities that were part of the daily Guerilla workflow but there isn't really time to go into them here.
このほかにも、ゲリラのワークフローに欠かせないツールやユーティリティはたくさんありますが、ここで紹介する時間はありません。
---
https://youtu.be/KRJkBxKv1VM?t=355
so one kind of tangential thing is that during shadow fall possibly out of just sheer frustration with the way that the tools pipeline was set up the debug UI was created.
and debug UI was an in-game user interface framework.
it's pretty feature-rich framework.
but most importantly for what I'm talking about right now.
it was something the programmers found very easy to use.
and because of that we ended up with a large number of very useful endgame tools.
so I'm gonna show one of those quickly just you can get an idea of what I'm talking about.
余談ですが、Shadow Fallのとき、ツールのパイプラインの設定に対する不満から、デバッグUIが作成されました。
デバッグUIは、ゲーム内のユーザーインターフェイスのフレームワークです。
これは、かなり機能豊富なフレームワークです。
しかし、今お話する内容で最も重要なのは、プログラマーがこのフレームワークに気付いたことです。
プログラマーにとって非常に使いやすいものでした。
そのおかげで、非常に便利なエンドゲームツールがたくさんできました。
そのうちの一つを手短にお見せしましょう。
---
https://youtu.be/KRJkBxKv1VM?t=381
so this is our in-game GPU profiler.
and you can see it displaying timing values for various draw calls during a frame.
and it allows you to jump to different points in the frame to examine like the render targets or the back buffer you can also see like the inputs for the different samplers and so on.
and it has an X-Ray view which you can move over the the scene and get an idea of which intermediate buffers are contributing to the final aim and what put there to look like.
これはゲーム内の GPU プロファイラです。
フレーム中のさまざまな描画呼び出しのタイミング値が表示されているのがわかります。
また、フレーム内の異なるポイントにジャンプして、レンダーターゲットやバックバッファを調べたり、異なるサンプラーの入力などを見ることができます。
また、X線ビューがあり、シーンの上を移動して、どの中間バッファが最終的な目標に寄与しているか、そこに何が置かれているかを把握することができます。
---
https://youtu.be/KRJkBxKv1VM?t=412
so that comes a little bit about the tools and some of the technology but what about the the game itself.
so when starting out when you that Horizon was gonna be very different to Killzone we hadn't made an RPG before.
we hadn't made an open-world game before and aside from that this game was gonna include new types of content which we just hadn't built before like quests and interactive conversations.
and additionally the tools team at gorilla is relatively small so depending on how you counted we had like four or six full-time tools programmers for a team size of about 220 people.
so we had to be really efficient about how we used our time resouces.
ということで、ツールや技術については少し触れましたが、ゲームそのものについてはどうでしょうか。
HorizonはKillzoneとはまったく違うゲームになると思っていたのですが、RPGを作るのは初めてでした。
オープンワールドのゲームも初めてですし、クエストやインタラクティブな会話など、これまで作ったことのない新しいタイプのコンテンツが含まれていました。
さらに、ゴリラのツールチームは比較的小規模で、220人ほどのチームに対して、フルタイムのツールプログラマーは4~6人でした。
そのため、時間の使い方を工夫する必要がありました。
---
https://youtu.be/KRJkBxKv1VM?t=445
All Things Considered it was pretty clear that our extent tools pipeline
would not easily scale to meet the demands of Horizon so we decided to
start again.
あらゆることを考慮した結果、私たちのエクステントツールパイプラインは、Horizonの要求を満たすために容易に拡張できないことが明らかでした。
Horizon の要求を満たすために拡張するのは容易ではないことが明らかだったので、もう一度やり直すことにしました。
再スタートすることにしました。
---
https://youtu.be/KRJkBxKv1VM?t=454
so at this point we were ready to define some initial goals for the new tool chain.
and we decided to start by building a tools framework which was gonna act as the basis of all of our future tools development.
and this was intended mainly to mitigate the arbitrary linguistic and technological differences between the tools and just enforce like some consistency between all of them.
この時点で、新しいツールチェーンの初期目標を定義する準備が整いました。
そして、今後のすべてのツール開発の基礎となるツールのフレームワークを構築することから始めることにしました。
これは主に、ツール間の任意の言語的・技術的な差異を緩和し、すべてのツールに一貫性を持たせることを目的としています。
DECIMA which I'll cover a little bit later on is our in-house engine and we wanted to leverage as much of it as possible while building the new tools.
後ほど少し紹介しますが、DECIMAは当社の内製エンジンなので、それをできるだけ活用しながら新しいツールを作りたいと考えていました。
and lastly the the debug UI had shown us that if it's easy to do so and
programmers will happily make tools for the game systems.
so by using the engine libraries and keeping the others codes simple we would increase the chances of getting buy-in from other teams and most importantly help from other teams.
そして最後に、デバッグUIは、それが簡単であれば、プログラマーは喜んでゲームシステム用のツールを作ってくれるということを教えてくれました。
そのため、エンジンのライブラリを使い、他のコードをシンプルにすることで、他のチームからの賛同や、最も重要な他のチームからの協力を得られる可能性が高くなるのです。
---
https://youtu.be/KRJkBxKv1VM?t=496
so in terms of our goals for an editing environment.
we wanted to make sure that we would build tools that would fit our content creators needs.
so to that end we decided to build like specific Editors for each discipline and
we would call these editing contexts.
and we were building from reusable components so that we could then create new ones for other teams as they needed
them and these editing contexts would sit inside of a single application where
the content creator would be able to do all their work and this would become the
integrated game development environment which we now call the DECIMA Editor.
そこで、編集環境として目指したのは、コンテンツクリエイターのニーズに合ったツールです。
コンテンツ制作者のニーズに合ったツールを作りたかったのです。
そこで、各分野に特化したエディターを構築することにしました。
これを編集コンテキストと呼びます。
そして、再利用可能なコンポーネントから構築することで、他のチームが必要とする新しいものを作ることができるようにしました。
これらの編集コンテキストは、1つのアプリケーション内に配置され、コンテンツ制作者がそのアプリケーションで行うことができます。
この編集コンテキストは、コンテンツ制作者がすべての作業を行える単一のアプリケーションの中に置かれ、これが統合ゲーム開発環境となります。
これが、現在DECIMA Editorと呼んでいる統合ゲーム開発環境です。
---
https://youtu.be/KRJkBxKv1VM?t=528
so before continuing I want to provide a little bit of background on DECIMA
itself DECIMA as I said our in-house game engine.
and most recently was used for "Killzone Shadowfall rigs mechanized combat league" "Horizon Zero Dawn" and is being used for the upcoming Kojima Productions titled "Death stranding".
and like most engines it's separated into layers.
so at the base we have platform independent game systems just above the OS.
and this insulates the layers above from the operating system which makes it just easier to port between different platforms.
it's also important to note that our engine has always had a full PC built for development purposes.
Nexus core this as rendering and mesh primitives lights particles and so on.
and then game code on top of that which adds more specific game features such as entities and crowds and cinematics.
so we wanted to put the tools framework on top of that stack so that we would have access to everything below.
そこで、話を続ける前に、DECIMA自体の背景を少し説明したいと思います。
DECIMAは、先ほども申し上げたように、当社のインハウスゲームエンジンです。
最近では、「Killzone Shadowfall rigs mechanized combat league」、「Horizon Zero Dawn」、そして小島プロダクションから発売予定の「Death stranding」にも使用されました。
他のエンジンと同様、レイヤーに分かれています。
ベースとなるのはOSの上にあるプラットフォーム非依存のゲームシステムです。
これは、OSから上のレイヤーを絶縁することで、異なるプラットフォーム間での移植を容易にするものです。
また、私たちのエンジンは、常に開発用にフルPCを搭載していることも重要なポイントです。
Nexusのコアは、レンダリング、メッシュプリミティブ、ライトパーティクルなどです。
その上に、エンティティやクラウド、シネマティクスなど、より具体的なゲーム機能を追加するゲームコードを載せています。
そこで、このスタックの上にツールフレームワークを置き、その下のすべてにアクセスできるようにしたいと考えました。
---
https://youtu.be/KRJkBxKv1VM?t=575
DECIMA is a very mature codebase and it contained many libraries that would be useful for tools development so here you can see the kind of things that we were interested taking advantage of.
one of the most important features was the engine super optimized our TTI system or a runtime type introspection system which is exactly what we need to be able to manipulate data in an Editor.
so aside from getting a lot of well written code for free using the DECIMA engine libraries would also mean that we the Edit code would prevent a lower barrier to entry for the other programmers in the company when compared to the Core Text Editor and SMED and the Maya plugin.
DECIMAは非常に成熟したコードベースで、ツール開発に有用なライブラリが多数含まれています。
最も重要な機能の1つは、エンジンが私たちのTTIシステムやランタイム型イントロスペクションシステムを超最適化したことです。これは、まさに「エディタ」でデータを操作するために必要なものです。
DECIMAエンジンのライブラリを使うことで、よくできたコードをたくさん無料で手に入れることができるのはもちろんですが、「Core Text Editor」や「SMED」、「Mayaプラグイン」と比べて、社内の他のプログラマーの参入障壁を低くすることができるという意味でも、私たちの「Edit」のコードは有効だと思います。
---
https://youtu.be/KRJkBxKv1VM?t=604
so to fast forward a little bit here's the Decimaeditor is it exists today.
and this is our level Editor context which is a 3D viewport onto the game and an asset browser at the bottom hierarchical outliner on the left and an attribute Editor on the right pretty standard stuff that you would expect to see.
少し早送りすると、これが現在のDecimaeditorの姿です。
これはレベルエディターのコンテキストで、ゲームの3Dビューポートと、下部のアセットブラウザ、左の階層型アウトライナー、右のアトリビュートエディターという、ごく標準的なものです。
---
https://youtu.be/KRJkBxKv1VM?t=618
This is a shot of our Visual Script Editor which is used for almost all of our in Game Designer driven logic.
It has a 2D diagram renderer which displays the logic graph of the script and on the right you can see it uses the same Attribute Editor as the previous slide.
これはVisual Script Editorの画面で、Game Designerで駆動するロジックのほとんど全てに使用されています。
スクリプトのロジック・グラフを表示する2Dダイアグラム・レンダラーを備え、右側には前のスライドと同じアトリビュート・エディターが使用されているのがわかります。
---
https://youtu.be/KRJkBxKv1VM?t=632
and finally is a shot of our conversation Editor or dialogue Editor
so it uses the same diagram render as the last view but to show the logic
graph of the the conversation and this time there's two new components so on
the bottom left there is a timeline for a positioning stuff inside of a
sequencer.
and a screenplay Editor for writers who work on the in-game dialogue which is on the bottom right.
そして最後に、会話エディタ(ダイアログ・エディタ)のショットです。
最後のビューと同じダイアグラム・レンダーを使用していますが、会話のロジック・グラフを表示するために、今回は2つの新しいコンポーネントを使用しています。
今回は2つの新しいコンポーネントがあります。
左下にはタイムラインがあり、シーケンサーの中に位置決め用のものが入っています。
シーケンサーの中に配置するためのタイムラインです。
そして右下には、ゲーム内のダイアログを作成するライター用のスクリーンプレー・エディターがあります。
---
https://youtu.be/KRJkBxKv1VM?t=653
so we would establish some initial goals.
and we actually needed to start moving at this point.
but the thing is killzone was already finished and aside from a few people who were remaining on the project to support multiplayer in DLC everybody was moving on to Horizon.
so production was already starting and despite the fact there was many issues with the killzone tool chain.
it did work you know you could make a game with it and the content creators were familiar with those tools.
so they didn't necessarily even want anything to change which seems illogical but that's just how it was.
and we also had no way to gauge how successful we'd be in this
endeavor.
so we're gonna have to start really small minimize risks.
we couldn't disrupt the production of the game and therefore we would need to continue supporting the old tools while we did this this drive for new stuff.
そこで、最初の目標を設定することにしました。
この時点で実際に動き出す必要がありました。
しかし、KILLZONEはすでに完成しており、DLCでマルチプレイをサポートするためにプロジェクトに残っていた数名を除いて、全員がHorizonに移っていました。
KILLZONEのツールチェーンに多くの問題があったにもかかわらず、生産はすでに始まっていたのです。
キルゾーンツールチェーンには多くの問題がありましたが、それでゲームを作ることができましたし、コンテンツクリエイターもそのツールを使いこなしていました。
そのため、必ずしも何かを変えたいとは思っていなかったようです。
また、この取り組みがどの程度成功するのか、評価する術もありませんでした。
そして、私たちには、この取り組みがどの程度成功したかを評価する方法がありませんでした。
だから、リスクを最小限に抑えるために、本当に小さなことから始めなければならない。
ゲームの制作を中断するわけにはいかないので、新しいものを作る一方で、古いツールのサポートを続ける必要がありました。
---
https://youtu.be/KRJkBxKv1VM?t=696
so with that in mind we decided to focus on a single team which
was our audio team and it represents a pretty small group of users at this time
it was around four people and they have a very specific set of needs for their
workflow so by essentially porting this workflow from the court
Editor into the fledgling DECIMA Editor we would have an isolated test case for
the viability of the framework as a whole so our deity works primarily with
no graphs and node graph is a high-performance Visual Scripting
language that we developed in-house and this was a technology that we it
generates C++ by the way which runs natively on the target and this was
something we use very heavily in shadow fall and it was being proposed as a
replacement for SMED and Lua so this meant that if we were successful in
porting this audio team's workflow then later it would have benefits for other
teams in production if you're interested in learning more about this technology
there's actually talk from GDC 2014 by andreas Varga and Anton voltic and they
go into some detail about how our audio system works so as a reminder this is
the thing that we were trying to replace and this is our first prototype which we
built in C++ and cute it used cutest cue painter class for the
diagram display the nose and a similar visual style to the old Editor at this
point it wasn't very usable it was very basically just like a read-only display
for no graphs that already existed we also had some performance issues with
cute so we decided that now we had access to DECIMAls renderer we could
just rewrite the diagram rendering using that so that allowed us to write the
diagram renderer as a single it draws a single quad and then it has some pixel
shader magic to draw the entire thing it was super fast and it can handle a huge
amounts of data wasn't very useful though because you still couldn't like
edit or save or anything but it look pretty good and this is what we have
today kind of skipping through forward a few steps so once we added an attribute
Editor and a way to browse for files and save them and edit them and so on we
managed to convince wanted to have members of the audio team to just like
try it out and this was a really really painful process actually it required
many iterations and every time they came up against something that wasn't
possible in the new Editor they would just immediately go back to using the
old tools but we focused a lot of our energy on implementing features that
they were missing and eventually achieved feature parity with the old
Editor and these guys started working in the Decimaeditor every day so it was our
initial test case and it worked out pretty well
we reached feature parity with the old Editor much more quickly than we
anticipated and because we shed this like onerous complexity of the old
Editor it meant that we could implement new features very easily and most
importantly programmers from other teams were comfortable enough with the code
that were able to contribute some pretty
awesome features but I'll come back to that later for now I'll hand over to to
Sondra okay thank you Dan so the node graph Editor gave the tools team the
necessary confidence to pick up the next larger project and that would be
creating a level Editor or 3D adder as we call it so in the next section I'll
explain some of the early design decisions that led to this new Editor so
why create a 3D Editor at this stage you might wonder as Dan mentioned we were
already in production and so this is fairly ambitious why did we not just
simply invest more into our Maya plugin so first of all the main way of the
sorry the first reason was that Horizon Zero Dawn needed a lot of quest content
which is a somewhat new content type coming from the Killzone series so the
game design workflow that we have we had in place in Maya was really the worst
part of our Maya plugin in terms of usability Maya is simply not designed to
do any any level editing so secondly the 3D Editor a 3D Editor like Unreal unity
would eventually benefit our workflows as well it is a formidable task and if
we hadn't started it back then it would have never been able to finish it in
time for Horizon in the first place and finally the game design team offered a
very achievable starting point for us to work on a 3D Editor if we could
facilitate only object placement in the game we would already provide a user
workflow improvement for the game design team so now that we've established why
we wanted this 3D Editor let's have a look at how we can actually achieve this
looking at what we have available a tools framework in its infancy and a
game executable we are immediately presented with the first choice do we
integrate the desired games this game subsistence into our Editor as
illustrated on the left or do we actually merge the full game into our
Editor environment so let's let's zoom into using only game subsystems so this
is the advantage of being able to tailor make your game loop based specifically
on your Editor needs this leads to a very controllable viewport
skipping I want to gain subsystems however doing this will introduce a
second code path where these systems will be used leading to code complexity
and potentially disparity between what you'll see in your Editor versus what
you see in the final game because more code is required it will also take
longer to develop so the alternative then is integrating the full game in the
Editor game systems would truly function as they would in game and as an added
bonus we can use the earlier discussed debug UI for free to give us a head
start we would we'd also be able to play in the other and just like we would be
able to do in game as a disadvantage though it does introduce more overhead
as a lot of game subsystems are usually not required for certain editing
operations an example of this would be a weather system it's typically not very
useful for the game to be raining when you're actually just placing an object
it's both for hindrance for performance and usability so given the time
considerations that we had at hand remember that we were already in
production we just really felt that integrating the full game would be our
best option so now that we want the full game in running in the Editor how do we
actually achieve that we're faced with another decision do we run the game in
the same process as the Editor or as we depict on the left here or do we attach
the Editor to the game executable and using out of process solution again we
examine our options an imprecise game is faster to set up and easier to debug
because all data changes are direct and synchronous a crash in the game is
problematic though as it will inevitably take down the Editor which will cause
loss of work and lots of frustration it also makes it easy to modify in-game
data directly as you can access game objects file pointers over time you
could end up with a situation where many different systems in the Editor would
access different systems in the game we felt that this would lead to an
unmanageable situation but enforcing this separation we figured that would be
hard because due to time constraints is ever so tempting to just directly access
a game system an out of process solution on the other hand offers more stability
primarily because you would be able to from a crashing game because the game is
a separate process coders are forced to work in a clear way as the Editor cannot
directly modify game objects five pointers finally UI updates are not tied
to the frame updates in the game and so your eye will be more responsive even
though the game might not be so the downside of this is that you had to
write a communication layer resulting in a longer initial development time this
communication layer would also be asynchronous which make it quite tricky
for debugging so similar to the decision on integrating full game we went for an
in process solution because of the time constraints and pressure from production
we did decide in an out of process solution should always remain an option
later on and so wanted to ensure code separation to enforce this code
separation we developed the 3D other as if the game was running in a separate
process no data was to be shared between the Editor and the game because of this
separation that we quite strictly enforced I'll continue to use the word
process to describe to describe like Editor process meaning code executing in
the Editor and game process for code executing again anyway luckily for for
us making these early quick quick gains paid off quite well so I have a video
I'm about to show you here from roughly a year into into after starting the
theater and it demonstrates how a game Designer uses the ability to play in the
Editor for its workflow so to start off design Designer explore some section of
the world in in play mode decides this is a nice spot for an encounter and so
it goes into edit mode now there are some trees in the middle here so what
we're going to do is we're going to paint them out to clear the area a bit
and we also want to paint some more trees in the surroundings so we're just
going to do that then in the center of the screen where you want to place this
encounter we want to flatten the terrain out a bit so we'll use some some terrain
sculpting magic for that
and so this looks like now we can actually place our encounter so we use
the content hierarchy on the left to create an encounter give it a name
created in the middle use the asset browser also on the left to create some
robots so now I have two robots and we want to add a rock in the middle provide
some player cover perhaps I'm not a game Designer so I wouldn't really know but
I'm yeah so we just use the tools to drag drop the rock in place it save it
gonna try it out so an hour immediately back into edit mode and yeah a low it
can shoot some robots and see if this actually works out well pretty sweet so
with our high level design now discussed let's look into some subsystems that we
required for this 3D Editor to actually work so in this section we also see how
we deal with the separation between Editor code and game code so a problem
that you encounter quite early on when diving into into this solution is that
the data that is in the game is not necessarily the same as the data that is
on disk this is caused by the process called conversion that then explained
earlier examples of what can happen in conversion or Shader compilation and
math optimization that's cetera so having this optimized game data is a
fairly common situation in games but it provides a problem for editing as
information is lost saving out cortex based objects that are read by the
Editor is not always possible once you're in the binary game data so
luckily we already stated that we did not want the Editor to operate directly
on the data in the game as a result the Editor works on the cortex counterpart
if you will from the loaded binary data in the game each process therefore has
its own copy of the data and it cannot access the copy of the other process so
data in the Editor is not identical nor shared between Editor and game a central
concept in dealing with this problem is something that we call the Editor node
so Editor nodes provide all editing functionality for an object in the game
an Editor node knows about a game object but this is one
one direction one directional relationship so this is a convenient way
of keeping the game code clean the Editor node lives in a separate project
and code file from the game and so the game code stays very clean there's no a
shift F Editor in our code base second problem caused by the game operating on
the converted data is hierarchy for performance reasons the game has little
hierarchy of objects left remaining resulting in a long flat list of objects
that are all in world space this is very fast and efficient for rendering in the
game but not very good for editing as often we want to edit in local space so
in order to restore some hierarchy of objects for editing in game we built a
hierarchical tree structure of these Editor nodes so this is done and this is
done once the game is loaded and user enters to edit mode the tree is
constructed top-down starting at the content route and reflects the content
that is currently loaded in the game so having a single Editor node tree as
shown on the image here would violate our earlier stated goal that the Editor
cannot directly modify game data the tree structure is build up in game and
so should only be used by the by the code living in the game so in order to
facilitate editing hierarchical data we actually have to Editor node trees so
this allows both the Editor and the game to reason about content hierarchy and
the world our local space conversions etc so these two trees are structurally
identical but referenced different data changes made in the other are mapped
onto the node tree and community communicated to the game so what we
actually communicate in our Editor is the next part of this presentation and
for which I'll hand over back to Dan
alright so making changes to objects when defining goals or requirements that
sometimes important to list things that you know you don't want and in our case
after working with the Core Text Editor and to a lesser extent the other legacy
tools we had a pretty good idea of what we didn't want so we didn't want to have
any kind of proxy representations for game objects or stream based accessors
or Editors specific boilerplate in a code or any kind of like code generator
steps in the build pipeline that would make the Editor work and that helps us
to define some goals when it came to how we wanted to actually write the Editor
code so in the Editor Cody wanted to just work directly with the concrete
objects that the simplist was classes and structs that were defined in the
engine and data modification by the code should just be performed with direct
member access or via accessor functions and mostly as a corollary of these two
goals we should be able to reuse any kind of existing data transformation
utilities that already existed within the engine code and within our
conversion pipeline so doing these things would mean that our game and
engine programmers would also understand the Editor code and would find it easy
to work with so being able to make changes is good but what is best in life
in any kind of data editing environment it's critical to be able to undo changes
that have been made and this allows users to experiments to recover from
their mistakes and generally feel like they're in control over what's actually
going on so looking at this slide it like the first goal seems almost
redundant to state but it is actually important the code for the undo system
had to be as simple and reliable as possible and in the interest of
accomplishing these goals we decided to avoid the kind of common command pattern
which involves a virtual do an undo function because well there's two
reasons for this like the first is we didn't think the undo system or anything
related to it should need to change in order to support new Editor features and
we wanted to remove the possibility of having asymmetric data transformations
that is to say an undo implementation that isn't the exact inverse of its do
implementation so given that we're approaching this whole thing from a
completely clean slate we may as well consider a totally ideal scenario and in
our case that would be a system that just sees all the changes that are being
made in the Editor knows how to undo those changes automatically so we
investigated this a little bit but quickly found that such a system
involved way too much overhead and it was also pretty difficult to delineate
when complicated changes were starting and ending so the compromise that we
settle for is that the for modifying data programmer will provide some hints
to the undo system about which objects are going to be modified and then
perform some modifications and later indicate when those modifications are
complete so it's actually described changed in the Editor we use a
transaction system and the transactions consists of all commands and you can see
a pretty simplistic example of such a thing here but
each command describes a simple individual change such as the addition
or removal of an object or an individual change to an attribute value and these
changes are annotated with our TTI so that we have enough information to
actually use these transactions to transform the Editor data and the
transaction system also has a subscriber mechanism which allows you to receive
notifications for all data changes that are happening in the Editor and the
system is mostly used to update the user interface inside of the tool have the
visual representation of the data remain in sync with the backend data so when a
transaction is created or applied against the data it is sent around to
all the registered listeners in the Editor so in this case that would be the
individual components that have subscribed for notifications and each
listener is free to respond to the data change however it likes but typically a
controller will inspect the transaction to see if any of the data which is being
changed is represented in the in the view and then it will just update its
visual representation for our views we follow the Model View controller pattern
quite strictly which has a few advantages in such a system
mostly code separation in that our components don't need to know about each
other or even communicate with each other they're only interested in
representing data modifying data and responding to modifications in data it
also means that the components support changes from anywhere so they can be
representing a change that's made by themselves or by other components or
even framework level operations such as reloading of files or what-have-you and
it also means that because of this there's no notion of directionality and
the any of the edits of code so views aren't concerned if this operation is a
do or an undo or redo they just respond to the data changes so one of things we
tried in the early days when we were playing around with a system is
especially build these transactions by hand and then apply them against the
data in order to effect a change and this is it's possible but it actually
proved very quickly to be way too cumbersome to be viable and it didn't
fit with our East of use goals encoding encoding terms so instead we built in
utility class which would just analyze changes after they'd already happened
and generate a transaction that described all the changes that it had
observed so before making a change the programmer will use this diff util to
take a snapshot of the object state and a snapshot is just a shallow attribute
wise copy of the original object and after that the code is free to me
any kind of modifications to that object and once those changes are complete the
programmer will tell the different tool that he or she is finished with changes
and the diffuser will compare the original snapshot against the current
state and this works by using our our TTI system so it reads the optimum
values and therefore we only see changes on data this is actually expose the our
TTI system so it's more fine-grained than just doing direct memory
comparisons so how does this actually look in code as you can see here we have
a super handy function that's gonna give us a pointer to a spot light that we
intend to modify so we call the function and then we create a diff you till on
the stack and we use it to take a snapshot of the spot lights current
state so then we just perform some kind of modification in this case we use an
accessor function to set the cone angle to 30 degrees and then afterwards we
tell the diffuser that we're done by calling commit changes and we also
supply a human readable description of what we just did which can be shown to
the user and the user interface for example and then the diffuser will
compare the snapshot against the current state and it will produce a transaction
that describes that change and that's that's the only interaction that a
programmer of his writing edits code actually has to do with the undo has to
perform to work with the undo system so once we have a system that describes
changes in this super granular fashion it means that writing code to undo those
changes is actually trivial so to revert a change we take our original
transaction and we reverse the order of the commands and then we replace each
command with its inverse so for example an add object command becomes they
remove object command asset value command will use the old value instead
of the new value and once we have this transaction we just apply it against the
data and the data is now back in its original state easy so in the same ways
before when we have in a new transaction we just send a notification about that
transaction to all the subscribers and the whole system just works without any
kind of code changes so what went well with the system well it was very very
reliable and it's a transparent system at least to the programmers who are
using it and you can represent any kind of complicated changes because our in
Editor views are only ever responding to transactions there is no distinction
between do and undo in any of our Editor view code and I guess what's most
telling about this code is the fact that it hasn't really changed all that much
the last couple of years I mean we wrote it of course there was some things that
we didn't figure out in the beginning so we had to tweak it a little bit but at
some point it worked and it just has continued working for like two and a
half years or three years or something that said there are a couple of things
that are not super great so the system realized very heavily on having
transactions which describe all data changes that are occurring however as a
programmer it's pretty easy to forget to surround your changes in AD if you told
before you actually start directly modifying the object and this can lead
to some very subtle irregularities and the timeline of the history stack we
perform some validation on transactions and debug bills to try and check the
changes and make sure that everything makes sense but of course we can't catch
everything and lastly the system doesn't support any file system changes which
means that if a user moves files around using our refactoring tools the system
doesn't support undoing those changes but this is something that were actually
looking into fixing right now and we'll try to get it working for our next title
but actually now I'm gonna hand back to sander who will describe how this works
with the 3D Editor so let's map this onto an everything I explained about the
3D Editor before so when we make a day to change in the Editor it results in a
transaction being broadcast so the Editor node tree living in the Editor
listens for these changes and picks up this change it determines which node in
the tree is affected by the change and sends this information to the game along
with the data change itself so the corresponding Editor node in the game
then receives the change and modifies the game object and thereby completing
the edit so with the information I've given before about having to Editor node
trees in our process and the data differences between the Editor and a
game the previous diagram is also a bit of a simplification for starters data in
the Editor is in local space and objects in a game are in world space let's have
a look at how the node trees allow us to deal with these changes so let's say we
modified the local space of the spotlight object my spotlight in this
little hierarchy you see here the spotlight has a parent object my prefab
which for this example will be the root of the tree each node in the tree can
act as an adapter for transactions flowing through it to facilitate this
all transactions are applied at the root of the tree
in this case when the local space changes passes through the prefab it
depends its own local matrix to the transaction so the transaction then
being passed to the next object which is the final object and the spot light that
it can do its modification and because the data has flown through its parent
you need data transaction will now be in real space so because these steps are
recursive we return back to the prefab after successful application of the
change in spotlight should an ancestor node need to respond to changes of its
children this is the place where you could do it more advanced objects might
trigger updates to vegetation placement or terrain as we'll see in the following
video so in this video that I borrowed borrowed from my from my colleague there
we're actually moving a road node and Road nodes are part of the content
hierarchy and its parent node is a road system and that actually knows how to
modify the terrain so moving these roads if you see actually updates the
vegetation and the terrain systems in the game this is a good example of how
the Editor node tree actually facilitates this so to observe the fewer
there is a piece of information missing here tools like the ones we just shown
actually draw a gizmo in the game and allow the user to interactively operate
on it plenty of tools work this way for such as example moving but also painting
etc so how did we implement this in the framework because we already know how to
describe the data change using transactions and add a few tool in an
Eevee implemented we would simply reverse the order of the transaction and
support the following data flow as well so reading the diagram from right to
left the tool sends a transaction to the node tree in the game which then
transmits its change back to the Editor where we do the same modification on the
cortex data that we have just done in game so we did not do this because this
is problematic for various reasons first of all it becomes unclear which
processes authority of the data secondly because of data loss in conversion it
might be impossible to have this code path in the first place and thirdly and
lastly you would need some system to prevent transaction being generated
in the Editor to flow back to the game again and then create a cycle so because
of these problems we implement a tool support or gizmos support somewhat
differently the key here is that the tools do not actually modify the data
they simply broadcast an intention to the Editor telling the Editor what
they'd like to change it's up to the Editor to actually make the data
modification itself this data modification is then processed in
exactly the same way as any other transaction and that way the Editor
keeps authority over the data at all time so it also automatically works with
undo as the input for the Editor data modification on the left is then the
undo system instead of the tool it is still the Editor node in the game that
modifies the game object as a result of a transaction in the Editor the game
itself has no notion of the undo system at all so there's one part in updating a
running game that we have not looked at at all and that is structural changes to
the node tree itself adding and removing objects so this is done by a separate
system that we call game sync so game sync is a continuation of a system that
we had in place before we started working on 3D Editor and was written for
hot loading particles into a running game the system has been heavily
expanded during the development of Horizon and it works by tracking all
changes in the Editor creating a patch file and then inserting that into the
game since we like working with spotlights let's copy the spot line in
this hierarchy and see our game sync works
so first the incoming change is analyzed the analysis phase determines which
objects are modified and what other dependencies are required for successful
application of the patch so these dependencies are extra objects that are
required for the conversion process or extra objects that might be required for
the targeted game subsystem the analysis phase writes out a single Core Text file
containing all required objects so in this example it contains the prefab as
it will contain a reference to this new spotlight and obviously the file will
contain a new spotlight itself so this Core Text file is then converted into a
binary file and safe to disk ready to be loaded and inserted into the
game the game is then paused at the same
point meaning no streaming is happening and all threats are posed effectively so
applying the patch means copying the objects in a patch file to the affected
game systems the target object in target objects in the game are overwritten
meaning that any cached pointers in the game actually will remain valid finally
after applying the patch the game and all of its subsystems Arabs are resumed
recapping on how game sync worked from a tools team we have mixed feelings on the
one hand we have a solution that accepts all changes throughout the Editor and
even changes on disc can be tracked by monitoring the file system it can also
sync patch files to a deaf kid allowing a user to see edits on the target
without having to restart the game on the other hand getting game sync fully
reliable proved impossible throughout the project because game subsystems are
sorry because game seems to predate game sync game sync try to respect these
systems as much as possible and thereby avoiding work for other Co teams this
led to special case code creeping into these systems and very complex analysis
code these complexities further than complicated or generated performance
problems where worse cased applying a relatively simple change could take as
much as five to ten seconds breaking a user's workflow all of these
complexities also led to inconsistencies in the results as well
games Inc could crash or produce incorrect results such as objects being
not placed incorrectly placed in the world and this done that to a very low
level of confidence from the users making it high on our wish list to
rewrite for a next project but let's not end here for the final part of this
presentation I would like to give the word back to Dan so you can give an
overview of other tools that we that we have developed
nope no sound okay as I mentioned earlier we managed to get some help from
another team while working working on the new graph Editor and the most
significant contribution came in the form of a fully featured debugger which
was able to show debug information in real time from active instances of a
graph which were was running within the Editor or just within a connected
instance of the game running on target and so the debugger had all the kind of
usual features that you'd expect to see such as variable inspection execution
visualization and it was also gathering this information over time so this meant
that the Editor could then facilitate historical debugging and it would
highlight previous executions on on a timeline which had where the execution
flow had changed so it meant it was very easy for users to kind of scrub back in
time and figure out when some particular condition was met or some condition had
changed so this debug functionality turned out to be super useful during the
development of Horizon Zero Dawn and it spawns them similar debuggers for other
systems and all of those debuggers and visualizations were made with very
minimal involvement from the tools team which for us was a pretty good indicator
that we were successful in making the tools framework something that could be
used by the rest of the programmers in the company there's a video of the new
graph debugger I was talking about you can see how the graph is currently
executing this is actually the graph sound for the main characters breathing
you can hover over inputs on the graph and see what is actually going on how
the value is changing over time you can see the execution path being visualized
but you can also pause the execution of the graph in-game and then start to look
at previous executions and you can see how long they were taking in
microseconds and so on and you can actually then step through those
executions and see like when the control flow changed and flip back and forth and
see how the values are actually working in this graph something else that was
completely new for Horizon was interactive conversations and with this
feature we were not beholden to any kind of existing workflow so it meant that we
were free to just build a new Editor editing context from scratch to meet the
specialized needs of the teams that were going to be using this tool so that
would be the writers and the quest Designers and cinematic artists who
would actually make the conversations within Horizon
so here's an overview of the Editor itself it's kind of hard to read to be
honest but we have a graph that just controls the flow of the conversation so
the main character is speaking to somebody and they are presented with
dialogue choices and the graph can also check for conditions like have you
accomplished a particular quest objective and so on the secular cat a
conversation in Horizon gave them the best work he'd ever do and look at it
now they defaced it because they hated what
it stood for they defaced a thing slaves like him lived and died for slaves
they've already forgotten short-sighted bastards a lot if they thought their Sun
set off this bridge they'd jump after it I thank you for honoring my wish I hope
you can find peace ok so that was actually from quite a while ago in in
production but what's interesting about this conversation is that it's
completely auto-generated so I mean aside from the the audio all of the
animations and camera cuts and gesture beats and so on were selected by a
data-driven rule-based system and the auto generation system will also
generate Robo voice and facial animations we're required so basically
it means that a Designer can just produce game ready content by typing in
a few sentences for a conversation and then immediately see it in the game or
even just game sink it into an active instance of the game and get quick
feedback on how their actual flow of the conversation is actually gonna be the
system was pretty heavily inspired by the work of CD Projekt RED and and
Bioware who've both delivered talks in the last several years about how their
conversation systems work if you have any interest in the subject I highly
recommend you check out a talk from last year's GDC which is shown shown here
about the Witcher 3 dialogue systems but this is something we were only able to
do because we'd built this new framework that meant it was easy for us to
experiment and just try out new ideas so something else is the profiler framework
so at the beginning of this talk I showed a video of our GPU profiler and
this is a video of our global profiler which is for profiling on larger
segments of time more of milliseconds but we had this profile
framework already running in game and we were able to expose it to the Editors
that we be able to start getting profile data back from any tools basically so we
can see like how much time is being spent in certain areas we can drill in
really quite - quite small timescales and get an overview and what's actually
being done in the Editor or the game so we expose this profiler framework to the
Editor and this allowed us to build a front end for the profiler which can
show profile data in real time from from any tool and this of course includes the
game but it also applies to the executables that we use to run our
conversion process and any of our kind of demons that are running - to
facilitate our other tools framework stuff and it's interesting to note that
the Editor also runs the profiler framework which means that it can
profile itself in real time which is super handy for being able to
investigate performance issues quickly or just figure out why the Editor is
running slowly on like one guy's machine
so something else we were telling you is a world map and one of the things about
working on an open-world game is that can be pretty hard to get an overview
over where things are happening so to that end we started working on a
two-dimensional map representation of our game data and displaying game data
on the world map let's use the game is a pretty logical progression and thanks to
a pluggable overlay system it's possible to create visualizations for whatever we
we want basically so right now we use it for showing stuff like the locations of
quests or the location of JIRA bugs and it's pretty handy to get a quick quick
eye on what's actually happening in the game but unfortunately they came on
pretty online very late in the project and it wasn't really widely used but
it's something we'll probably continue to work on pretty heavily for the next
project so here's a quick video of that this is
just a view that shows some world encounters so you're able to kind of
filter and see where you might run into particular types of robots within the
world and again there's like filtering so you can you can get a quick overview
of what you actually want to see there's another overlay that we have which shows
our performance stats for each tile which is gathered on a daily basis and
this shows what our tiles are in memory and within CPU budgets and GPU budget
and so on it shows tiles that are in budget as green and tiles that are out
of budget is red and a fourth one fortunately this screenshot is from very
late in production I think right before gold master so everything is green
because we were in in budget but we can drill into that that tile and then get
an overview on like the the different parts of performance stats that have
been happening over time of like very long periods of time because we gather
these stats always so yeah so the last thing that actually came up was now that
all of our tools are written in C++ it meant it but a few of programs that
guerilla had the idea of creating a C API for the tools framework so we could
use this to expose certain areas of the tools frameworks functionality to other
languages and so with that in mind we exposed stuff like file loading and
saving and object manipulations and then we wrote a wrapper in Python for the
DECIMA a P I which allows tech artists and Designers and anybody who knows most
Python basically to create scripts that will like create game content or modify
existing game content and they use these the same tools framework functionality
that we actually used to build the Editor so revisiting the C++ code
example from earlier we can look at the same code which is written in Python and
yeah we retrieve the spotlight to edit and in Python we create a transaction so
that we can actually start modifying data and then we just use some accesses
to modify the the cone angle once again and then we just commit the transaction
and this is how somebody would do the same things that we do in the Editor
with Python there's also a lot of other really cool stuff that I would like to
show today but there just isn't enough time so I'll move on to two conclusions
so now that the game is finished and we have some time to kind of take stock and
we realize that we were much more successful in this whole endeavor than
we anticipated but there's still a lot of work left to do we'd like to bring
more of our team's work flows into the new Editor are a lot of our environment
are set dressing is still being done in Maya some of our VFX work involving
particles is still involved in performed in a standalone particle Editor tool and
so we'll start building you Editor context of all these different workflows
and continue to move everybody in the company into our Editor and it's non
dimension we'd like to rewrite the game sink system because I was a source of a
lot of mind-boggling debugging problems throughout the project and we want to
take the 3Dr out of process we can stabilize the Editor because even
though we had like a really quick start with a 3D Editor and a lot of progress
it meant that when we came towards the end of the game and end of the project
story and the game was being optimized very heavily that had a really big
impact on the stability of the Editor so we would like to separate this to
persist so that we don't lose lose users work from crashes in the in the game and
although we made a lot of progress on this project for the tools the user
interface building was done in like let's say an organic fashion which means
that a lot of tools and concepts within the Editor as user interface are kind of
confusing or irritating and users find it pretty hard to discover new features
and so we want to work on this stuff quite a lot in the future let's say
thanks to everyone at guerilla who helps out with this stuff and there's some
special thanks to the people whose work you've you've seen here today and also
we are we're hiring so if you're interested in working with what is
frankly just a bunch of really awesome people with a lot of talent then please
click that link and send us your resume and we'll take a look but yeah that's it
so does anybody have any questions [Applause]
hi I have a question yep I'm a long time Kojima Productions fan and so I've been
following the announcements follow about DECIMA engine and how they've integrated
their workflow with the new coaching productions into DECIMA so I after
seeing some video content like just mass market content on how they were changing
the workflow of DECIMA engine in previous titles to fit the new death
stranding title I was wondering is there anything in particular that that you
learned from that experience or that was integrated as a result of that
experience with the Kojima Productions team once they came on yeah I mean those
guys were working for many years on their their engine and Fox engine had a
lot of cool capabilities so we learned a lot of Technical stuff from those guys
but in terms of workflow changes they were they're much more focused on
cinematics than we were so our cinematics pipeline was something that
we analyzed quite heavily to understand how we could improve it to fit their
workflows where they wanted to have very long sequences of like a lot of
complicated interaction between many characters so that was like a pretty big
big part of it but yeah that's about it so a couple of questions yeah one would
be how did you get buy-in from production of management or were there
any problems with it and also in the early days when you were working on
stuff that wasn't yet usable we're having to support two systems for people
editing game yes so the question is like how do we get buy-in from from other
other teams and how do we actually like approach the first dealing with two
separate several parts of editing so getting buy-in from from management
wasn't that difficult because we had the backing of our Technical directorates
you just push forward with this that we thought in a way that we thought made
sense getting buying from our from our users was much harder because yeah the
tools were really kind of crappy for like a very long time and that meant
nobody wanted to use them so we had to invest a lot of energy and just making
sure that we met all the demands of like reaching feature parity for that stuff
in terms of editing I think two different systems of editing that was
kind of a pretty tricky tricky thing is it's very hard to summarize shortly
because there's like many different angles to that there's like all these
different systems and that we have in the game and moving on to new data
formats in the new Editor would make editing much easier but I can give you
some specific examples in more detail but it's a very very hard thing to
summarize I think mainly like all the old tools pretty much had a feature stop
once this got rolling so we did support them I mean as then ended this
presentation not not all - all teams have moved over to the new Editor but
yet the new features for the old tools is zero or very little since we've
started this so hey thanks for the talk do you have any like deta messaging
steps like you know like a pipeline step or something like that and how does that
work with like you know the diff patch file that you do or is that actually
what you mean in the the game sync stuff where it's actually done on the game
side where you know the the Editor data has to be kind of massaged like you know
strings need to be hashed and all the things need to be generated for the for
the runtime to work okay so you're talking specifically about how the
Editor data is that you transferred into the game and how that transformation
process works or yeah okay so we have a conversion pipeline that is basically
just lazily converting the graph of content that is going to be loaded into
the game and that conversion processes can involve basically anything I mean in
the simplest form it's just that we take some pre converted data and write it to
a binary stream in accordance with the the RT TI schema but in complicated
cases it can be things like where we do a lot of baking operations and then
write out a stream in a certain way or whatever but this is something which
game systems and game objects are free to implement however they want so
there's a lot of different things happening there basically but so that
runs in the game process right it Technically does but not not inside the
game instance so we use the same executable but it's it's a different
instance like we rename the executable file and then we run
conversion process so that will basically just eat up the core text
files and spit out core binary files and those core binary files are then
actually loaded by the game so this is an essentially an offline process but
it's done on demand so as you're working in the Editor or whatever and you load
up the game the game will make a request to a converted worker for this operation
it will convert the data and then that will get loaded into the game
cool thanks a question about a little less Technical I guess logistics of
making this happen and so you guys started out with one prototype was that
pushed from like engineers to higher levels like did you pitch to pitch it or
was it like a an in order from from up there somewhere like how did you guys we
don't really have like sort of strong hierarchy or something that gorillas so
it was nobody saying like hey you need to make this or something it was more
like as a tools team we discussed it quite quite extensively and we looked at
the options we were faced with and it was kind of like the new game was so big
and complicated that while it would be Technically possible to make it with the
old tools it would just require so much effort that we decided we should at
least explore making a new Editor and trying that out so that was something
that we just sort of did and it was something we a conclusion that we
reached as consensus as a team basically all right
I had a quick question about the basic stuff like the menus and the docking
panels and the trees they were all rewritten using this that's the
framework that was started there was nothing reused from all West code or you
don't trying to say well for all of the kind of like lower level stuff like file
handles and what-have-you that was already present inside of the DECIMA
engine for the user interface we have sort of two different parts we have our
own custom control framework we call it which is just you know simple things
like buttons or any kind of two-dimensional rendering and then for
the most of like the bread and button butter controls of like the user
interface such as buttons and menus and whatever we use a framework called cute
or QT depending on how you want to pronounce it but yeah that's it ok all
right Oh quick question so at what point of the process did you decide to create
the DECIMA API and how much of an undertaking was it to create the Python
wrappers for your tech artists there was some that happened very recently
actually I don't recall the exact date but I think it was within like the last
two or three months or something and in terms of getting it online it was a very
easy thing to make in a way like it took much much more time to really drill down
and like understand exactly what this thing was going to do and like how each
API call was gonna look and whatever and then one guy just said and wrote it and
it took him I think like a couple of weeks to to make the thing go
alright okay I think there's no more questions so thank you thank you
[Applause]