My Work at Unity

4 days ago 1

I worked as a developer at Unity Technologies from 2009 to 2020. When I started, there were around 20 employees worldwide and Unity was still largely unknown. When I left, there were over 3000 employees and Unity had become the most widely used game engine in the industry.

As you can imagine, I worked on a variety of projects in that 12 year timespan. My self-chosen job title was "Creative Programmer" (formally Senior Software Engineer when Unity got more rigid about job titles) and designing and implementing for usability, elegance and intuitiveness was my core focus. Here's a non-comprehensive overview of my contributions, with a few behind-the-scenes tidbits here and there.

Keep in mind that this is a personal page, not a balanced and neutral account of events. While I occasionally mention other people, the focus is not on their contributions, as that would greatly expand the scope in terms of both research and length. That said, I do strive for accuracy. Apart from chief executives of the company, everyone mentioned by name have had a chance to provide feedback and have their requests addressed. If you spot any factually incorrect information, please let me know!

Locomotion System and other advanced animation

2007-2009 - Downloadable on the Unity website

For my Master's Thesis at Aarhus University I developed a Locomotion System to make characters walk and run on uneven terrain. I completed the project in collaboration with Unity prior to becoming an employee.

The collaboration essentially meant that I developed the tech based on the Unity engine (which was not widely known back then) and made it available for free for Unity's users. In exchange Unity provided me with a Mac to develop on (the Unity editor was Mac-only back then) as well as occasional tech support from CTO Joachim Ante. The arrangement happened because CEO David Helgason saw a post of mine about my Master's Thesis project on a Danish game development online forum in 2007 and reached out to me. At the time, I hadn't heard of Unity.

The Unity engine and editor were much easier to work with than the OGRE renderer I would have otherwise used, and it also meant I got my foot in the door of the game industry, so I was happy with the arrangement.

Video demonstrating the Locomotion System I implemented as part of my Master's Thesis.

The Locomotion System was released as a downloadable resource on the Unity website in 2008. When Unity later launched its Asset Store, it was migrated there.

I held talks about the Locomotion System at Unity's Unite 2008 conference as well as at GDC 2009 (sadly not recorded). Much more info about the Locomotion System here:

I got hired by Unity Technologies in January 2009 and moved from Aarhus to Copenhagen. There was never any job interview; I guess the Locomotion System collaboration was convincing enough for both parties. Well, I had also briefly worked at Unity Studios, an Aarhus-based sister company to Unity Technologies that later went fully independent.

While my primary field of work at Unity was editor UI, I also continued doing a bit of advanced animation work in the early years. During my first year, my colleague Paulius Liekis and I did a presentation on Character Animation Tips & Tricks at Unite 2009. We discussed a range of animation techniques such as realistic foot placement (using my Locomotion System), procedural aiming and head turning, and how to smoothly turn procedural adjustments on and off while reloading a gun. We released an accompanying demo project for Unity 2.6 in 2010 (see further down in the Demo projects section).

Character Animation Tips & Tricks presentation by Paulius Liekis and me at Unite 2009.

Animation View

2009 - Shipped in Unity 2.6

I don't think there were teams when I first joined Unity, but I worked with Unity CCO (Chief Creative Officer) Nicholas Francis on the Unity Editor in what would later be expanded into the Editor Team.

The first project I worked on was developing the frontend part of a new integrated animation editor called the Animation View. This allowed animating almost any property on any component, including transforms, materials, lights and script values, and also trigger certain method calls in scripts.

The Animation View allows animating properties directly inside Unity.

Part of the effort was developing a view that could efficiently display many curves. The curves could be edited with manual or automated control over tangents. The view was independently zoomable on the horizontal and vertical axis, and I equipped it with smart grid rendering that dynamically faded in finer-grained grid lines based on the current zoom level (a treatment I would later do for the Scene View as well, this time in 3D).

Later in Unity's history, the Animation View (now called Animation Window) was partially rewritten multiple times by various teams. Some things improved, some things regressed (don't get me started), but the core functionality remains largely the same.

I also worked on a companion feature called the Curve Popup Window that shipped in Unity 3.0. With this, any public script variable of type AnimationCurve got exposed in the Inspector as a curve field. Clicking it opened a popup window with a curve editor similar to the one in the Animation View.

Demo projects

2010-2011 - Shipped for Unity 2.6, 3.0, 3.4

In the early years I contributed to several demo projects for Unity. These all served a dual purpose: making Unity look good and being educational projects users could inspect and learn from.

Character Animation / 3rd Person Shooter Demo

Screenshot from the 3rd Person Shooter demo which showcases various animation techniques.

Following up from the Unite '09 presentation I did together with Paulius Liekis on Character Animation Tips & Tricks, we released an accompanying demo project for Unity 2.6 in 2010, with 3D models by Ethan Vosburgh. Besides animation techniques, I implemented gameplay, menus and enemy drone behavior, and even lent my voice to the character, including the sophisticated line "It's gonna blow!".

Bootcamp

Unity 3.0 shipped with a new bundled demo project called Bootcamp, featuring a soldier character controlled in third person. It was developed primarily by game studio Aquiris, while I assisted with implementing animation techniques similar to those used in the 3rd Person Shooter Demo I'd worked on earlier.

AngryBots

Unity 3.4 shipped with a new bundled demo project called AngryBots, a twin-stick shooter with a sci-fi theme and overhead perspective. This was developed by Unity's internal Demo Team at the time. While not formally a member of that team, I nevertheless worked on the demo, implementing animation techniques for the avatar and enemies, enemy behaviors and simple AI, and modular gameplay elements like triggers and doors that level designers could place and configure without any additional programming needed.

Multi-object editing

2012 - Shipped in Unity 3.5

Unity got support for multi-object editing in version 3.5 with backend functionality implemented by Jonas Echterhoff and frontend handled by me. You could select multiple objects of the same type and edit them simultaneously in the Inspector. If not all the selected objects had the same value for a property, a dash would appear in place of the property value.

All component editors with custom logic (which a lot of them had) had to be rewritten. Things became tricky when the UI controls didn't directly map to the underlying serialized data. It was not a very enjoyable task, but it had to be done.

PropertyDrawers

2012 - Shipped in Unity 4.0

During almost all my years at Unity, there were yearly events called "NinjaCamp" and later "Hackweek", which were week-long events where all developers (or earlier on, all employees) got together in one location (either at a Unity office or a rented venue) to work on whatever they wanted in small groups. Many important Unity features started out this way.

Together with my colleague Lasse Järvensivu, who introduced me to how reflection works in .NET, I came up with the idea for PropertyDrawers, a feature allowing custom UI for serialized fields in the Inspector, during one of the NinjaCamps. Afterwards I wrapped up the project and got it shipped in Unity 4.

Left: The Inspector displays various properties without customization. Right: PropertyDrawers are used to display properties with sliders and custom layouts, widgets and message boxes.

PropertyDrawers was a huge boon for editor customization since, unlike custom inspectors or editors, they can be reused across multiple objects. They let you write custom UI that's consistently applied to a given serializable type or to properties marked with specific custom attributes in code.

Unity 4.3 shipped with a feature by Aras Pranckevičius called MaterialPropertyDrawers, which was inspired by PropertyDrawers but was developed specifically for shaders.

In Unity 4.5, I supplemented PropertyDrawers with DecoratorDrawers, which are used for decorative elements in the Inspector, such as spacing, headers, or notes. While a property can have only one PropertyDrawer, it can have multiple DecoratorDrawers.

By the way, it's "drawer" as in something that draws something, not the furniture. Yeah, that naming was not the best.

Mecanim animation improvements

2012-2013 - Shipped in Unity 4.0, 4.1

In 2011 Unity acquired Mecanim, a Montreal-based startup specializing in animation middleware. They developed the Mecanim animation system, which became a standout feature in Unity 4.0, with its state machine-based animation control.

I was skeptical of certain aspects of the new animation tech. Animation state machines and root-motion-based animations often lead to janky and unresponsive results in the hands of game developers without AAA game experience — something the vast majority of our users did not have. Nevertheless, I did want to help make it the best it could be. I assisted with the frontend of the tool, particularly the UI for making the rig on an imported model be compatible with Mecanim's humanoid standard bones.

Algorithmic automated setup of humanoid bones

In addition to the frontend work, I also envisioned and implemented a major improvement to the humanoid setup workflow, where an algorithm could automatically identify which bones in an arbitrary rig should be mapped to which standard humanoid bones.

Configuring a humanoid avatar in Unity requires matching up dozens of bones from a rig to Mecanim's standard humanoid bones. The algorithm I implemented could often do it in a single click.

I wrote a technical blog post detailing the state space search algorithm I developed here:

2D blending for animation blend trees

Mecanim supported visual setup of animation blend trees, and as part of that you would use blend nodes where multiple motions could be blended based on a single parameter value, such as speed. However, authoring blend trees for motions corresponding to two parameters, such as a direction vector, was cumbersome and limited, requiring a blend node with multiple nested child blend nodes. For Unity 4.1 I addressed this by implementing three 2D Blend Nodes.

One was a simple directional blend. The other two were based on the Gradient Band Interpolation and Polar Gradient Band Interpolation algorithms described in my Master's Thesis.

Apart from the runtime functionality, I also implemented visualizations of the weight "field" associated with each motion when selected. When none were selected, I used the sum of the squared weights to still give a rough idea about the fields. The visualization appeared more blue where a single motion dominated, and fainter in areas where multiple motions blended together.

Unity UI - layout systems

2013-2015 - Shipped in Unity 4.6

In Unity 4.6, Unity finally got a built-in UI system (sometimes referred to as uGUI) where UI could be set up visually. Prior to that, users had largely relied on third party Asset Store solutions for their UI needs since the built-in solutions so far were highly inadequate.

Context

Unity kickstarted the development of its new UI system by hiring the developer of the popular third party UI solution NGUI to create a UI system for Unity derived from it. NGUI was developed by owner Michael Lyashenko, and once he joined Unity, he brought on Philip Cosgrave as well, whom he knew from previous jobs.

From Unity's existing ranks, the UI Team was joined by developers Tim Cooper, Leonardo Carneiro, and (a bit later) me. We also had UX designer Stine Munkesø Kjærbøll embedded in the team for a good while.

Michael ended up quitting Unity long before the UI system was released (to work on a new version of NGUI and focus on game development), while Philip stayed on. After Michael's departure, the system continued to evolve and diverge significantly from its NGUI roots. Here's a live Q&A video with the team from September 2014:

Live Q&A video with the UI Team from September 2014.

RectTransform with freeform anchors

I sort of invited myself into the UI Team when I critiqued its layout solution and proposed a concept for a more intuitive solution with more flexible anchoring functionality, and the ability to manipulate anchoring handles directly in the Scene view, instead of only through Inspector properties. The implementation of this concept eventually became known as the RectTransform.

For some unique insight into how this concept was conceived of, my original critique and new proposal can be seen in these two videos, which were posted to the closed Unity alpha list on Oct 15 2013. Yes, I posted my thoughts to the alpha list despite being an employee myself, which was not uncommon at the time. This allowed our alpha users to weigh in on my points and suggestions.

My proposal was met with universal praise from alpha list users and colleagues, and I joined the UI Team to take it from prototype to production. Among other things, this involved implementing significant changes in Unity's core Transform handling. It was a fairly involved undertaking, and I worked on it for a while, with some assistance from colleagues more familiar with Unity's core backend.

During development I posted this video update to the alpha list in December 2013 and this one to the beta list in January 2014. Here's a bit more polished video from May 2014 where I presented the UI system publicly a few months before it was released:

Presentation video of the new UI system a few months prior to its release.

The video covers several of the unique aspects of the freeform anchors. At time stamp 5:18 the anchors are explained, and at 25:53 the powerful implications for animating layouts are demonstrated, which wouldn't be possible with anchors based on discrete enum/dropdown values. I also wrote a blog post to accompany the video:

How UI freeform anchors became an industry standard

The UI freeform anchors concept I designed and implemented for Unity's UI system ended up being copied by both Unreal and Godot, making it effectively an industry standard in game UI tooling.

Visual comparison of Freeform Anchors in Unity, Unreal and Godot.

To be clear, I'm not referring to the general concept of anchoring UI elements — there were plenty of examples of that predating my work. Rather, I'm specifically talking about freeform anchors defined by all of these traits combined:

  • Anchoring can be set separately for each edge of a UI element (left, right, top, bottom) so that anchoring and stretching are unified into one combined concept. Each edge has an absolute offset* from its anchor, in pixels or other non-relative units.
  • Anchors can be dragged around to any point within the parent container, as opposed to being discrete enum/dropdown values. The anchors can be thought of as two coordinate pairs, a min and max anchor, which are specified as normalized (zero to one) values within the parent container rectangle.
  • The anchors are displayed as draggable handles that are pointy, so they don't overlap when set to the same point. If the anchors are together, the UI element is anchored to that point and does not stretch. If they are split apart horizontally, vertically, or both, different edges of the UI element are anchored differently, which causes the UI element to stretch with the parent container, proportional to how far apart the anchors are split.
  • A presets menu shows a grid of the common point and stretch anchor configurations. However, since all values for both anchors and positions are simply floating point numbers, with no reliance on discrete states, it's possible to interpolate from any layout to any other, for example using standard animation keyframes without any UI-specific functionality needed.

The video further up shows demonstrations of this, particularly at the timestamps mentioned below the video.

* There is some flexibility in how these offsets are handled and serialized. In a system without a concept of pivots, each offset could simply be the edge relative to the anchor. At Unity, we chose a slightly more complex pivot-centric implementation where the values are stored as an "anchored position" and a "size delta" which I won't go into details with here. However, the editor UI for the properties can display values in a more intuitive way that doesn't have to correspond directly to the serialized values.

Here are some illustrative GIFs I made for the official documentation page about the feature.

Animated GIF showing a button horizontally stretching to follow the sides of its parent container.
Animated GIF showing a button horizontally stretching proportionally to its parent container.

Auto Layout system

Besides the anchor-based layout, I also devised and worked on an additional "auto layout system" for handling use cases not addressed by the RectTransform's top-down layout hierarchy. This was used for components such as vertical layout groups, horizontal layout groups and grid layout groups, where layout adapts to content size in a bottom-up manner, inspired by how it works in IMGUI.

The auto layout system was not part of the original design and scope of the UI system, and ended up being slightly more convoluted to use than it might have been, had it been part of the original design or allocated more development time. Still, it provided significant value over not having any such system at all.

Other UI features

I was involved in many other aspects of the UI system:

  • The integration with the animation system for interaction states such as hover.
  • The keyboard/gamepad automatic navigation mode (demonstrated at 18:26 in the video further up) that automatically determines adjacency between interactive controls without manual setup, and visualizes it with arrows for purposes of inspection and debugging.
  • The design of many of the UI controls, such as toggles, sliders and scroll-views.

Post-release development

I privately kept a list of testimonials/praise that the layout features, and the UI system in general, got from colleagues and users due to how much of an improvement it was.

We shipped several updates for the UI system after the initial release, and were still at it a year after release. Here's a talk about UI performance I did together with Tim at Unite 2015 in Seoul:

Eventually though, most of the team moved on to other projects, and the only person left developing and maintaining the UI system was Philip. Later, third party extension TextMeshPro (which did improved text rendering) was bought by Unity, but never seamlessly integrated. Eventually the UI system would get flack for certain use cases that never got addressed and bugs that never got fixed, due to the lack of allocated resources for maintaining, improving and evolving it. Instead of investing in improving the existing solution, a sizable team was eventually formed to develop a new web-inspired UI solution.

Custom build configurations

2014 - Unreleased

I think custom build configurations was the first feature I worked on that was scrapped due to changes in prioritization from above, but it wouldn't be the last. I cared a lot about our users' experience, so I was especially passionate about features with potential for radically improving an aspect of the product. This made it especially hard for me to accept cancellations of such features.

The image shows the existing build window on the left and the one I was developing on the right.

Left: The regular Build Settings window in Unity. Right: The never shipped Build Configurations window that was meant to replace it.

The idea was to replace options specified for a fixed set of target platforms with custom-defined build configurations, and the things you used to be able to specify per target platform could instead be specified per build configuration.

This way it would be possible to specify more than one build configuration for the same platform; for example one for iPhone and one for iPad, or one for regular desktop builds and one for Steam builds.

Custom build configurations also supported inheritance, allowing one configuration to inherit from another and override only select player settings. Furthermore they were integrated with texture overrides and with build config defines.

Blog post for posterity:

Unity would eventually ship other functionality related to build configurations, but I'm not currently familiar with the details.

New Input System (earlier iterations)

2015-2017

The New Input System started its journey because René Damm and I thought it was important to offer a better replacement for Unity's highly lacking Input Manager, so we decided to work on that together.

We spent a few years working on this, gathering user feedback, defining use cases and requirements, researching various input devices, developing the design, implementing and presenting working prototypes, and working on the implementation.

Here's a video from 2016 we posted publicly, which explains the problem space, challenges and ideas at a conceptual level and with some hands-on examples:

Video from 2016 explaining the design of the New Input System.

And a blog post:

And here's a few videos from during development we posted internally at Unity:

I thought it was well on track, considering it had to support a dizzying array of interlocking requirements: reusable input schemes for local multiplayer; input schemes for different input devices; standardized handling of device archetypes, such as gamepads with partial overlap in functionality and keyboards with regional differences; defining distinct control schemes for different in-game activities; handling both axis/button input and touch/gesture-based input; taking evolving VR hardware requirements into account — and much more. Delaying consideration of certain aspects until after an initial release risked leading to a dead end. After all, there was a reason we couldn't just extend the existing Input Manager, but had to start from scratch. We didn't want to face the same situation again due to a lack of consideration of all requirements from the beginning.

We also regularly got input from other stakeholders in the organization, and had constructive and positive exchanges with them, reinforcing my belief that we were on the right path.

But apparently, not everyone agreed. One day I got back from vacation and found that everything had changed — our team had been disbanded in my absence. It was the most bewildering day of my career, as I tried to piece together what had happened from completely contradictory accounts.

I never did figure out the whole picture, but it clearly involved serious communication failures, power plays over staffing resources between different parts of the organization, and a case of one high-up manager from elsewhere in the organisation who had decided that the feature had failed.

As I learned after the fact, this manager considered his part of the organization a vital "customer" to the new Input System, and insisted that the feature did not meet their needs, which had not previously been communicated to us. Furthermore, he'd made his decision without wanting to hear any input or defense from us.

He used this declared failure, along with staff count disputes among his peers, as justification to assume control of the team's responsibilities under his branch of the organization. René moved to this branch to continue working on the Input System, whereas I was neither interested in moving there nor sure the option was ever on the table.

The experience was rather traumatic for me, and miles away from the Unity culture I thought I knew. It made it clear the company was changing in ways I wasn't comfortable with, and my first doubts about staying began at this point. I chose to remain for the time being, but swore to myself I'd never work under that manager.

As for the new Input System, many aspects changed drastically after I left. Still, some of the core ideas and designs remained mostly intact. I suppose it still carries my thumbprints, though I'm not the best judge of how much.

The next team I joined would be the Scene Management Team, but let's first catch up on some part time projects I worked on in parallel around this time.

Visible assets in scenes

2016-2017 - Unreleased

This is another feature I'm very frustrated never shipped. On the surface it may seem like a small quality-of-life improvement, but it would have enabled clunky workflows in Unity to fall into place and be redesigned much more elegantly.

I collaborated on this feature with Steen Lund and Mads Nyholm during two offsites (Hackweeks) in 2016 and 2017. This was our presentation at the end of the first Hackweek:

Presentation of Visible Assets in Scenes from Unity Hackweek 2016.

Below is a video of our more polished results at the end of the second Hackweek. Notice how assets in the scene are now displayed as children of the GameObjects referencing them. (Also, beware of the bad humor in the presentation style.)

Presentation of Visible Assets in Scenes from Unity Hackweek 2017.

There were other workflows this could have improved, besides the ones in the video. At one point, another team expressed great interest in the feature, hoping to build functionality on top of it. However, the resources and prioritization to make it happen never materialized.

Save play mode changes

2017 - Unreleased

Unity lets you test your game by entering play mode without having to make a build. All editor tools remain available in play mode, but any changes you make are for testing only and are discarded when you exit. This can be highly inconvenient, and easy to forget as well. A way to save play mode changes was a top requested feature for a long time.

There were no official resources allocated for developing this feature, but I worked on it when I could. Unity once had a policy allowing developers to spend one day a week on self-directed projects, known for a long time as FAFF, short for "Fridays Are For Fun". Sometimes Steen Lund, who was more fluent in C++ than me, would work on the feature too.

2017 video demonstration of the feature-complete Save Play mode Changes feature.

The project reached near-completion and was essentially functional, even handling complex Undo edge cases. Yet a few issues remained, relating to more arcane aspects of Unity's C++ codebase, that I couldn't resolve on my own. At this point Steen was not able to spend more time on it either without official resource allocation. As a result, it never shipped. I tried to appeal this on multiple occasions without luck.

Improved Prefabs (aka "Nested Prefabs")

2017-2020 - Shipped in Unity 2018.3+

After my time on the Input Team came to an unexpected end, I accepted an invitation to join the Scene Management Team, also known as the Prefab Team. They were in an early stage of working on improved Prefab workflows popularly known as 'Nested Prefabs'.

By many accounts, this was Unity's most requested feature of all time. It was also a feature I had been advocating for prioritizing for years, so I was happy to get a chance to be involved in developing it. I also already knew many people on the Prefab Team well. I joined the team and remained there for the rest of my time at Unity.

I developed the workflows and frontend of the Improved Prefabs features together with Mads Nyholm, whom I'd previously worked with during my time on the Editor Team. On backend were Steen Lund who I'd frequently worked with, as well as Jakob Schou Jensen and, later, Thomas Andersen. Also working on the feature were Stine Kjærbøll from product management, Nikoline Høgh from UX, and Illia Komendantov from QA.

Iconography

When I first joined the team, the initial concept for Prefab icons in the Hierarchy view was that a Prefab that was nested within another would be displayed with a "nested Prefab" icon, whereas a Prefab instance that was added in the scene as a child to an outer Prefab instance (but wasn't part of the outer Prefab Asset) would be displayed with a normal Prefab icon.

I argued for flipping this: Nested Prefabs should be treated as the norm and not require special denotation. But a Prefab instance added in the hierarchy under another Prefab instance is a modification to that outer Prefab instance (an addition), which is typically important to be aware of. These, I argued, could be displayed with a plus (+) overlay on their icons. And this could be extended to a consistent iconographic system: Added components could show a plus overlay too, and removed ones a minus.

We put a lot of thoughts into things like this: Consistent iconography, terminology, APIs, etc. to help make the (honestly quite complex) functionality as approachable and intuitive as possible. It was a big challenge, especially since we had to work around various legacy decisions and existing functionality. Fortunately, we had the time to get the details right on this project, and I'm very proud of what we ended up delivering.

Applying and reverting overrides

A major source of Prefab complexity is the ability to revert and apply overrides on a Prefab instance. Nested Prefabs and Variants further increased this complexity, as modifications to different parts of a Prefab instance could have different sets of Prefab targets to apply to, since some parts of the instance might belong to one or more nested Prefabs, and other parts not.

I implemented context menus for applying or reverting individual changes, for property modifications, component additions and removals, and added GameObjects.

I also contributed to the Overrides dropdown, which displays a hierarchy of the Prefab GameObjects and provides a diff view of each component's changes.

In general Mads and I worked closely together on the frontend of the feature, and even on things that were handled mostly by him or me, we'd regularly bounce ideas off each other. This was in addition to more formal meetings and team-wide discussions.

Here's a video we shared internally at Unity showing the status on December 2017:

Internal video demonstration of the improved Prefab features in December 2017.

Release

At the end of the Unite Berlin 2018 keynote, Nikoline announced the release of the new Prefab workflows ("Nested Prefabs!") being available in preview to the effect of a literal confetti explosion and enthusiastic applause.

The conference had quite a few talks centered around the new Prefab workflows. Here's a technical deep dive talk by Steen and me:

Prefabs Technical Deep Dive presentation by Steen Lund and me at Unite 2018.

Prefab mode in context

After the initial release, the team continued further improving the Prefab workflows. The primary feature I remember in this regard was Prefab Mode in Context. This was an improvement to Prefab Mode which, when a Prefab was opened for editing via an instance in the scene, would display the scene as context while editing the asset.

Here's a Meet the Devs session from (online-only) Unite 2020 where Mads Nyholm and I discussed our work, covering past and upcoming Prefab features in the first half of the session, and answering questions from the user community in the second half:

Meet the Devs session from Unite 2020 where Mads Nyholm and I discussed new Prefab features and answered questions from the community.

You can find many more talks about Prefabs on the Unity YouTube channel here, including several by my former teammates from the Scene Management / Prefab team. And here's a blog post I wrote about Prefab Mode in Context:

Prefab black-boxing

Prefab black-boxing aka encapsulation was a highly requested feature to promote structure and clarity in a project's Prefab usage and reduce the risk of human error. We worked on it in late 2019 to early 2020.

Internal video demonstration of the Prefab black-boxing prototype in December 2019.

I worked on a prototype to expose Prefab properties as seen in the video above. While the prototype functioned as shown, changes to exposed properties still created overrides on the underlying properties too, so the encapsulation was not truly enforced beyond the user interface. That's why it was only a prototype. It never moved past this stage, as it was put on hold indefinitely for reasons coming from above that I no longer recall.

Particle system improvements

2019 - Shipped in Unity 2020.1

For Unity HackWeek 2019 I implemented two improvements to Unity's Shuriken particle system: Improved stretched particles, and improved moving fire. These shipped in Unity 2020.1, and you can see them demonstrated in this quick one-minute video:

My Unity Hackweek 2019 presentation about improved stretched particles and improved moving fire.

Calling it quits

Dec 2020

In December 2020 I left Unity after having worked there for 12 years. I wrote more about that here:

To give a bit more context, I decided to stop for a variety of reasons, most of them related.

Unity had become increasingly corporate in a way I didn't like at all. Upper management kept making tone deaf decisions which faced internal protests, only to go ahead anyway and face the same protests from users and negative media coverage. To their credit, they backpedaled often enough, but seemingly without ever taking any lessons from the backlash. This was tragic to witness, compared to the incredible goodwill and appreciation the company received from the community in earlier days.

The increasingly corporate culture also meant job areas got more rigid. Unity had a strong culture of open discussions from its early days, but while it tried to maintain that appearance, in practice it was increasingly frowned upon and discouraged.

A pattern slowly emerged where I would get stern words from managers above me (not team leads) warning me about how I was getting negative attention from managers of other departments, while I would simultaneously get frequent private messages from other employees "on the floor" thanking me for speaking up about things they didn't dare to. But the friction and feeling of fighting an uphill battle eventually got strong enough that even I, reluctantly, began to keep my head down. I did not like what this was doing to my mental health, and towards the end I felt utterly alienated in a company I'd been in longer than 99% of the other people in it.

For more thoughts on Unity's culture changing over the years, I recommend the blog post "Random thoughts about Unity" by fellow Unity old-timer Aras Pranckevičius.

Remember that manager who I swore I would never work under in the section about the Input System? Well that person got promoted to head of all of R&D, meaning all developers working on the Unity engine and editor were now working under him, including me. While I didn't quit my job immediately when this happened, my resolve to leave strengthened.

That said, I ended up staying for almost another year. During this time I moved from Denmark to Finland to follow my partner who had gotten a job there. Unity let me relocate and be formally moved to the Finnish part of the organization, while still staying in the Copenhagen-based Scene Management Team. This was during Covid, so the entire team worked remotely anyway. Even though I ended up quitting just six months later, I'm still thankful they let me relocate like this, as it reduced things to worry about during a major transition in my life.

In the final months of my time at Unity, all interesting projects had been put on hold and the team was focused on bug fixing due to some metrics indicating a large amount of legacy bugs in our team's area. I don't mind fixing my own bugs, or bugs in areas of code I'm well familiar with, but I was a developer focused on frontend and C# code while most of the bugs we had to fix were in backend C++ parts of the code I hated touching because I was utterly inefficient at it. This prolonged bug fixing marathon was soul crushing for me, as I felt my talents were completely wasted. For the first time in my career I came to work each day just longing for the time I could leave again. (I know that's the norm for many, and I genuinely feel for them.)

Meanwhile, the VR game I was developing on the side, Eye of the Temple, was generating excitement among VR players and receiving positive press coverage. The prospect of going full-time indie was alluring.

Unity went public on the stock market in late 2020. Once I realized my stock options were valuable enough to easily cover my living expenses while I completed development of Eye of the Temple (which was very far along already), the last mental barrier dissolved, and it felt like an obvious choice to start a new chapter of my life as a full-time indie developer. After quitting in December 2020 I released Eye of the Temple in October 2021, a Quest version in April 2023, and I'm now working on a new game.

It's now (in 2025) four years ago I quit Unity, and I don't regret a thing. That said, I genuinely enjoyed the majority of my time at Unity. I had wonderful and talented colleagues, interesting projects to work on, and a chance to do my little part in reshaping the game industry to make it possible for far more people to make games.

Read Entire Article